Sample records for structural reliability models

  1. Analysis of whisker-toughened CMC structural components using an interactive reliability model

    NASA Technical Reports Server (NTRS)

    Duffy, Stephen F.; Palko, Joseph L.

    1992-01-01

    Realizing wider utilization of ceramic matrix composites (CMC) requires the development of advanced structural analysis technologies. This article focuses on the use of interactive reliability models to predict component probability of failure. The deterministic William-Warnke failure criterion serves as theoretical basis for the reliability model presented here. The model has been implemented into a test-bed software program. This computer program has been coupled to a general-purpose finite element program. A simple structural problem is presented to illustrate the reliability model and the computer algorithm.

  2. Structural reliability assessment capability in NESSUS

    NASA Technical Reports Server (NTRS)

    Millwater, H.; Wu, Y.-T.

    1992-01-01

    The principal capabilities of NESSUS (Numerical Evaluation of Stochastic Structures Under Stress), an advanced computer code developed for probabilistic structural response analysis, are reviewed, and its structural reliability assessed. The code combines flexible structural modeling tools with advanced probabilistic algorithms in order to compute probabilistic structural response and resistance, component reliability and risk, and system reliability and risk. An illustrative numerical example is presented.

  3. Structural reliability assessment capability in NESSUS

    NASA Astrophysics Data System (ADS)

    Millwater, H.; Wu, Y.-T.

    1992-07-01

    The principal capabilities of NESSUS (Numerical Evaluation of Stochastic Structures Under Stress), an advanced computer code developed for probabilistic structural response analysis, are reviewed, and its structural reliability assessed. The code combines flexible structural modeling tools with advanced probabilistic algorithms in order to compute probabilistic structural response and resistance, component reliability and risk, and system reliability and risk. An illustrative numerical example is presented.

  4. A Note on Structural Equation Modeling Estimates of Reliability

    ERIC Educational Resources Information Center

    Yang, Yanyun; Green, Samuel B.

    2010-01-01

    Reliability can be estimated using structural equation modeling (SEM). Two potential problems with this approach are that estimates may be unstable with small sample sizes and biased with misspecified models. A Monte Carlo study was conducted to investigate the quality of SEM estimates of reliability by themselves and relative to coefficient…

  5. Reliability of Summed Item Scores Using Structural Equation Modeling: An Alternative to Coefficient Alpha

    ERIC Educational Resources Information Center

    Green, Samuel B.; Yang, Yanyun

    2009-01-01

    A method is presented for estimating reliability using structural equation modeling (SEM) that allows for nonlinearity between factors and item scores. Assuming the focus is on consistency of summed item scores, this method for estimating reliability is preferred to those based on linear SEM models and to the most commonly reported estimate of…

  6. Reliability Modeling of Double Beam Bridge Crane

    NASA Astrophysics Data System (ADS)

    Han, Zhu; Tong, Yifei; Luan, Jiahui; Xiangdong, Li

    2018-05-01

    This paper briefly described the structure of double beam bridge crane and the basic parameters of double beam bridge crane are defined. According to the structure and system division of double beam bridge crane, the reliability architecture of double beam bridge crane system is proposed, and the reliability mathematical model is constructed.

  7. Reliability Analysis of Sealing Structure of Electromechanical System Based on Kriging Model

    NASA Astrophysics Data System (ADS)

    Zhang, F.; Wang, Y. M.; Chen, R. W.; Deng, W. W.; Gao, Y.

    2018-05-01

    The sealing performance of aircraft electromechanical system has a great influence on flight safety, and the reliability of its typical seal structure is analyzed by researcher. In this paper, we regard reciprocating seal structure as a research object to study structural reliability. Having been based on the finite element numerical simulation method, the contact stress between the rubber sealing ring and the cylinder wall is calculated, and the relationship between the contact stress and the pressure of the hydraulic medium is built, and the friction force on different working conditions are compared. Through the co-simulation, the adaptive Kriging model obtained by EFF learning mechanism is used to describe the failure probability of the seal ring, so as to evaluate the reliability of the sealing structure. This article proposes a new idea of numerical evaluation for the reliability analysis of sealing structure, and also provides a theoretical basis for the optimal design of sealing structure.

  8. Interval Estimation of Revision Effect on Scale Reliability via Covariance Structure Modeling

    ERIC Educational Resources Information Center

    Raykov, Tenko

    2009-01-01

    A didactic discussion of a procedure for interval estimation of change in scale reliability due to revision is provided, which is developed within the framework of covariance structure modeling. The method yields ranges of plausible values for the population gain or loss in reliability of unidimensional composites, which results from deletion or…

  9. Estimate of the Reliability in Geological Forecasts for Tunnels: Toward a Structured Approach

    NASA Astrophysics Data System (ADS)

    Perello, Paolo

    2011-11-01

    In tunnelling, a reliable geological model often allows providing an effective design and facing the construction phase without unpleasant surprises. A geological model can be considered reliable when it is a valid support to correctly foresee the rock mass behaviour, therefore preventing unexpected events during the excavation. The higher the model reliability, the lower the probability of unforeseen rock mass behaviour. Unfortunately, owing to different reasons, geological models are affected by uncertainties and a fully reliable knowledge of the rock mass is, in most cases, impossible. Therefore, estimating to which degree a geological model is reliable, becomes a primary requirement in order to save time and money and to adopt the appropriate construction strategy. The definition of the geological model reliability is often achieved by engineering geologists through an unstructured analytical process and variable criteria. This paper focusses on geological models for projects of linear underground structures and represents an effort to analyse and include in a conceptual framework the factors influencing such models. An empirical parametric procedure is then developed with the aim of obtaining an index called "geological model rating (GMR)", which can be used to provide a more standardised definition of a geological model reliability.

  10. Probabilistic Finite Element Analysis & Design Optimization for Structural Designs

    NASA Astrophysics Data System (ADS)

    Deivanayagam, Arumugam

    This study focuses on implementing probabilistic nature of material properties (Kevlar® 49) to the existing deterministic finite element analysis (FEA) of fabric based engine containment system through Monte Carlo simulations (MCS) and implementation of probabilistic analysis in engineering designs through Reliability Based Design Optimization (RBDO). First, the emphasis is on experimental data analysis focusing on probabilistic distribution models which characterize the randomness associated with the experimental data. The material properties of Kevlar® 49 are modeled using experimental data analysis and implemented along with an existing spiral modeling scheme (SMS) and user defined constitutive model (UMAT) for fabric based engine containment simulations in LS-DYNA. MCS of the model are performed to observe the failure pattern and exit velocities of the models. Then the solutions are compared with NASA experimental tests and deterministic results. MCS with probabilistic material data give a good prospective on results rather than a single deterministic simulation results. The next part of research is to implement the probabilistic material properties in engineering designs. The main aim of structural design is to obtain optimal solutions. In any case, in a deterministic optimization problem even though the structures are cost effective, it becomes highly unreliable if the uncertainty that may be associated with the system (material properties, loading etc.) is not represented or considered in the solution process. Reliable and optimal solution can be obtained by performing reliability optimization along with the deterministic optimization, which is RBDO. In RBDO problem formulation, in addition to structural performance constraints, reliability constraints are also considered. This part of research starts with introduction to reliability analysis such as first order reliability analysis, second order reliability analysis followed by simulation technique that are performed to obtain probability of failure and reliability of structures. Next, decoupled RBDO procedure is proposed with a new reliability analysis formulation with sensitivity analysis, which is performed to remove the highly reliable constraints in the RBDO, thereby reducing the computational time and function evaluations. Followed by implementation of the reliability analysis concepts and RBDO in finite element 2D truss problems and a planar beam problem are presented and discussed.

  11. Composite Stress Rupture: A New Reliability Model Based on Strength Decay

    NASA Technical Reports Server (NTRS)

    Reeder, James R.

    2012-01-01

    A model is proposed to estimate reliability for stress rupture of composite overwrap pressure vessels (COPVs) and similar composite structures. This new reliability model is generated by assuming a strength degradation (or decay) over time. The model suggests that most of the strength decay occurs late in life. The strength decay model will be shown to predict a response similar to that predicted by a traditional reliability model for stress rupture based on tests at a single stress level. In addition, the model predicts that even though there is strength decay due to proof loading, a significant overall increase in reliability is gained by eliminating any weak vessels, which would fail early. The model predicts that there should be significant periods of safe life following proof loading, because time is required for the strength to decay from the proof stress level to the subsequent loading level. Suggestions for testing the strength decay reliability model have been made. If the strength decay reliability model predictions are shown through testing to be accurate, COPVs may be designed to carry a higher level of stress than is currently allowed, which will enable the production of lighter structures

  12. Durability reliability analysis for corroding concrete structures under uncertainty

    NASA Astrophysics Data System (ADS)

    Zhang, Hao

    2018-02-01

    This paper presents a durability reliability analysis of reinforced concrete structures subject to the action of marine chloride. The focus is to provide insight into the role of epistemic uncertainties on durability reliability. The corrosion model involves a number of variables whose probabilistic characteristics cannot be fully determined due to the limited availability of supporting data. All sources of uncertainty, both aleatory and epistemic, should be included in the reliability analysis. Two methods are available to formulate the epistemic uncertainty: the imprecise probability-based method and the purely probabilistic method in which the epistemic uncertainties are modeled as random variables. The paper illustrates how the epistemic uncertainties are modeled and propagated in the two methods, and shows how epistemic uncertainties govern the durability reliability.

  13. Behavioral Scale Reliability and Measurement Invariance Evaluation Using Latent Variable Modeling

    ERIC Educational Resources Information Center

    Raykov, Tenko

    2004-01-01

    A latent variable modeling approach to reliability and measurement invariance evaluation for multiple-component measuring instruments is outlined. An initial discussion deals with the limitations of coefficient alpha, a frequently used index of composite reliability. A widely and readily applicable structural modeling framework is next described…

  14. System reliability approaches for advanced propulsion system structures

    NASA Technical Reports Server (NTRS)

    Cruse, T. A.; Mahadevan, S.

    1991-01-01

    This paper identifies significant issues that pertain to the estimation and use of system reliability in the design of advanced propulsion system structures. Linkages between the reliabilities of individual components and their effect on system design issues such as performance, cost, availability, and certification are examined. The need for system reliability computation to address the continuum nature of propulsion system structures and synergistic progressive damage modes has been highlighted. Available system reliability models are observed to apply only to discrete systems. Therefore a sequential structural reanalysis procedure is formulated to rigorously compute the conditional dependencies between various failure modes. The method is developed in a manner that supports both top-down and bottom-up analyses in system reliability.

  15. Probabilistic Structural Analysis Methods (PSAM) for select space propulsion system components, part 2

    NASA Technical Reports Server (NTRS)

    1991-01-01

    The technical effort and computer code enhancements performed during the sixth year of the Probabilistic Structural Analysis Methods program are summarized. Various capabilities are described to probabilistically combine structural response and structural resistance to compute component reliability. A library of structural resistance models is implemented in the Numerical Evaluations of Stochastic Structures Under Stress (NESSUS) code that included fatigue, fracture, creep, multi-factor interaction, and other important effects. In addition, a user interface was developed for user-defined resistance models. An accurate and efficient reliability method was developed and was successfully implemented in the NESSUS code to compute component reliability based on user-selected response and resistance models. A risk module was developed to compute component risk with respect to cost, performance, or user-defined criteria. The new component risk assessment capabilities were validated and demonstrated using several examples. Various supporting methodologies were also developed in support of component risk assessment.

  16. Fatigue reliability of deck structures subjected to correlated crack growth

    NASA Astrophysics Data System (ADS)

    Feng, G. Q.; Garbatov, Y.; Guedes Soares, C.

    2013-12-01

    The objective of this work is to analyse fatigue reliability of deck structures subjected to correlated crack growth. The stress intensity factors of the correlated cracks are obtained by finite element analysis and based on which the geometry correction functions are derived. The Monte Carlo simulations are applied to predict the statistical descriptors of correlated cracks based on the Paris-Erdogan equation. A probabilistic model of crack growth as a function of time is used to analyse the fatigue reliability of deck structures accounting for the crack propagation correlation. A deck structure is modelled as a series system of stiffened panels, where a stiffened panel is regarded as a parallel system composed of plates and are longitudinal. It has been proven that the method developed here can be conveniently applied to perform the fatigue reliability assessment of structures subjected to correlated crack growth.

  17. Reliability-Based Design Optimization of a Composite Airframe Component

    NASA Technical Reports Server (NTRS)

    Patnaik, Surya N.; Pai, Shantaram S.; Coroneos, Rula M.

    2009-01-01

    A stochastic design optimization methodology (SDO) has been developed to design components of an airframe structure that can be made of metallic and composite materials. The design is obtained as a function of the risk level, or reliability, p. The design method treats uncertainties in load, strength, and material properties as distribution functions, which are defined with mean values and standard deviations. A design constraint or a failure mode is specified as a function of reliability p. Solution to stochastic optimization yields the weight of a structure as a function of reliability p. Optimum weight versus reliability p traced out an inverted-S-shaped graph. The center of the inverted-S graph corresponded to 50 percent (p = 0.5) probability of success. A heavy design with weight approaching infinity could be produced for a near-zero rate of failure that corresponds to unity for reliability p (or p = 1). Weight can be reduced to a small value for the most failure-prone design with a reliability that approaches zero (p = 0). Reliability can be changed for different components of an airframe structure. For example, the landing gear can be designed for a very high reliability, whereas it can be reduced to a small extent for a raked wingtip. The SDO capability is obtained by combining three codes: (1) The MSC/Nastran code was the deterministic analysis tool, (2) The fast probabilistic integrator, or the FPI module of the NESSUS software, was the probabilistic calculator, and (3) NASA Glenn Research Center s optimization testbed CometBoards became the optimizer. The SDO capability requires a finite element structural model, a material model, a load model, and a design model. The stochastic optimization concept is illustrated considering an academic example and a real-life raked wingtip structure of the Boeing 767-400 extended range airliner made of metallic and composite materials.

  18. Assessment of concrete damage and strength degradation caused by reinforcement corrosion

    NASA Astrophysics Data System (ADS)

    Nepal, Jaya; Chen, Hua-Peng

    2015-07-01

    Structural performance deterioration of reinforced concrete structures has been extensively investigated, but very limited studies have been carried out to investigate the effect of reinforcement corrosion on time-dependent reliability with consideration of the influence of mechanical characteristics of the bond interface due to corrosion. This paper deals with how corrosion in reinforcement creates different types of defects in concrete structure and how they are responsible for the structural capacity deterioration of corrosion affected reinforced concrete structures during their service life. Cracking in cover concrete due to reinforcement corrosion is investigated by using rebar-concrete model and realistic concrete properties. The flexural strength deterioration is analytically predicted on the basis of bond strength evolution due to reinforcement corrosion, which is examined by the experimental data available. The time-dependent reliability analysis is undertaken to calculate the life time structural reliability of corrosion damaged concrete structures by stochastic deterioration modelling of reinforced concrete. The results from the numerical example show that the proposed approach is capable of evaluating the damage caused by reinforcement corrosion and also predicting the structural reliability of concrete structures during their lifecycle.

  19. Structural reliability analysis under evidence theory using the active learning kriging model

    NASA Astrophysics Data System (ADS)

    Yang, Xufeng; Liu, Yongshou; Ma, Panke

    2017-11-01

    Structural reliability analysis under evidence theory is investigated. It is rigorously proved that a surrogate model providing only correct sign prediction of the performance function can meet the accuracy requirement of evidence-theory-based reliability analysis. Accordingly, a method based on the active learning kriging model which only correctly predicts the sign of the performance function is proposed. Interval Monte Carlo simulation and a modified optimization method based on Karush-Kuhn-Tucker conditions are introduced to make the method more efficient in estimating the bounds of failure probability based on the kriging model. Four examples are investigated to demonstrate the efficiency and accuracy of the proposed method.

  20. Evaluation of Validity and Reliability for Hierarchical Scales Using Latent Variable Modeling

    ERIC Educational Resources Information Center

    Raykov, Tenko; Marcoulides, George A.

    2012-01-01

    A latent variable modeling method is outlined, which accomplishes estimation of criterion validity and reliability for a multicomponent measuring instrument with hierarchical structure. The approach provides point and interval estimates for the scale criterion validity and reliability coefficients, and can also be used for testing composite or…

  1. Synthesis, Characterization And Modeling Of Functionally Graded Multifunctional Hybrid Composites For Extreme Environments

    DTIC Science & Technology

    2017-04-04

    research thrust areas are designed to enable the development of reliable, damage tolerant, lightweight structures with excellent thermal management...46 2. RESEARCH THRUST AREA: MULTISCALE CHARACTERIZATION AND MODELING .................................... 56 2.1 DESIGN OF MATERIALS...The research thrust areas are designed to enable the development of reliable, damage tolerant, lightweight structures with excellent thermal

  2. Software reliability: Application of a reliability model to requirements error analysis

    NASA Technical Reports Server (NTRS)

    Logan, J.

    1980-01-01

    The application of a software reliability model having a well defined correspondence of computer program properties to requirements error analysis is described. Requirements error categories which can be related to program structural elements are identified and their effect on program execution considered. The model is applied to a hypothetical B-5 requirement specification for a program module.

  3. A Reliability Estimation in Modeling Watershed Runoff With Uncertainties

    NASA Astrophysics Data System (ADS)

    Melching, Charles S.; Yen, Ben Chie; Wenzel, Harry G., Jr.

    1990-10-01

    The reliability of simulation results produced by watershed runoff models is a function of uncertainties in nature, data, model parameters, and model structure. A framework is presented here for using a reliability analysis method (such as first-order second-moment techniques or Monte Carlo simulation) to evaluate the combined effect of the uncertainties on the reliability of output hydrographs from hydrologic models. For a given event the prediction reliability can be expressed in terms of the probability distribution of the estimated hydrologic variable. The peak discharge probability for a watershed in Illinois using the HEC-1 watershed model is given as an example. The study of the reliability of predictions from watershed models provides useful information on the stochastic nature of output from deterministic models subject to uncertainties and identifies the relative contribution of the various uncertainties to unreliability of model predictions.

  4. GalaxyTBM: template-based modeling by building a reliable core and refining unreliable local regions.

    PubMed

    Ko, Junsu; Park, Hahnbeom; Seok, Chaok

    2012-08-10

    Protein structures can be reliably predicted by template-based modeling (TBM) when experimental structures of homologous proteins are available. However, it is challenging to obtain structures more accurate than the single best templates by either combining information from multiple templates or by modeling regions that vary among templates or are not covered by any templates. We introduce GalaxyTBM, a new TBM method in which the more reliable core region is modeled first from multiple templates and less reliable, variable local regions, such as loops or termini, are then detected and re-modeled by an ab initio method. This TBM method is based on "Seok-server," which was tested in CASP9 and assessed to be amongst the top TBM servers. The accuracy of the initial core modeling is enhanced by focusing on more conserved regions in the multiple-template selection and multiple sequence alignment stages. Additional improvement is achieved by ab initio modeling of up to 3 unreliable local regions in the fixed framework of the core structure. Overall, GalaxyTBM reproduced the performance of Seok-server, with GalaxyTBM and Seok-server resulting in average GDT-TS of 68.1 and 68.4, respectively, when tested on 68 single-domain CASP9 TBM targets. For application to multi-domain proteins, GalaxyTBM must be combined with domain-splitting methods. Application of GalaxyTBM to CASP9 targets demonstrates that accurate protein structure prediction is possible by use of a multiple-template-based approach, and ab initio modeling of variable regions can further enhance the model quality.

  5. A general graphical user interface for automatic reliability modeling

    NASA Technical Reports Server (NTRS)

    Liceaga, Carlos A.; Siewiorek, Daniel P.

    1991-01-01

    Reported here is a general Graphical User Interface (GUI) for automatic reliability modeling of Processor Memory Switch (PMS) structures using a Markov model. This GUI is based on a hierarchy of windows. One window has graphical editing capabilities for specifying the system's communication structure, hierarchy, reconfiguration capabilities, and requirements. Other windows have field texts, popup menus, and buttons for specifying parameters and selecting actions. An example application of the GUI is given.

  6. Improved reliability of wind turbine towers with active tuned mass dampers (ATMDs)

    NASA Astrophysics Data System (ADS)

    Fitzgerald, Breiffni; Sarkar, Saptarshi; Staino, Andrea

    2018-04-01

    Modern multi-megawatt wind turbines are composed of slender, flexible, and lightly damped blades and towers. These components exhibit high susceptibility to wind-induced vibrations. As the size, flexibility and cost of the towers have increased in recent years, the need to protect these structures against damage induced by turbulent aerodynamic loading has become apparent. This paper combines structural dynamic models and probabilistic assessment tools to demonstrate improvements in structural reliability when modern wind turbine towers are equipped with active tuned mass dampers (ATMDs). This study proposes a multi-modal wind turbine model for wind turbine control design and analysis. This study incorporates an ATMD into the tower of this model. The model is subjected to stochastically generated wind loads of varying speeds to develop wind-induced probabilistic demand models for towers of modern multi-megawatt wind turbines under structural uncertainty. Numerical simulations have been carried out to ascertain the effectiveness of the active control system to improve the structural performance of the wind turbine and its reliability. The study constructs fragility curves, which illustrate reductions in the vulnerability of towers to wind loading owing to the inclusion of the damper. Results show that the active controller is successful in increasing the reliability of the tower responses. According to the analysis carried out in this paper, a strong reduction of the probability of exceeding a given displacement at the rated wind speed has been observed.

  7. Probabilistic structural analysis by extremum methods

    NASA Technical Reports Server (NTRS)

    Nafday, Avinash M.

    1990-01-01

    The objective is to demonstrate discrete extremum methods of structural analysis as a tool for structural system reliability evaluation. Specifically, linear and multiobjective linear programming models for analysis of rigid plastic frames under proportional and multiparametric loadings, respectively, are considered. Kinematic and static approaches for analysis form a primal-dual pair in each of these models and have a polyhedral format. Duality relations link extreme points and hyperplanes of these polyhedra and lead naturally to dual methods for system reliability evaluation.

  8. Computer-aided design of polymers and composites

    NASA Technical Reports Server (NTRS)

    Kaelble, D. H.

    1985-01-01

    This book on computer-aided design of polymers and composites introduces and discusses the subject from the viewpoint of atomic and molecular models. Thus, the origins of stiffness, strength, extensibility, and fracture toughness in composite materials can be analyzed directly in terms of chemical composition and molecular structure. Aspects of polymer composite reliability are considered along with characterization techniques for composite reliability, relations between atomic and molecular properties, computer aided design and manufacture, polymer CAD/CAM models, and composite CAD/CAM models. Attention is given to multiphase structural adhesives, fibrous composite reliability, metal joint reliability, polymer physical states and transitions, chemical quality assurance, processability testing, cure monitoring and management, nondestructive evaluation (NDE), surface NDE, elementary properties, ionic-covalent bonding, molecular analysis, acid-base interactions, the manufacturing science, and peel mechanics.

  9. The relationship between cost estimates reliability and BIM adoption: SEM analysis

    NASA Astrophysics Data System (ADS)

    Ismail, N. A. A.; Idris, N. H.; Ramli, H.; Rooshdi, R. R. Raja Muhammad; Sahamir, S. R.

    2018-02-01

    This paper presents the usage of Structural Equation Modelling (SEM) approach in analysing the effects of Building Information Modelling (BIM) technology adoption in improving the reliability of cost estimates. Based on the questionnaire survey results, SEM analysis using SPSS-AMOS application examined the relationships between BIM-improved information and cost estimates reliability factors, leading to BIM technology adoption. Six hypotheses were established prior to SEM analysis employing two types of SEM models, namely the Confirmatory Factor Analysis (CFA) model and full structural model. The SEM models were then validated through the assessment on their uni-dimensionality, validity, reliability, and fitness index, in line with the hypotheses tested. The final SEM model fit measures are: P-value=0.000, RMSEA=0.079<0.08, GFI=0.824, CFI=0.962>0.90, TLI=0.956>0.90, NFI=0.935>0.90 and ChiSq/df=2.259; indicating that the overall index values achieved the required level of model fitness. The model supports all the hypotheses evaluated, confirming that all relationship exists amongst the constructs are positive and significant. Ultimately, the analysis verified that most of the respondents foresee better understanding of project input information through BIM visualization, its reliable database and coordinated data, in developing more reliable cost estimates. They also perceive to accelerate their cost estimating task through BIM adoption.

  10. Design of high reliability organizations in health care.

    PubMed

    Carroll, J S; Rudolph, J W

    2006-12-01

    To improve safety performance, many healthcare organizations have sought to emulate high reliability organizations from industries such as nuclear power, chemical processing, and military operations. We outline high reliability design principles for healthcare organizations including both the formal structures and the informal practices that complement those structures. A stage model of organizational structures and practices, moving from local autonomy to formal controls to open inquiry to deep self-understanding, is used to illustrate typical challenges and design possibilities at each stage. We suggest how organizations can use the concepts and examples presented to increase their capacity to self-design for safety and reliability.

  11. Care 3 model overview and user's guide, first revision

    NASA Technical Reports Server (NTRS)

    Bavuso, S. J.; Petersen, P. L.

    1985-01-01

    A manual was written to introduce the CARE III (Computer-Aided Reliability Estimation) capability to reliability and design engineers who are interested in predicting the reliability of highly reliable fault-tolerant systems. It was also structured to serve as a quick-look reference manual for more experienced users. The guide covers CARE III modeling and reliability predictions for execution in the CDC CYber 170 series computers, DEC VAX-11/700 series computer, and most machines that compile ANSI Standard FORTRAN 77.

  12. Probabilistic simulation of the human factor in structural reliability

    NASA Technical Reports Server (NTRS)

    Shah, Ashwin R.; Chamis, Christos C.

    1991-01-01

    Many structural failures have occasionally been attributed to human factors in engineering design, analyses maintenance, and fabrication processes. Every facet of the engineering process is heavily governed by human factors and the degree of uncertainty associated with them. Factors such as societal, physical, professional, psychological, and many others introduce uncertainties that significantly influence the reliability of human performance. Quantifying human factors and associated uncertainties in structural reliability require: (1) identification of the fundamental factors that influence human performance, and (2) models to describe the interaction of these factors. An approach is being developed to quantify the uncertainties associated with the human performance. This approach consists of a multi factor model in conjunction with direct Monte-Carlo simulation.

  13. Protocol and Demonstrations of Probabilistic Reliability Assessment for Structural Health Monitoring Systems (Preprint)

    DTIC Science & Technology

    2011-11-01

    assessment to quality of localization/characterization estimates. This protocol includes four critical components: (1) a procedure to identify the...critical factors impacting SHM system performance; (2) a multistage or hierarchical approach to SHM system validation; (3) a model -assisted evaluation...Lindgren, E. A ., Buynak, C. F., Steffes, G., Derriso, M., “ Model -assisted Probabilistic Reliability Assessment for Structural Health Monitoring

  14. Interactive Reliability Model for Whisker-toughened Ceramics

    NASA Technical Reports Server (NTRS)

    Palko, Joseph L.

    1993-01-01

    Wider use of ceramic matrix composites (CMC) will require the development of advanced structural analysis technologies. The use of an interactive model to predict the time-independent reliability of a component subjected to multiaxial loads is discussed. The deterministic, three-parameter Willam-Warnke failure criterion serves as the theoretical basis for the reliability model. The strength parameters defining the model are assumed to be random variables, thereby transforming the deterministic failure criterion into a probabilistic criterion. The ability of the model to account for multiaxial stress states with the same unified theory is an improvement over existing models. The new model was coupled with a public-domain finite element program through an integrated design program. This allows a design engineer to predict the probability of failure of a component. A simple structural problem is analyzed using the new model, and the results are compared to existing models.

  15. Advances in Homology Protein Structure Modeling

    PubMed Central

    Xiang, Zhexin

    2007-01-01

    Homology modeling plays a central role in determining protein structure in the structural genomics project. The importance of homology modeling has been steadily increasing because of the large gap that exists between the overwhelming number of available protein sequences and experimentally solved protein structures, and also, more importantly, because of the increasing reliability and accuracy of the method. In fact, a protein sequence with over 30% identity to a known structure can often be predicted with an accuracy equivalent to a low-resolution X-ray structure. The recent advances in homology modeling, especially in detecting distant homologues, aligning sequences with template structures, modeling of loops and side chains, as well as detecting errors in a model, have contributed to reliable prediction of protein structure, which was not possible even several years ago. The ongoing efforts in solving protein structures, which can be time-consuming and often difficult, will continue to spur the development of a host of new computational methods that can fill in the gap and further contribute to understanding the relationship between protein structure and function. PMID:16787261

  16. An Energy-Based Limit State Function for Estimation of Structural Reliability in Shock Environments

    DOE PAGES

    Guthrie, Michael A.

    2013-01-01

    limit state function is developed for the estimation of structural reliability in shock environments. This limit state function uses peak modal strain energies to characterize environmental severity and modal strain energies at failure to characterize the structural capacity. The Hasofer-Lind reliability index is briefly reviewed and its computation for the energy-based limit state function is discussed. Applications to two degree of freedom mass-spring systems and to a simple finite element model are considered. For these examples, computation of the reliability index requires little effort beyond a modal analysis, but still accounts for relevant uncertainties in both the structure and environment.more » For both examples, the reliability index is observed to agree well with the results of Monte Carlo analysis. In situations where fast, qualitative comparison of several candidate designs is required, the reliability index based on the proposed limit state function provides an attractive metric which can be used to compare and control reliability.« less

  17. Experimental application of OMA solutions on the model of industrial structure

    NASA Astrophysics Data System (ADS)

    Mironov, A.; Mironovs, D.

    2017-10-01

    It is very important and sometimes even vital to maintain reliability of industrial structures. High quality control during production and structural health monitoring (SHM) in exploitation provides reliable functioning of large, massive and remote structures, like wind generators, pipelines, power line posts, etc. This paper introduces a complex of technological and methodical solutions for SHM and diagnostics of industrial structures, including those that are actuated by periodic forces. Solutions were verified on a wind generator scaled model with integrated system of piezo-film deformation sensors. Simultaneous and multi-patch Operational Modal Analysis (OMA) approaches were implemented as methodical means for structural diagnostics and monitoring. Specially designed data processing algorithms provide objective evaluation of structural state modification.

  18. System reliability of randomly vibrating structures: Computational modeling and laboratory testing

    NASA Astrophysics Data System (ADS)

    Sundar, V. S.; Ammanagi, S.; Manohar, C. S.

    2015-09-01

    The problem of determination of system reliability of randomly vibrating structures arises in many application areas of engineering. We discuss in this paper approaches based on Monte Carlo simulations and laboratory testing to tackle problems of time variant system reliability estimation. The strategy we adopt is based on the application of Girsanov's transformation to the governing stochastic differential equations which enables estimation of probability of failure with significantly reduced number of samples than what is needed in a direct simulation study. Notably, we show that the ideas from Girsanov's transformation based Monte Carlo simulations can be extended to conduct laboratory testing to assess system reliability of engineering structures with reduced number of samples and hence with reduced testing times. Illustrative examples include computational studies on a 10-degree of freedom nonlinear system model and laboratory/computational investigations on road load response of an automotive system tested on a four-post test rig.

  19. Optimization Testbed Cometboards Extended into Stochastic Domain

    NASA Technical Reports Server (NTRS)

    Patnaik, Surya N.; Pai, Shantaram S.; Coroneos, Rula M.; Patnaik, Surya N.

    2010-01-01

    COMparative Evaluation Testbed of Optimization and Analysis Routines for the Design of Structures (CometBoards) is a multidisciplinary design optimization software. It was originally developed for deterministic calculation. It has now been extended into the stochastic domain for structural design problems. For deterministic problems, CometBoards is introduced through its subproblem solution strategy as well as the approximation concept in optimization. In the stochastic domain, a design is formulated as a function of the risk or reliability. Optimum solution including the weight of a structure, is also obtained as a function of reliability. Weight versus reliability traced out an inverted-S-shaped graph. The center of the graph corresponded to 50 percent probability of success, or one failure in two samples. A heavy design with weight approaching infinity could be produced for a near-zero rate of failure that corresponded to unity for reliability. Weight can be reduced to a small value for the most failure-prone design with a compromised reliability approaching zero. The stochastic design optimization (SDO) capability for an industrial problem was obtained by combining three codes: MSC/Nastran code was the deterministic analysis tool, fast probabilistic integrator, or the FPI module of the NESSUS software, was the probabilistic calculator, and CometBoards became the optimizer. The SDO capability requires a finite element structural model, a material model, a load model, and a design model. The stochastic optimization concept is illustrated considering an academic example and a real-life airframe component made of metallic and composite materials.

  20. A systematic review of the factor structure and reliability of the Spence Children's Anxiety Scale.

    PubMed

    Orgilés, Mireia; Fernández-Martínez, Iván; Guillén-Riquelme, Alejandro; Espada, José P; Essau, Cecilia A

    2016-01-15

    The Spence Children's Anxiety Scale (SCAS) is a widely used instrument for assessing symptoms of anxiety disorders among children and adolescents. Previous studies have demonstrated its good reliability for children and adolescents from different backgrounds. However, remarkable variability in the reliability of the SCAS across studies and inconsistent results regarding its factor structure has been found. The present study aims to examine the SCAS factor structure by means of a systematic review with narrative synthesis, the mean reliability of the SCAS by means of a meta-analysis, and the influence of the moderators on the SCAS reliability. Databases employed to collect the studies included Scholar Google, PsycARTICLES, PsycINFO, Web of Science, and Scopus since 1997. Twenty-nine and 32 studies, which examined the factor structure and the internal consistency of the SCAS, respectively, were included. The SCAS was found to have strong internal consistency, influenced by different moderators. The systematic review demonstrated that the original six-factor model was supported by most studies. Factorial invariance studies (across age, gender, country) and test-retest reliability of the SCAS were not examined in this study. It is concluded that the SCAS is a reliable instrument for cross-cultural use, and it is suggested that the original six-factor model is appropriate for cross-cultural application. Copyright © 2015 Elsevier B.V. All rights reserved.

  1. Design of high reliability organizations in health care

    PubMed Central

    Carroll, J S; Rudolph, J W

    2006-01-01

    To improve safety performance, many healthcare organizations have sought to emulate high reliability organizations from industries such as nuclear power, chemical processing, and military operations. We outline high reliability design principles for healthcare organizations including both the formal structures and the informal practices that complement those structures. A stage model of organizational structures and practices, moving from local autonomy to formal controls to open inquiry to deep self‐understanding, is used to illustrate typical challenges and design possibilities at each stage. We suggest how organizations can use the concepts and examples presented to increase their capacity to self‐design for safety and reliability. PMID:17142607

  2. Computing Reliabilities Of Ceramic Components Subject To Fracture

    NASA Technical Reports Server (NTRS)

    Nemeth, N. N.; Gyekenyesi, J. P.; Manderscheid, J. M.

    1992-01-01

    CARES calculates fast-fracture reliability or failure probability of macroscopically isotropic ceramic components. Program uses results from commercial structural-analysis program (MSC/NASTRAN or ANSYS) to evaluate reliability of component in presence of inherent surface- and/or volume-type flaws. Computes measure of reliability by use of finite-element mathematical model applicable to multiple materials in sense model made function of statistical characterizations of many ceramic materials. Reliability analysis uses element stress, temperature, area, and volume outputs, obtained from two-dimensional shell and three-dimensional solid isoparametric or axisymmetric finite elements. Written in FORTRAN 77.

  3. Fine reservoir structure modeling based upon 3D visualized stratigraphic correlation between horizontal wells: methodology and its application

    NASA Astrophysics Data System (ADS)

    Chenghua, Ou; Chaochun, Li; Siyuan, Huang; Sheng, James J.; Yuan, Xu

    2017-12-01

    As the platform-based horizontal well production mode has been widely applied in petroleum industry, building a reliable fine reservoir structure model by using horizontal well stratigraphic correlation has become very important. Horizontal wells usually extend between the upper and bottom boundaries of the target formation, with limited penetration points. Using these limited penetration points to conduct well deviation correction means the formation depth information obtained is not accurate, which makes it hard to build a fine structure model. In order to solve this problem, a method of fine reservoir structure modeling, based on 3D visualized stratigraphic correlation among horizontal wells, is proposed. This method can increase the accuracy when estimating the depth of the penetration points, and can also effectively predict the top and bottom interfaces in the horizontal penetrating section. Moreover, this method will greatly increase not only the number of points of depth data available, but also the accuracy of these data, which achieves the goal of building a reliable fine reservoir structure model by using the stratigraphic correlation among horizontal wells. Using this method, four 3D fine structure layer models have been successfully built of a specimen shale gas field with platform-based horizontal well production mode. The shale gas field is located to the east of Sichuan Basin, China; the successful application of the method has proven its feasibility and reliability.

  4. Probabilistic Methods for Structural Reliability and Risk

    NASA Technical Reports Server (NTRS)

    Chamis, Christos C.

    2010-01-01

    A probabilistic method is used to evaluate the structural reliability and risk of select metallic and composite structures. The method is a multiscale, multifunctional and it is based on the most elemental level. A multifactor interaction model is used to describe the material properties which are subsequently evaluated probabilistically. The metallic structure is a two rotor aircraft engine, while the composite structures consist of laminated plies (multiscale) and the properties of each ply are the multifunctional representation. The structural component is modeled by finite element. The solution method for structural responses is obtained by an updated simulation scheme. The results show that the risk for the two rotor engine is about 0.0001 and the composite built-up structure is also 0.0001.

  5. Probabilistic Methods for Structural Reliability and Risk

    NASA Technical Reports Server (NTRS)

    Chamis, Christos C.

    2008-01-01

    A probabilistic method is used to evaluate the structural reliability and risk of select metallic and composite structures. The method is a multiscale, multifunctional and it is based on the most elemental level. A multi-factor interaction model is used to describe the material properties which are subsequently evaluated probabilistically. The metallic structure is a two rotor aircraft engine, while the composite structures consist of laminated plies (multiscale) and the properties of each ply are the multifunctional representation. The structural component is modeled by finite element. The solution method for structural responses is obtained by an updated simulation scheme. The results show that the risk for the two rotor engine is about 0.0001 and the composite built-up structure is also 0.0001.

  6. Integrated material state awareness system with self-learning symbiotic diagnostic algorithms and models

    NASA Astrophysics Data System (ADS)

    Banerjee, Sourav; Liu, Lie; Liu, S. T.; Yuan, Fuh-Gwo; Beard, Shawn

    2011-04-01

    Materials State Awareness (MSA) goes beyond traditional NDE and SHM in its challenge to characterize the current state of material damage before the onset of macro-damage such as cracks. A highly reliable, minimally invasive system for MSA of Aerospace Structures, Naval structures as well as next generation space systems is critically needed. Development of such a system will require a reliable SHM system that can detect the onset of damage well before the flaw grows to a critical size. Therefore, it is important to develop an integrated SHM system that not only detects macroscale damages in the structures but also provides an early indication of flaw precursors and microdamages. The early warning for flaw precursors and their evolution provided by an SHM system can then be used to define remedial strategies before the structural damage leads to failure, and significantly improve the safety and reliability of the structures. Thus, in this article a preliminary concept of developing the Hybrid Distributed Sensor Network Integrated with Self-learning Symbiotic Diagnostic Algorithms and Models to accurately and reliably detect the precursors to damages that occur to the structure are discussed. Experiments conducted in a laboratory environment shows potential of the proposed technique.

  7. Distributed collaborative probabilistic design of multi-failure structure with fluid-structure interaction using fuzzy neural network of regression

    NASA Astrophysics Data System (ADS)

    Song, Lu-Kai; Wen, Jie; Fei, Cheng-Wei; Bai, Guang-Chen

    2018-05-01

    To improve the computing efficiency and precision of probabilistic design for multi-failure structure, a distributed collaborative probabilistic design method-based fuzzy neural network of regression (FR) (called as DCFRM) is proposed with the integration of distributed collaborative response surface method and fuzzy neural network regression model. The mathematical model of DCFRM is established and the probabilistic design idea with DCFRM is introduced. The probabilistic analysis of turbine blisk involving multi-failure modes (deformation failure, stress failure and strain failure) was investigated by considering fluid-structure interaction with the proposed method. The distribution characteristics, reliability degree, and sensitivity degree of each failure mode and overall failure mode on turbine blisk are obtained, which provides a useful reference for improving the performance and reliability of aeroengine. Through the comparison of methods shows that the DCFRM reshapes the probability of probabilistic analysis for multi-failure structure and improves the computing efficiency while keeping acceptable computational precision. Moreover, the proposed method offers a useful insight for reliability-based design optimization of multi-failure structure and thereby also enriches the theory and method of mechanical reliability design.

  8. Scheduling structural health monitoring activities for optimizing life-cycle costs and reliability of wind turbines

    NASA Astrophysics Data System (ADS)

    Hanish Nithin, Anu; Omenzetter, Piotr

    2017-04-01

    Optimization of the life-cycle costs and reliability of offshore wind turbines (OWTs) is an area of immense interest due to the widespread increase in wind power generation across the world. Most of the existing studies have used structural reliability and the Bayesian pre-posterior analysis for optimization. This paper proposes an extension to the previous approaches in a framework for probabilistic optimization of the total life-cycle costs and reliability of OWTs by combining the elements of structural reliability/risk analysis (SRA), the Bayesian pre-posterior analysis with optimization through a genetic algorithm (GA). The SRA techniques are adopted to compute the probabilities of damage occurrence and failure associated with the deterioration model. The probabilities are used in the decision tree and are updated using the Bayesian analysis. The output of this framework would determine the optimal structural health monitoring and maintenance schedules to be implemented during the life span of OWTs while maintaining a trade-off between the life-cycle costs and risk of the structural failure. Numerical illustrations with a generic deterioration model for one monitoring exercise in the life cycle of a system are demonstrated. Two case scenarios, namely to build initially an expensive and robust or a cheaper but more quickly deteriorating structures and to adopt expensive monitoring system, are presented to aid in the decision-making process.

  9. A semi-supervised learning approach for RNA secondary structure prediction.

    PubMed

    Yonemoto, Haruka; Asai, Kiyoshi; Hamada, Michiaki

    2015-08-01

    RNA secondary structure prediction is a key technology in RNA bioinformatics. Most algorithms for RNA secondary structure prediction use probabilistic models, in which the model parameters are trained with reliable RNA secondary structures. Because of the difficulty of determining RNA secondary structures by experimental procedures, such as NMR or X-ray crystal structural analyses, there are still many RNA sequences that could be useful for training whose secondary structures have not been experimentally determined. In this paper, we introduce a novel semi-supervised learning approach for training parameters in a probabilistic model of RNA secondary structures in which we employ not only RNA sequences with annotated secondary structures but also ones with unknown secondary structures. Our model is based on a hybrid of generative (stochastic context-free grammars) and discriminative models (conditional random fields) that has been successfully applied to natural language processing. Computational experiments indicate that the accuracy of secondary structure prediction is improved by incorporating RNA sequences with unknown secondary structures into training. To our knowledge, this is the first study of a semi-supervised learning approach for RNA secondary structure prediction. This technique will be useful when the number of reliable structures is limited. Copyright © 2015 Elsevier Ltd. All rights reserved.

  10. Target recognition and scene interpretation in image/video understanding systems based on network-symbolic models

    NASA Astrophysics Data System (ADS)

    Kuvich, Gary

    2004-08-01

    Vision is only a part of a system that converts visual information into knowledge structures. These structures drive the vision process, resolving ambiguity and uncertainty via feedback, and provide image understanding, which is an interpretation of visual information in terms of these knowledge models. These mechanisms provide a reliable recognition if the object is occluded or cannot be recognized as a whole. It is hard to split the entire system apart, and reliable solutions to the target recognition problems are possible only within the solution of a more generic Image Understanding Problem. Brain reduces informational and computational complexities, using implicit symbolic coding of features, hierarchical compression, and selective processing of visual information. Biologically inspired Network-Symbolic representation, where both systematic structural/logical methods and neural/statistical methods are parts of a single mechanism, is the most feasible for such models. It converts visual information into relational Network-Symbolic structures, avoiding artificial precise computations of 3-dimensional models. Network-Symbolic Transformations derive abstract structures, which allows for invariant recognition of an object as exemplar of a class. Active vision helps creating consistent models. Attention, separation of figure from ground and perceptual grouping are special kinds of network-symbolic transformations. Such Image/Video Understanding Systems will be reliably recognizing targets.

  11. Evaluation of 3D-Jury on CASP7 models.

    PubMed

    Kaján, László; Rychlewski, Leszek

    2007-08-21

    3D-Jury, the structure prediction consensus method publicly available in the Meta Server http://meta.bioinfo.pl/, was evaluated using models gathered in the 7th round of the Critical Assessment of Techniques for Protein Structure Prediction (CASP7). 3D-Jury is an automated expert process that generates protein structure meta-predictions from sets of models obtained from partner servers. The performance of 3D-Jury was analysed for three aspects. First, we examined the correlation between the 3D-Jury score and a model quality measure: the number of correctly predicted residues. The 3D-Jury score was shown to correlate significantly with the number of correctly predicted residues, the correlation is good enough to be used for prediction. 3D-Jury was also found to improve upon the competing servers' choice of the best structure model in most cases. The value of the 3D-Jury score as a generic reliability measure was also examined. We found that the 3D-Jury score separates bad models from good models better than the reliability score of the original server in 27 cases and falls short of it in only 5 cases out of a total of 38. We report the release of a new Meta Server feature: instant 3D-Jury scoring of uploaded user models. The 3D-Jury score continues to be a good indicator of structural model quality. It also provides a generic reliability score, especially important for models that were not assigned such by the original server. Individual structure modellers can also benefit from the 3D-Jury scoring system by testing their models in the new instant scoring feature http://meta.bioinfo.pl/compare_your_model_example.pl available in the Meta Server.

  12. Why the Major Field Test in Business Does Not Report Subscores: Reliability and Construct Validity Evidence. Research Report. ETS RR-12-11

    ERIC Educational Resources Information Center

    Ling, Guangming

    2012-01-01

    To assess the value of individual students' subscores on the Major Field Test in Business (MFT Business), I examined the test's internal structure with factor analysis and structural equation model methods, and analyzed the subscore reliabilities using the augmented scores method. Analyses of the internal structure suggested that the MFT Business…

  13. Evaluation of 3D-Jury on CASP7 models

    PubMed Central

    Kaján, László; Rychlewski, Leszek

    2007-01-01

    Background 3D-Jury, the structure prediction consensus method publicly available in the Meta Server , was evaluated using models gathered in the 7th round of the Critical Assessment of Techniques for Protein Structure Prediction (CASP7). 3D-Jury is an automated expert process that generates protein structure meta-predictions from sets of models obtained from partner servers. Results The performance of 3D-Jury was analysed for three aspects. First, we examined the correlation between the 3D-Jury score and a model quality measure: the number of correctly predicted residues. The 3D-Jury score was shown to correlate significantly with the number of correctly predicted residues, the correlation is good enough to be used for prediction. 3D-Jury was also found to improve upon the competing servers' choice of the best structure model in most cases. The value of the 3D-Jury score as a generic reliability measure was also examined. We found that the 3D-Jury score separates bad models from good models better than the reliability score of the original server in 27 cases and falls short of it in only 5 cases out of a total of 38. We report the release of a new Meta Server feature: instant 3D-Jury scoring of uploaded user models. Conclusion The 3D-Jury score continues to be a good indicator of structural model quality. It also provides a generic reliability score, especially important for models that were not assigned such by the original server. Individual structure modellers can also benefit from the 3D-Jury scoring system by testing their models in the new instant scoring feature available in the Meta Server. PMID:17711571

  14. Functionalization of MEMS cantilever beams for interconnect reliability investigation: development practice

    NASA Astrophysics Data System (ADS)

    Bieniek, T.; Janczyk, G.; Dobrowolski, R.; Wojciechowska, K.; Malinowska, A.; Panas, A.; Nieprzecki, M.; Kłos, H.

    2016-11-01

    This paper covers research results on development of the cantilevers beams test structures for interconnects reliability and robustness investigation. Presented results include design, modelling, simulation, optimization and finally fabrication stage performed on 4 inch Si wafers using the ITE microfabrication facility. This paper also covers experimental results from the test structures characterization.

  15. Care 3 phase 2 report, maintenance manual

    NASA Technical Reports Server (NTRS)

    Bryant, L. A.; Stiffler, J. J.

    1982-01-01

    CARE 3 (Computer-Aided Reliability Estimation, version three) is a computer program designed to help estimate the reliability of complex, redundant systems. Although the program can model a wide variety of redundant structures, it was developed specifically for fault-tolerant avionics systems--systems distinguished by the need for extremely reliable performance since a system failure could well result in the loss of human life. It substantially generalizes the class of redundant configurations that could be accommodated, and includes a coverage model to determine the various coverage probabilities as a function of the applicable fault recovery mechanisms (detection delay, diagnostic scheduling interval, isolation and recovery delay, etc.). CARE 3 further generalizes the class of system structures that can be modeled and greatly expands the coverage model to take into account such effects as intermittent and transient faults, latent faults, error propagation, etc.

  16. A General Reliability Model for Ni-BaTiO3-Based Multilayer Ceramic Capacitors

    NASA Technical Reports Server (NTRS)

    Liu, Donhang

    2014-01-01

    The evaluation of multilayer ceramic capacitors (MLCCs) with Ni electrode and BaTiO3 dielectric material for potential space project applications requires an in-depth understanding of their reliability. A general reliability model for Ni-BaTiO3 MLCC is developed and discussed. The model consists of three parts: a statistical distribution; an acceleration function that describes how a capacitor's reliability life responds to the external stresses, and an empirical function that defines contribution of the structural and constructional characteristics of a multilayer capacitor device, such as the number of dielectric layers N, dielectric thickness d, average grain size, and capacitor chip size A. Application examples are also discussed based on the proposed reliability model for Ni-BaTiO3 MLCCs.

  17. A General Reliability Model for Ni-BaTiO3-Based Multilayer Ceramic Capacitors

    NASA Technical Reports Server (NTRS)

    Liu, Donhang

    2014-01-01

    The evaluation for potential space project applications of multilayer ceramic capacitors (MLCCs) with Ni electrode and BaTiO3 dielectric material requires an in-depth understanding of the MLCCs reliability. A general reliability model for Ni-BaTiO3 MLCCs is developed and discussed in this paper. The model consists of three parts: a statistical distribution; an acceleration function that describes how a capacitors reliability life responds to external stresses; and an empirical function that defines the contribution of the structural and constructional characteristics of a multilayer capacitor device, such as the number of dielectric layers N, dielectric thickness d, average grain size r, and capacitor chip size A. Application examples are also discussed based on the proposed reliability model for Ni-BaTiO3 MLCCs.

  18. A New Reliability Analysis Model of the Chegongzhuang Heat-Supplying Tunnel Structure Considering the Coupling of Pipeline Thrust and Thermal Effect

    PubMed Central

    Zhang, Jiawen; He, Shaohui; Wang, Dahai; Liu, Yangpeng; Yao, Wenbo; Liu, Xiabing

    2018-01-01

    Based on the operating Chegongzhuang heat-supplying tunnel in Beijing, the reliability of its lining structure under the action of large thrust and thermal effect is studied. According to the characteristics of a heat-supplying tunnel service, a three-dimensional numerical analysis model was established based on the mechanical tests on the in-situ specimens. The stress and strain of the tunnel structure were obtained before and after the operation. Compared with the field monitoring data, the rationality of the model was verified. After extracting the internal force of the lining structure, the improved method of subset simulation was proposed as the performance function to calculate the reliability of the main control section of the tunnel. In contrast to the traditional calculation method, the analytic relationship between the sample numbers in the subset simulation method and Monte Carlo method was given. The results indicate that the lining structure is greatly influenced by coupling in the range of six meters from the fixed brackets, especially the tunnel floor. The improved subset simulation method can greatly save computation time and improve computational efficiency under the premise of ensuring the accuracy of calculation. It is suitable for the reliability calculation of tunnel engineering, because “the lower the probability, the more efficient the calculation.” PMID:29401691

  19. Structural Probability Concepts Adapted to Electrical Engineering

    NASA Technical Reports Server (NTRS)

    Steinberg, Eric P.; Chamis, Christos C.

    1994-01-01

    Through the use of equivalent variable analogies, the authors demonstrate how an electrical subsystem can be modeled by an equivalent structural subsystem. This allows the electrical subsystem to be probabilistically analyzed by using available structural reliability computer codes such as NESSUS. With the ability to analyze the electrical subsystem probabilistically, we can evaluate the reliability of systems that include both structural and electrical subsystems. Common examples of such systems are a structural subsystem integrated with a health-monitoring subsystem, and smart structures. Since these systems have electrical subsystems that directly affect the operation of the overall system, probabilistically analyzing them could lead to improved reliability and reduced costs. The direct effect of the electrical subsystem on the structural subsystem is of secondary order and is not considered in the scope of this work.

  20. The use of test structures for reliability prediction and process control of integrated circuits and photovoltaics

    NASA Astrophysics Data System (ADS)

    Trachtenberg, I.

    How a reliability model might be developed with new data from accelerated stress testing, failure mechanisms, process control monitoring, and test structure evaluations is illustrated. The effects of the acceleration of temperature on operating life is discussed. Test structures that will further accelerate the failure rate are discussed. Corrosion testing is addressed. The uncoated structure is encapsulated in a variety of mold compounds and subjected to pressure-cooker testing.

  1. Assessment of family functioning in Caucasian and Hispanic Americans: reliability, validity, and factor structure of the Family Assessment Device.

    PubMed

    Aarons, Gregory A; McDonald, Elizabeth J; Connelly, Cynthia D; Newton, Rae R

    2007-12-01

    The purpose of this study was to examine the factor structure, reliability, and validity of the Family Assessment Device (FAD) among a national sample of Caucasian and Hispanic American families receiving public sector mental health services. A confirmatory factor analysis conducted to test model fit yielded equivocal findings. With few exceptions, indices of model fit, reliability, and validity were poorer for Hispanic Americans compared with Caucasian Americans. Contrary to our expectation, an exploratory factor analysis did not result in a better fitting model of family functioning. Without stronger evidence supporting a reformulation of the FAD, we recommend against such a course of action. Findings highlight the need for additional research on the role of culture in measurement of family functioning.

  2. Probabilistic durability assessment of concrete structures in marine environments: Reliability and sensitivity analysis

    NASA Astrophysics Data System (ADS)

    Yu, Bo; Ning, Chao-lie; Li, Bing

    2017-03-01

    A probabilistic framework for durability assessment of concrete structures in marine environments was proposed in terms of reliability and sensitivity analysis, which takes into account the uncertainties under the environmental, material, structural and executional conditions. A time-dependent probabilistic model of chloride ingress was established first to consider the variations in various governing parameters, such as the chloride concentration, chloride diffusion coefficient, and age factor. Then the Nataf transformation was adopted to transform the non-normal random variables from the original physical space into the independent standard Normal space. After that the durability limit state function and its gradient vector with respect to the original physical parameters were derived analytically, based on which the first-order reliability method was adopted to analyze the time-dependent reliability and parametric sensitivity of concrete structures in marine environments. The accuracy of the proposed method was verified by comparing with the second-order reliability method and the Monte Carlo simulation. Finally, the influences of environmental conditions, material properties, structural parameters and execution conditions on the time-dependent reliability of concrete structures in marine environments were also investigated. The proposed probabilistic framework can be implemented in the decision-making algorithm for the maintenance and repair of deteriorating concrete structures in marine environments.

  3. Software reliability through fault-avoidance and fault-tolerance

    NASA Technical Reports Server (NTRS)

    Vouk, Mladen A.; Mcallister, David F.

    1992-01-01

    Accomplishments in the following research areas are summarized: structure based testing, reliability growth, and design testability with risk evaluation; reliability growth models and software risk management; and evaluation of consensus voting, consensus recovery block, and acceptance voting. Four papers generated during the reporting period are included as appendices.

  4. Coefficient Alpha: A Reliability Coefficient for the 21st Century?

    ERIC Educational Resources Information Center

    Yang, Yanyun; Green, Samuel B.

    2011-01-01

    Coefficient alpha is almost universally applied to assess reliability of scales in psychology. We argue that researchers should consider alternatives to coefficient alpha. Our preference is for structural equation modeling (SEM) estimates of reliability because they are informative and allow for an empirical evaluation of the assumptions…

  5. A methodology for producing reliable software, volume 1

    NASA Technical Reports Server (NTRS)

    Stucki, L. G.; Moranda, P. B.; Foshee, G.; Kirchoff, M.; Omre, R.

    1976-01-01

    An investigation into the areas having an impact on producing reliable software including automated verification tools, software modeling, testing techniques, structured programming, and management techniques is presented. This final report contains the results of this investigation, analysis of each technique, and the definition of a methodology for producing reliable software.

  6. Digital Avionics Information System (DAIS): Reliability and Maintainability Model Users Guide. Final Report, May 1975-July 1977.

    ERIC Educational Resources Information Center

    Czuchry, Andrew J.; And Others

    This report provides a complete guide to the stand alone mode operation of the reliability and maintenance (R&M) model, which was developed to facilitate the performance of design versus cost trade-offs within the digital avionics information system (DAIS) acquisition process. The features and structure of the model, its input data…

  7. @NWTC Newsletter: Summer 2015 | Wind | NREL

    Science.gov Websites

    Structural Testing of the Blade Reliability Collaborative Effect of Defect Wind Turbine Blades Gearbox Reliability Collaborative Phase 3 Gearbox 2 Test Report Modeling Dynamic Stall on Wind Turbine Blades Under

  8. Development of reliable pavement models.

    DOT National Transportation Integrated Search

    2011-05-01

    The current report proposes a framework for estimating the reliability of a given pavement structure as analyzed by : the Mechanistic-Empirical Pavement Design Guide (MEPDG). The methodology proposes using a previously fit : response surface, in plac...

  9. Reliability based fatigue design and maintenance procedures

    NASA Technical Reports Server (NTRS)

    Hanagud, S.

    1977-01-01

    A stochastic model has been developed to describe a probability for fatigue process by assuming a varying hazard rate. This stochastic model can be used to obtain the desired probability of a crack of certain length at a given location after a certain number of cycles or time. Quantitative estimation of the developed model was also discussed. Application of the model to develop a procedure for reliability-based cost-effective fail-safe structural design is presented. This design procedure includes the reliability improvement due to inspection and repair. Methods of obtaining optimum inspection and maintenance schemes are treated.

  10. Bayesian Chance-Constrained Hydraulic Barrier Design under Geological Structure Uncertainty.

    PubMed

    Chitsazan, Nima; Pham, Hai V; Tsai, Frank T-C

    2015-01-01

    The groundwater community has widely recognized geological structure uncertainty as a major source of model structure uncertainty. Previous studies in aquifer remediation design, however, rarely discuss the impact of geological structure uncertainty. This study combines chance-constrained (CC) programming with Bayesian model averaging (BMA) as a BMA-CC framework to assess the impact of geological structure uncertainty in remediation design. To pursue this goal, the BMA-CC method is compared with traditional CC programming that only considers model parameter uncertainty. The BMA-CC method is employed to design a hydraulic barrier to protect public supply wells of the Government St. pump station from salt water intrusion in the "1500-foot" sand and the "1700-foot" sand of the Baton Rouge area, southeastern Louisiana. To address geological structure uncertainty, three groundwater models based on three different hydrostratigraphic architectures are developed. The results show that using traditional CC programming overestimates design reliability. The results also show that at least five additional connector wells are needed to achieve more than 90% design reliability level. The total amount of injected water from the connector wells is higher than the total pumpage of the protected public supply wells. While reducing the injection rate can be achieved by reducing the reliability level, the study finds that the hydraulic barrier design to protect the Government St. pump station may not be economically attractive. © 2014, National Ground Water Association.

  11. A reliability design method for a lithium-ion battery pack considering the thermal disequilibrium in electric vehicles

    NASA Astrophysics Data System (ADS)

    Xia, Quan; Wang, Zili; Ren, Yi; Sun, Bo; Yang, Dezhen; Feng, Qiang

    2018-05-01

    With the rapid development of lithium-ion battery technology in the electric vehicle (EV) industry, the lifetime of the battery cell increases substantially; however, the reliability of the battery pack is still inadequate. Because of the complexity of the battery pack, a reliability design method for a lithium-ion battery pack considering the thermal disequilibrium is proposed in this paper based on cell redundancy. Based on this method, a three-dimensional electric-thermal-flow-coupled model, a stochastic degradation model of cells under field dynamic conditions and a multi-state system reliability model of a battery pack are established. The relationships between the multi-physics coupling model, the degradation model and the system reliability model are first constructed to analyze the reliability of the battery pack and followed by analysis examples with different redundancy strategies. By comparing the reliability of battery packs of different redundant cell numbers and configurations, several conclusions for the redundancy strategy are obtained. More notably, the reliability does not monotonically increase with the number of redundant cells for the thermal disequilibrium effects. In this work, the reliability of a 6 × 5 parallel-series configuration is the optimal system structure. In addition, the effect of the cell arrangement and cooling conditions are investigated.

  12. Integrated performance and reliability specification for digital avionics systems

    NASA Technical Reports Server (NTRS)

    Brehm, Eric W.; Goettge, Robert T.

    1995-01-01

    This paper describes an automated tool for performance and reliability assessment of digital avionics systems, called the Automated Design Tool Set (ADTS). ADTS is based on an integrated approach to design assessment that unifies traditional performance and reliability views of system designs, and that addresses interdependencies between performance and reliability behavior via exchange of parameters and result between mathematical models of each type. A multi-layer tool set architecture has been developed for ADTS that separates the concerns of system specification, model generation, and model solution. Performance and reliability models are generated automatically as a function of candidate system designs, and model results are expressed within the system specification. The layered approach helps deal with the inherent complexity of the design assessment process, and preserves long-term flexibility to accommodate a wide range of models and solution techniques within the tool set structure. ADTS research and development to date has focused on development of a language for specification of system designs as a basis for performance and reliability evaluation. A model generation and solution framework has also been developed for ADTS, that will ultimately encompass an integrated set of analytic and simulated based techniques for performance, reliability, and combined design assessment.

  13. Assessing the applicability of template-based protein docking in the twilight zone.

    PubMed

    Negroni, Jacopo; Mosca, Roberto; Aloy, Patrick

    2014-09-02

    The structural modeling of protein interactions in the absence of close homologous templates is a challenging task. Recently, template-based docking methods have emerged to exploit local structural similarities to help ab-initio protocols provide reliable 3D models for protein interactions. In this work, we critically assess the performance of template-based docking in the twilight zone. Our results show that, while it is possible to find templates for nearly all known interactions, the quality of the obtained models is rather limited. We can increase the precision of the models at expenses of coverage, but it drastically reduces the potential applicability of the method, as illustrated by the whole-interactome modeling of nine organisms. Template-based docking is likely to play an important role in the structural characterization of the interaction space, but we still need to improve the repertoire of structural templates onto which we can reliably model protein complexes. Copyright © 2014 Elsevier Ltd. All rights reserved.

  14. Probabilistic Structural Analysis and Reliability Using NESSUS With Implemented Material Strength Degradation Model

    NASA Technical Reports Server (NTRS)

    Bast, Callie C.; Jurena, Mark T.; Godines, Cody R.; Chamis, Christos C. (Technical Monitor)

    2001-01-01

    This project included both research and education objectives. The goal of this project was to advance innovative research and education objectives in theoretical and computational probabilistic structural analysis, reliability, and life prediction for improved reliability and safety of structural components of aerospace and aircraft propulsion systems. Research and education partners included Glenn Research Center (GRC) and Southwest Research Institute (SwRI) along with the University of Texas at San Antonio (UTSA). SwRI enhanced the NESSUS (Numerical Evaluation of Stochastic Structures Under Stress) code and provided consulting support for NESSUS-related activities at UTSA. NASA funding supported three undergraduate students, two graduate students, a summer course instructor and the Principal Investigator. Matching funds from UTSA provided for the purchase of additional equipment for the enhancement of the Advanced Interactive Computational SGI Lab established during the first year of this Partnership Award to conduct the probabilistic finite element summer courses. The research portion of this report presents the cumulation of work performed through the use of the probabilistic finite element program, NESSUS, Numerical Evaluation and Structures Under Stress, and an embedded Material Strength Degradation (MSD) model. Probabilistic structural analysis provided for quantification of uncertainties associated with the design, thus enabling increased system performance and reliability. The structure examined was a Space Shuttle Main Engine (SSME) fuel turbopump blade. The blade material analyzed was Inconel 718, since the MSD model was previously calibrated for this material. Reliability analysis encompassing the effects of high temperature and high cycle fatigue, yielded a reliability value of 0.99978 using a fully correlated random field for the blade thickness. The reliability did not change significantly for a change in distribution type except for a change in distribution from Gaussian to Weibull for the centrifugal load. The sensitivity factors determined to be most dominant were the centrifugal loading and the initial strength of the material. These two sensitivity factors were influenced most by a change in distribution type from Gaussian to Weibull. The education portion of this report describes short-term and long-term educational objectives. Such objectives serve to integrate research and education components of this project resulting in opportunities for ethnic minority students, principally Hispanic. The primary vehicle to facilitate such integration was the teaching of two probabilistic finite element method courses to undergraduate engineering students in the summers of 1998 and 1999.

  15. Locally adaptive MR intensity models and MRF-based segmentation of multiple sclerosis lesions

    NASA Astrophysics Data System (ADS)

    Galimzianova, Alfiia; Lesjak, Žiga; Likar, Boštjan; Pernuš, Franjo; Špiclin, Žiga

    2015-03-01

    Neuroimaging biomarkers are an important paraclinical tool used to characterize a number of neurological diseases, however, their extraction requires accurate and reliable segmentation of normal and pathological brain structures. For MR images of healthy brains the intensity models of normal-appearing brain tissue (NABT) in combination with Markov random field (MRF) models are known to give reliable and smooth NABT segmentation. However, the presence of pathology, MR intensity bias and natural tissue-dependent intensity variability altogether represent difficult challenges for a reliable estimation of NABT intensity model based on MR images. In this paper, we propose a novel method for segmentation of normal and pathological structures in brain MR images of multiple sclerosis (MS) patients that is based on locally-adaptive NABT model, a robust method for the estimation of model parameters and a MRF-based segmentation framework. Experiments on multi-sequence brain MR images of 27 MS patients show that, compared to whole-brain model and compared to the widely used Expectation-Maximization Segmentation (EMS) method, the locally-adaptive NABT model increases the accuracy of MS lesion segmentation.

  16. Study on safety level of RC beam bridges under earthquake

    NASA Astrophysics Data System (ADS)

    Zhao, Jun; Lin, Junqi; Liu, Jinlong; Li, Jia

    2017-08-01

    This study considers uncertainties in material strengths and the modeling which have important effects on structural resistance force based on reliability theory. After analyzing the destruction mechanism of a RC bridge, structural functions and the reliability were given, then the safety level of the piers of a reinforced concrete continuous girder bridge with stochastic structural parameters against earthquake was analyzed. Using response surface method to calculate the failure probabilities of bridge piers under high-level earthquake, their seismic reliability for different damage states within the design reference period were calculated applying two-stage design, which describes seismic safety level of the built bridges to some extent.

  17. Reliability prediction of large fuel cell stack based on structure stress analysis

    NASA Astrophysics Data System (ADS)

    Liu, L. F.; Liu, B.; Wu, C. W.

    2017-09-01

    The aim of this paper is to improve the reliability of Proton Electrolyte Membrane Fuel Cell (PEMFC) stack by designing the clamping force and the thickness difference between the membrane electrode assembly (MEA) and the gasket. The stack reliability is directly determined by the component reliability, which is affected by the material property and contact stress. The component contact stress is a random variable because it is usually affected by many uncertain factors in the production and clamping process. We have investigated the influences of parameter variation coefficient on the probability distribution of contact stress using the equivalent stiffness model and the first-order second moment method. The optimal contact stress to make the component stay in the highest level reliability is obtained by the stress-strength interference model. To obtain the optimal contact stress between the contact components, the optimal thickness of the component and the stack clamping force are optimally designed. Finally, a detailed description is given how to design the MEA and gasket dimensions to obtain the highest stack reliability. This work can provide a valuable guidance in the design of stack structure for a high reliability of fuel cell stack.

  18. Agent autonomy approach to probabilistic physics-of-failure modeling of complex dynamic systems with interacting failure mechanisms

    NASA Astrophysics Data System (ADS)

    Gromek, Katherine Emily

    A novel computational and inference framework of the physics-of-failure (PoF) reliability modeling for complex dynamic systems has been established in this research. The PoF-based reliability models are used to perform a real time simulation of system failure processes, so that the system level reliability modeling would constitute inferences from checking the status of component level reliability at any given time. The "agent autonomy" concept is applied as a solution method for the system-level probabilistic PoF-based (i.e. PPoF-based) modeling. This concept originated from artificial intelligence (AI) as a leading intelligent computational inference in modeling of multi agents systems (MAS). The concept of agent autonomy in the context of reliability modeling was first proposed by M. Azarkhail [1], where a fundamentally new idea of system representation by autonomous intelligent agents for the purpose of reliability modeling was introduced. Contribution of the current work lies in the further development of the agent anatomy concept, particularly the refined agent classification within the scope of the PoF-based system reliability modeling, new approaches to the learning and the autonomy properties of the intelligent agents, and modeling interacting failure mechanisms within the dynamic engineering system. The autonomous property of intelligent agents is defined as agent's ability to self-activate, deactivate or completely redefine their role in the analysis. This property of agents and the ability to model interacting failure mechanisms of the system elements makes the agent autonomy fundamentally different from all existing methods of probabilistic PoF-based reliability modeling. 1. Azarkhail, M., "Agent Autonomy Approach to Physics-Based Reliability Modeling of Structures and Mechanical Systems", PhD thesis, University of Maryland, College Park, 2007.

  19. Evaluating shallow-flow rock structures as scour countermeasures at bridges.

    DOT National Transportation Integrated Search

    2009-12-01

    A study to determine whether or not shallow-flow rock structures could reliably be used at bridge abutments in place of riprap. Research was conducted in a two-phase effort beginning with numerical modeling and ending with field verification of model...

  20. Model-Based Heterogeneous Data Fusion for Reliable Force Estimation in Dynamic Structures under Uncertainties

    PubMed Central

    Khodabandeloo, Babak; Melvin, Dyan; Jo, Hongki

    2017-01-01

    Direct measurements of external forces acting on a structure are infeasible in many cases. The Augmented Kalman Filter (AKF) has several attractive features that can be utilized to solve the inverse problem of identifying applied forces, as it requires the dynamic model and the measured responses of structure at only a few locations. But, the AKF intrinsically suffers from numerical instabilities when accelerations, which are the most common response measurements in structural dynamics, are the only measured responses. Although displacement measurements can be used to overcome the instability issue, the absolute displacement measurements are challenging and expensive for full-scale dynamic structures. In this paper, a reliable model-based data fusion approach to reconstruct dynamic forces applied to structures using heterogeneous structural measurements (i.e., strains and accelerations) in combination with AKF is investigated. The way of incorporating multi-sensor measurements in the AKF is formulated. Then the formulation is implemented and validated through numerical examples considering possible uncertainties in numerical modeling and sensor measurement. A planar truss example was chosen to clearly explain the formulation, while the method and formulation are applicable to other structures as well. PMID:29149088

  1. Multidisciplinary System Reliability Analysis

    NASA Technical Reports Server (NTRS)

    Mahadevan, Sankaran; Han, Song; Chamis, Christos C. (Technical Monitor)

    2001-01-01

    The objective of this study is to develop a new methodology for estimating the reliability of engineering systems that encompass multiple disciplines. The methodology is formulated in the context of the NESSUS probabilistic structural analysis code, developed under the leadership of NASA Glenn Research Center. The NESSUS code has been successfully applied to the reliability estimation of a variety of structural engineering systems. This study examines whether the features of NESSUS could be used to investigate the reliability of systems in other disciplines such as heat transfer, fluid mechanics, electrical circuits etc., without considerable programming effort specific to each discipline. In this study, the mechanical equivalence between system behavior models in different disciplines are investigated to achieve this objective. A new methodology is presented for the analysis of heat transfer, fluid flow, and electrical circuit problems using the structural analysis routines within NESSUS, by utilizing the equivalence between the computational quantities in different disciplines. This technique is integrated with the fast probability integration and system reliability techniques within the NESSUS code, to successfully compute the system reliability of multidisciplinary systems. Traditional as well as progressive failure analysis methods for system reliability estimation are demonstrated, through a numerical example of a heat exchanger system involving failure modes in structural, heat transfer and fluid flow disciplines.

  2. Multi-Disciplinary System Reliability Analysis

    NASA Technical Reports Server (NTRS)

    Mahadevan, Sankaran; Han, Song

    1997-01-01

    The objective of this study is to develop a new methodology for estimating the reliability of engineering systems that encompass multiple disciplines. The methodology is formulated in the context of the NESSUS probabilistic structural analysis code developed under the leadership of NASA Lewis Research Center. The NESSUS code has been successfully applied to the reliability estimation of a variety of structural engineering systems. This study examines whether the features of NESSUS could be used to investigate the reliability of systems in other disciplines such as heat transfer, fluid mechanics, electrical circuits etc., without considerable programming effort specific to each discipline. In this study, the mechanical equivalence between system behavior models in different disciplines are investigated to achieve this objective. A new methodology is presented for the analysis of heat transfer, fluid flow, and electrical circuit problems using the structural analysis routines within NESSUS, by utilizing the equivalence between the computational quantities in different disciplines. This technique is integrated with the fast probability integration and system reliability techniques within the NESSUS code, to successfully compute the system reliability of multi-disciplinary systems. Traditional as well as progressive failure analysis methods for system reliability estimation are demonstrated, through a numerical example of a heat exchanger system involving failure modes in structural, heat transfer and fluid flow disciplines.

  3. Attention-deficit/hyperactivity disorder dimensionality: the reliable 'g' and the elusive 's' dimensions.

    PubMed

    Wagner, Flávia; Martel, Michelle M; Cogo-Moreira, Hugo; Maia, Carlos Renato Moreira; Pan, Pedro Mario; Rohde, Luis Augusto; Salum, Giovanni Abrahão

    2016-01-01

    The best structural model for attention-deficit/hyperactivity disorder (ADHD) symptoms remains a matter of debate. The objective of this study is to test the fit and factor reliability of competing models of the dimensional structure of ADHD symptoms in a sample of randomly selected and high-risk children and pre-adolescents from Brazil. Our sample comprised 2512 children aged 6-12 years from 57 schools in Brazil. The ADHD symptoms were assessed using parent report on the development and well-being assessment (DAWBA). Fit indexes from confirmatory factor analysis were used to test unidimensional, correlated, and bifactor models of ADHD, the latter including "g" ADHD and "s" symptom domain factors. Reliability of all models was measured with omega coefficients. A bifactor model with one general factor and three specific factors (inattention, hyperactivity, impulsivity) exhibited the best fit to the data, according to fit indices, as well as the most consistent factor loadings. However, based on omega reliability statistics, the specific inattention, hyperactivity, and impulsivity dimensions provided very little reliable information after accounting for the reliable general ADHD factor. Our study presents some psychometric evidence that ADHD specific ("s") factors might be unreliable after taking common ("g" factor) variance into account. These results are in accordance with the lack of longitudinal stability among subtypes, the absence of dimension-specific molecular genetic findings and non-specific effects of treatment strategies. Therefore, researchers and clinicians might most effectively rely on the "g" ADHD to characterize ADHD dimensional phenotype, based on currently available symptom items.

  4. Predicting laser weld reliability with stochastic reduced-order models. Predicting laser weld reliability

    DOE PAGES

    Emery, John M.; Field, Richard V.; Foulk, James W.; ...

    2015-05-26

    Laser welds are prevalent in complex engineering systems and they frequently govern failure. The weld process often results in partial penetration of the base metals, leaving sharp crack-like features with a high degree of variability in the geometry and material properties of the welded structure. Furthermore, accurate finite element predictions of the structural reliability of components containing laser welds requires the analysis of a large number of finite element meshes with very fine spatial resolution, where each mesh has different geometry and/or material properties in the welded region to address variability. We found that traditional modeling approaches could not bemore » efficiently employed. Consequently, a method is presented for constructing a surrogate model, based on stochastic reduced-order models, and is proposed to represent the laser welds within the component. Here, the uncertainty in weld microstructure and geometry is captured by calibrating plasticity parameters to experimental observations of necking as, because of the ductility of the welds, necking – and thus peak load – plays the pivotal role in structural failure. The proposed method is exercised for a simplified verification problem and compared with the traditional Monte Carlo simulation with rather remarkable results.« less

  5. Evaluation of a model of violence risk assessment among forensic psychiatric patients.

    PubMed

    Douglas, Kevin S; Ogloff, James R P; Hart, Stephen D

    2003-10-01

    This study tested the interrater reliability and criterion-related validity of structured violence risk judgments made by using one application of the structured professional judgment model of violence risk assessment, the HCR-20 violence risk assessment scheme, which assesses 20 key risk factors in three domains: historical, clinical, and risk management. The HCR-20 was completed for a sample of 100 forensic psychiatric patients who had been found not guilty by reason of a mental disorder and were subsequently released to the community. Violence in the community was determined from multiple file-based sources. Interrater reliability of structured final risk judgments of low, moderate, or high violence risk made on the basis of the structured professional judgment model was acceptable (weighted kappa=.61). Structured final risk judgments were significantly predictive of postrelease community violence, yielding moderate to large effect sizes. Event history analyses showed that final risk judgments made with the structured professional judgment model added incremental validity to the HCR-20 used in an actuarial (numerical) sense. The findings support the structured professional judgment model of risk assessment as well as the HCR-20 specifically and suggest that clinical judgment, if made within a structured context, can contribute in meaningful ways to the assessment of violence risk.

  6. Sustainability of transport structures - some aspects of the nonlinear reliability assessment

    NASA Astrophysics Data System (ADS)

    Pukl, Radomír; Sajdlová, Tereza; Strauss, Alfred; Lehký, David; Novák, Drahomír

    2017-09-01

    Efficient techniques for both nonlinear numerical analysis of concrete structures and advanced stochastic simulation methods have been combined in order to offer an advanced tool for assessment of realistic behaviour, failure and safety assessment of transport structures. The utilized approach is based on randomization of the non-linear finite element analysis of the structural models. Degradation aspects such as carbonation of concrete can be accounted in order predict durability of the investigated structure and its sustainability. Results can serve as a rational basis for the performance and sustainability assessment based on advanced nonlinear computer analysis of the structures of transport infrastructure such as bridges or tunnels. In the stochastic simulation the input material parameters obtained from material tests including their randomness and uncertainty are represented as random variables or fields. Appropriate identification of material parameters is crucial for the virtual failure modelling of structures and structural elements. Inverse analysis using artificial neural networks and virtual stochastic simulations approach is applied to determine the fracture mechanical parameters of the structural material and its numerical model. Structural response, reliability and sustainability have been investigated on different types of transport structures made from various materials using the above mentioned methodology and tools.

  7. Ceramic component reliability with the restructured NASA/CARES computer program

    NASA Technical Reports Server (NTRS)

    Powers, Lynn M.; Starlinger, Alois; Gyekenyesi, John P.

    1992-01-01

    The Ceramics Analysis and Reliability Evaluation of Structures (CARES) integrated design program on statistical fast fracture reliability and monolithic ceramic components is enhanced to include the use of a neutral data base, two-dimensional modeling, and variable problem size. The data base allows for the efficient transfer of element stresses, temperatures, and volumes/areas from the finite element output to the reliability analysis program. Elements are divided to insure a direct correspondence between the subelements and the Gaussian integration points. Two-dimensional modeling is accomplished by assessing the volume flaw reliability with shell elements. To demonstrate the improvements in the algorithm, example problems are selected from a round-robin conducted by WELFEP (WEakest Link failure probability prediction by Finite Element Postprocessors).

  8. Uncertainty and Intelligence in Computational Stochastic Mechanics

    NASA Technical Reports Server (NTRS)

    Ayyub, Bilal M.

    1996-01-01

    Classical structural reliability assessment techniques are based on precise and crisp (sharp) definitions of failure and non-failure (survival) of a structure in meeting a set of strength, function and serviceability criteria. These definitions are provided in the form of performance functions and limit state equations. Thus, the criteria provide a dichotomous definition of what real physical situations represent, in the form of abrupt change from structural survival to failure. However, based on observing the failure and survival of real structures according to the serviceability and strength criteria, the transition from a survival state to a failure state and from serviceability criteria to strength criteria are continuous and gradual rather than crisp and abrupt. That is, an entire spectrum of damage or failure levels (grades) is observed during the transition to total collapse. In the process, serviceability criteria are gradually violated with monotonically increasing level of violation, and progressively lead into the strength criteria violation. Classical structural reliability methods correctly and adequately include the ambiguity sources of uncertainty (physical randomness, statistical and modeling uncertainty) by varying amounts. However, they are unable to adequately incorporate the presence of a damage spectrum, and do not consider in their mathematical framework any sources of uncertainty of the vagueness type. Vagueness can be attributed to sources of fuzziness, unclearness, indistinctiveness, sharplessness and grayness; whereas ambiguity can be attributed to nonspecificity, one-to-many relations, variety, generality, diversity and divergence. Using the nomenclature of structural reliability, vagueness and ambiguity can be accounted for in the form of realistic delineation of structural damage based on subjective judgment of engineers. For situations that require decisions under uncertainty with cost/benefit objectives, the risk of failure should depend on the underlying level of damage and the uncertainties associated with its definition. A mathematical model for structural reliability assessment that includes both ambiguity and vagueness types of uncertainty was suggested to result in the likelihood of failure over a damage spectrum. The resulting structural reliability estimates properly represent the continuous transition from serviceability to strength limit states over the ultimate time exposure of the structure. In this section, a structural reliability assessment method based on a fuzzy definition of failure is suggested to meet these practical needs. A failure definition can be developed to indicate the relationship between failure level and structural response. In this fuzzy model, a subjective index is introduced to represent all levels of damage (or failure). This index can be interpreted as either a measure of failure level or a measure of a degree of belief in the occurrence of some performance condition (e.g., failure). The index allows expressing the transition state between complete survival and complete failure for some structural response based on subjective evaluation and judgment.

  9. Structural Test Laboratory | Water Power | NREL

    Science.gov Websites

    Structural Test Laboratory Structural Test Laboratory NREL engineers design and configure structural components can validate models, demonstrate system reliability, inform design margins, and assess , including mass and center of gravity, to ensure compliance with design goals Dynamic Characterization Use

  10. Factor Structure, Reliability and Measurement Invariance of the Alberta Context Tool and the Conceptual Research Utilization Scale, for German Residential Long Term Care

    PubMed Central

    Hoben, Matthias; Estabrooks, Carole A.; Squires, Janet E.; Behrens, Johann

    2016-01-01

    We translated the Canadian residential long term care versions of the Alberta Context Tool (ACT) and the Conceptual Research Utilization (CRU) Scale into German, to study the association between organizational context factors and research utilization in German nursing homes. The rigorous translation process was based on best practice guidelines for tool translation, and we previously published methods and results of this process in two papers. Both instruments are self-report questionnaires used with care providers working in nursing homes. The aim of this study was to assess the factor structure, reliability, and measurement invariance (MI) between care provider groups responding to these instruments. In a stratified random sample of 38 nursing homes in one German region (Metropolregion Rhein-Neckar), we collected questionnaires from 273 care aides, 196 regulated nurses, 152 allied health providers, 6 quality improvement specialists, 129 clinical leaders, and 65 nursing students. The factor structure was assessed using confirmatory factor models. The first model included all 10 ACT concepts. We also decided a priori to run two separate models for the scale-based and the count-based ACT concepts as suggested by the instrument developers. The fourth model included the five CRU Scale items. Reliability scores were calculated based on the parameters of the best-fitting factor models. Multiple-group confirmatory factor models were used to assess MI between provider groups. Rather than the hypothesized ten-factor structure of the ACT, confirmatory factor models suggested 13 factors. The one-factor solution of the CRU Scale was confirmed. The reliability was acceptable (>0.7 in the entire sample and in all provider groups) for 10 of 13 ACT concepts, and high (0.90–0.96) for the CRU Scale. We could demonstrate partial strong MI for both ACT models and partial strict MI for the CRU Scale. Our results suggest that the scores of the German ACT and the CRU Scale for nursing homes are acceptably reliable and valid. However, as the ACT lacked strict MI, observed variables (or scale scores based on them) cannot be compared between provider groups. Rather, group comparisons should be based on latent variable models, which consider the different residual variances of each group. PMID:27656156

  11. Protein Models Docking Benchmark 2

    PubMed Central

    Anishchenko, Ivan; Kundrotas, Petras J.; Tuzikov, Alexander V.; Vakser, Ilya A.

    2015-01-01

    Structural characterization of protein-protein interactions is essential for our ability to understand life processes. However, only a fraction of known proteins have experimentally determined structures. Such structures provide templates for modeling of a large part of the proteome, where individual proteins can be docked by template-free or template-based techniques. Still, the sensitivity of the docking methods to the inherent inaccuracies of protein models, as opposed to the experimentally determined high-resolution structures, remains largely untested, primarily due to the absence of appropriate benchmark set(s). Structures in such a set should have pre-defined inaccuracy levels and, at the same time, resemble actual protein models in terms of structural motifs/packing. The set should also be large enough to ensure statistical reliability of the benchmarking results. We present a major update of the previously developed benchmark set of protein models. For each interactor, six models were generated with the model-to-native Cα RMSD in the 1 to 6 Å range. The models in the set were generated by a new approach, which corresponds to the actual modeling of new protein structures in the “real case scenario,” as opposed to the previous set, where a significant number of structures were model-like only. In addition, the larger number of complexes (165 vs. 63 in the previous set) increases the statistical reliability of the benchmarking. We estimated the highest accuracy of the predicted complexes (according to CAPRI criteria), which can be attained using the benchmark structures. The set is available at http://dockground.bioinformatics.ku.edu. PMID:25712716

  12. Probabilistic design of fibre concrete structures

    NASA Astrophysics Data System (ADS)

    Pukl, R.; Novák, D.; Sajdlová, T.; Lehký, D.; Červenka, J.; Červenka, V.

    2017-09-01

    Advanced computer simulation is recently well-established methodology for evaluation of resistance of concrete engineering structures. The nonlinear finite element analysis enables to realistically predict structural damage, peak load, failure, post-peak response, development of cracks in concrete, yielding of reinforcement, concrete crushing or shear failure. The nonlinear material models can cover various types of concrete and reinforced concrete: ordinary concrete, plain or reinforced, without or with prestressing, fibre concrete, (ultra) high performance concrete, lightweight concrete, etc. Advanced material models taking into account fibre concrete properties such as shape of tensile softening branch, high toughness and ductility are described in the paper. Since the variability of the fibre concrete material properties is rather high, the probabilistic analysis seems to be the most appropriate format for structural design and evaluation of structural performance, reliability and safety. The presented combination of the nonlinear analysis with advanced probabilistic methods allows evaluation of structural safety characterized by failure probability or by reliability index respectively. Authors offer a methodology and computer tools for realistic safety assessment of concrete structures; the utilized approach is based on randomization of the nonlinear finite element analysis of the structural model. Uncertainty of the material properties or their randomness obtained from material tests are accounted in the random distribution. Furthermore, degradation of the reinforced concrete materials such as carbonation of concrete, corrosion of reinforcement, etc. can be accounted in order to analyze life-cycle structural performance and to enable prediction of the structural reliability and safety in time development. The results can serve as a rational basis for design of fibre concrete engineering structures based on advanced nonlinear computer analysis. The presented methodology is illustrated on results from two probabilistic studies with different types of concrete structures related to practical applications and made from various materials (with the parameters obtained from real material tests).

  13. Towards automatic Markov reliability modeling of computer architectures

    NASA Technical Reports Server (NTRS)

    Liceaga, C. A.; Siewiorek, D. P.

    1986-01-01

    The analysis and evaluation of reliability measures using time-varying Markov models is required for Processor-Memory-Switch (PMS) structures that have competing processes such as standby redundancy and repair, or renewal processes such as transient or intermittent faults. The task of generating these models is tedious and prone to human error due to the large number of states and transitions involved in any reasonable system. Therefore model formulation is a major analysis bottleneck, and model verification is a major validation problem. The general unfamiliarity of computer architects with Markov modeling techniques further increases the necessity of automating the model formulation. This paper presents an overview of the Automated Reliability Modeling (ARM) program, under development at NASA Langley Research Center. ARM will accept as input a description of the PMS interconnection graph, the behavior of the PMS components, the fault-tolerant strategies, and the operational requirements. The output of ARM will be the reliability of availability Markov model formulated for direct use by evaluation programs. The advantages of such an approach are (a) utility to a large class of users, not necessarily expert in reliability analysis, and (b) a lower probability of human error in the computation.

  14. Efficient finite element modelling for the investigation of the dynamic behaviour of a structure with bolted joints

    NASA Astrophysics Data System (ADS)

    Omar, R.; Rani, M. N. Abdul; Yunus, M. A.; Mirza, W. I. I. Wan Iskandar; Zin, M. S. Mohd

    2018-04-01

    A simple structure with bolted joints consists of the structural components, bolts and nuts. There are several methods to model the structures with bolted joints, however there is no reliable, efficient and economic modelling methods that can accurately predict its dynamics behaviour. Explained in this paper is an investigation that was conducted to obtain an appropriate modelling method for bolted joints. This was carried out by evaluating four different finite element (FE) models of the assembled plates and bolts namely the solid plates-bolts model, plates without bolt model, hybrid plates-bolts model and simplified plates-bolts model. FE modal analysis was conducted for all four initial FE models of the bolted joints. Results of the FE modal analysis were compared with the experimental modal analysis (EMA) results. EMA was performed to extract the natural frequencies and mode shapes of the test physical structure with bolted joints. Evaluation was made by comparing the number of nodes, number of elements, elapsed computer processing unit (CPU) time, and the total percentage of errors of each initial FE model when compared with EMA result. The evaluation showed that the simplified plates-bolts model could most accurately predict the dynamic behaviour of the structure with bolted joints. This study proved that the reliable, efficient and economic modelling of bolted joints, mainly the representation of the bolting, has played a crucial element in ensuring the accuracy of the dynamic behaviour prediction.

  15. A lightweight thermal heat switch for redundant cryocooling on satellites

    NASA Astrophysics Data System (ADS)

    Dietrich, M.; Euler, A.; Thummes, G.

    2017-04-01

    A previously designed cryogenic thermal heat switch for space applications has been optimized for low mass, high structural stability, and reliability. The heat switch makes use of the large linear thermal expansion coefficient (CTE) of the thermoplastic UHMW-PE for actuation. A structure model, which includes the temperature dependent properties of the actuator, is derived to be able to predict the contact pressure between the switch parts. This pressure was used in a thermal model in order to predict the switch performance under different heat loads and operating temperatures. The two models were used to optimize the mass and stability of the switch. Its reliability was proven by cyclic actuation of the switch and by shaker tests.

  16. Reliability, Validity, and Factor Structure of the Current Assessment Practice Evaluation-Revised (CAPER) in a National Sample.

    PubMed

    Lyon, Aaron R; Pullmann, Michael D; Dorsey, Shannon; Martin, Prerna; Grigore, Alexandra A; Becker, Emily M; Jensen-Doss, Amanda

    2018-05-11

    Measurement-based care (MBC) is an increasingly popular, evidence-based practice, but there are no tools with established psychometrics to evaluate clinician use of MBC practices in mental health service delivery. The current study evaluated the reliability, validity, and factor structure of scores generated from a brief, standardized tool to measure MBC practices, the Current Assessment Practice Evaluation-Revised (CAPER). Survey data from a national sample of 479 mental health clinicians were used to conduct exploratory and confirmatory factor analyses, as well as reliability and validity analyses (e.g., relationships between CAPER subscales and clinician MBC attitudes). Analyses revealed competing two- and three-factor models. Regardless of the model used, scores from CAPER subscales demonstrated good reliability and convergent and divergent validity with MBC attitudes in the expected directions. The CAPER appears to be a psychometrically sound tool for assessing clinician MBC practices. Future directions for development and application of the tool are discussed.

  17. Reliability Analysis and Reliability-Based Design Optimization of Circular Composite Cylinders Under Axial Compression

    NASA Technical Reports Server (NTRS)

    Rais-Rohani, Masoud

    2001-01-01

    This report describes the preliminary results of an investigation on component reliability analysis and reliability-based design optimization of thin-walled circular composite cylinders with average diameter and average length of 15 inches. Structural reliability is based on axial buckling strength of the cylinder. Both Monte Carlo simulation and First Order Reliability Method are considered for reliability analysis with the latter incorporated into the reliability-based structural optimization problem. To improve the efficiency of reliability sensitivity analysis and design optimization solution, the buckling strength of the cylinder is estimated using a second-order response surface model. The sensitivity of the reliability index with respect to the mean and standard deviation of each random variable is calculated and compared. The reliability index is found to be extremely sensitive to the applied load and elastic modulus of the material in the fiber direction. The cylinder diameter was found to have the third highest impact on the reliability index. Also the uncertainty in the applied load, captured by examining different values for its coefficient of variation, is found to have a large influence on cylinder reliability. The optimization problem for minimum weight is solved subject to a design constraint on element reliability index. The methodology, solution procedure and optimization results are included in this report.

  18. Time-Varying, Multi-Scale Adaptive System Reliability Analysis of Lifeline Infrastructure Networks

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gearhart, Jared Lee; Kurtz, Nolan Scot

    2014-09-01

    The majority of current societal and economic needs world-wide are met by the existing networked, civil infrastructure. Because the cost of managing such infrastructure is high and increases with time, risk-informed decision making is essential for those with management responsibilities for these systems. To address such concerns, a methodology that accounts for new information, deterioration, component models, component importance, group importance, network reliability, hierarchical structure organization, and efficiency concerns has been developed. This methodology analyzes the use of new information through the lens of adaptive Importance Sampling for structural reliability problems. Deterioration, multi-scale bridge models, and time-variant component importance aremore » investigated for a specific network. Furthermore, both bridge and pipeline networks are studied for group and component importance, as well as for hierarchical structures in the context of specific networks. Efficiency is the primary driver throughout this study. With this risk-informed approach, those responsible for management can address deteriorating infrastructure networks in an organized manner.« less

  19. On the Reliability and Validity of a Numerical Reasoning Speed Dimension Derived from Response Times Collected in Computerized Testing

    ERIC Educational Resources Information Center

    Davison, Mark L.; Semmes, Robert; Huang, Lan; Close, Catherine N.

    2012-01-01

    Data from 181 college students were used to assess whether math reasoning item response times in computerized testing can provide valid and reliable measures of a speed dimension. The alternate forms reliability of the speed dimension was .85. A two-dimensional structural equation model suggests that the speed dimension is related to the accuracy…

  20. Separating Common from Unique Variance Within Emotional Distress: An Examination of Reliability and Relations to Worry.

    PubMed

    Marshall, Andrew J; Evanovich, Emma K; David, Sarah Jo; Mumma, Gregory H

    2018-01-17

    High comorbidity rates among emotional disorders have led researchers to examine transdiagnostic factors that may contribute to shared psychopathology. Bifactor models provide a unique method for examining transdiagnostic variables by modelling the common and unique factors within measures. Previous findings suggest that the bifactor model of the Depression Anxiety and Stress Scale (DASS) may provide a method for examining transdiagnostic factors within emotional disorders. This study aimed to replicate the bifactor model of the DASS, a multidimensional measure of psychological distress, within a US adult sample and provide initial estimates of the reliability of the general and domain-specific factors. Furthermore, this study hypothesized that Worry, a theorized transdiagnostic variable, would show stronger relations to general emotional distress than domain-specific subscales. Confirmatory factor analysis was used to evaluate the bifactor model structure of the DASS in 456 US adult participants (279 females and 177 males, mean age 35.9 years) recruited online. The DASS bifactor model fitted well (CFI = 0.98; RMSEA = 0.05). The General Emotional Distress factor accounted for most of the reliable variance in item scores. Domain-specific subscales accounted for modest portions of reliable variance in items after accounting for the general scale. Finally, structural equation modelling indicated that Worry was strongly predicted by the General Emotional Distress factor. The DASS bifactor model is generalizable to a US community sample and General Emotional Distress, but not domain-specific factors, strongly predict the transdiagnostic variable Worry.

  1. Model testing for reliability and validity of the Outcome Expectations for Exercise Scale.

    PubMed

    Resnick, B; Zimmerman, S; Orwig, D; Furstenberg, A L; Magaziner, J

    2001-01-01

    Development of a reliable and valid measure of outcome expectations for exercise appropriate for older adults will help establish the relationship between outcome expectations and exercise. Once established, this measure can be used to facilitate the development of interventions to strengthen outcome expectations and improve adherence to regular exercise in older adults. Building on initial psychometrics of the Outcome Expectation for Exercise (OEE) Scale, the purpose of the current study was to use structural equation modeling to provide additional support for the reliability and validity of this measure. The OEE scale is a 9-item measure specifically focusing on the perceived consequences of exercise for older adults. The OEE scale was given to 191 residents in a continuing care retirement community. The mean age of the participants was 85 +/- 6.1 and the majority were female (76%), White (99%), and unmarried (76%). Using structural equation modeling, reliability was based on R2 values, and validity was based on a confirmatory factor analysis and path coefficients. There was continued evidence for reliability of the OEE based on R2 values ranging from .42 to .77, and validity with path coefficients ranging from .69 to .87, and evidence of model fit (X2 of 69, df = 27, p < .05, NFI = .98, RMSEA = .07). The evidence of reliability and validity of this measure has important implications for clinical work and research. The OEE scale can be used to identify older adults who have low outcome expectations for exercise, and interventions can then be implemented to strengthen these expectations and thereby improve exercise behavior.

  2. Automatic specification of reliability models for fault-tolerant computers

    NASA Technical Reports Server (NTRS)

    Liceaga, Carlos A.; Siewiorek, Daniel P.

    1993-01-01

    The calculation of reliability measures using Markov models is required for life-critical processor-memory-switch structures that have standby redundancy or that are subject to transient or intermittent faults or repair. The task of specifying these models is tedious and prone to human error because of the large number of states and transitions required in any reasonable system. Therefore, model specification is a major analysis bottleneck, and model verification is a major validation problem. The general unfamiliarity of computer architects with Markov modeling techniques further increases the necessity of automating the model specification. Automation requires a general system description language (SDL). For practicality, this SDL should also provide a high level of abstraction and be easy to learn and use. The first attempt to define and implement an SDL with those characteristics is presented. A program named Automated Reliability Modeling (ARM) was constructed as a research vehicle. The ARM program uses a graphical interface as its SDL, and it outputs a Markov reliability model specification formulated for direct use by programs that generate and evaluate the model.

  3. A Critique of a Phenomenological Fiber Breakage Model for Stress Rupture of Composite Materials

    NASA Technical Reports Server (NTRS)

    Reeder, James R.

    2010-01-01

    Stress rupture is not a critical failure mode for most composite structures, but there are a few applications where it can be critical. One application where stress rupture can be a critical design issue is in Composite Overwrapped Pressure Vessels (COPV's), where the composite material is highly and uniformly loaded for long periods of time and where very high reliability is required. COPV's are normally required to be proof loaded before being put into service to insure strength, but it is feared that the proof load may cause damage that reduces the stress rupture reliability. Recently, a fiber breakage model was proposed specifically to estimate a reduced reliability due to proof loading. The fiber breakage model attempts to model physics believed to occur at the microscopic scale, but validation of the model has not occurred. In this paper, the fiber breakage model is re-derived while highlighting assumptions that were made during the derivation. Some of the assumptions are examined to assess their effect on the final predicted reliability.

  4. Reliability of the Suicide Opinion Questionnaire.

    ERIC Educational Resources Information Center

    Rogers, James R.; DeShon, Richard P.

    The lack of systematic psychometric information on the Suicide Opinion Questionnaire (SOQ) was addressed by investigating the factor structure and reliability of the eight-factor clinical scale model (mental illness, cry for help, right to die, religion, impulsivity, normality, aggression, and moral evil), developed for interpreting responses to…

  5. Advances in Micromechanics Modeling of Composites Structures for Structural Health Monitoring

    NASA Astrophysics Data System (ADS)

    Moncada, Albert

    Although high performance, light-weight composites are increasingly being used in applications ranging from aircraft, rotorcraft, weapon systems and ground vehicles, the assurance of structural reliability remains a critical issue. In composites, damage is absorbed through various fracture processes, including fiber failure, matrix cracking and delamination. An important element in achieving reliable composite systems is a strong capability of assessing and inspecting physical damage of critical structural components. Installation of a robust Structural Health Monitoring (SHM) system would be very valuable in detecting the onset of composite failure. A number of major issues still require serious attention in connection with the research and development aspects of sensor-integrated reliable SHM systems for composite structures. In particular, the sensitivity of currently available sensor systems does not allow detection of micro level damage; this limits the capability of data driven SHM systems. As a fundamental layer in SHM, modeling can provide in-depth information on material and structural behavior for sensing and detection, as well as data for learning algorithms. This dissertation focuses on the development of a multiscale analysis framework, which is used to detect various forms of damage in complex composite structures. A generalized method of cells based micromechanics analysis, as implemented in NASA's MAC/GMC code, is used for the micro-level analysis. First, a baseline study of MAC/GMC is performed to determine the governing failure theories that best capture the damage progression. The deficiencies associated with various layups and loading conditions are addressed. In most micromechanics analysis, a representative unit cell (RUC) with a common fiber packing arrangement is used. The effect of variation in this arrangement within the RUC has been studied and results indicate this variation influences the macro-scale effective material properties and failure stresses. The developed model has been used to simulate impact damage in a composite beam and an airfoil structure. The model data was verified through active interrogation using piezoelectric sensors. The multiscale model was further extended to develop a coupled damage and wave attenuation model, which was used to study different damage states such as fiber-matrix debonding in composite structures with surface bonded piezoelectric sensors.

  6. Structural reliability methods: Code development status

    NASA Astrophysics Data System (ADS)

    Millwater, Harry R.; Thacker, Ben H.; Wu, Y.-T.; Cruse, T. A.

    1991-05-01

    The Probabilistic Structures Analysis Method (PSAM) program integrates state of the art probabilistic algorithms with structural analysis methods in order to quantify the behavior of Space Shuttle Main Engine structures subject to uncertain loadings, boundary conditions, material parameters, and geometric conditions. An advanced, efficient probabilistic structural analysis software program, NESSUS (Numerical Evaluation of Stochastic Structures Under Stress) was developed as a deliverable. NESSUS contains a number of integrated software components to perform probabilistic analysis of complex structures. A nonlinear finite element module NESSUS/FEM is used to model the structure and obtain structural sensitivities. Some of the capabilities of NESSUS/FEM are shown. A Fast Probability Integration module NESSUS/FPI estimates the probability given the structural sensitivities. A driver module, PFEM, couples the FEM and FPI. NESSUS, version 5.0, addresses component reliability, resistance, and risk.

  7. Structural reliability methods: Code development status

    NASA Technical Reports Server (NTRS)

    Millwater, Harry R.; Thacker, Ben H.; Wu, Y.-T.; Cruse, T. A.

    1991-01-01

    The Probabilistic Structures Analysis Method (PSAM) program integrates state of the art probabilistic algorithms with structural analysis methods in order to quantify the behavior of Space Shuttle Main Engine structures subject to uncertain loadings, boundary conditions, material parameters, and geometric conditions. An advanced, efficient probabilistic structural analysis software program, NESSUS (Numerical Evaluation of Stochastic Structures Under Stress) was developed as a deliverable. NESSUS contains a number of integrated software components to perform probabilistic analysis of complex structures. A nonlinear finite element module NESSUS/FEM is used to model the structure and obtain structural sensitivities. Some of the capabilities of NESSUS/FEM are shown. A Fast Probability Integration module NESSUS/FPI estimates the probability given the structural sensitivities. A driver module, PFEM, couples the FEM and FPI. NESSUS, version 5.0, addresses component reliability, resistance, and risk.

  8. Constraining uncertainties in water supply reliability in a tropical data scarce basin

    NASA Astrophysics Data System (ADS)

    Kaune, Alexander; Werner, Micha; Rodriguez, Erasmo; de Fraiture, Charlotte

    2015-04-01

    Assessing the water supply reliability in river basins is essential for adequate planning and development of irrigated agriculture and urban water systems. In many cases hydrological models are applied to determine the surface water availability in river basins. However, surface water availability and variability is often not appropriately quantified due to epistemic uncertainties, leading to water supply insecurity. The objective of this research is to determine the water supply reliability in order to support planning and development of irrigated agriculture in a tropical, data scarce environment. The approach proposed uses a simple hydrological model, but explicitly includes model parameter uncertainty. A transboundary river basin in the tropical region of Colombia and Venezuela with an approximately area of 2100 km² was selected as a case study. The Budyko hydrological framework was extended to consider climatological input variability and model parameter uncertainty, and through this the surface water reliability to satisfy the irrigation and urban demand was estimated. This provides a spatial estimate of the water supply reliability across the basin. For the middle basin the reliability was found to be less than 30% for most of the months when the water is extracted from an upstream source. Conversely, the monthly water supply reliability was high (r>98%) in the lower basin irrigation areas when water was withdrawn from a source located further downstream. Including model parameter uncertainty provides a complete estimate of the water supply reliability, but that estimate is influenced by the uncertainty in the model. Reducing the uncertainty in the model through improved data and perhaps improved model structure will improve the estimate of the water supply reliability allowing better planning of irrigated agriculture and dependable water allocation decisions.

  9. Robustness Analysis and Reliable Flight Regime Estimation of an Integrated Resilent Control System for a Transport Aircraft

    NASA Technical Reports Server (NTRS)

    Shin, Jong-Yeob; Belcastro, Christine

    2008-01-01

    Formal robustness analysis of aircraft control upset prevention and recovery systems could play an important role in their validation and ultimate certification. As a part of the validation process, this paper describes an analysis method for determining a reliable flight regime in the flight envelope within which an integrated resilent control system can achieve the desired performance of tracking command signals and detecting additive faults in the presence of parameter uncertainty and unmodeled dynamics. To calculate a reliable flight regime, a structured singular value analysis method is applied to analyze the closed-loop system over the entire flight envelope. To use the structured singular value analysis method, a linear fractional transform (LFT) model of a transport aircraft longitudinal dynamics is developed over the flight envelope by using a preliminary LFT modeling software tool developed at the NASA Langley Research Center, which utilizes a matrix-based computational approach. The developed LFT model can capture original nonlinear dynamics over the flight envelope with the ! block which contains key varying parameters: angle of attack and velocity, and real parameter uncertainty: aerodynamic coefficient uncertainty and moment of inertia uncertainty. Using the developed LFT model and a formal robustness analysis method, a reliable flight regime is calculated for a transport aircraft closed-loop system.

  10. An Acoustic Charge Transport Imager for High Definition Television Applications: Reliability Modeling and Parametric Yield Prediction of GaAs Multiple Quantum Well Avalanche Photodiodes. Degree awarded Oct. 1997

    NASA Technical Reports Server (NTRS)

    Hunt, W. D.; Brennan, K. F.; Summers, C. J.; Yun, Ilgu

    1994-01-01

    Reliability modeling and parametric yield prediction of GaAs/AlGaAs multiple quantum well (MQW) avalanche photodiodes (APDs), which are of interest as an ultra-low noise image capture mechanism for high definition systems, have been investigated. First, the effect of various doping methods on the reliability of GaAs/AlGaAs multiple quantum well (MQW) avalanche photodiode (APD) structures fabricated by molecular beam epitaxy is investigated. Reliability is examined by accelerated life tests by monitoring dark current and breakdown voltage. Median device lifetime and the activation energy of the degradation mechanism are computed for undoped, doped-barrier, and doped-well APD structures. Lifetimes for each device structure are examined via a statistically designed experiment. Analysis of variance shows that dark-current is affected primarily by device diameter, temperature and stressing time, and breakdown voltage depends on the diameter, stressing time and APD type. It is concluded that the undoped APD has the highest reliability, followed by the doped well and doped barrier devices, respectively. To determine the source of the degradation mechanism for each device structure, failure analysis using the electron-beam induced current method is performed. This analysis reveals some degree of device degradation caused by ionic impurities in the passivation layer, and energy-dispersive spectrometry subsequently verified the presence of ionic sodium as the primary contaminant. However, since all device structures are similarly passivated, sodium contamination alone does not account for the observed variation between the differently doped APDs. This effect is explained by the dopant migration during stressing, which is verified by free carrier concentration measurements using the capacitance-voltage technique.

  11. Mid-frequency Band Dynamics of Large Space Structures

    NASA Technical Reports Server (NTRS)

    Coppolino, Robert N.; Adams, Douglas S.

    2004-01-01

    High and low intensity dynamic environments experienced by a spacecraft during launch and on-orbit operations, respectively, induce structural loads and motions, which are difficult to reliably predict. Structural dynamics in low- and mid-frequency bands are sensitive to component interface uncertainty and non-linearity as evidenced in laboratory testing and flight operations. Analytical tools for prediction of linear system response are not necessarily adequate for reliable prediction of mid-frequency band dynamics and analysis of measured laboratory and flight data. A new MATLAB toolbox, designed to address the key challenges of mid-frequency band dynamics, is introduced in this paper. Finite-element models of major subassemblies are defined following rational frequency-wavelength guidelines. For computational efficiency, these subassemblies are described as linear, component mode models. The complete structural system model is composed of component mode subassemblies and linear or non-linear joint descriptions. Computation and display of structural dynamic responses are accomplished employing well-established, stable numerical methods, modern signal processing procedures and descriptive graphical tools. Parametric sensitivity and Monte-Carlo based system identification tools are used to reconcile models with experimental data and investigate the effects of uncertainties. Models and dynamic responses are exported for employment in applications, such as detailed structural integrity and mechanical-optical-control performance analyses.

  12. Work-in-Progress Presented at the Army Symposium on Solid Mechanics, 1980 - Designing for Extremes: Environment, Loading, and Structural Behavior Held at Cape Cod, Massachusetts, 29 September-2 October 1980

    DTIC Science & Technology

    1980-09-01

    relating x’and y’ Figure 2: Basic Laboratory Simulation Model 73 COMPARISON OF COMPUTED AND MEASURED ACCELERATIONS IN A DYNAMICALLY LOADED TACTICAL...Survival (General) Displacements Mines (Ordnance) Telemeter Systems Dynamic Response Models Temperatures Dynamics Moisture Thermal Stresses Energy...probabilistic reliability model for the XM 753 projectile rocket motor to bulkhead joint under extreme loading conditions is constructed. The reliability

  13. Portuguese version of the PTSD Checklist-Military Version (PCL-M)-I: Confirmatory Factor Analysis and reliability.

    PubMed

    Carvalho, Teresa; Cunha, Marina; Pinto-Gouveia, José; Duarte, Joana

    2015-03-30

    The PTSD Checklist-Military Version (PCL-M) is a brief self-report instrument widely used to assess Post-traumatic Stress Disorder (PTSD) symptomatology in war Veterans, according to DSM-IV. This study sought out to explore the factor structure and reliability of the Portuguese version of the PCL-M. A sample of 660 Portuguese Colonial War Veterans completed the PCL-M. Several Confirmatory Factor Analyses were conducted to test different structures for PCL-M PTSD symptoms. Although the respecified first-order four-factor model based on King et al.'s model showed the best fit to the data, the respecified first and second-order models based on the DSM-IV symptom clusters also presented an acceptable fit. In addition, the PCL-M showed adequate reliability. The Portuguese version of the PCL-M is thus a valid and reliable measure to assess the severity of PTSD symptoms as described in DSM-IV. Its use with Portuguese Colonial War Veterans may ease screening of possible PTSD cases, promote more suitable treatment planning, and enable monitoring of therapeutic outcomes. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  14. CPHmodels-3.0--remote homology modeling using structure-guided sequence profiles.

    PubMed

    Nielsen, Morten; Lundegaard, Claus; Lund, Ole; Petersen, Thomas Nordahl

    2010-07-01

    CPHmodels-3.0 is a web server predicting protein 3D structure by use of single template homology modeling. The server employs a hybrid of the scoring functions of CPHmodels-2.0 and a novel remote homology-modeling algorithm. A query sequence is first attempted modeled using the fast CPHmodels-2.0 profile-profile scoring function suitable for close homology modeling. The new computational costly remote homology-modeling algorithm is only engaged provided that no suitable PDB template is identified in the initial search. CPHmodels-3.0 was benchmarked in the CASP8 competition and produced models for 94% of the targets (117 out of 128), 74% were predicted as high reliability models (87 out of 117). These achieved an average RMSD of 4.6 A when superimposed to the 3D structure. The remaining 26% low reliably models (30 out of 117) could superimpose to the true 3D structure with an average RMSD of 9.3 A. These performance values place the CPHmodels-3.0 method in the group of high performing 3D prediction tools. Beside its accuracy, one of the important features of the method is its speed. For most queries, the response time of the server is <20 min. The web server is available at http://www.cbs.dtu.dk/services/CPHmodels/.

  15. Analysis of fatigue reliability for high temperature and high pressure multi-stage decompression control valve

    NASA Astrophysics Data System (ADS)

    Yu, Long; Xu, Juanjuan; Zhang, Lifang; Xu, Xiaogang

    2018-03-01

    Based on stress-strength interference theory to establish the reliability mathematical model for high temperature and high pressure multi-stage decompression control valve (HMDCV), and introduced to the temperature correction coefficient for revising material fatigue limit at high temperature. Reliability of key dangerous components and fatigue sensitivity curve of each component are calculated and analyzed by the means, which are analyzed the fatigue life of control valve and combined with reliability theory of control valve model. The impact proportion of each component on the control valve system fatigue failure was obtained. The results is shown that temperature correction factor makes the theoretical calculations of reliability more accurate, prediction life expectancy of main pressure parts accords with the technical requirements, and valve body and the sleeve have obvious influence on control system reliability, the stress concentration in key part of control valve can be reduced in the design process by improving structure.

  16. Reliability Technology to Achieve Insertion of Advanced Packaging (RELTECH) program

    NASA Astrophysics Data System (ADS)

    Fayette, Daniel F.; Speicher, Patricia; Stoklosa, Mark J.; Evans, Jillian V.; Evans, John W.; Gentile, Mike; Pagel, Chuck A.; Hakim, Edward

    1993-08-01

    A joint military-commercial effort to evaluate multichip module (MCM) structures is discussed. The program, Reliability Technology to Achieve Insertion of Advanced Packaging (RELTECH), has been designed to identify the failure mechanisms that are possible in MCM structures. The RELTECH test vehicles, technical assessment task, product evaluation plan, reliability modeling task, accelerated and environmental testing, and post-test physical analysis and failure analysis are described. The information obtained through RELTECH can be used to address standardization issues, through development of cost effective qualification and appropriate screening criteria, for inclusion into a commercial specification and the MIL-H-38534 general specification for hybrid microcircuits.

  17. Reliability Technology to Achieve Insertion of Advanced Packaging (RELTECH) program

    NASA Technical Reports Server (NTRS)

    Fayette, Daniel F.; Speicher, Patricia; Stoklosa, Mark J.; Evans, Jillian V.; Evans, John W.; Gentile, Mike; Pagel, Chuck A.; Hakim, Edward

    1993-01-01

    A joint military-commercial effort to evaluate multichip module (MCM) structures is discussed. The program, Reliability Technology to Achieve Insertion of Advanced Packaging (RELTECH), has been designed to identify the failure mechanisms that are possible in MCM structures. The RELTECH test vehicles, technical assessment task, product evaluation plan, reliability modeling task, accelerated and environmental testing, and post-test physical analysis and failure analysis are described. The information obtained through RELTECH can be used to address standardization issues, through development of cost effective qualification and appropriate screening criteria, for inclusion into a commercial specification and the MIL-H-38534 general specification for hybrid microcircuits.

  18. Improving Water Level and Soil Moisture Over Peatlands in a Global Land Modeling System

    NASA Technical Reports Server (NTRS)

    Bechtold, M.; De Lannoy, G. J. M.; Roose, D.; Reichle, R. H.; Koster, R. D.; Mahanama, S. P.

    2017-01-01

    New model structure for peatlands results in improved skill metrics (without any parameter calibration) Simulated surface soil moisture strongly affected by new model, but reliable soil moisture data lacking for validation.

  19. The German version of the Posttraumatic Stress Disorder Checklist for DSM-5 (PCL-5): psychometric properties and diagnostic utility.

    PubMed

    Krüger-Gottschalk, Antje; Knaevelsrud, Christine; Rau, Heinrich; Dyer, Anne; Schäfer, Ingo; Schellong, Julia; Ehring, Thomas

    2017-11-28

    The Posttraumatic Stress Disorder (PTSD) Checklist (PCL, now PCL-5) has recently been revised to reflect the new diagnostic criteria of the disorder. A clinical sample of trauma-exposed individuals (N = 352) was assessed with the Clinician Administered PTSD Scale for DSM-5 (CAPS-5) and the PCL-5. Internal consistencies and test-retest reliability were computed. To investigate diagnostic accuracy, we calculated receiver operating curves. Confirmatory factor analyses (CFA) were performed to analyze the structural validity. Results showed high internal consistency (α = .95), high test-retest reliability (r = .91) and a high correlation with the total severity score of the CAPS-5, r = .77. In addition, the recommended cutoff of 33 on the PCL-5 showed high diagnostic accuracy when compared to the diagnosis established by the CAPS-5. CFAs comparing the DSM-5 model with alternative models (the three-factor solution, the dysphoria, anhedonia, externalizing behavior and hybrid model) to account for the structural validity of the PCL-5 remained inconclusive. Overall, the findings show that the German PCL-5 is a reliable instrument with good diagnostic accuracy. However, more research evaluating the underlying factor structure is needed.

  20. Fast and reliable prediction of domain-peptide binding affinity using coarse-grained structure models.

    PubMed

    Tian, Feifei; Tan, Rui; Guo, Tailin; Zhou, Peng; Yang, Li

    2013-07-01

    Domain-peptide recognition and interaction are fundamentally important for eukaryotic signaling and regulatory networks. It is thus essential to quantitatively infer the binding stability and specificity of such interaction based upon large-scale but low-accurate complex structure models which could be readily obtained from sophisticated molecular modeling procedure. In the present study, a new method is described for the fast and reliable prediction of domain-peptide binding affinity with coarse-grained structure models. This method is designed to tolerate strong random noises involved in domain-peptide complex structures and uses statistical modeling approach to eliminate systematic bias associated with a group of investigated samples. As a paradigm, this method was employed to model and predict the binding behavior of various peptides to four evolutionarily unrelated peptide-recognition domains (PRDs), i.e. human amph SH3, human nherf PDZ, yeast syh GYF and yeast bmh 14-3-3, and moreover, we explored the molecular mechanism and biological implication underlying the binding of cognate and noncognate peptide ligands to their domain receptors. It is expected that the newly proposed method could be further used to perform genome-wide inference of domain-peptide binding at three-dimensional structure level. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  1. Lifetime Reliability Prediction of Ceramic Structures Under Transient Thermomechanical Loads

    NASA Technical Reports Server (NTRS)

    Nemeth, Noel N.; Jadaan, Osama J.; Gyekenyesi, John P.

    2005-01-01

    An analytical methodology is developed to predict the probability of survival (reliability) of ceramic components subjected to harsh thermomechanical loads that can vary with time (transient reliability analysis). This capability enables more accurate prediction of ceramic component integrity against fracture in situations such as turbine startup and shutdown, operational vibrations, atmospheric reentry, or other rapid heating or cooling situations (thermal shock). The transient reliability analysis methodology developed herein incorporates the following features: fast-fracture transient analysis (reliability analysis without slow crack growth, SCG); transient analysis with SCG (reliability analysis with time-dependent damage due to SCG); a computationally efficient algorithm to compute the reliability for components subjected to repeated transient loading (block loading); cyclic fatigue modeling using a combined SCG and Walker fatigue law; proof testing for transient loads; and Weibull and fatigue parameters that are allowed to vary with temperature or time. Component-to-component variation in strength (stochastic strength response) is accounted for with the Weibull distribution, and either the principle of independent action or the Batdorf theory is used to predict the effect of multiaxial stresses on reliability. The reliability analysis can be performed either as a function of the component surface (for surface-distributed flaws) or component volume (for volume-distributed flaws). The transient reliability analysis capability has been added to the NASA CARES/ Life (Ceramic Analysis and Reliability Evaluation of Structures/Life) code. CARES/Life was also updated to interface with commercially available finite element analysis software, such as ANSYS, when used to model the effects of transient load histories. Examples are provided to demonstrate the features of the methodology as implemented in the CARES/Life program.

  2. Hierarchical Bayesian Model Averaging for Chance Constrained Remediation Designs

    NASA Astrophysics Data System (ADS)

    Chitsazan, N.; Tsai, F. T.

    2012-12-01

    Groundwater remediation designs are heavily relying on simulation models which are subjected to various sources of uncertainty in their predictions. To develop a robust remediation design, it is crucial to understand the effect of uncertainty sources. In this research, we introduce a hierarchical Bayesian model averaging (HBMA) framework to segregate and prioritize sources of uncertainty in a multi-layer frame, where each layer targets a source of uncertainty. The HBMA framework provides an insight to uncertainty priorities and propagation. In addition, HBMA allows evaluating model weights in different hierarchy levels and assessing the relative importance of models in each level. To account for uncertainty, we employ a chance constrained (CC) programming for stochastic remediation design. Chance constrained programming was implemented traditionally to account for parameter uncertainty. Recently, many studies suggested that model structure uncertainty is not negligible compared to parameter uncertainty. Using chance constrained programming along with HBMA can provide a rigorous tool for groundwater remediation designs under uncertainty. In this research, the HBMA-CC was applied to a remediation design in a synthetic aquifer. The design was to develop a scavenger well approach to mitigate saltwater intrusion toward production wells. HBMA was employed to assess uncertainties from model structure, parameter estimation and kriging interpolation. An improved harmony search optimization method was used to find the optimal location of the scavenger well. We evaluated prediction variances of chloride concentration at the production wells through the HBMA framework. The results showed that choosing the single best model may lead to a significant error in evaluating prediction variances for two reasons. First, considering the single best model, variances that stem from uncertainty in the model structure will be ignored. Second, considering the best model with non-dominant model weight may underestimate or overestimate prediction variances by ignoring other plausible propositions. Chance constraints allow developing a remediation design with a desirable reliability. However, considering the single best model, the calculated reliability will be different from the desirable reliability. We calculated the reliability of the design for the models at different levels of HBMA. The results showed that by moving toward the top layers of HBMA, the calculated reliability converges to the chosen reliability. We employed the chance constrained optimization along with the HBMA framework to find the optimal location and pumpage for the scavenger well. The results showed that using models at different levels in the HBMA framework, the optimal location of the scavenger well remained the same, but the optimal extraction rate was altered. Thus, we concluded that the optimal pumping rate was sensitive to the prediction variance. Also, the prediction variance was changed by using different extraction rate. Using very high extraction rate will cause prediction variances of chloride concentration at the production wells to approach zero regardless of which HBMA models used.

  3. TED analysis of the Si(113) surface structure

    NASA Astrophysics Data System (ADS)

    Suzuki, T.; Minoda, H.; Tanishiro, Y.; Yagi, K.

    1999-09-01

    We carried out a TED (transmission electron diffraction) analysis of the Si(113) surface structure. The TED patterns taken at room temperature showed reflections due to the 3×2 reconstructed structure. The TED pattern indicated that a glide plane parallel to the <332> direction suggested in some models is excluded. We calculated the R-factors (reliability factors) for six surface structure models proposed previously. All structure models with energy-optimized atomic positions have large R-factors. After revision of the atomic positions, the R-factors of all the structure models decreased below 0.3, and the revised version of Dabrowski's 3×2 model has the smallest R-factor of 0.17.

  4. Factor structure and reliability of the depression, anxiety and stress scales in a large Portuguese community sample.

    PubMed

    Vasconcelos-Raposo, José; Fernandes, Helder Miguel; Teixeira, Carla M

    2013-01-01

    The purpose of the present study was to assess the factor structure and reliability of the Depression, Anxiety and Stress Scales (DASS-21) in a large Portuguese community sample. Participants were 1020 adults (585 women and 435 men), with a mean age of 36.74 (SD = 11.90) years. All scales revealed good reliability, with Cronbach's alpha values between .80 (anxiety) and .84 (depression). The internal consistency of the total score was .92. Confirmatory factor analysis revealed that the best-fitting model (*CFI = .940, *RMSEA = .038) consisted of a latent component of general psychological distress (or negative affectivity) plus orthogonal depression, anxiety and stress factors. The Portuguese version of the DASS-21 showed good psychometric properties (factorial validity and reliability) and thus can be used as a reliable and valid instrument for measuring depression, anxiety and stress symptoms.

  5. Deterministic and reliability based optimization of integrated thermal protection system composite panel using adaptive sampling techniques

    NASA Astrophysics Data System (ADS)

    Ravishankar, Bharani

    Conventional space vehicles have thermal protection systems (TPS) that provide protection to an underlying structure that carries the flight loads. In an attempt to save weight, there is interest in an integrated TPS (ITPS) that combines the structural function and the TPS function. This has weight saving potential, but complicates the design of the ITPS that now has both thermal and structural failure modes. The main objectives of this dissertation was to optimally design the ITPS subjected to thermal and mechanical loads through deterministic and reliability based optimization. The optimization of the ITPS structure requires computationally expensive finite element analyses of 3D ITPS (solid) model. To reduce the computational expenses involved in the structural analysis, finite element based homogenization method was employed, homogenizing the 3D ITPS model to a 2D orthotropic plate. However it was found that homogenization was applicable only for panels that are much larger than the characteristic dimensions of the repeating unit cell in the ITPS panel. Hence a single unit cell was used for the optimization process to reduce the computational cost. Deterministic and probabilistic optimization of the ITPS panel required evaluation of failure constraints at various design points. This further demands computationally expensive finite element analyses which was replaced by efficient, low fidelity surrogate models. In an optimization process, it is important to represent the constraints accurately to find the optimum design. Instead of building global surrogate models using large number of designs, the computational resources were directed towards target regions near constraint boundaries for accurate representation of constraints using adaptive sampling strategies. Efficient Global Reliability Analyses (EGRA) facilitates sequentially sampling of design points around the region of interest in the design space. EGRA was applied to the response surface construction of the failure constraints in the deterministic and reliability based optimization of the ITPS panel. It was shown that using adaptive sampling, the number of designs required to find the optimum were reduced drastically, while improving the accuracy. System reliability of ITPS was estimated using Monte Carlo Simulation (MCS) based method. Separable Monte Carlo method was employed that allowed separable sampling of the random variables to predict the probability of failure accurately. The reliability analysis considered uncertainties in the geometry, material properties, loading conditions of the panel and error in finite element modeling. These uncertainties further increased the computational cost of MCS techniques which was also reduced by employing surrogate models. In order to estimate the error in the probability of failure estimate, bootstrapping method was applied. This research work thus demonstrates optimization of the ITPS composite panel with multiple failure modes and large number of uncertainties using adaptive sampling techniques.

  6. Models of determining deformations

    NASA Astrophysics Data System (ADS)

    Gladilin, V. N.

    2016-12-01

    In recent years, a lot of functions designed to determine deformation values that occur mostly as a result of settlement of structures and industrial equipment. Some authors suggest such advanced mathematical functions approximating deformations as general methods for the determination of deformations. The article describes models of deformations as physical processes. When comparing static, cinematic and dynamic models, it was found that the dynamic model reflects the deformation of structures and industrial equipment most reliably.

  7. Average inactivity time model, associated orderings and reliability properties

    NASA Astrophysics Data System (ADS)

    Kayid, M.; Izadkhah, S.; Abouammoh, A. M.

    2018-02-01

    In this paper, we introduce and study a new model called 'average inactivity time model'. This new model is specifically applicable to handle the heterogeneity of the time of the failure of a system in which some inactive items exist. We provide some bounds for the mean average inactivity time of a lifespan unit. In addition, we discuss some dependence structures between the average variable and the mixing variable in the model when original random variable possesses some aging behaviors. Based on the conception of the new model, we introduce and study a new stochastic order. Finally, to illustrate the concept of the model, some interesting reliability problems are reserved.

  8. eHealth literacy in chronic disease patients: An item response theory analysis of the eHealth literacy scale (eHEALS).

    PubMed

    Paige, Samantha R; Krieger, Janice L; Stellefson, Michael; Alber, Julia M

    2017-02-01

    Chronic disease patients are affected by low computer and health literacy, which negatively affects their ability to benefit from access to online health information. To estimate reliability and confirm model specifications for eHealth Literacy Scale (eHEALS) scores among chronic disease patients using Classical Test (CTT) and Item Response Theory techniques. A stratified sample of Black/African American (N=341) and Caucasian (N=343) adults with chronic disease completed an online survey including the eHEALS. Item discrimination was explored using bi-variate correlations and Cronbach's alpha for internal consistency. A categorical confirmatory factor analysis tested a one-factor structure of eHEALS scores. Item characteristic curves, in-fit/outfit statistics, omega coefficient, and item reliability and separation estimates were computed. A 1-factor structure of eHEALS was confirmed by statistically significant standardized item loadings, acceptable model fit indices (CFI/TLI>0.90), and 70% variance explained by the model. Item response categories increased with higher theta levels, and there was evidence of acceptable reliability (ω=0.94; item reliability=89; item separation=8.54). eHEALS scores are a valid and reliable measure of self-reported eHealth literacy among Internet-using chronic disease patients. Providers can use eHEALS to help identify patients' eHealth literacy skills. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  9. Feasibility model of a high reliability five-year tape transport, Volume 1. [development, performance, and test results

    NASA Technical Reports Server (NTRS)

    Eshleman, R. L.; Meyers, A. P.; Davidson, W. A.; Gortowski, R. C.; Anderson, M. E.

    1973-01-01

    The development, performance, and test results for the spaceborne magnetic tape transport are discussed. An analytical model of the tape transport was used to optimize its conceptual design. Each of the subsystems was subjected to reliability analyses which included structural integrity, maintenance of system performance within acceptable bounds, and avoidance of fatigue failure. These subsystems were also compared with each other in order to evaluate reliability characteristics. The transport uses no mechanical couplings. Four drive motors, one for each reel and one for each of two capstans, are used in a differential mode. There are two hybrid, spherical, cone tapered-crown rollers for tape guidance. Storage of the magnetic tape is provided by a reel assembly which includes the reel, a reel support structure and bearings, dust seals, and a dc drive motor. A summary of transport test results on tape guidance, flutter, and skew is provided.

  10. Evaluation Applied to Reliability Analysis of Reconfigurable, Highly Reliable, Fault-Tolerant, Computing Systems for Avionics

    NASA Technical Reports Server (NTRS)

    Migneault, G. E.

    1979-01-01

    Emulation techniques are proposed as a solution to a difficulty arising in the analysis of the reliability of highly reliable computer systems for future commercial aircraft. The difficulty, viz., the lack of credible precision in reliability estimates obtained by analytical modeling techniques are established. The difficulty is shown to be an unavoidable consequence of: (1) a high reliability requirement so demanding as to make system evaluation by use testing infeasible, (2) a complex system design technique, fault tolerance, (3) system reliability dominated by errors due to flaws in the system definition, and (4) elaborate analytical modeling techniques whose precision outputs are quite sensitive to errors of approximation in their input data. The technique of emulation is described, indicating how its input is a simple description of the logical structure of a system and its output is the consequent behavior. The use of emulation techniques is discussed for pseudo-testing systems to evaluate bounds on the parameter values needed for the analytical techniques.

  11. Structural validity and reliability of the Positive and Negative Affect Schedule (PANAS): evidence from a large Brazilian community sample.

    PubMed

    Carvalho, Hudson W de; Andreoli, Sérgio B; Lara, Diogo R; Patrick, Christopher J; Quintana, Maria Inês; Bressan, Rodrigo A; Melo, Marcelo F de; Mari, Jair de J; Jorge, Miguel R

    2013-01-01

    Positive and negative affect are the two psychobiological-dispositional dimensions reflecting proneness to positive and negative activation that influence the extent to which individuals experience life events as joyful or as distressful. The Positive and Negative Affect Schedule (PANAS) is a structured questionnaire that provides independent indexes of positive and negative affect. This study aimed to validate a Brazilian interview-version of the PANAS by means of factor and internal consistency analysis. A representative community sample of 3,728 individuals residing in the cities of São Paulo and Rio de Janeiro, Brazil, voluntarily completed the PANAS. Exploratory structural equation model analysis was based on maximum likelihood estimation and reliability was calculated via Cronbach's alpha coefficient. Our results provide support for the hypothesis that the PANAS reliably measures two distinct dimensions of positive and negative affect. The structure and reliability of the Brazilian version of the PANAS are consistent with those of its original version. Taken together, these results attest the validity of the Brazilian adaptation of the instrument.

  12. Validation of the Spanish version of Mackey childbirth satisfaction rating scale.

    PubMed

    Caballero, Pablo; Delgado-García, Beatriz E; Orts-Cortes, Isabel; Moncho, Joaquin; Pereyra-Zamora, Pamela; Nolasco, Andreu

    2016-04-16

    The "Mackey Childbirth Satisfaction Rating Scale" (MCSRS) is a complete non-validated scale which includes the most important factors associated with maternal satisfaction. Our primary purpose was to describe the internal structure of the scale and validate the reliability and validity of concept of its Spanish version MCSRS-E. The MCSRS was translated into Spanish, back-translated and adapted to the Spanish population. It was then administered following a pilot test with women who met the study participant requirements. The scale structure was obtained by performing an exploratory factorial analysis using a sample of 304 women. The structures obtained were tested by conducting a confirmatory factorial analysis using a sample of 159 women. To test the validity of concept, the structure factors were correlated with expectations prior to childbirth experiences. McDonald's omegas were calculated for each model to establish the reliability of each factor. The study was carried out at four University Hospitals; Alicante, Elche, Torrevieja and Vinalopo Salud of Elche. The inclusion criteria were women aged 18-45 years old who had just delivered a singleton live baby at 38-42 weeks through vaginal delivery. Women who had difficulty speaking and understanding Spanish were excluded. The process generated 5 different possible internal structures in a nested model more consistent with the theory than other internal structures of the MCSRS applied hitherto. All of them had good levels of validation and reliability. This nested model to explain internal structure of MCSRS-E can accommodate different clinical practice scenarios better than the other structures applied to date, and it is a flexible tool which can be used to identify the aspects that should be changed to improve maternal satisfaction and hence maternal health.

  13. The effect of leverage and/or influential on structure-activity relationships.

    PubMed

    Bolboacă, Sorana D; Jäntschi, Lorentz

    2013-05-01

    In the spirit of reporting valid and reliable Quantitative Structure-Activity Relationship (QSAR) models, the aim of our research was to assess how the leverage (analysis with Hat matrix, h(i)) and the influential (analysis with Cook's distance, D(i)) of QSAR models may reflect the models reliability and their characteristics. The datasets included in this research were collected from previously published papers. Seven datasets which accomplished the imposed inclusion criteria were analyzed. Three models were obtained for each dataset (full-model, h(i)-model and D(i)-model) and several statistical validation criteria were applied to the models. In 5 out of 7 sets the correlation coefficient increased when compounds with either h(i) or D(i) higher than the threshold were removed. Withdrawn compounds varied from 2 to 4 for h(i)-models and from 1 to 13 for D(i)-models. Validation statistics showed that D(i)-models possess systematically better agreement than both full-models and h(i)-models. Removal of influential compounds from training set significantly improves the model and is recommended to be conducted in the process of quantitative structure-activity relationships developing. Cook's distance approach should be combined with hat matrix analysis in order to identify the compounds candidates for removal.

  14. Locating, characterizing and minimizing sources of error for a paper case-based structured oral examination in a multi-campus clerkship.

    PubMed

    Kumar, A; Bridgham, R; Potts, M; Gushurst, C; Hamp, M; Passal, D

    2001-01-01

    To determine consistency of assessment in a new paper case-based structured oral examination in a multi-community pediatrics clerkship, and to identify correctable problems in the administration of examination and assessment process. Nine paper case-based oral examinations were audio-taped. From audio-tapes five community coordinators scored examiner behaviors and graded student performance. Correlations among examiner behaviors scores were examined. Graphs identified grading patterns of evaluators. The effect of exam-giving on evaluators was assessed by t-test. Reliability of grades was calculated and the effect of reducing assessment problems was modeled. Exam-givers differed most in their "teaching-guiding" behavior, and this negatively correlated with student grades. Exam reliability was lowered mainly by evaluator differences in leniency and grading pattern; less important was absence of standardization in cases. While grade reliability was low in early use of the paper case-based oral examination, modeling of plausible effects of training and monitoring for greater uniformity in administration of the examination and assigning scores suggests that more adequate reliabilities can be attained.

  15. Two Models of Raters in a Structured Oral Examination: Does It Make a Difference?

    ERIC Educational Resources Information Center

    Touchie, Claire; Humphrey-Murto, Susan; Ainslie, Martha; Myers, Kathryn; Wood, Timothy J.

    2010-01-01

    Oral examinations have become more standardized over recent years. Traditionally a small number of raters were used for this type of examination. Past studies suggested that more raters should improve reliability. We compared the results of a multi-station structured oral examination using two different rater models, those based in a station,…

  16. A reliability-based cost effective fail-safe design procedure

    NASA Technical Reports Server (NTRS)

    Hanagud, S.; Uppaluri, B.

    1976-01-01

    The authors have developed a methodology for cost-effective fatigue design of structures subject to random fatigue loading. A stochastic model for fatigue crack propagation under random loading has been discussed. Fracture mechanics is then used to estimate the parameters of the model and the residual strength of structures with cracks. The stochastic model and residual strength variations have been used to develop procedures for estimating the probability of failure and its changes with inspection frequency. This information on reliability is then used to construct an objective function in terms of either a total weight function or cost function. A procedure for selecting the design variables, subject to constraints, by optimizing the objective function has been illustrated by examples. In particular, optimum design of stiffened panel has been discussed.

  17. A system methodology for optimization design of the structural crashworthiness of a vehicle subjected to a high-speed frontal crash

    NASA Astrophysics Data System (ADS)

    Xia, Liang; Liu, Weiguo; Lv, Xiaojiang; Gu, Xianguang

    2018-04-01

    The structural crashworthiness design of vehicles has become an important research direction to ensure the safety of the occupants. To effectively improve the structural safety of a vehicle in a frontal crash, a system methodology is presented in this study. The surrogate model of Online support vector regression (Online-SVR) is adopted to approximate crashworthiness criteria and different kernel functions are selected to enhance the accuracy of the model. The Online-SVR model is demonstrated to have the advantages of solving highly nonlinear problems and saving training costs, and can effectively be applied for vehicle structural crashworthiness design. By combining the non-dominated sorting genetic algorithm II and Monte Carlo simulation, both deterministic optimization and reliability-based design optimization (RBDO) are conducted. The optimization solutions are further validated by finite element analysis, which shows the effectiveness of the RBDO solution in the structural crashworthiness design process. The results demonstrate the advantages of using RBDO, resulting in not only increased energy absorption and decreased structural weight from a baseline design, but also a significant improvement in the reliability of the design.

  18. Reliability and life prediction of ceramic composite structures at elevated temperatures

    NASA Technical Reports Server (NTRS)

    Duffy, Stephen F.; Gyekenyesi, John P.

    1994-01-01

    Methods are highlighted that ascertain the structural reliability of components fabricated of composites with ceramic matrices reinforced with ceramic fibers or whiskers and subject to quasi-static load conditions at elevated temperatures. Each method focuses on a particular composite microstructure: whisker-toughened ceramics, laminated ceramic matrix composites, and fabric reinforced ceramic matrix composites. In addition, since elevated service temperatures usually involve time-dependent effects, a section dealing with reliability degradation as a function of load history has been included. A recurring theme throughout this chapter is that even though component failure is controlled by a sequence of many microfailure events, failure of ceramic composites will be modeled using macrovariables.

  19. Formation of integrated structural units using the systematic and integrated method when implementing high-rise construction projects

    NASA Astrophysics Data System (ADS)

    Abramov, Ivan

    2018-03-01

    Development of design documentation for a future construction project gives rise to a number of issues with the main one being selection of manpower for structural units of the project's overall implementation system. Well planned and competently staffed integrated structural construction units will help achieve a high level of reliability and labor productivity and avoid negative (extraordinary) situations during the construction period eventually ensuring improved project performance. Research priorities include the development of theoretical recommendations for enhancing reliability of a structural unit staffed as an integrated construction crew. The author focuses on identification of destabilizing factors affecting formation of an integrated construction crew; assessment of these destabilizing factors; based on the developed mathematical model, highlighting the impact of these factors on the integration criterion with subsequent identification of an efficiency and reliability criterion for the structural unit in general. The purpose of this article is to develop theoretical recommendations and scientific and methodological provisions of an organizational and technological nature in order to identify a reliability criterion for a structural unit based on manpower integration and productivity criteria. With this purpose in mind, complex scientific tasks have been defined requiring special research, development of corresponding provisions and recommendations based on the system analysis findings presented herein.

  20. Finite element modeling and analysis of reinforced-concrete bridge.

    DOT National Transportation Integrated Search

    2000-09-01

    Despite its long history, the finite element method continues to be the predominant strategy employed by engineers to conduct structural analysis. A reliable method is needed for analyzing structures made of reinforced concrete, a complex but common ...

  1. Structural reliability assessment of the Oman India Pipeline

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Al-Sharif, A.M.; Preston, R.

    1996-12-31

    Reliability techniques are increasingly finding application in design. The special design conditions for the deep water sections of the Oman India Pipeline dictate their use since the experience basis for application of standard deterministic techniques is inadequate. The paper discusses the reliability analysis as applied to the Oman India Pipeline, including selection of a collapse model, characterization of the variability in the parameters that affect pipe resistance to collapse, and implementation of first and second order reliability analyses to assess the probability of pipe failure. The reliability analysis results are used as the basis for establishing the pipe wall thicknessmore » requirements for the pipeline.« less

  2. The process group approach to reliable distributed computing

    NASA Technical Reports Server (NTRS)

    Birman, Kenneth P.

    1992-01-01

    The difficulty of developing reliable distribution software is an impediment to applying distributed computing technology in many settings. Experience with the ISIS system suggests that a structured approach based on virtually synchronous process groups yields systems that are substantially easier to develop, exploit sophisticated forms of cooperative computation, and achieve high reliability. Six years of research on ISIS, describing the model, its implementation challenges, and the types of applications to which ISIS has been applied are reviewed.

  3. An integrated approach to system design, reliability, and diagnosis

    NASA Technical Reports Server (NTRS)

    Patterson-Hine, F. A.; Iverson, David L.

    1990-01-01

    The requirement for ultradependability of computer systems in future avionics and space applications necessitates a top-down, integrated systems engineering approach for design, implementation, testing, and operation. The functional analyses of hardware and software systems must be combined by models that are flexible enough to represent their interactions and behavior. The information contained in these models must be accessible throughout all phases of the system life cycle in order to maintain consistency and accuracy in design and operational decisions. One approach being taken by researchers at Ames Research Center is the creation of an object-oriented environment that integrates information about system components required in the reliability evaluation with behavioral information useful for diagnostic algorithms. Procedures have been developed at Ames that perform reliability evaluations during design and failure diagnoses during system operation. These procedures utilize information from a central source, structured as object-oriented fault trees. Fault trees were selected because they are a flexible model widely used in aerospace applications and because they give a concise, structured representation of system behavior. The utility of this integrated environment for aerospace applications in light of our experiences during its development and use is described. The techniques for reliability evaluation and failure diagnosis are discussed, and current extensions of the environment and areas requiring further development are summarized.

  4. An integrated approach to system design, reliability, and diagnosis

    NASA Astrophysics Data System (ADS)

    Patterson-Hine, F. A.; Iverson, David L.

    1990-12-01

    The requirement for ultradependability of computer systems in future avionics and space applications necessitates a top-down, integrated systems engineering approach for design, implementation, testing, and operation. The functional analyses of hardware and software systems must be combined by models that are flexible enough to represent their interactions and behavior. The information contained in these models must be accessible throughout all phases of the system life cycle in order to maintain consistency and accuracy in design and operational decisions. One approach being taken by researchers at Ames Research Center is the creation of an object-oriented environment that integrates information about system components required in the reliability evaluation with behavioral information useful for diagnostic algorithms. Procedures have been developed at Ames that perform reliability evaluations during design and failure diagnoses during system operation. These procedures utilize information from a central source, structured as object-oriented fault trees. Fault trees were selected because they are a flexible model widely used in aerospace applications and because they give a concise, structured representation of system behavior. The utility of this integrated environment for aerospace applications in light of our experiences during its development and use is described. The techniques for reliability evaluation and failure diagnosis are discussed, and current extensions of the environment and areas requiring further development are summarized.

  5. Universal first-order reliability concept applied to semistatic structures

    NASA Technical Reports Server (NTRS)

    Verderaime, V.

    1994-01-01

    A reliability design concept was developed for semistatic structures which combines the prevailing deterministic method with the first-order reliability method. The proposed method surmounts deterministic deficiencies in providing uniformly reliable structures and improved safety audits. It supports risk analyses and reliability selection criterion. The method provides a reliability design factor derived from the reliability criterion which is analogous to the current safety factor for sizing structures and verifying reliability response. The universal first-order reliability method should also be applicable for air and surface vehicles semistatic structures.

  6. Universal first-order reliability concept applied to semistatic structures

    NASA Astrophysics Data System (ADS)

    Verderaime, V.

    1994-07-01

    A reliability design concept was developed for semistatic structures which combines the prevailing deterministic method with the first-order reliability method. The proposed method surmounts deterministic deficiencies in providing uniformly reliable structures and improved safety audits. It supports risk analyses and reliability selection criterion. The method provides a reliability design factor derived from the reliability criterion which is analogous to the current safety factor for sizing structures and verifying reliability response. The universal first-order reliability method should also be applicable for air and surface vehicles semistatic structures.

  7. Reliability and Validity of the Sexual Pressure Scale for Women-Revised

    PubMed Central

    Jones, Rachel; Gulick, Elsie

    2008-01-01

    Sexual pressure among young urban women represents adherence to gender stereotypical expectations to engage in sex. Revision of the original 5-factor Sexual Pressure Scale was undertaken in two studies to improve reliabilities in two of the five factors. In Study 1 the reliability of the Sexual Pressure Scale for Women-Revised (SPSW-R) was tested, and principal components analysis was performed in a sample of 325 young, urban women. A parsimonious 18-item, 4-factor model explained 61% of the variance. In Study 2 the theory underlying sexual pressure was supported by confirmatory factor analysis using structural equation modeling in a sample of 181 women. Reliabilities of the SPSW-R total and subscales were very satisfactory, suggesting it may be used in intervention research. PMID:18666222

  8. Is there a reliable factorial structure in the 20-item Toronto Alexithymia Scale? A comparison of factor models in clinical and normal adult samples.

    PubMed

    Müller, Jochen; Bühner, Markus; Ellgring, Heiner

    2003-12-01

    The 20-item Toronto Alexithymia Scale (TAS-20) is the most widely used instrument for measuring alexithymia. However, different studies did not always yield identical factor structures of this scale. The present study aims at clarifying some discrepant results. Maximum likelihood confirmatory factor analyses of a German version of the TAS-20 were conducted on data from a clinical sample (N=204) and a sample of normal adults (N=224). Five different models with one to four factors were compared. A four-factor model with factors (F1) "Difficulty identifying feelings" (F2), "Difficulty describing feelings" (F3), "Low importance of emotion" and (F4) "Pragmatic thinking" and a three-factor model with the combined factor "Difficulties in identifying and describing feelings" described the data best. Factors related to "externally oriented thinking" provided no acceptable level of reliability. Results from the present and other studies indicate that the factorial structure of the TAS-20 may vary across samples. Whether factor structures different from the common three-factor structure are an exception in some mainly clinical populations or a common phenomenon outside student populations has still to be determined. For a further exploration of the factor structure of the TAS-20 in different populations, it would be important not only to test the fit of the common three-factor model, but also to consider other competing solutions like the models of the present study.

  9. Epistemic belief structures within introductory astronomy

    NASA Astrophysics Data System (ADS)

    Johnson, Keith; Willoughby, Shannon D.

    2018-06-01

    The reliability and validity of inventories should be verified in multiple ways. Although the epistemological beliefs about the physical science survey (EBAPS) has been deemed to be reliable and valid by the authors, the axes or factor structure proposed by the authors has not been independently checked. Using data from a study sample we discussed in previous publications, we performed exploratory factor analysis on 1,258 post-test EBAPS surveys. The students in the sample were from an introductory Astronomy course at a mid-sized western university. Inspection suggested the use of either a three-factor model or a five-factor model. Each of the factors is interpreted and discussed, and the factors are compared to the axes proposed by the authors of the EBAPS. We find that the five-factor model extrapolated from our data partially overlaps with the model put forth by the authors of the EBAPS, and that many of the questions did not load onto any factors.

  10. Microstructure-Evolution and Reliability Assessment Tool for Lead-Free Component Insertion in Army Electronics

    DTIC Science & Technology

    2008-10-01

    provide adequate means for thermal heat dissipation and cooling. Thus electronic packaging has four main functions [1]: • Signal distribution which... dissipation , involving structural and materials consideration. • Mechanical, chemical and electromagnetic protection of components and... nature when compared to phenomenological models. Microelectronic packaging industry spends typically several months building and reliability

  11. Large-scale systems: Complexity, stability, reliability

    NASA Technical Reports Server (NTRS)

    Siljak, D. D.

    1975-01-01

    After showing that a complex dynamic system with a competitive structure has highly reliable stability, a class of noncompetitive dynamic systems for which competitive models can be constructed is defined. It is shown that such a construction is possible in the context of the hierarchic stability analysis. The scheme is based on the comparison principle and vector Liapunov functions.

  12. Alarms about structural alerts.

    PubMed

    Alves, Vinicius; Muratov, Eugene; Capuzzi, Stephen; Politi, Regina; Low, Yen; Braga, Rodolpho; Zakharov, Alexey V; Sedykh, Alexander; Mokshyna, Elena; Farag, Sherif; Andrade, Carolina; Kuz'min, Victor; Fourches, Denis; Tropsha, Alexander

    2016-08-21

    Structural alerts are widely accepted in chemical toxicology and regulatory decision support as a simple and transparent means to flag potential chemical hazards or group compounds into categories for read-across. However, there has been a growing concern that alerts disproportionally flag too many chemicals as toxic, which questions their reliability as toxicity markers. Conversely, the rigorously developed and properly validated statistical QSAR models can accurately and reliably predict the toxicity of a chemical; however, their use in regulatory toxicology has been hampered by the lack of transparency and interpretability. We demonstrate that contrary to the common perception of QSAR models as "black boxes" they can be used to identify statistically significant chemical substructures (QSAR-based alerts) that influence toxicity. We show through several case studies, however, that the mere presence of structural alerts in a chemical, irrespective of the derivation method (expert-based or QSAR-based), should be perceived only as hypotheses of possible toxicological effect. We propose a new approach that synergistically integrates structural alerts and rigorously validated QSAR models for a more transparent and accurate safety assessment of new chemicals.

  13. Probabilistic Structural Analysis Methods (PSAM) for Select Space Propulsion System Components

    NASA Technical Reports Server (NTRS)

    1999-01-01

    Probabilistic Structural Analysis Methods (PSAM) are described for the probabilistic structural analysis of engine components for current and future space propulsion systems. Components for these systems are subjected to stochastic thermomechanical launch loads. Uncertainties or randomness also occurs in material properties, structural geometry, and boundary conditions. Material property stochasticity, such as in modulus of elasticity or yield strength, exists in every structure and is a consequence of variations in material composition and manufacturing processes. Procedures are outlined for computing the probabilistic structural response or reliability of the structural components. The response variables include static or dynamic deflections, strains, and stresses at one or several locations, natural frequencies, fatigue or creep life, etc. Sample cases illustrates how the PSAM methods and codes simulate input uncertainties and compute probabilistic response or reliability using a finite element model with probabilistic methods.

  14. Reliable and More Powerful Methods for Power Analysis in Structural Equation Modeling

    ERIC Educational Resources Information Center

    Yuan, Ke-Hai; Zhang, Zhiyong; Zhao, Yanyun

    2017-01-01

    The normal-distribution-based likelihood ratio statistic T[subscript ml] = nF[subscript ml] is widely used for power analysis in structural Equation modeling (SEM). In such an analysis, power and sample size are computed by assuming that T[subscript ml] follows a central chi-square distribution under H[subscript 0] and a noncentral chi-square…

  15. CARES/LIFE Ceramics Analysis and Reliability Evaluation of Structures Life Prediction Program

    NASA Technical Reports Server (NTRS)

    Nemeth, Noel N.; Powers, Lynn M.; Janosik, Lesley A.; Gyekenyesi, John P.

    2003-01-01

    This manual describes the Ceramics Analysis and Reliability Evaluation of Structures Life Prediction (CARES/LIFE) computer program. The program calculates the time-dependent reliability of monolithic ceramic components subjected to thermomechanical and/or proof test loading. CARES/LIFE is an extension of the CARES (Ceramic Analysis and Reliability Evaluation of Structures) computer program. The program uses results from MSC/NASTRAN, ABAQUS, and ANSYS finite element analysis programs to evaluate component reliability due to inherent surface and/or volume type flaws. CARES/LIFE accounts for the phenomenon of subcritical crack growth (SCG) by utilizing the power law, Paris law, or Walker law. The two-parameter Weibull cumulative distribution function is used to characterize the variation in component strength. The effects of multiaxial stresses are modeled by using either the principle of independent action (PIA), the Weibull normal stress averaging method (NSA), or the Batdorf theory. Inert strength and fatigue parameters are estimated from rupture strength data of naturally flawed specimens loaded in static, dynamic, or cyclic fatigue. The probabilistic time-dependent theories used in CARES/LIFE, along with the input and output for CARES/LIFE, are described. Example problems to demonstrate various features of the program are also included.

  16. Reliability analysis in interdependent smart grid systems

    NASA Astrophysics Data System (ADS)

    Peng, Hao; Kan, Zhe; Zhao, Dandan; Han, Jianmin; Lu, Jianfeng; Hu, Zhaolong

    2018-06-01

    Complex network theory is a useful way to study many real complex systems. In this paper, a reliability analysis model based on complex network theory is introduced in interdependent smart grid systems. In this paper, we focus on understanding the structure of smart grid systems and studying the underlying network model, their interactions, and relationships and how cascading failures occur in the interdependent smart grid systems. We propose a practical model for interdependent smart grid systems using complex theory. Besides, based on percolation theory, we also study the effect of cascading failures effect and reveal detailed mathematical analysis of failure propagation in such systems. We analyze the reliability of our proposed model caused by random attacks or failures by calculating the size of giant functioning components in interdependent smart grid systems. Our simulation results also show that there exists a threshold for the proportion of faulty nodes, beyond which the smart grid systems collapse. Also we determine the critical values for different system parameters. In this way, the reliability analysis model based on complex network theory can be effectively utilized for anti-attack and protection purposes in interdependent smart grid systems.

  17. Factor structure and psychometric properties of the trier inventory for chronic stress (TICS) in a representative german sample

    PubMed Central

    2012-01-01

    Background Chronic stress results from an imbalance of personal traits, resources and the demands placed upon an individual by social and occupational situations. This chronic stress can be measured using the Trier Inventory for Chronic Stress (TICS). Aims of the present study are to test the factorial structure of the TICS, report its psychometric properties, and evaluate the influence of gender and age on chronic stress. Methods The TICS was answered by N = 2,339 healthy participants aged 14 to 99. The sample was selected by random-route sampling. Exploratory factor analyses with Oblimin-rotated Principal Axis extraction were calculated. Confirmatory factor analyses applying Robust Maximum Likelihood estimations (MLM) tested model fit and configural invariance as well as the measurement invariance for gender and age. Reliability estimations and effect sizes are reported. Results In the exploratory factor analyses, both a two-factor and a nine-factor model emerged. Confirmatory factor analyses resulted in acceptable model fit (RMSEA), with model comparison fit statistics corroborating the superiority of the nine-factor model. Most factors were moderately to highly intercorrelated. Reliabilities were good to very good. Measurement invariance tests gave evidence for differential effects of gender and age on the factor structure. Furthermore, women and younger individuals, especially those aged 35 to 44, tended to report more chronic stress than men and older individuals. Conclusions The proposed nine-factor structure could be factorially validated, results in good scale reliability, and heuristically can be grouped by two higher-order factors: "High Demands" and "Lack of Satisfaction". Age and gender represent differentiable and meaningful contributors to the perception of chronic stress. PMID:22463771

  18. Estimation and enhancement of real-time software reliability through mutation analysis

    NASA Technical Reports Server (NTRS)

    Geist, Robert; Offutt, A. J.; Harris, Frederick C., Jr.

    1992-01-01

    A simulation-based technique for obtaining numerical estimates of the reliability of N-version, real-time software is presented. An extended stochastic Petri net is employed to represent the synchronization structure of N versions of the software, where dependencies among versions are modeled through correlated sampling of module execution times. Test results utilizing specifications for NASA's planetary lander control software indicate that mutation-based testing could hold greater potential for enhancing reliability than the desirable but perhaps unachievable goal of independence among N versions.

  19. Emulation applied to reliability analysis of reconfigurable, highly reliable, fault-tolerant computing systems

    NASA Technical Reports Server (NTRS)

    Migneault, G. E.

    1979-01-01

    Emulation techniques applied to the analysis of the reliability of highly reliable computer systems for future commercial aircraft are described. The lack of credible precision in reliability estimates obtained by analytical modeling techniques is first established. The difficulty is shown to be an unavoidable consequence of: (1) a high reliability requirement so demanding as to make system evaluation by use testing infeasible; (2) a complex system design technique, fault tolerance; (3) system reliability dominated by errors due to flaws in the system definition; and (4) elaborate analytical modeling techniques whose precision outputs are quite sensitive to errors of approximation in their input data. Next, the technique of emulation is described, indicating how its input is a simple description of the logical structure of a system and its output is the consequent behavior. Use of emulation techniques is discussed for pseudo-testing systems to evaluate bounds on the parameter values needed for the analytical techniques. Finally an illustrative example is presented to demonstrate from actual use the promise of the proposed application of emulation.

  20. Highly uniform and reliable resistive switching characteristics of a Ni/WOx/p+-Si memory device

    NASA Astrophysics Data System (ADS)

    Kim, Tae-Hyeon; Kim, Sungjun; Kim, Hyungjin; Kim, Min-Hwi; Bang, Suhyun; Cho, Seongjae; Park, Byung-Gook

    2018-02-01

    In this paper, we investigate the resistive switching behavior of a bipolar resistive random-access memory (RRAM) in a Ni/WOx/p+-Si RRAM with CMOS compatibility. Highly unifrom and reliable bipolar resistive switching characteristics are observed by a DC voltage sweeping and its switching mechanism can be explained by SCLC model. As a result, the possibility of metal-insulator-silicon (MIS) structural WOx-based RRAM's application to Si-based 1D (diode)-1R (RRAM) or 1T (transistor)-1R (RRAM) structure is demonstrated.

  1. Magnetic resonance imaging can accurately assess the long-term progression of knee structural changes in experimental dog osteoarthritis.

    PubMed

    Boileau, C; Martel-Pelletier, J; Abram, F; Raynauld, J-P; Troncy, E; D'Anjou, M-A; Moreau, M; Pelletier, J-P

    2008-07-01

    Osteoarthritis (OA) structural changes take place over decades in humans. MRI can provide precise and reliable information on the joint structure and changes over time. In this study, we investigated the reliability of quantitative MRI in assessing knee OA structural changes in the experimental anterior cruciate ligament (ACL) dog model of OA. OA was surgically induced by transection of the ACL of the right knee in five dogs. High resolution three dimensional MRI using a 1.5 T magnet was performed at baseline, 4, 8 and 26 weeks post surgery. Cartilage volume/thickness, cartilage defects, trochlear osteophyte formation and subchondral bone lesion (hypersignal) were assessed on MRI images. Animals were killed 26 weeks post surgery and macroscopic evaluation was performed. There was a progressive and significant increase over time in the loss of knee cartilage volume, the cartilage defect and subchondral bone hypersignal. The trochlear osteophyte size also progressed over time. The greatest cartilage loss at 26 weeks was found on the tibial plateaus and in the medial compartment. There was a highly significant correlation between total knee cartilage volume loss or defect and subchondral bone hypersignal, and also a good correlation between the macroscopic and the MRI findings. This study demonstrated that MRI is a useful technology to provide a non-invasive and reliable assessment of the joint structural changes during the development of OA in the ACL dog model. The combination of this OA model with MRI evaluation provides a promising tool for the evaluation of new disease-modifying osteoarthritis drugs (DMOADs).

  2. Predicting wettability behavior of fluorosilica coated metal surface using optimum neural network

    NASA Astrophysics Data System (ADS)

    Taghipour-Gorjikolaie, Mehran; Valipour Motlagh, Naser

    2018-02-01

    The interaction between variables, which are effective on the surface wettability, is very complex to predict the contact angles and sliding angles of liquid drops. In this paper, in order to solve this complexity, artificial neural network was used to develop reliable models for predicting the angles of liquid drops. Experimental data are divided into training data and testing data. By using training data and feed forward structure for the neural network and using particle swarm optimization for training the neural network based models, the optimum models were developed. The obtained results showed that regression index for the proposed models for the contact angles and sliding angles are 0.9874 and 0.9920, respectively. As it can be seen, these values are close to unit and it means the reliable performance of the models. Also, it can be inferred from the results that the proposed model have more reliable performance than multi-layer perceptron and radial basis function based models.

  3. Lifetime Reliability Evaluation of Structural Ceramic Parts with the CARES/LIFE Computer Program

    NASA Technical Reports Server (NTRS)

    Nemeth, Noel N.; Powers, Lynn M.; Janosik, Lesley A.; Gyekenyesi, John P.

    1993-01-01

    The computer program CARES/LIFE calculates the time-dependent reliability of monolithic ceramic components subjected to thermomechanical and/or proof test loading. This program is an extension of the CARES (Ceramics Analysis and Reliability Evaluation of Structures) computer program. CARES/LIFE accounts for the phenomenon of subcritical crack growth (SCG) by utilizing the power law, Paris law, or Walker equation. The two-parameter Weibull cumulative distribution function is used to characterize the variation in component strength. The effects of multiaxial stresses are modeled using either the principle of independent action (PIA), Weibull's normal stress averaging method (NSA), or Batdorf's theory. Inert strength and fatigue parameters are estimated from rupture strength data of naturally flawed specimens loaded in static, dynamic, or cyclic fatigue. Two example problems demonstrating cyclic fatigue parameter estimation and component reliability analysis with proof testing are included.

  4. Probabilistic structural mechanics research for parallel processing computers

    NASA Technical Reports Server (NTRS)

    Sues, Robert H.; Chen, Heh-Chyun; Twisdale, Lawrence A.; Martin, William R.

    1991-01-01

    Aerospace structures and spacecraft are a complex assemblage of structural components that are subjected to a variety of complex, cyclic, and transient loading conditions. Significant modeling uncertainties are present in these structures, in addition to the inherent randomness of material properties and loads. To properly account for these uncertainties in evaluating and assessing the reliability of these components and structures, probabilistic structural mechanics (PSM) procedures must be used. Much research has focused on basic theory development and the development of approximate analytic solution methods in random vibrations and structural reliability. Practical application of PSM methods was hampered by their computationally intense nature. Solution of PSM problems requires repeated analyses of structures that are often large, and exhibit nonlinear and/or dynamic response behavior. These methods are all inherently parallel and ideally suited to implementation on parallel processing computers. New hardware architectures and innovative control software and solution methodologies are needed to make solution of large scale PSM problems practical.

  5. Identification of Extracellular Segments by Mass Spectrometry Improves Topology Prediction of Transmembrane Proteins.

    PubMed

    Langó, Tamás; Róna, Gergely; Hunyadi-Gulyás, Éva; Turiák, Lilla; Varga, Julia; Dobson, László; Várady, György; Drahos, László; Vértessy, Beáta G; Medzihradszky, Katalin F; Szakács, Gergely; Tusnády, Gábor E

    2017-02-13

    Transmembrane proteins play crucial role in signaling, ion transport, nutrient uptake, as well as in maintaining the dynamic equilibrium between the internal and external environment of cells. Despite their important biological functions and abundance, less than 2% of all determined structures are transmembrane proteins. Given the persisting technical difficulties associated with high resolution structure determination of transmembrane proteins, additional methods, including computational and experimental techniques remain vital in promoting our understanding of their topologies, 3D structures, functions and interactions. Here we report a method for the high-throughput determination of extracellular segments of transmembrane proteins based on the identification of surface labeled and biotin captured peptide fragments by LC/MS/MS. We show that reliable identification of extracellular protein segments increases the accuracy and reliability of existing topology prediction algorithms. Using the experimental topology data as constraints, our improved prediction tool provides accurate and reliable topology models for hundreds of human transmembrane proteins.

  6. Dynamic analysis of space structures including elastic, multibody, and control behavior

    NASA Technical Reports Server (NTRS)

    Pinson, Larry; Soosaar, Keto

    1989-01-01

    The problem is to develop analysis methods, modeling stategies, and simulation tools to predict with assurance the on-orbit performance and integrity of large complex space structures that cannot be verified on the ground. The problem must incorporate large reliable structural models, multi-body flexible dynamics, multi-tier controller interaction, environmental models including 1g and atmosphere, various on-board disturbances, and linkage to mission-level performance codes. All areas are in serious need of work, but the weakest link is multi-body flexible dynamics.

  7. Developing a novel hierarchical approach for multiscale structural reliability predictions for ultra-high consequence applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Emery, John M.; Coffin, Peter; Robbins, Brian A.

    Microstructural variabilities are among the predominant sources of uncertainty in structural performance and reliability. We seek to develop efficient algorithms for multiscale calcu- lations for polycrystalline alloys such as aluminum alloy 6061-T6 in environments where ductile fracture is the dominant failure mode. Our approach employs concurrent multiscale methods, but does not focus on their development. They are a necessary but not sufficient ingredient to multiscale reliability predictions. We have focused on how to efficiently use concurrent models for forward propagation because practical applications cannot include fine-scale details throughout the problem domain due to exorbitant computational demand. Our approach begins withmore » a low-fidelity prediction at the engineering scale that is sub- sequently refined with multiscale simulation. The results presented in this report focus on plasticity and damage at the meso-scale, efforts to expedite Monte Carlo simulation with mi- crostructural considerations, modeling aspects regarding geometric representation of grains and second-phase particles, and contrasting algorithms for scale coupling.« less

  8. Reliability Sensitivity Analysis and Design Optimization of Composite Structures Based on Response Surface Methodology

    NASA Technical Reports Server (NTRS)

    Rais-Rohani, Masoud

    2003-01-01

    This report discusses the development and application of two alternative strategies in the form of global and sequential local response surface (RS) techniques for the solution of reliability-based optimization (RBO) problems. The problem of a thin-walled composite circular cylinder under axial buckling instability is used as a demonstrative example. In this case, the global technique uses a single second-order RS model to estimate the axial buckling load over the entire feasible design space (FDS) whereas the local technique uses multiple first-order RS models with each applied to a small subregion of FDS. Alternative methods for the calculation of unknown coefficients in each RS model are explored prior to the solution of the optimization problem. The example RBO problem is formulated as a function of 23 uncorrelated random variables that include material properties, thickness and orientation angle of each ply, cylinder diameter and length, as well as the applied load. The mean values of the 8 ply thicknesses are treated as independent design variables. While the coefficients of variation of all random variables are held fixed, the standard deviations of ply thicknesses can vary during the optimization process as a result of changes in the design variables. The structural reliability analysis is based on the first-order reliability method with reliability index treated as the design constraint. In addition to the probabilistic sensitivity analysis of reliability index, the results of the RBO problem are presented for different combinations of cylinder length and diameter and laminate ply patterns. The two strategies are found to produce similar results in terms of accuracy with the sequential local RS technique having a considerably better computational efficiency.

  9. Computational methods for structural load and resistance modeling

    NASA Technical Reports Server (NTRS)

    Thacker, B. H.; Millwater, H. R.; Harren, S. V.

    1991-01-01

    An automated capability for computing structural reliability considering uncertainties in both load and resistance variables is presented. The computations are carried out using an automated Advanced Mean Value iteration algorithm (AMV +) with performance functions involving load and resistance variables obtained by both explicit and implicit methods. A complete description of the procedures used is given as well as several illustrative examples, verified by Monte Carlo Analysis. In particular, the computational methods described in the paper are shown to be quite accurate and efficient for a material nonlinear structure considering material damage as a function of several primitive random variables. The results show clearly the effectiveness of the algorithms for computing the reliability of large-scale structural systems with a maximum number of resolutions.

  10. Sample size requirements for the design of reliability studies: precision consideration.

    PubMed

    Shieh, Gwowen

    2014-09-01

    In multilevel modeling, the intraclass correlation coefficient based on the one-way random-effects model is routinely employed to measure the reliability or degree of resemblance among group members. To facilitate the advocated practice of reporting confidence intervals in future reliability studies, this article presents exact sample size procedures for precise interval estimation of the intraclass correlation coefficient under various allocation and cost structures. Although the suggested approaches do not admit explicit sample size formulas and require special algorithms for carrying out iterative computations, they are more accurate than the closed-form formulas constructed from large-sample approximations with respect to the expected width and assurance probability criteria. This investigation notes the deficiency of existing methods and expands the sample size methodology for the design of reliability studies that have not previously been discussed in the literature.

  11. Testing comparison models of DASS-12 and its reliability among adolescents in Malaysia.

    PubMed

    Osman, Zubaidah Jamil; Mukhtar, Firdaus; Hashim, Hairul Anuar; Abdul Latiff, Latiffah; Mohd Sidik, Sherina; Awang, Hamidin; Ibrahim, Normala; Abdul Rahman, Hejar; Ismail, Siti Irma Fadhilah; Ibrahim, Faisal; Tajik, Esra; Othman, Norlijah

    2014-10-01

    The 21-item Depression, Anxiety and Stress Scale (DASS-21) is frequently used in non-clinical research to measure mental health factors among adults. However, previous studies have concluded that the 21 items are not stable for utilization among the adolescent population. Thus, the aims of this study are to examine the structure of the factors and to report on the reliability of the refined version of the DASS that consists of 12 items. A total of 2850 students (aged 13 to 17 years old) from three major ethnic in Malaysia completed the DASS-21. The study was conducted at 10 randomly selected secondary schools in the northern state of Peninsular Malaysia. The study population comprised secondary school students (Forms 1, 2 and 4) from the selected schools. Based on the results of the EFA stage, 12 items were included in a final CFA to test the fit of the model. Using maximum likelihood procedures to estimate the model, the selected fit indices indicated a close model fit (χ(2)=132.94, df=57, p=.000; CFI=.96; RMR=.02; RMSEA=.04). Moreover, significant loadings of all the unstandardized regression weights implied an acceptable convergent validity. Besides the convergent validity of the item, a discriminant validity of the subscales was also evident from the moderate latent factor inter-correlations, which ranged from .62 to .75. The subscale reliability was further estimated using Cronbach's alpha and the adequate reliability of the subscales was obtained (Total=76; Depression=.68; Anxiety=.53; Stress=.52). The new version of the 12-item DASS for adolescents in Malaysia (DASS-12) is reliable and has a stable factor structure, and thus it is a useful instrument for distinguishing between depression, anxiety and stress. Copyright © 2014 Elsevier Inc. All rights reserved.

  12. Modeling of unit operating considerations in generating-capacity reliability evaluation. Volume 2. Computer-program documentation. Final report. [GENESIS, OPCON and OPPLAN

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Patton, A.D.; Ayoub, A.K.; Singh, C.

    1982-07-01

    This report describes the structure and operation of prototype computer programs developed for a Monte Carlo simulation model, GENESIS, and for two analytical models, OPCON and OPPLAN. It includes input data requirements and sample test cases.

  13. Assessing the impact of land use change on hydrology by ensemble modeling (LUCHEM) III: Scenario analysis

    USGS Publications Warehouse

    Huisman, J.A.; Breuer, L.; Bormann, H.; Bronstert, A.; Croke, B.F.W.; Frede, H.-G.; Graff, T.; Hubrechts, L.; Jakeman, A.J.; Kite, G.; Lanini, J.; Leavesley, G.; Lettenmaier, D.P.; Lindstrom, G.; Seibert, J.; Sivapalan, M.; Viney, N.R.; Willems, P.

    2009-01-01

    An ensemble of 10 hydrological models was applied to the same set of land use change scenarios. There was general agreement about the direction of changes in the mean annual discharge and 90% discharge percentile predicted by the ensemble members, although a considerable range in the magnitude of predictions for the scenarios and catchments under consideration was obvious. Differences in the magnitude of the increase were attributed to the different mean annual actual evapotranspiration rates for each land use type. The ensemble of model runs was further analyzed with deterministic and probabilistic ensemble methods. The deterministic ensemble method based on a trimmed mean resulted in a single somewhat more reliable scenario prediction. The probabilistic reliability ensemble averaging (REA) method allowed a quantification of the model structure uncertainty in the scenario predictions. It was concluded that the use of a model ensemble has greatly increased our confidence in the reliability of the model predictions. ?? 2008 Elsevier Ltd.

  14. Reliability and Validity of the Telephone-Based eHealth Literacy Scale Among Older Adults: Cross-Sectional Survey.

    PubMed

    Stellefson, Michael; Paige, Samantha R; Tennant, Bethany; Alber, Julia M; Chaney, Beth H; Chaney, Don; Grossman, Suzanne

    2017-10-26

    Only a handful of studies have examined reliability and validity evidence of scores produced by the 8-item eHealth literacy Scale (eHEALS) among older adults. Older adults are generally more comfortable responding to survey items when asked by a real person rather than by completing self-administered paper-and-pencil or online questionnaires. However, no studies have explored the psychometrics of this scale when administered to older adults over the telephone. The objective of our study was to examine the reliability and internal structure of eHEALS data collected from older adults aged 50 years or older responding to items over the telephone. Respondents (N=283) completed eHEALS as part of a cross-sectional landline telephone survey. Exploratory structural equation modeling (E-SEM) analyses examined model fit of eHEALS scores with 1-, 2-, and 3-factor structures. Subsequent analyses based on the partial credit model explored the internal structure of eHEALS data. Compared with 1- and 2-factor models, the 3-factor eHEALS structure showed the best global E-SEM model fit indices (root mean square error of approximation=.07; comparative fit index=1.0; Tucker-Lewis index=1.0). Nonetheless, the 3 factors were highly correlated (r range .36 to .65). Item analyses revealed that eHEALS items 2 through 5 were overfit to a minor degree (mean square infit/outfit values <1.0; t statistics less than -2.0), but the internal structure of Likert scale response options functioned as expected. Overfitting eHEALS items (2-5) displayed a similar degree of information for respondents at similar points on the latent continuum. Test information curves suggested that eHEALS may capture more information about older adults at the higher end of the latent continuum (ie, those with high eHealth literacy) than at the lower end of the continuum (ie, those with low eHealth literacy). Item reliability (value=.92) and item separation (value=11.31) estimates indicated that eHEALS responses were reliable and stable. Results support administering eHEALS over the telephone when surveying older adults regarding their use of the Internet for health information. eHEALS scores best captured 3 factors (or subscales) to measure eHealth literacy in older adults; however, statistically significant correlations between these 3 factors suggest an overarching unidimensional structure with 3 underlying dimensions. As older adults continue to use the Internet more frequently to find and evaluate health information, it will be important to consider modifying the original eHEALS to adequately measure societal shifts in online health information seeking among aging populations. ©Michael Stellefson, Samantha R Paige, Bethany Tennant, Julia M Alber, Beth H Chaney, Don Chaney, Suzanne Grossman. Originally published in the Journal of Medical Internet Research (http://www.jmir.org), 26.10.2017.

  15. Probabilistic fatigue life prediction of metallic and composite materials

    NASA Astrophysics Data System (ADS)

    Xiang, Yibing

    Fatigue is one of the most common failure modes for engineering structures, such as aircrafts, rotorcrafts and aviation transports. Both metallic materials and composite materials are widely used and affected by fatigue damage. Huge uncertainties arise from material properties, measurement noise, imperfect models, future anticipated loads and environmental conditions. These uncertainties are critical issues for accurate remaining useful life (RUL) prediction for engineering structures in service. Probabilistic fatigue prognosis considering various uncertainties is of great importance for structural safety. The objective of this study is to develop probabilistic fatigue life prediction models for metallic materials and composite materials. A fatigue model based on crack growth analysis and equivalent initial flaw size concept is proposed for metallic materials. Following this, the developed model is extended to include structural geometry effects (notch effect), environmental effects (corroded specimens) and manufacturing effects (shot peening effects). Due to the inhomogeneity and anisotropy, the fatigue model suitable for metallic materials cannot be directly applied to composite materials. A composite fatigue model life prediction is proposed based on a mixed-mode delamination growth model and a stiffness degradation law. After the development of deterministic fatigue models of metallic and composite materials, a general probabilistic life prediction methodology is developed. The proposed methodology combines an efficient Inverse First-Order Reliability Method (IFORM) for the uncertainty propogation in fatigue life prediction. An equivalent stresstransformation has been developed to enhance the computational efficiency under realistic random amplitude loading. A systematical reliability-based maintenance optimization framework is proposed for fatigue risk management and mitigation of engineering structures.

  16. A Reliability Model for Ni-BaTiO3-Based (BME) Ceramic Capacitors

    NASA Technical Reports Server (NTRS)

    Liu, Donhang

    2014-01-01

    The evaluation of multilayer ceramic capacitors (MLCCs) with base-metal electrodes (BMEs) for potential NASA space project applications requires an in-depth understanding of their reliability. The reliability of an MLCC is defined as the ability of the dielectric material to retain its insulating properties under stated environmental and operational conditions for a specified period of time t. In this presentation, a general mathematic expression of a reliability model for a BME MLCC is developed and discussed. The reliability model consists of three parts: (1) a statistical distribution that describes the individual variation of properties in a test group of samples (Weibull, log normal, normal, etc.), (2) an acceleration function that describes how a capacitors reliability responds to external stresses such as applied voltage and temperature (All units in the test group should follow the same acceleration function if they share the same failure mode, independent of individual units), and (3) the effect and contribution of the structural and constructional characteristics of a multilayer capacitor device, such as the number of dielectric layers N, dielectric thickness d, average grain size r, and capacitor chip size S. In general, a two-parameter Weibull statistical distribution model is used in the description of a BME capacitors reliability as a function of time. The acceleration function that relates a capacitors reliability to external stresses is dependent on the failure mode. Two failure modes have been identified in BME MLCCs: catastrophic and slow degradation. A catastrophic failure is characterized by a time-accelerating increase in leakage current that is mainly due to existing processing defects (voids, cracks, delamination, etc.), or the extrinsic defects. A slow degradation failure is characterized by a near-linear increase in leakage current against the stress time; this is caused by the electromigration of oxygen vacancies (intrinsic defects). The two identified failure modes follow different acceleration functions. Catastrophic failures follow the traditional power-law relationship to the applied voltage. Slow degradation failures fit well to an exponential law relationship to the applied electrical field. Finally, the impact of capacitor structure on the reliability of BME capacitors is discussed with respect to the number of dielectric layers in an MLCC unit, the number of BaTiO3 grains per dielectric layer, and the chip size of the capacitor device.

  17. Comprehensive Deployment Method for Technical Characteristics Base on Multi-failure Modes Correlation Analysis

    NASA Astrophysics Data System (ADS)

    Zheng, W.; Gao, J. M.; Wang, R. X.; Chen, K.; Jiang, Y.

    2017-12-01

    This paper put forward a new method of technical characteristics deployment based on Reliability Function Deployment (RFD) by analysing the advantages and shortages of related research works on mechanical reliability design. The matrix decomposition structure of RFD was used to describe the correlative relation between failure mechanisms, soft failures and hard failures. By considering the correlation of multiple failure modes, the reliability loss of one failure mode to the whole part was defined, and a calculation and analysis model for reliability loss was presented. According to the reliability loss, the reliability index value of the whole part was allocated to each failure mode. On the basis of the deployment of reliability index value, the inverse reliability method was employed to acquire the values of technology characteristics. The feasibility and validity of proposed method were illustrated by a development case of machining centre’s transmission system.

  18. Factor structure of the Childhood Autism Rating Scale as per DSM-5.

    PubMed

    Park, Eun-Young; Kim, Joungmin

    2016-02-01

    The DSM-5 recently proposed new diagnostic criteria for autism spectrum disorder (ASD). Although many new or updated tools have been developed since the DSM-IV was published in 1994, the Childhood Autism Rating Scale (CARS) has been used consistently in ASD diagnosis and research due to its technical adequacy, cost-effectiveness, and practicality. Additionally, items in the CARS did not alter following the release of the revised DSM-IV because the CARS factor structure was found to be consistent with the revised criteria after factor analysis. For that reason, in this study confirmatory factor analysis was used to identify the factor structure of the CARS. Participants (n = 150) consisted of children with an ASD diagnosis or who met the criteria for broader autism or emotional/behavior disorder with comorbid disorders such as attention-deficit hyperactivity disorder, bipolar disorder, intellectual or developmental disabilities. Previous studies used one-, two-, and four-factor models, all of which we examined to confirm the best-fit model on confirmatory factor analysis. Appropriate comparative fit indices and root mean square errors were obtained for all four models. The two-factor model, based on DSM-5 criteria, was the most valid and reliable. The inter-item consistency of the CARS was 0.926 and demonstrated adequate reliability, thereby supporting the validity and reliability of the two-factor model of CARS. Although CARS was developed prior to the introduction of DSM-5, its psychometric properties, conceptual relevance, and flexible administration procedures support its continued role as a screening device in the diagnostic decision-making process. © 2015 Japan Pediatric Society.

  19. What makes an accurate and reliable subject-specific finite element model? A case study of an elephant femur

    PubMed Central

    Panagiotopoulou, O.; Wilshin, S. D.; Rayfield, E. J.; Shefelbine, S. J.; Hutchinson, J. R.

    2012-01-01

    Finite element modelling is well entrenched in comparative vertebrate biomechanics as a tool to assess the mechanical design of skeletal structures and to better comprehend the complex interaction of their form–function relationships. But what makes a reliable subject-specific finite element model? To approach this question, we here present a set of convergence and sensitivity analyses and a validation study as an example, for finite element analysis (FEA) in general, of ways to ensure a reliable model. We detail how choices of element size, type and material properties in FEA influence the results of simulations. We also present an empirical model for estimating heterogeneous material properties throughout an elephant femur (but of broad applicability to FEA). We then use an ex vivo experimental validation test of a cadaveric femur to check our FEA results and find that the heterogeneous model matches the experimental results extremely well, and far better than the homogeneous model. We emphasize how considering heterogeneous material properties in FEA may be critical, so this should become standard practice in comparative FEA studies along with convergence analyses, consideration of element size, type and experimental validation. These steps may be required to obtain accurate models and derive reliable conclusions from them. PMID:21752810

  20. Technique for Early Reliability Prediction of Software Components Using Behaviour Models

    PubMed Central

    Ali, Awad; N. A. Jawawi, Dayang; Adham Isa, Mohd; Imran Babar, Muhammad

    2016-01-01

    Behaviour models are the most commonly used input for predicting the reliability of a software system at the early design stage. A component behaviour model reveals the structure and behaviour of the component during the execution of system-level functionalities. There are various challenges related to component reliability prediction at the early design stage based on behaviour models. For example, most of the current reliability techniques do not provide fine-grained sequential behaviour models of individual components and fail to consider the loop entry and exit points in the reliability computation. Moreover, some of the current techniques do not tackle the problem of operational data unavailability and the lack of analysis results that can be valuable for software architects at the early design stage. This paper proposes a reliability prediction technique that, pragmatically, synthesizes system behaviour in the form of a state machine, given a set of scenarios and corresponding constraints as input. The state machine is utilized as a base for generating the component-relevant operational data. The state machine is also used as a source for identifying the nodes and edges of a component probabilistic dependency graph (CPDG). Based on the CPDG, a stack-based algorithm is used to compute the reliability. The proposed technique is evaluated by a comparison with existing techniques and the application of sensitivity analysis to a robotic wheelchair system as a case study. The results indicate that the proposed technique is more relevant at the early design stage compared to existing works, and can provide a more realistic and meaningful prediction. PMID:27668748

  1. Structural Validation of the Holistic Wellness Assessment

    ERIC Educational Resources Information Center

    Brown, Charlene; Applegate, E. Brooks; Yildiz, Mustafa

    2015-01-01

    The Holistic Wellness Assessment (HWA) is a relatively new assessment instrument based on an emergent transdisciplinary model of wellness. This study validated the factor structure identified via exploratory factor analysis (EFA), assessed test-retest reliability, and investigated concurrent validity of the HWA in three separate samples. The…

  2. On the magnetic circular dichroism of benzene. A density-functional study

    NASA Astrophysics Data System (ADS)

    Kaminský, Jakub; Kříž, Jan; Bouř, Petr

    2017-04-01

    Spectroscopy of magnetic circular dichroism (MCD) provides enhanced information on molecular structure and a more reliable assignment of spectral bands than absorption alone. Theoretical modeling can significantly enhance the information obtained from experimental spectra. In the present study, the time dependent density functional theory is employed to model the lowest-energy benzene transitions, in particular to investigate the role of the Rydberg states and vibrational interference in spectral intensities. The effect of solvent is explored on model benzene-methane clusters. For the lowest-energy excitation, the vibrational sub-structure of absorption and MCD spectra is modeled within the harmonic approximation, providing a very good agreement with the experiment. The simulations demonstrate that the Rydberg states have a much stronger effect on the MCD intensities than on the absorption, and a very diffuse basis set must be used to obtain reliable results. The modeling also indicates that the Rydberg-like states and associated transitions may persist in solutions. Continuum-like solvent models are thus not suitable for their modeling; solvent-solute clusters appear to be more appropriate, providing they are large enough.

  3. Applications of a global nuclear-structure model to studies of the heaviest elements

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Moeller, P.; Nix, J.R.

    1993-10-01

    We present some new results on heavy-element nuclear-structure properties calculated on the basis of the finite-range droplet model and folded-Yukawa single-particle potential. Specifically, we discuss calculations of nuclear ground-state masses and microscopic corrections, {alpha}-decay properties, {beta}-decay properties, fission potential-energy surfaces, and spontaneous-fission half-lives. These results, obtained in a global nuclear-structure approach, are particularly reliable for describing the stability properties of the heaviest elements.

  4. Nuclear surface diffuseness revealed in nucleon-nucleus diffraction

    NASA Astrophysics Data System (ADS)

    Hatakeyama, S.; Horiuchi, W.; Kohama, A.

    2018-05-01

    The nuclear surface provides useful information on nuclear radius, nuclear structure, as well as properties of nuclear matter. We discuss the relationship between the nuclear surface diffuseness and elastic scattering differential cross section at the first diffraction peak of high-energy nucleon-nucleus scattering as an efficient tool in order to extract the nuclear surface information from limited experimental data involving short-lived unstable nuclei. The high-energy reaction is described by a reliable microscopic reaction theory, the Glauber model. Extending the idea of the black sphere model, we find one-to-one correspondence between the nuclear bulk structure information and proton-nucleus elastic scattering diffraction peak. This implies that we can extract both the nuclear radius and diffuseness simultaneously, using the position of the first diffraction peak and its magnitude of the elastic scattering differential cross section. We confirm the reliability of this approach by using realistic density distributions obtained by a mean-field model.

  5. Estimation and Identifiability of Model Parameters in Human Nociceptive Processing Using Yes-No Detection Responses to Electrocutaneous Stimulation.

    PubMed

    Yang, Huan; Meijer, Hil G E; Buitenweg, Jan R; van Gils, Stephan A

    2016-01-01

    Healthy or pathological states of nociceptive subsystems determine different stimulus-response relations measured from quantitative sensory testing. In turn, stimulus-response measurements may be used to assess these states. In a recently developed computational model, six model parameters characterize activation of nerve endings and spinal neurons. However, both model nonlinearity and limited information in yes-no detection responses to electrocutaneous stimuli challenge to estimate model parameters. Here, we address the question whether and how one can overcome these difficulties for reliable parameter estimation. First, we fit the computational model to experimental stimulus-response pairs by maximizing the likelihood. To evaluate the balance between model fit and complexity, i.e., the number of model parameters, we evaluate the Bayesian Information Criterion. We find that the computational model is better than a conventional logistic model regarding the balance. Second, our theoretical analysis suggests to vary the pulse width among applied stimuli as a necessary condition to prevent structural non-identifiability. In addition, the numerically implemented profile likelihood approach reveals structural and practical non-identifiability. Our model-based approach with integration of psychophysical measurements can be useful for a reliable assessment of states of the nociceptive system.

  6. A general software reliability process simulation technique

    NASA Technical Reports Server (NTRS)

    Tausworthe, Robert C.

    1991-01-01

    The structure and rationale of the generalized software reliability process, together with the design and implementation of a computer program that simulates this process are described. Given assumed parameters of a particular project, the users of this program are able to generate simulated status timelines of work products, numbers of injected anomalies, and the progress of testing, fault isolation, repair, validation, and retest. Such timelines are useful in comparison with actual timeline data, for validating the project input parameters, and for providing data for researchers in reliability prediction modeling.

  7. Finite element modelling of aluminum alloy 2024-T3 under transverse impact loading

    NASA Astrophysics Data System (ADS)

    Abdullah, Ahmad Sufian; Kuntjoro, Wahyu; Yamin, A. F. M.

    2017-12-01

    Fiber metal laminate named GLARE is a new aerospace material which has great potential to be widely used in future lightweight aircraft. It consists of aluminum alloy 2024-T3 and glass-fiber reinforced laminate. In order to produce reliable finite element model of impact response or crashworthiness of structure made of GLARE, one can initially model and validate the finite element model of the impact response of its constituents separately. The objective of this study was to develop a reliable finite element model of aluminum alloy 2024-T3 under low velocity transverse impact loading using commercial software ABAQUS. Johnson-Cook plasticity and damage models were used to predict the alloy's material properties and impact behavior. The results of the finite element analysis were compared to the experiment that has similar material and impact conditions. Results showed good correlations in terms of impact forces, deformation and failure progressions which concluded that the finite element model of 2024-T3 aluminum alloy under low velocity transverse impact condition using Johnson-Cook plastic and damage models was reliable.

  8. Estimating forest canopy fuel parameters using LIDAR data.

    Treesearch

    Hans-Erik Andersen; Robert J. McGaughey; Stephen E. Reutebuch

    2005-01-01

    Fire researchers and resource managers are dependent upon accurate, spatially-explicit forest structure information to support the application of forest fire behavior models. In particular, reliable estimates of several critical forest canopy structure metrics, including canopy bulk density, canopy height, canopy fuel weight, and canopy base height, are required to...

  9. Optimization of life support systems and their systems reliability

    NASA Technical Reports Server (NTRS)

    Fan, L. T.; Hwang, C. L.; Erickson, L. E.

    1971-01-01

    The identification, analysis, and optimization of life support systems and subsystems have been investigated. For each system or subsystem that has been considered, the procedure involves the establishment of a set of system equations (or mathematical model) based on theory and experimental evidences; the analysis and simulation of the model; the optimization of the operation, control, and reliability; analysis of sensitivity of the system based on the model; and, if possible, experimental verification of the theoretical and computational results. Research activities include: (1) modeling of air flow in a confined space; (2) review of several different gas-liquid contactors utilizing centrifugal force: (3) review of carbon dioxide reduction contactors in space vehicles and other enclosed structures: (4) application of modern optimal control theory to environmental control of confined spaces; (5) optimal control of class of nonlinear diffusional distributed parameter systems: (6) optimization of system reliability of life support systems and sub-systems: (7) modeling, simulation and optimal control of the human thermal system: and (8) analysis and optimization of the water-vapor eletrolysis cell.

  10. Decision-theoretic methodology for reliability and risk allocation in nuclear power plants

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cho, N.Z.; Papazoglou, I.A.; Bari, R.A.

    1985-01-01

    This paper describes a methodology for allocating reliability and risk to various reactor systems, subsystems, components, operations, and structures in a consistent manner, based on a set of global safety criteria which are not rigid. The problem is formulated as a multiattribute decision analysis paradigm; the multiobjective optimization, which is performed on a PRA model and reliability cost functions, serves as the guiding principle for reliability and risk allocation. The concept of noninferiority is used in the multiobjective optimization problem. Finding the noninferior solution set is the main theme of the current approach. The assessment of the decision maker's preferencesmore » could then be performed more easily on the noninferior solution set. Some results of the methodology applications to a nontrivial risk model are provided and several outstanding issues such as generic allocation and preference assessment are discussed.« less

  11. Psychometric Properties and Factor Structure of the German Version of the Clinician-Administered PTSD Scale for DSM-5.

    PubMed

    Müller-Engelmann, Meike; Schnyder, Ulrich; Dittmann, Clara; Priebe, Kathlen; Bohus, Martin; Thome, Janine; Fydrich, Thomas; Pfaltz, Monique C; Steil, Regina

    2018-05-01

    The Clinician-Administered PTSD Scale (CAPS) is a widely used diagnostic interview for posttraumatic stress disorder (PTSD). Following fundamental modifications in the Diagnostic and Statistical Manual of Mental Disorders, Fifth Edition ( DSM-5), the CAPS had to be revised. This study examined the psychometric properties (internal consistency, interrater reliability, convergent and discriminant validity, and structural validity) of the German version of the CAPS-5 in a trauma-exposed sample ( n = 223 with PTSD; n =51 without PTSD). The results demonstrated high internal consistency (αs = .65-.93) and high interrater reliability (ICCs = .81-.89). With regard to convergent and discriminant validity, we found high correlations between the CAPS severity score and both the Posttraumatic Diagnostic Scale sum score ( r = .87) and the Beck Depression Inventory total score ( r = .72). Regarding the underlying factor structure, the hybrid model demonstrated the best fit, followed by the anhedonia model. However, we encountered some nonpositive estimates for the correlations of the latent variables (factors) for both models. The model with the best fit without methodological problems was the externalizing behaviors model, but the results also supported the DSM-5 model. Overall, the results demonstrate that the German version of the CAPS-5 is a psychometrically sound measure.

  12. HDMR methods to assess reliability in slope stability analyses

    NASA Astrophysics Data System (ADS)

    Kozubal, Janusz; Pula, Wojciech; Vessia, Giovanna

    2014-05-01

    Stability analyses of complex rock-soil deposits shall be tackled considering the complex structure of discontinuities within rock mass and embedded soil layers. These materials are characterized by a high variability in physical and mechanical properties. Thus, to calculate the slope safety factor in stability analyses two issues must be taken into account: 1) the uncertainties related to structural setting of the rock-slope mass and 2) the variability in mechanical properties of soils and rocks. High Dimensional Model Representation (HDMR) (Chowdhury et al. 2009; Chowdhury and Rao 2010) can be used to carry out the reliability index within complex rock-soil slopes when numerous random variables with high coefficient of variations are considered. HDMR implements the inverse reliability analysis, meaning that the unknown design parameters are sought provided that prescribed reliability index values are attained. Such approach uses implicit response functions according to the Response Surface Method (RSM). The simple RSM can be efficiently applied when less than four random variables are considered; as the number of variables increases, the efficiency in reliability index estimation decreases due to the great amount of calculations. Therefore, HDMR method is used to improve the computational accuracy. In this study, the sliding mechanism in Polish Flysch Carpathian Mountains have been studied by means of HDMR. The Southern part of Poland where Carpathian Mountains are placed is characterized by a rather complicated sedimentary pattern of flysh rocky-soil deposits that can be simplified into three main categories: (1) normal flysch, consisting of adjacent sandstone and shale beds of approximately equal thickness, (2) shale flysch, where shale beds are thicker than adjacent sandstone beds, and (3) sandstone flysch, where the opposite holds. Landslides occur in all flysch deposit types thus some configurations of possible unstable settings (within fractured rocky-soil masses) resulting in sliding mechanisms have been investigated in this study. The reliability indices values drawn from the HDRM method have been compared with conventional approaches as neural networks: the efficiency of HDRM is shown in the case studied. References Chowdhury R., Rao B.N. and Prasad A.M. 2009. High-dimensional model representation for structural reliability analysis. Commun. Numer. Meth. Engng, 25: 301-337. Chowdhury R. and Rao B. 2010. Probabilistic Stability Assessment of Slopes Using High Dimensional Model Representation. Computers and Geotechnics, 37: 876-884.

  13. The Role of Human Error in Design, Construction, and Reliability of Marine Structures.

    DTIC Science & Technology

    1994-10-01

    The 1979 Three Mile Island nuclear plant accident was largely a result of a failure to properly sort out and recognize critically important information...determinating the goals and objectives of the program and by evaluating and interpreting the results in terms of structural design, construction, and...67 Checking Models in Structural Design ....................................... 69 Nuclear Power Plants

  14. Cross-cultural adaptation and psychometric properties of the Korean Scale for Internet Addiction (K-Scale) in Japanese high school students.

    PubMed

    Mak, Kwok-Kei; Nam, JeeEun Karin; Kim, Dongil; Aum, Narae; Choi, Jung-Seok; Cheng, Cecilia; Ko, Huei-Chen; Watanabe, Hiroko

    2017-03-01

    The Korean Scale for Internet Addiction (K-Scale) was developed in Korea for assessing addictive internet behaviors. This study aims to adopt K-Scale and examine its psychometric properties in Japanese adolescents. In 2014, 589 (36.0% boys) high school students (Grade 10-12) from Japan completed a survey, including items of Japanese versions of K-Scale and Smartphone Scale for Smartphone Addiction (S-Scale). Model fit indices of the original four-factor structure, three-factor structure obtained from exploratory factor analysis, and improved two-factor structure of K-Scale were computed using confirmatory factor analysis, with internal reliability of included items reported. The convergent validity of K-Scale was tested against self-rated internet addiction, and S-Scale using multiple regression models. The results showed that a second-order two-factor 13-item structure was the most parsimonious model (NFI=0.919, NNFI=0.935, CFI=0.949, and RMSEA=0.05) with good internal reliability (Cronbach's alpha=0.87). The two factors revealed were "Disturbance of Adaptation and Life Orientation" and "Withdrawal and Tolerance". Moreover, the correlation between internet user classifications defined by K-Scale and self-rating was significant. K-Scale total score was significantly and positively associated with S-Scale total (adjusted R 2 =0.440) and subscale scores (adjusted R 2 =0.439). In conclusion, K-Scale is a valid and reliable assessment scale of internet addiction for Japanese high school students after modifications. Copyright © 2017. Published by Elsevier B.V.

  15. Test Reliability at the Individual Level

    PubMed Central

    Hu, Yueqin; Nesselroade, John R.; Erbacher, Monica K.; Boker, Steven M.; Burt, S. Alexandra; Keel, Pamela K.; Neale, Michael C.; Sisk, Cheryl L.; Klump, Kelly

    2016-01-01

    Reliability has a long history as one of the key psychometric properties of a test. However, a given test might not measure people equally reliably. Test scores from some individuals may have considerably greater error than others. This study proposed two approaches using intraindividual variation to estimate test reliability for each person. A simulation study suggested that the parallel tests approach and the structural equation modeling approach recovered the simulated reliability coefficients. Then in an empirical study, where forty-five females were measured daily on the Positive and Negative Affect Schedule (PANAS) for 45 consecutive days, separate estimates of reliability were generated for each person. Results showed that reliability estimates of the PANAS varied substantially from person to person. The methods provided in this article apply to tests measuring changeable attributes and require repeated measures across time on each individual. This article also provides a set of parallel forms of PANAS. PMID:28936107

  16. Measuring leader perceptions of school readiness for reforms: use of an iterative model combining classical and Rasch methods.

    PubMed

    Chatterji, Madhabi

    2002-01-01

    This study examines validity of data generated by the School Readiness for Reforms: Leader Questionnaire (SRR-LQ) using an iterative procedure that combines classical and Rasch rating scale analysis. Following content-validation and pilot-testing, principal axis factor extraction and promax rotation of factors yielded a five factor structure consistent with the content-validated subscales of the original instrument. Factors were identified based on inspection of pattern and structure coefficients. The rotated factor pattern, inter-factor correlations, convergent validity coefficients, and Cronbach's alpha reliability estimates supported the hypothesized construct properties. To further examine unidimensionality and efficacy of the rating scale structures, item-level data from each factor-defined subscale were subjected to analysis with the Rasch rating scale model. Data-to-model fit statistics and separation reliability for items and persons met acceptable criteria. Rating scale results suggested consistency of expected and observed step difficulties in rating categories, and correspondence of step calibrations with increases in the underlying variables. The combined approach yielded more comprehensive diagnostic information on the quality of the five SRR-LQ subscales; further research is continuing.

  17. Confirmatory Factor Analysis of the Malay Version of the Confusion, Hubbub and Order Scale (CHAOS-6) among Myocardial Infarction Survivors in a Malaysian Cardiac Healthcare Facility.

    PubMed

    Ganasegeran, Kurubaran; Selvaraj, Kamaraj; Rashid, Abdul

    2017-08-01

    The six item Confusion, Hubbub and Order Scale (CHAOS-6) has been validated as a reliable tool to measure levels of household disorder. We aimed to investigate the goodness of fit and reliability of a new Malay version of the CHAOS-6. The original English version of the CHAOS-6 underwent forward-backward translation into the Malay language. The finalised Malay version was administered to 105 myocardial infarction survivors in a Malaysian cardiac health facility. We performed confirmatory factor analyses (CFAs) using structural equation modelling. A path diagram and fit statistics were yielded to determine the Malay version's validity. Composite reliability was tested to determine the scale's reliability. All 105 myocardial infarction survivors participated in the study. The CFA yielded a six-item, one-factor model with excellent fit statistics. Composite reliability for the single factor CHAOS-6 was 0.65, confirming that the scale is reliable for Malay speakers. The Malay version of the CHAOS-6 was reliable and showed the best fit statistics for our study sample. We thus offer a simple, brief, validated, reliable and novel instrument to measure chaos, the Skala Kecelaruan, Keriuhan & Tertib Terubahsuai (CHAOS-6) , for the Malaysian population.

  18. Confirmatory Factor Analysis of the Malay Version of the Confusion, Hubbub and Order Scale (CHAOS-6) among Myocardial Infarction Survivors in a Malaysian Cardiac Healthcare Facility

    PubMed Central

    Ganasegeran, Kurubaran; Selvaraj, Kamaraj; Rashid, Abdul

    2017-01-01

    Background The six item Confusion, Hubbub and Order Scale (CHAOS-6) has been validated as a reliable tool to measure levels of household disorder. We aimed to investigate the goodness of fit and reliability of a new Malay version of the CHAOS-6. Methods The original English version of the CHAOS-6 underwent forward-backward translation into the Malay language. The finalised Malay version was administered to 105 myocardial infarction survivors in a Malaysian cardiac health facility. We performed confirmatory factor analyses (CFAs) using structural equation modelling. A path diagram and fit statistics were yielded to determine the Malay version’s validity. Composite reliability was tested to determine the scale’s reliability. Results All 105 myocardial infarction survivors participated in the study. The CFA yielded a six-item, one-factor model with excellent fit statistics. Composite reliability for the single factor CHAOS-6 was 0.65, confirming that the scale is reliable for Malay speakers. Conclusion The Malay version of the CHAOS-6 was reliable and showed the best fit statistics for our study sample. We thus offer a simple, brief, validated, reliable and novel instrument to measure chaos, the Skala Kecelaruan, Keriuhan & Tertib Terubahsuai (CHAOS-6), for the Malaysian population. PMID:28951688

  19. A scoring function based on solvation thermodynamics for protein structure prediction

    PubMed Central

    Du, Shiqiao; Harano, Yuichi; Kinoshita, Masahiro; Sakurai, Minoru

    2012-01-01

    We predict protein structure using our recently developed free energy function for describing protein stability, which is focused on solvation thermodynamics. The function is combined with the current most reliable sampling methods, i.e., fragment assembly (FA) and comparative modeling (CM). The prediction is tested using 11 small proteins for which high-resolution crystal structures are available. For 8 of these proteins, sequence similarities are found in the database, and the prediction is performed with CM. Fairly accurate models with average Cα root mean square deviation (RMSD) ∼ 2.0 Å are successfully obtained for all cases. For the rest of the target proteins, we perform the prediction following FA protocols. For 2 cases, we obtain predicted models with an RMSD ∼ 3.0 Å as the best-scored structures. For the other case, the RMSD remains larger than 7 Å. For all the 11 target proteins, our scoring function identifies the experimentally determined native structure as the best structure. Starting from the predicted structure, replica exchange molecular dynamics is performed to further refine the structures. However, we are unable to improve its RMSD toward the experimental structure. The exhaustive sampling by coarse-grained normal mode analysis around the native structures reveals that our function has a linear correlation with RMSDs < 3.0 Å. These results suggest that the function is quite reliable for the protein structure prediction while the sampling method remains one of the major limiting factors in it. The aspects through which the methodology could further be improved are discussed. PMID:27493529

  20. Factorial Validity of the ADHD Adult Symptom Rating Scale in a French Community Sample: Results From the ChiP-ARD Study.

    PubMed

    Morin, Alexandre J S; Tran, Antoine; Caci, Hervé

    2016-06-01

    Recent publications reported that a bifactor model better represented the underlying structure of ADHD than classical models, at least in youth. The Adult ADHD Symptoms Rating Scale (ASRS) has been translated into many languages, but a single study compared its structure in adults across Diagnostic and Statistical Manual of Mental Disorders (4th ed.; DSM-IV) and International Classification of Diseases (ICD-10) classifications. We investigated the factor structure, reliability, and measurement invariance of the ASRS among a community sample of 1,171 adults. Results support a bifactor model, including one general ADHD factor and three specific Inattention, Hyperactivity, and Impulsivity factors corresponding to ICD-10, albeit the Impulsivity specific factor was weakly defined. Results also support the complete measurement invariance of this model across gender and age groups, and that men have higher scores than women on the ADHD G-factor but lower scores on all three S-factors. Results suggest that a total ASRS-ADHD score is meaningful, reliable, and valid in adults. (J. of Att. Dis. 2016; 20(6) 530-541). © The Author(s) 2013.

  1. Use of a structured functional evaluation process for independent medical evaluations of claimants presenting with disabling mental illness: rationale and design for a multi-center reliability study.

    PubMed

    Bachmann, Monica; de Boer, Wout; Schandelmaier, Stefan; Leibold, Andrea; Marelli, Renato; Jeger, Joerg; Hoffmann-Richter, Ulrike; Mager, Ralph; Schaad, Heinz; Zumbrunn, Thomas; Vogel, Nicole; Bänziger, Oskar; Busse, Jason W; Fischer, Katrin; Kunz, Regina

    2016-07-29

    Work capacity evaluations by independent medical experts are widely used to inform insurers whether injured or ill workers are capable of engaging in competitive employment. In many countries, evaluation processes lack a clearly structured approach, standardized instruments, and an explicit focus on claimants' functional abilities. Evaluation of subjective complaints, such as mental illness, present additional challenges in the determination of work capacity. We have therefore developed a process for functional evaluation of claimants with mental disorders which complements usual psychiatric evaluation. Here we report the design of a study to measure the reliability of our approach in determining work capacity among patients with mental illness applying for disability benefits. We will conduct a multi-center reliability study, in which 20 psychiatrists trained in our functional evaluation process will assess 30 claimants presenting with mental illness for eligibility to receive disability benefits [Reliability of Functional Evaluation in Psychiatry, RELY-study]. The functional evaluation process entails a five-step structured interview and a reporting instrument (Instrument of Functional Assessment in Psychiatry [IFAP]) to document the severity of work-related functional limitations. We will videotape all evaluations which will be viewed by three psychiatrists who will independently rate claimants' functional limitations. Our primary outcome measure is the evaluation of claimant's work capacity as a percentage (0 to 100 %), and our secondary outcomes are the 12 mental functions and 13 functional capacities assessed by the IFAP-instrument. Inter-rater reliability of four psychiatric experts will be explored using multilevel models to estimate the intraclass correlation coefficient (ICC). Additional analyses include subgroups according to mental disorder, the typicality of claimants, and claimant perceived fairness of the assessment process. We hypothesize that a structured functional approach will show moderate reliability (ICC ≥ 0.6) of psychiatric evaluation of work capacity. Enrollment of actual claimants with mental disorders referred for evaluation by disability/accident insurers will increase the external validity of our findings. Finding moderate levels of reliability, we will continue with a randomized trial to test the reliability of a structured functional approach versus evaluation-as-usual.

  2. A Reliable Data Transmission Model for IEEE 802.15.4e Enabled Wireless Sensor Network under WiFi Interference.

    PubMed

    Sahoo, Prasan Kumar; Pattanaik, Sudhir Ranjan; Wu, Shih-Lin

    2017-06-07

    The IEEE 802.15.4e standard proposes Medium Access Control (MAC) to support collision-free wireless channel access mechanisms for industrial, commercial and healthcare applications. However, unnecessary wastage of energy and bandwidth consumption occur due to inefficient backoff management and collisions. In this paper, a new channel access mechanism is designed for the buffer constraint sensor devices to reduce the packet drop rate, energy consumption and collisions. In order to avoid collision due to the hidden terminal problem, a new frame structure is designed for the data transmission. A new superframe structure is proposed to mitigate the problems due to WiFi and ZigBee interference. A modified superframe structure with a new retransmission opportunity for failure devices is proposed to reduce the collisions and retransmission delay with high reliability. Performance evaluation and validation of our scheme indicate that the packet drop rate, throughput, reliability, energy consumption and average delay of the nodes can be improved significantly.

  3. An overview of reliability assessment and control for design of civil engineering structures

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Field, R.V. Jr.; Grigoriadis, K.M.; Bergman, L.A.

    1998-06-01

    Random variations, whether they occur in the input signal or the system parameters, are phenomena that occur in nearly all engineering systems of interest. As a result, nondeterministic modeling techniques must somehow account for these variations to ensure validity of the solution. As might be expected, this is a difficult proposition and the focus of many current research efforts. Controlling seismically excited structures is one pertinent application of nondeterministic analysis and is the subject of the work presented herein. This overview paper is organized into two sections. First, techniques to assess system reliability, in a context familiar to civil engineers,more » are discussed. Second, and as a consequence of the first, active control methods that ensure good performance in this random environment are presented. It is the hope of the authors that these discussions will ignite further interest in the area of reliability assessment and design of controlled civil engineering structures.« less

  4. A Reliable Data Transmission Model for IEEE 802.15.4e Enabled Wireless Sensor Network under WiFi Interference

    PubMed Central

    Sahoo, Prasan Kumar; Pattanaik, Sudhir Ranjan; Wu, Shih-Lin

    2017-01-01

    The IEEE 802.15.4e standard proposes Medium Access Control (MAC) to support collision-free wireless channel access mechanisms for industrial, commercial and healthcare applications. However, unnecessary wastage of energy and bandwidth consumption occur due to inefficient backoff management and collisions. In this paper, a new channel access mechanism is designed for the buffer constraint sensor devices to reduce the packet drop rate, energy consumption and collisions. In order to avoid collision due to the hidden terminal problem, a new frame structure is designed for the data transmission. A new superframe structure is proposed to mitigate the problems due to WiFi and ZigBee interference. A modified superframe structure with a new retransmission opportunity for failure devices is proposed to reduce the collisions and retransmission delay with high reliability. Performance evaluation and validation of our scheme indicate that the packet drop rate, throughput, reliability, energy consumption and average delay of the nodes can be improved significantly. PMID:28590434

  5. Factor structure and psychometric properties of the english version of the trier inventory for chronic stress (TICS-E).

    PubMed

    Petrowski, Katja; Kliem, Sören; Sadler, Michael; Meuret, Alicia E; Ritz, Thomas; Brähler, Elmar

    2018-02-06

    Demands placed on individuals in occupational and social settings, as well as imbalances in personal traits and resources, can lead to chronic stress. The Trier Inventory for Chronic Stress (TICS) measures chronic stress while incorporating domain-specific aspects, and has been found to be a highly reliable and valid research tool. The aims of the present study were to confirm the German version TICS factorial structure in an English translation of the instrument (TICS-E) and to report its psychometric properties. A random route sample of healthy participants (N = 483) aged 18-30 years completed the TICS-E. The robust maximum likelihood estimation with a mean-adjusted chi-square test statistic was applied due to the sample's significant deviation from the multivariate normal distribution. Goodness of fit, absolute model fit, and relative model fit were assessed by means of the root mean square error of approximation (RMSEA), the Comparative Fit Index (CFI) and the Tucker Lewis Index (TLI). Reliability estimates (Cronbach's α and adjusted split-half reliability) ranged from .84 to .92. Item-scale correlations ranged from .50 to .85. Measures of fit showed values of .052 for RMSEA (Cl = 0.50-.054) and .067 for SRMR for absolute model fit, and values of .846 (TLI) and .855 (CFI) for relative model-fit. Factor loadings ranged from .55 to .91. The psychometric properties and factor structure of the TICS-E are comparable to the German version of the TICS. The instrument therefore meets quality standards for an adequate measurement of chronic stress.

  6. The Brazilian version of the effort-reward imbalance questionnaire to assess job stress.

    PubMed

    Chor, Dóra; Werneck, Guilherme Loureiro; Faerstein, Eduardo; Alves, Márcia Guimarães de Mello; Rotenberg, Lúcia

    2008-01-01

    The effort-reward imbalance (ERI) model has been used to assess the health impact of job stress. We aimed at describing the cross-cultural adaptation of the ERI questionnaire into Portuguese and some psychometric properties, in particular internal consistency, test-retest reliability, and factorial structure. We developed a Brazilian version of the ERI using a back-translation method and tested its reliability. The test-retest reliability study was conducted with 111 health workers and University staff. The current analyses are based on 89 participants, after exclusion of those with missing data. Reproducibility (interclass correlation coefficients) for the "effort", "'reward", and "'overcommitment"' dimensions of the scale was estimated at 0.76, 0.86, and 0.78, respectively. Internal consistency (Cronbach's alpha) estimates for these same dimensions were 0.68, 0.78, and 0.78, respectively. The exploratory factorial structure was fairly consistent with the model's theoretical components. We conclude that the results of this study represent the first evidence in favor of the application of the Brazilian Portuguese version of the ERI scale in health research in populations with similar socioeconomic characteristics.

  7. The Psychometric Properties of the Center for Epidemiologic Studies Depression Scale in Chinese Primary Care Patients: Factor Structure, Construct Validity, Reliability, Sensitivity and Responsiveness.

    PubMed

    Chin, Weng Yee; Choi, Edmond P H; Chan, Kit T Y; Wong, Carlos K H

    2015-01-01

    The Center for Epidemiologic Studies Depression Scale (CES-D) is a commonly used instrument to measure depressive symptomatology. Despite this, the evidence for its psychometric properties remains poorly established in Chinese populations. The aim of this study was to validate the use of the CES-D in Chinese primary care patients by examining factor structure, construct validity, reliability, sensitivity and responsiveness. The psychometric properties were assessed amongst a sample of 3686 Chinese adult primary care patients in Hong Kong. Three competing factor structure models were examined using confirmatory factor analysis. The original CES-D four-structure model had adequate fit, however the data was better fit into a bi-factor model. For the internal construct validity, corrected item-total correlations were 0.4 for most items. The convergent validity was assessed by examining the correlations between the CES-D, the Patient Health Questionnaire 9 (PHQ-9) and the Short Form-12 Health Survey (version 2) Mental Component Summary (SF-12 v2 MCS). The CES-D had a strong correlation with the PHQ-9 (coefficient: 0.78) and SF-12 v2 MCS (coefficient: -0.75). Internal consistency was assessed by McDonald's omega hierarchical (ωH). The ωH value for the general depression factor was 0.855. The ωH values for "somatic", "depressed affect", "positive affect" and "interpersonal problems" were 0.434, 0.038, 0.738 and 0.730, respectively. For the two-week test-retest reliability, the intraclass correlation coefficient was 0.91. The CES-D was sensitive in detecting differences between known groups, with the AUC >0.7. Internal responsiveness of the CES-D to detect positive and negative changes was satisfactory (with p value <0.01 and all effect size statistics >0.2). The CES-D was externally responsive, with the AUC>0.7. The CES-D appears to be a valid, reliable, sensitive and responsive instrument for screening and monitoring depressive symptoms in adult Chinese primary care patients. In its original four-factor and bi-factor structure, the CES-D is supported for cross-cultural comparisons of depression in multi-center studies.

  8. Protocol for Reliability Assessment of Structural Health Monitoring Systems Incorporating Model-assisted Probability of Detection (MAPOD) Approach

    DTIC Science & Technology

    2011-09-01

    a quality evaluation with limited data, a model -based assessment must be...that affect system performance, a multistage approach to system validation, a modeling and experimental methodology for efficiently addressing a ...affect system performance, a multistage approach to system validation, a modeling and experimental methodology for efficiently addressing a wide range

  9. Adaptive Crack Modeling with Interface Solid Elements for Plain and Fiber Reinforced Concrete Structures.

    PubMed

    Zhan, Yijian; Meschke, Günther

    2017-07-08

    The effective analysis of the nonlinear behavior of cement-based engineering structures not only demands physically-reliable models, but also computationally-efficient algorithms. Based on a continuum interface element formulation that is suitable to capture complex cracking phenomena in concrete materials and structures, an adaptive mesh processing technique is proposed for computational simulations of plain and fiber-reinforced concrete structures to progressively disintegrate the initial finite element mesh and to add degenerated solid elements into the interfacial gaps. In comparison with the implementation where the entire mesh is processed prior to the computation, the proposed adaptive cracking model allows simulating the failure behavior of plain and fiber-reinforced concrete structures with remarkably reduced computational expense.

  10. Adaptive Crack Modeling with Interface Solid Elements for Plain and Fiber Reinforced Concrete Structures

    PubMed Central

    Zhan, Yijian

    2017-01-01

    The effective analysis of the nonlinear behavior of cement-based engineering structures not only demands physically-reliable models, but also computationally-efficient algorithms. Based on a continuum interface element formulation that is suitable to capture complex cracking phenomena in concrete materials and structures, an adaptive mesh processing technique is proposed for computational simulations of plain and fiber-reinforced concrete structures to progressively disintegrate the initial finite element mesh and to add degenerated solid elements into the interfacial gaps. In comparison with the implementation where the entire mesh is processed prior to the computation, the proposed adaptive cracking model allows simulating the failure behavior of plain and fiber-reinforced concrete structures with remarkably reduced computational expense. PMID:28773130

  11. Factor Structure, Reliability and Criterion Validity of the Autism-Spectrum Quotient (AQ): A Study in Dutch Population and Patient Groups

    PubMed Central

    Bartels, Meike; Cath, Danielle C.; Boomsma, Dorret I.

    2008-01-01

    The factor structure of the Dutch translation of the Autism-Spectrum Quotient (AQ; a continuous, quantitative measure of autistic traits) was evaluated with confirmatory factor analyses in a large general population and student sample. The criterion validity of the AQ was examined in three matched patient groups (autism spectrum conditions (ASC), social anxiety disorder, and obsessive–compulsive disorder). A two factor model, consisting of a “Social interaction” factor and “Attention to detail” factor could be identified. The internal consistency and test–retest reliability of the AQ were satisfactory. High total AQ and factor scores were specific to ASC patients. Men scored higher than women and science students higher than non-science students. The Dutch translation of the AQ is a reliable instrument to assess autism spectrum conditions. PMID:18302013

  12. Young Children's Psychological Selves: Convergence with Maternal Reports of Child Personality

    ERIC Educational Resources Information Center

    Brown, Geoffrey L.; Mangelsdorf, Sarah C.; Agathen, Jean M.; Ho, Moon-Ho

    2008-01-01

    The present research examined five-year-old children's psychological self-concepts. Non-linear factor analysis was used to model the latent structure of the children's self-view questionnaire (CSVQ; Eder, 1990), a measure of children's self-concepts. The coherence and reliability of the emerging factor structure indicated that young children are…

  13. Reliability- and performance-based robust design optimization of MEMS structures considering technological uncertainties

    NASA Astrophysics Data System (ADS)

    Martowicz, Adam; Uhl, Tadeusz

    2012-10-01

    The paper discusses the applicability of a reliability- and performance-based multi-criteria robust design optimization technique for micro-electromechanical systems, considering their technological uncertainties. Nowadays, micro-devices are commonly applied systems, especially in the automotive industry, taking advantage of utilizing both the mechanical structure and electronic control circuit on one board. Their frequent use motivates the elaboration of virtual prototyping tools that can be applied in design optimization with the introduction of technological uncertainties and reliability. The authors present a procedure for the optimization of micro-devices, which is based on the theory of reliability-based robust design optimization. This takes into consideration the performance of a micro-device and its reliability assessed by means of uncertainty analysis. The procedure assumes that, for each checked design configuration, the assessment of uncertainty propagation is performed with the meta-modeling technique. The described procedure is illustrated with an example of the optimization carried out for a finite element model of a micro-mirror. The multi-physics approach allowed the introduction of several physical phenomena to correctly model the electrostatic actuation and the squeezing effect present between electrodes. The optimization was preceded by sensitivity analysis to establish the design and uncertain domains. The genetic algorithms fulfilled the defined optimization task effectively. The best discovered individuals are characterized by a minimized value of the multi-criteria objective function, simultaneously satisfying the constraint on material strength. The restriction of the maximum equivalent stresses was introduced with the conditionally formulated objective function with a penalty component. The yielded results were successfully verified with a global uniform search through the input design domain.

  14. Self-esteem among nursing assistants: reliability and validity of the Rosenberg Self-Esteem Scale.

    PubMed

    McMullen, Tara; Resnick, Barbara

    2013-01-01

    To establish the reliability and validity of the Rosenberg Self-Esteem Scale (RSES) when used with nursing assistants (NAs). Testing the RSES used baseline data from a randomized controlled trial testing the Res-Care Intervention. Female NAs were recruited from nursing homes (n = 508). Validity testing for the positive and negative subscales of the RSES was based on confirmatory factor analysis (CFA) using structural equation modeling and Rasch analysis. Estimates of reliability were based on Rasch analysis and the person separation index. Evidence supports the reliability and validity of the RSES in NAs although we recommend minor revisions to the measure for subsequent use. Establishing reliable and valid measures of self-esteem in NAs will facilitate testing of interventions to strengthen workplace self-esteem, job satisfaction, and retention.

  15. Structural design methodologies for ceramic-based material systems

    NASA Technical Reports Server (NTRS)

    Duffy, Stephen F.; Chulya, Abhisak; Gyekenyesi, John P.

    1991-01-01

    One of the primary pacing items for realizing the full potential of ceramic-based structural components is the development of new design methods and protocols. The focus here is on low temperature, fast-fracture analysis of monolithic, whisker-toughened, laminated, and woven ceramic composites. A number of design models and criteria are highlighted. Public domain computer algorithms, which aid engineers in predicting the fast-fracture reliability of structural components, are mentioned. Emphasis is not placed on evaluating the models, but instead is focused on the issues relevant to the current state of the art.

  16. A reliability study on brain activation during active and passive arm movements supported by an MRI-compatible robot.

    PubMed

    Estévez, Natalia; Yu, Ningbo; Brügger, Mike; Villiger, Michael; Hepp-Reymond, Marie-Claude; Riener, Robert; Kollias, Spyros

    2014-11-01

    In neurorehabilitation, longitudinal assessment of arm movement related brain function in patients with motor disability is challenging due to variability in task performance. MRI-compatible robots monitor and control task performance, yielding more reliable evaluation of brain function over time. The main goals of the present study were first to define the brain network activated while performing active and passive elbow movements with an MRI-compatible arm robot (MaRIA) in healthy subjects, and second to test the reproducibility of this activation over time. For the fMRI analysis two models were compared. In model 1 movement onset and duration were included, whereas in model 2 force and range of motion were added to the analysis. Reliability of brain activation was tested with several statistical approaches applied on individual and group activation maps and on summary statistics. The activated network included mainly the primary motor cortex, primary and secondary somatosensory cortex, superior and inferior parietal cortex, medial and lateral premotor regions, and subcortical structures. Reliability analyses revealed robust activation for active movements with both fMRI models and all the statistical methods used. Imposed passive movements also elicited mainly robust brain activation for individual and group activation maps, and reliability was improved by including additional force and range of motion using model 2. These findings demonstrate that the use of robotic devices, such as MaRIA, can be useful to reliably assess arm movement related brain activation in longitudinal studies and may contribute in studies evaluating therapies and brain plasticity following injury in the nervous system.

  17. The DSM-5 Trait Measure in a Psychiatric Sample of Late Adolescents and Emerging Adults: Structure, Reliability, and Validity.

    PubMed

    De Caluwé, Elien; Verbeke, Lize; van Aken, Marcel; van der Heijden, Paul T; De Clercq, Barbara

    2018-02-22

    The inclusion of a dimensional trait model of personality pathology in DSM-5 creates new opportunities for research on developmental antecedents of personality pathology. The traits of this model can be measured with the Personality Inventory for DSM-5 (PID-5), initially developed for adults, but also demonstrating validity in adolescents. The present study adds to the growing body of literature on the psychometrics of the PID-5, by examining its structure, validity, and reliability in 187 psychiatric-referred late adolescents and emerging adults. PID-5, Big Five Inventory, and Kidscreen self-reports were provided, and 88 non-clinical matched controls completed the PID-5. Results confirm the PID-5's five-factor structure, indicate adequate psychometric properties, and underscore the construct and criterion validity, showing meaningful associations with adaptive traits and quality of life. Results are discussed in terms of the PID-5's applicability in vulnerable populations who are going through important developmental transition phases, such as the step towards early adulthood.

  18. An Improved Gaussian Mixture Model for Damage Propagation Monitoring of an Aircraft Wing Spar under Changing Structural Boundary Conditions.

    PubMed

    Qiu, Lei; Yuan, Shenfang; Mei, Hanfei; Fang, Fang

    2016-02-26

    Structural Health Monitoring (SHM) technology is considered to be a key technology to reduce the maintenance cost and meanwhile ensure the operational safety of aircraft structures. It has gradually developed from theoretic and fundamental research to real-world engineering applications in recent decades. The problem of reliable damage monitoring under time-varying conditions is a main issue for the aerospace engineering applications of SHM technology. Among the existing SHM methods, Guided Wave (GW) and piezoelectric sensor-based SHM technique is a promising method due to its high damage sensitivity and long monitoring range. Nevertheless the reliability problem should be addressed. Several methods including environmental parameter compensation, baseline signal dependency reduction and data normalization, have been well studied but limitations remain. This paper proposes a damage propagation monitoring method based on an improved Gaussian Mixture Model (GMM). It can be used on-line without any structural mechanical model and a priori knowledge of damage and time-varying conditions. With this method, a baseline GMM is constructed first based on the GW features obtained under time-varying conditions when the structure under monitoring is in the healthy state. When a new GW feature is obtained during the on-line damage monitoring process, the GMM can be updated by an adaptive migration mechanism including dynamic learning and Gaussian components split-merge. The mixture probability distribution structure of the GMM and the number of Gaussian components can be optimized adaptively. Then an on-line GMM can be obtained. Finally, a best match based Kullback-Leibler (KL) divergence is studied to measure the migration degree between the baseline GMM and the on-line GMM to reveal the weak cumulative changes of the damage propagation mixed in the time-varying influence. A wing spar of an aircraft is used to validate the proposed method. The results indicate that the crack propagation under changing structural boundary conditions can be monitored reliably. The method is not limited by the properties of the structure, and thus it is feasible to be applied to composite structure.

  19. Goal Structuring Notation in a Radiation Hardening Assurance Case for COTS-Based Spacecraft

    NASA Technical Reports Server (NTRS)

    Witulski, A.; Austin, R.; Evans, J.; Mahadevan, N.; Karsai, G.; Sierawski, B.; LaBel, K.; Reed, R.

    2016-01-01

    The attached presentation is a summary of how mission assurance is supported by model-based representations of spacecraft systems that can define sub-system functionality and interfacing, reliability parameters, as well as detailing a new paradigm for assurance, a model-centric and not document-centric process.

  20. A flexible count data regression model for risk analysis.

    PubMed

    Guikema, Seth D; Coffelt, Jeremy P; Goffelt, Jeremy P

    2008-02-01

    In many cases, risk and reliability analyses involve estimating the probabilities of discrete events such as hardware failures and occurrences of disease or death. There is often additional information in the form of explanatory variables that can be used to help estimate the likelihood of different numbers of events in the future through the use of an appropriate regression model, such as a generalized linear model. However, existing generalized linear models (GLM) are limited in their ability to handle the types of variance structures often encountered in using count data in risk and reliability analysis. In particular, standard models cannot handle both underdispersed data (variance less than the mean) and overdispersed data (variance greater than the mean) in a single coherent modeling framework. This article presents a new GLM based on a reformulation of the Conway-Maxwell Poisson (COM) distribution that is useful for both underdispersed and overdispersed count data and demonstrates this model by applying it to the assessment of electric power system reliability. The results show that the proposed COM GLM can provide as good of fits to data as the commonly used existing models for overdispered data sets while outperforming these commonly used models for underdispersed data sets.

  1. Stirling engine - Approach for long-term durability assessment

    NASA Technical Reports Server (NTRS)

    Tong, Michael T.; Bartolotta, Paul A.; Halford, Gary R.; Freed, Alan D.

    1992-01-01

    The approach employed by NASA Lewis for the long-term durability assessment of the Stirling engine hot-section components is summarized. The approach consists of: preliminary structural assessment; development of a viscoplastic constitutive model to accurately determine material behavior under high-temperature thermomechanical loads; an experimental program to characterize material constants for the viscoplastic constitutive model; finite-element thermal analysis and structural analysis using a viscoplastic constitutive model to obtain stress/strain/temperature at the critical location of the hot-section components for life assessment; and development of a life prediction model applicable for long-term durability assessment at high temperatures. The approach should aid in the provision of long-term structural durability and reliability of Stirling engines.

  2. Ab Initio structure prediction for Escherichia coli: towards genome-wide protein structure modeling and fold assignment

    PubMed Central

    Xu, Dong; Zhang, Yang

    2013-01-01

    Genome-wide protein structure prediction and structure-based function annotation have been a long-term goal in molecular biology but not yet become possible due to difficulties in modeling distant-homology targets. We developed a hybrid pipeline combining ab initio folding and template-based modeling for genome-wide structure prediction applied to the Escherichia coli genome. The pipeline was tested on 43 known sequences, where QUARK-based ab initio folding simulation generated models with TM-score 17% higher than that by traditional comparative modeling methods. For 495 unknown hard sequences, 72 are predicted to have a correct fold (TM-score > 0.5) and 321 have a substantial portion of structure correctly modeled (TM-score > 0.35). 317 sequences can be reliably assigned to a SCOP fold family based on structural analogy to existing proteins in PDB. The presented results, as a case study of E. coli, represent promising progress towards genome-wide structure modeling and fold family assignment using state-of-the-art ab initio folding algorithms. PMID:23719418

  3. Assessing a Norwegian translation of the Organizational Climate Measure.

    PubMed

    Bernstrøm, Vilde Hoff; Lone, Jon Anders; Bjørkli, Cato A; Ulleberg, Pål; Hoff, Thomas

    2013-04-01

    This study investigated the Norwegian translation of the Organizational Climate Measure developed by Patterson and colleagues. The Organizational Climate Measure is a global measure of organizational climate based on Quinn and Rohrbaugh's competing values model. The survey was administered to a Norwegian branch of an international service sector company (N = 555). The results revealed satisfactory internal reliability and interrater agreement for the 17 scales, and confirmatory factor analysis supported the original factor structure. The findings gave preliminary support for the Organizational Climate Measure as a reliable measure with a stable factor structure, and indicated that it is potentially useful in the Norwegian context.

  4. Optimizing the Reliability and Performance of Service Composition Applications with Fault Tolerance in Wireless Sensor Networks

    PubMed Central

    Wu, Zhao; Xiong, Naixue; Huang, Yannong; Xu, Degang; Hu, Chunyang

    2015-01-01

    The services composition technology provides flexible methods for building service composition applications (SCAs) in wireless sensor networks (WSNs). The high reliability and high performance of SCAs help services composition technology promote the practical application of WSNs. The optimization methods for reliability and performance used for traditional software systems are mostly based on the instantiations of software components, which are inapplicable and inefficient in the ever-changing SCAs in WSNs. In this paper, we consider the SCAs with fault tolerance in WSNs. Based on a Universal Generating Function (UGF) we propose a reliability and performance model of SCAs in WSNs, which generalizes a redundancy optimization problem to a multi-state system. Based on this model, an efficient optimization algorithm for reliability and performance of SCAs in WSNs is developed based on a Genetic Algorithm (GA) to find the optimal structure of SCAs with fault-tolerance in WSNs. In order to examine the feasibility of our algorithm, we have evaluated the performance. Furthermore, the interrelationships between the reliability, performance and cost are investigated. In addition, a distinct approach to determine the most suitable parameters in the suggested algorithm is proposed. PMID:26561818

  5. The Personality Inventory for DSM-5 Short Form (PID-5-SF): psychometric properties and association with big five traits and pathological beliefs in a Norwegian population.

    PubMed

    Thimm, Jens C; Jordan, Stian; Bach, Bo

    2016-12-07

    With the publication of the fifth edition of the Diagnostic and Statistical Manual of Mental Disorders (DSM-5), an alternative model for personality disorders based on personality dysfunction and pathological personality traits was introduced. The Personality Inventory for DSM-5 (PID-5) is a 220-item self-report inventory designed to assess the personality traits of this model. Recently, a short 100-item version of the PID-5 (PID-5-SF) has been developed. The aim of this study was to investigate the score reliability and structure of the Norwegian PID-5-SF. Further, criterion validity with the five factor model of personality (FFM) and pathological personality beliefs was examined. A derivation sample of university students (N = 503) completed the PID-5, the Big Five Inventory (BFI), and the Personality Beliefs Questionnaire - Short Form (PBQ-SF), whereas a replication sample of 127 students completed the PID-5-SF along with the aforementioned measures. The short PID-5 showed overall good score reliability and structural validity. The associations with FFM traits and pathological personality beliefs were conceptually coherent and similar for the two forms of the PID-5. The results suggest that the Norwegian PID-5 short form is a reliable and efficient measure of the trait criterion of the alternative model for personality disorders in DSM-5.

  6. Reliability Quantification of Advanced Stirling Convertor (ASC) Components

    NASA Technical Reports Server (NTRS)

    Shah, Ashwin R.; Korovaichuk, Igor; Zampino, Edward

    2010-01-01

    The Advanced Stirling Convertor, is intended to provide power for an unmanned planetary spacecraft and has an operational life requirement of 17 years. Over this 17 year mission, the ASC must provide power with desired performance and efficiency and require no corrective maintenance. Reliability demonstration testing for the ASC was found to be very limited due to schedule and resource constraints. Reliability demonstration must involve the application of analysis, system and component level testing, and simulation models, taken collectively. Therefore, computer simulation with limited test data verification is a viable approach to assess the reliability of ASC components. This approach is based on physics-of-failure mechanisms and involves the relationship among the design variables based on physics, mechanics, material behavior models, interaction of different components and their respective disciplines such as structures, materials, fluid, thermal, mechanical, electrical, etc. In addition, these models are based on the available test data, which can be updated, and analysis refined as more data and information becomes available. The failure mechanisms and causes of failure are included in the analysis, especially in light of the new information, in order to develop guidelines to improve design reliability and better operating controls to reduce the probability of failure. Quantified reliability assessment based on fundamental physical behavior of components and their relationship with other components has demonstrated itself to be a superior technique to conventional reliability approaches based on utilizing failure rates derived from similar equipment or simply expert judgment.

  7. Reliability and risk assessment of structures

    NASA Technical Reports Server (NTRS)

    Chamis, C. C.

    1991-01-01

    Development of reliability and risk assessment of structural components and structures is a major activity at Lewis Research Center. It consists of five program elements: (1) probabilistic loads; (2) probabilistic finite element analysis; (3) probabilistic material behavior; (4) assessment of reliability and risk; and (5) probabilistic structural performance evaluation. Recent progress includes: (1) the evaluation of the various uncertainties in terms of cumulative distribution functions for various structural response variables based on known or assumed uncertainties in primitive structural variables; (2) evaluation of the failure probability; (3) reliability and risk-cost assessment; and (4) an outline of an emerging approach for eventual certification of man-rated structures by computational methods. Collectively, the results demonstrate that the structural durability/reliability of man-rated structural components and structures can be effectively evaluated by using formal probabilistic methods.

  8. Towards An Engineering Discipline of Computational Security

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mili, Ali; Sheldon, Frederick T; Jilani, Lamia Labed

    2007-01-01

    George Boole ushered the era of modern logic by arguing that logical reasoning does not fall in the realm of philosophy, as it was considered up to his time, but in the realm of mathematics. As such, logical propositions and logical arguments are modeled using algebraic structures. Likewise, we submit that security attributes must be modeled as formal mathematical propositions that are subject to mathematical analysis. In this paper, we approach this problem by attempting to model security attributes in a refinement-like framework that has traditionally been used to represent reliability and safety claims. Keywords: Computable security attributes, survivability, integrity,more » dependability, reliability, safety, security, verification, testing, fault tolerance.« less

  9. Psychometric Properties of the “Sport Motivation Scale (SMS)” Adapted to Physical Education

    PubMed Central

    Granero-Gallegos, Antonio; Baena-Extremera, Antonio; Gómez-López, Manuel; Sánchez-Fuentes, José Antonio; Abraldes, J. Arturo

    2014-01-01

    The aim of this study was to investigate the factor structure of a Spanish version of the Sport Motivation Scale adapted to physical education. A second aim was to test which one of three hypothesized models (three, five and seven-factor) provided best model fit. 758 Spanish high school students completed the Sport Motivation Scale adapted for Physical Education and also completed the Learning and Performance Orientation in Physical Education Classes Questionnaire. We examined the factor structure of each model using confirmatory factor analysis and also assessed internal consistency and convergent validity. The results showed that all three models in Spanish produce good indicators of fitness, but we suggest using the seven-factor model (χ2/gl = 2.73; ECVI = 1.38) as it produces better values when adapted to physical education, that five-factor model (χ2/gl = 2.82; ECVI = 1.44) and three-factor model (χ2/gl = 3.02; ECVI = 1.53). Key Points Physical education research conducted in Spain has used the version of SMS designed to assess motivation in sport, but validity reliability and validity results in physical education have not been reported. Results of the present study lend support to the factorial validity and internal reliability of three alternative factor structures (3, 5, and 7 factors) of SMS adapted to Physical Education in Spanish. Although all three models in Spanish produce good indicators of fitness, but we suggest using the seven-factor model. PMID:25435772

  10. Reliability analysis of the solar array based on Fault Tree Analysis

    NASA Astrophysics Data System (ADS)

    Jianing, Wu; Shaoze, Yan

    2011-07-01

    The solar array is an important device used in the spacecraft, which influences the quality of in-orbit operation of the spacecraft and even the launches. This paper analyzes the reliability of the mechanical system and certifies the most vital subsystem of the solar array. The fault tree analysis (FTA) model is established according to the operating process of the mechanical system based on DFH-3 satellite; the logical expression of the top event is obtained by Boolean algebra and the reliability of the solar array is calculated. The conclusion shows that the hinges are the most vital links between the solar arrays. By analyzing the structure importance(SI) of the hinge's FTA model, some fatal causes, including faults of the seal, insufficient torque of the locking spring, temperature in space, and friction force, can be identified. Damage is the initial stage of the fault, so limiting damage is significant to prevent faults. Furthermore, recommendations for improving reliability associated with damage limitation are discussed, which can be used for the redesigning of the solar array and the reliability growth planning.

  11. A strategy to improve the identification reliability of the chemical constituents by high-resolution mass spectrometry-based isomer structure prediction combined with a quantitative structure retention relationship analysis: Phthalide compounds in Chuanxiong as a test case.

    PubMed

    Zhang, Qingqing; Huo, Mengqi; Zhang, Yanling; Qiao, Yanjiang; Gao, Xiaoyan

    2018-06-01

    High-resolution mass spectrometry (HRMS) provides a powerful tool for the rapid analysis and identification of compounds in herbs. However, the diversity and large differences in the content of the chemical constituents in herbal medicines, especially isomerisms, are a great challenge for mass spectrometry-based structural identification. In the current study, a new strategy for the structural characterization of potential new phthalide compounds was proposed by isomer structure predictions combined with a quantitative structure-retention relationship (QSRR) analysis using phthalide compounds in Chuanxiong as an example. This strategy consists of three steps. First, the structures of phthalide compounds were reasonably predicted on the basis of the structure features and MS/MS fragmentation patterns: (1) the collected raw HRMS data were preliminarily screened by an in-house database; (2) the MS/MS fragmentation patterns of the analogous compounds were summarized; (3) the reported phthalide compounds were identified, and the structures of the isomers were reasonably predicted. Second, the QSRR model was established and verified using representative phthalide compound standards. Finally, the retention times of the predicted isomers were calculated by the QSRR model, and the structures of these peaks were rationally characterized by matching retention times of the detected chromatographic peaks and the predicted isomers. A multiple linear regression QSRR model in which 6 physicochemical variables were screened was built using 23 phthalide standards. The retention times of the phthalide isomers in Chuanxiong were well predicted by the QSRR model combined with reasonable structure predictions (R 2 =0.955). A total of 81 peaks were detected from Chuanxiong and assigned to reasonable structures, and 26 potential new phthalide compounds were structurally characterized. This strategy can improve the identification efficiency and reliability of homologues in complex materials. Copyright © 2018 Elsevier B.V. All rights reserved.

  12. System Statement of Tasks of Calculating and Providing the Reliability of Heating Cogeneration Plants in Power Systems

    NASA Astrophysics Data System (ADS)

    Biryuk, V. V.; Tsapkova, A. B.; Larin, E. A.; Livshiz, M. Y.; Sheludko, L. P.

    2018-01-01

    A set of mathematical models for calculating the reliability indexes of structurally complex multifunctional combined installations in heat and power supply systems was developed. Reliability of energy supply is considered as required condition for the creation and operation of heat and power supply systems. The optimal value of the power supply system coefficient F is based on an economic assessment of the consumers’ loss caused by the under-supply of electric power and additional system expences for the creation and operation of an emergency capacity reserve. Rationing of RI of the industrial heat supply is based on the use of concept of technological margin of safety of technological processes. The definition of rationed RI values of heat supply of communal consumers is based on the air temperature level iside the heated premises. The complex allows solving a number of practical tasks for providing reliability of heat supply for consumers. A probabilistic model is developed for calculating the reliability indexes of combined multipurpose heat and power plants in heat-and-power supply systems. The complex of models and calculation programs can be used to solve a wide range of specific tasks of optimization of schemes and parameters of combined heat and power plants and systems, as well as determining the efficiency of various redundance methods to ensure specified reliability of power supply.

  13. Nonlinear analyses and failure patterns of typical masonry school buildings in the epicentral zone of the 2016 Italian earthquakes

    NASA Astrophysics Data System (ADS)

    Clementi, Cristhian; Clementi, Francesco; Lenci, Stefano

    2017-11-01

    The paper discusses the behavior of typical masonry school buildings in the center of Italy built at the end of 1950s without any seismic guidelines. These structures have faced the recent Italian earthquakes in 2016 without diffuse damages. Global numerical models of the building have been built and masonry material has been simulated as nonlinear. Sensitivity analyses are done to evaluate the reliability of the structural models.

  14. Building Habitats on the Moon: Engineering Approaches to Lunar Settlements

    NASA Astrophysics Data System (ADS)

    Benaroya, H.

    This book provides an overview of various concepts for lunar habitats and structural designs and characterizes the lunar environment - the technical and the nontechnical. The designs take into consideration psychological comfort, structural strength against seismic and thermal activity, as well as internal pressurization and 1/6 g. Also discussed are micrometeoroid modelling, risk and redundancy as well as probability and reliability, with an introduction to analytical tools that can be useful in modelling uncertainties.

  15. Getting on the same page: The effect of normative feedback interventions on structured interview ratings.

    PubMed

    Hartwell, Christopher J; Campion, Michael A

    2016-06-01

    This study explores normative feedback as a way to reduce rating errors and increase the reliability and validity of structured interview ratings. Based in control theory and social comparison theory, we propose a model of normative feedback interventions (NFIs) in the context of structured interviews and test our model using data from over 20,000 interviews conducted by more than 100 interviewers over a period of more than 4 years. Results indicate that lenient and severe interviewers reduced discrepancies between their ratings and the overall normative mean rating after receipt of normative feedback, though changes were greater for lenient interviewers. When various waves of feedback were presented in later NFIs, the combined normative mean rating over multiple time periods was more predictive of subsequent rating changes than the normative mean rating from the most recent time period. Mean within-interviewer rating variance, along with interrater agreement and interrater reliability, increased after the initial NFI, but results from later NFIs were more complex and revealed that feedback interventions may lose effectiveness over time. A second study using simulated data indicated that leniency and severity errors did not impact rating validity, but did affect which applicants were hired. We conclude that giving normative feedback to interviewers will aid in minimizing interviewer rating differences and enhance the reliability of structured interview ratings. We suggest that interviewer feedback might be considered as a potential new component of interview structure, though future research is needed before a definitive conclusion can be drawn. (PsycINFO Database Record (c) 2016 APA, all rights reserved).

  16. Understanding the Reliability of Solder Joints Used in Advanced Structural and Electronics Applications: Part 2 - Reliability Performance.

    DOE PAGES

    Vianco, Paul T.

    2017-03-01

    Whether structural or electronic, all solder joints must provide the necessary level of reliability for the application. The Part 1 report examined the effects of filler metal properties and the soldering process on joint reliability. Filler metal solderability and mechanical properties, as well as the extents of base material dissolution and interface reaction that occur during the soldering process, were shown to affect reliability performance. The continuation of this discussion is presented in this Part 2 report, which highlights those factors that directly affect solder joint reliability. There is the growth of an intermetallic compound (IMC) reaction layer at themore » solder/base material interface by means of solid-state diffusion processes. In terms of mechanical response by the solder joint, fatigue remains as the foremost concern for long-term performance. Thermal mechanical fatigue (TMF), a form of low-cycle fatigue (LCF), occurs when temperature cycling is combined with mismatched values of the coefficient of thermal expansion (CTE) between materials comprising the solder joint “system.” Vibration environments give rise to high-cycle fatigue (HCF) degradation. Although accelerated aging studies provide valuable empirical data, too many variants of filler metals, base materials, joint geometries, and service environments are forcing design engineers to embrace computational modeling to predict the long-term reliability of solder joints.« less

  17. Inverse analysis of aerodynamic loads from strain information using structural models and neural networks

    NASA Astrophysics Data System (ADS)

    Wada, Daichi; Sugimoto, Yohei

    2017-04-01

    Aerodynamic loads on aircraft wings are one of the key parameters to be monitored for reliable and effective aircraft operations and management. Flight data of the aerodynamic loads would be used onboard to control the aircraft and accumulated data would be used for the condition-based maintenance and the feedback for the fatigue and critical load modeling. The effective sensing techniques such as fiber optic distributed sensing have been developed and demonstrated promising capability of monitoring structural responses, i.e., strains on the surface of the aircraft wings. By using the developed techniques, load identification methods for structural health monitoring are expected to be established. The typical inverse analysis for load identification using strains calculates the loads in a discrete form of concentrated forces, however, the distributed form of the loads is essential for the accurate and reliable estimation of the critical stress at structural parts. In this study, we demonstrate an inverse analysis to identify the distributed loads from measured strain information. The introduced inverse analysis technique calculates aerodynamic loads not in a discrete but in a distributed manner based on a finite element model. In order to verify the technique through numerical simulations, we apply static aerodynamic loads on a flat panel model, and conduct the inverse identification of the load distributions. We take two approaches to build the inverse system between loads and strains. The first one uses structural models and the second one uses neural networks. We compare the performance of the two approaches, and discuss the effect of the amount of the strain sensing information.

  18. GRIZZLY Model of Multi-Reactive Species Diffusion, Moisture/Heat Transfer and Alkali-Silica Reaction for Simulating Concrete Aging and Degradation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Huang, Hai; Spencer, Benjamin W.; Cai, Guowei

    Concrete is widely used in the construction of nuclear facilities because of its structural strength and its ability to shield radiation. The use of concrete in nuclear power plants for containment and shielding of radiation and radioactive materials has made its performance crucial for the safe operation of the facility. As such, when life extension is considered for nuclear power plants, it is critical to have accurate and reliable predictive tools to address concerns related to various aging processes of concrete structures and the capacity of structures subjected to age-related degradation. The goal of this report is to document themore » progress of the development and implementation of a fully coupled thermo-hydro-mechanical-chemical model in GRIZZLY code with the ultimate goal to reliably simulate and predict long-term performance and response of aged NPP concrete structures subjected to a number of aging mechanisms including external chemical attacks and volume-changing chemical reactions within concrete structures induced by alkali-silica reactions and long-term exposure to irradiation. Based on a number of survey reports of concrete aging mechanisms relevant to nuclear power plants and recommendations from researchers in concrete community, we’ve implemented three modules during FY15 in GRIZZLY code, (1) multi-species reactive diffusion model within cement materials; (2) coupled moisture and heat transfer model in concrete; and (3) anisotropic, stress-dependent, alkali-silica reaction induced swelling model. The multi-species reactive diffusion model was implemented with the objective to model aging of concrete structures subjected to aggressive external chemical attacks (e.g., chloride attack, sulfate attack, etc.). It considers multiple processes relevant to external chemical attacks such as diffusion of ions in aqueous phase within pore spaces, equilibrium chemical speciation reactions and kinetic mineral dissolution/precipitation. The moisture/heat transfer module was implemented to simulate long-term spatial and temporal evolutions of the moisture and temperature fields within concrete structures at both room and elevated temperatures. The ASR swelling model implemented in GRIZZLY code can simulate anisotropic expansions of ASR gel under either uniaxial, biaxial and triaxial stress states, and can be run simultaneously with the moisture/heat transfer model and coupled with various elastic/inelastic solid mechanics models that were implemented in GRIZZLY code previously. This report provides detailed descriptions of the governing equations, constitutive equations and numerical algorithms of the three modules implemented in GRIZZLY during FY15, simulation results of example problems and model validation results by comparing simulations with available experimental data reported in the literature. The close match between the experiments and simulations clearly demonstrate the potential of GRIZZLY code for reliable evaluation and prediction of long-term performance and response of aged concrete structures in nuclear power plants.« less

  19. The neural processing of hierarchical structure in music and speech at different timescales.

    PubMed

    Farbood, Morwaread M; Heeger, David J; Marcus, Gary; Hasson, Uri; Lerner, Yulia

    2015-01-01

    Music, like speech, is a complex auditory signal that contains structures at multiple timescales, and as such is a potentially powerful entry point into the question of how the brain integrates complex streams of information. Using an experimental design modeled after previous studies that used scrambled versions of a spoken story (Lerner et al., 2011) and a silent movie (Hasson et al., 2008), we investigate whether listeners perceive hierarchical structure in music beyond short (~6 s) time windows and whether there is cortical overlap between music and language processing at multiple timescales. Experienced pianists were presented with an extended musical excerpt scrambled at multiple timescales-by measure, phrase, and section-while measuring brain activity with functional magnetic resonance imaging (fMRI). The reliability of evoked activity, as quantified by inter-subject correlation of the fMRI responses, was measured. We found that response reliability depended systematically on musical structure coherence, revealing a topographically organized hierarchy of processing timescales. Early auditory areas (at the bottom of the hierarchy) responded reliably in all conditions. For brain areas at the top of the hierarchy, the original (unscrambled) excerpt evoked more reliable responses than any of the scrambled excerpts, indicating that these brain areas process long-timescale musical structures, on the order of minutes. The topography of processing timescales was analogous with that reported previously for speech, but the timescale gradients for music and speech overlapped with one another only partially, suggesting that temporally analogous structures-words/measures, sentences/musical phrases, paragraph/sections-are processed separately.

  20. Understanding criminals' thinking: further examination of the Measure of Offender Thinking Styles-Revised.

    PubMed

    Mandracchia, Jon T; Morgan, Robert D

    2011-12-01

    The Measure of Offender Thinking Styles (MOTS) was originally developed to examine the structure of dysfunctional thinking exhibited by criminal offenders. In the initial investigation, a three-factor model of criminal thinking was obtained using the MOTS. These factors included dysfunctional thinking characterized as Control, Cognitive Immaturity, and Egocentrism. In the present investigation, the stability of the three-factor model was examined with a confirmatory factor analysis of the revised version of the MOTS (i.e., MOTS-R). In addition, the internal consistency, test-retest reliability, and convergent validity of the MOTS-R were examined. Results indicated that the three-factor model of criminal thinking was supported. In addition, the MOTS-R demonstrated reliability and convergent validity with other measures of criminal thinking and attitudes. Overall, it appears that the MOTS-R may prove to be a valuable tool for use with an offender population, particularly because of the simple, intuitive structure of dysfunctional thinking that it represents.

  1. Validity and reliability of the persian version of templer death anxiety scale in family caregivers of cancer patients.

    PubMed

    Soleimani, Mohammad Ali; Bahrami, Nasim; Yaghoobzadeh, Ameneh; Banihashemi, Hedieh; Nia, Hamid Sharif; Haghdoost, Ali Akbar

    2016-01-01

    Due to increasing recognition of the importance of death anxiety for understanding human nature, it is important that researchers who investigate death anxiety have reliable and valid methodology to measure. The purpose of this study was to evaluate the validity and reliability of the Persian version of Templer Death Anxiety Scale (TDAS) in family caregivers of cancer patients. A sample of 326 caregivers of cancer patients completed a 15-item questionnaire. Principal components analysis (PCA) followed by a varimax rotation was used to assess factor structure of the DAS. The construct validity of the scale was assessed using exploratory and confirmatory factor analyses. Convergent and discriminant validity were also examined. Reliability was assessed with Cronbach's alpha coefficients and construction reliability. Based on the results of the PCA and consideration of the meaning of our items, a three-factor solution, explaining 60.38% of the variance, was identified. A confirmatory factor analysis (CFA) then supported the adequacy of the three-domain structure of the DAS. Goodness-of-fit indices showed an acceptable fit overall with the full model {χ(2)(df) = 262.32 (61), χ(2)/df = 2.04 [adjusted goodness of fit index (AGFI) = 0.922, parsimonious comparative fit index (PCFI) = 0.703, normed fit Index (NFI) = 0.912, CMIN/DF = 2.048, root mean square error of approximation (RMSEA) = 0.055]}. Convergent and discriminant validity were shown with construct fulfilled. The Cronbach's alpha and construct reliability were greater than 0.70. The findings show that the Persian version of the TDAS has a three-factor structure and acceptable validity and reliability.

  2. Finite element modelling and updating of a lively footbridge: The complete process

    NASA Astrophysics Data System (ADS)

    Živanović, Stana; Pavic, Aleksandar; Reynolds, Paul

    2007-03-01

    The finite element (FE) model updating technology was originally developed in the aerospace and mechanical engineering disciplines to automatically update numerical models of structures to match their experimentally measured counterparts. The process of updating identifies the drawbacks in the FE modelling and the updated FE model could be used to produce more reliable results in further dynamic analysis. In the last decade, the updating technology has been introduced into civil structural engineering. It can serve as an advanced tool for getting reliable modal properties of large structures. The updating process has four key phases: initial FE modelling, modal testing, manual model tuning and automatic updating (conducted using specialist software). However, the published literature does not connect well these phases, although this is crucial when implementing the updating technology. This paper therefore aims to clarify the importance of this linking and to describe the complete model updating process as applicable in civil structural engineering. The complete process consisting the four phases is outlined and brief theory is presented as appropriate. Then, the procedure is implemented on a lively steel box girder footbridge. It was found that even a very detailed initial FE model underestimated the natural frequencies of all seven experimentally identified modes of vibration, with the maximum error being almost 30%. Manual FE model tuning by trial and error found that flexible supports in the longitudinal direction should be introduced at the girder ends to improve correlation between the measured and FE-calculated modes. This significantly reduced the maximum frequency error to only 4%. It was demonstrated that only then could the FE model be automatically updated in a meaningful way. The automatic updating was successfully conducted by updating 22 uncertain structural parameters. Finally, a physical interpretation of all parameter changes is discussed. This interpretation is often missing in the published literature. It was found that the composite slabs were less stiff than originally assumed and that the asphalt layer contributed considerably to the deck stiffness.

  3. Measurement Structure of the Trait Hope Scale in Persons with Spinal Cord Injury: A Confirmatory Factor Analysis

    ERIC Educational Resources Information Center

    Smedema, Susan Miller; Pfaller, Joseph; Moser, Erin; Tu, Wei-Mo; Chan, Fong

    2013-01-01

    Objective: To evaluate the measurement structure of the Trait Hope Scale (THS) among individuals with spinal cord injury. Design: Confirmatory factor analysis and reliability and validity analyses were performed. Participants: 242 individuals with spinal cord injury. Results: Results support the two-factor measurement model for the THS with agency…

  4. Structural composite panel performance under long-term load

    Treesearch

    Theodore L. Laufenberg

    1988-01-01

    Information on the performance of wood-based structural composite panels under long-term load is currently needed to permit their use in engineered assemblies and systems. A broad assessment of the time-dependent properties of panels is critical for creating databases and models of the creep-rupture phenomenon that lead to reliability-based design procedures. This...

  5. Psychometric properties of the Brief Symptom Inventory-18 in a Spanish breast cancer sample.

    PubMed

    Galdón, Ma José; Durá, Estrella; Andreu, Yolanda; Ferrando, Maite; Murgui, Sergio; Pérez, Sandra; Ibañez, Elena

    2008-12-01

    The objective of this work was to study the psychometric and structural properties of the Brief Symptom Inventory-18 (BSI-18) in a sample of breast cancer patients (N=175). Confirmatory factor analyses were conducted. Two models were tested: the theoretical model with the original structure (three-dimensional), and the empirical model (a four-factor structure) obtained through exploratory factor analysis initially performed by the authors of the BSI-18. The eligible structure was the original proposal consisting of three dimensions: somatization, depression, and anxiety scores. These measures also showed good internal consistency. The results of this study support the reliability and structural validity of the BSI-18 as a standardized instrument for screening purposes in breast cancer patients, with the added benefits of simplicity and ease of application.

  6. Operations and support cost modeling of conceptual space vehicles

    NASA Technical Reports Server (NTRS)

    Ebeling, Charles

    1994-01-01

    The University of Dayton is pleased to submit this annual report to the National Aeronautics and Space Administration (NASA) Langley Research Center which documents the development of an operations and support (O&S) cost model as part of a larger life cycle cost (LCC) structure. It is intended for use during the conceptual design of new launch vehicles and spacecraft. This research is being conducted under NASA Research Grant NAG-1-1327. This research effort changes the focus from that of the first two years in which a reliability and maintainability model was developed to the initial development of an operations and support life cycle cost model. Cost categories were initially patterned after NASA's three axis work breakdown structure consisting of a configuration axis (vehicle), a function axis, and a cost axis. A revised cost element structure (CES), which is currently under study by NASA, was used to established the basic cost elements used in the model. While the focus of the effort was on operations and maintenance costs and other recurring costs, the computerized model allowed for other cost categories such as RDT&E and production costs to be addressed. Secondary tasks performed concurrent with the development of the costing model included support and upgrades to the reliability and maintainability (R&M) model. The primary result of the current research has been a methodology and a computer implementation of the methodology to provide for timely operations and support cost analysis during the conceptual design activities.

  7. Scale of attitudes toward alcohol - Spanish version: evidences of validity and reliability 1

    PubMed Central

    Ramírez, Erika Gisseth León; de Vargas, Divane

    2017-01-01

    ABSTRACT Objective: validate the Scale of attitudes toward alcohol, alcoholism and individuals with alcohol use disorders in its Spanish version. Method: methodological study, involving 300 Colombian nurses. Adopting the classical theory, confirmatory factor analysis was applied without prior examination, based on the strong historical evidence of the factorial structure of the original scale to determine the construct validity of this Spanish version. To assess the reliability, Cronbach’s Alpha and Mc Donalid’s Omega coefficients were used. Results: the confirmatory factor analysis indicated the good fit of the scale model in a four-factor distribution, with a cut-off point at 3.2, demonstrating 66.7% of sensitivity. Conclusions: the Scale of attitudes toward alcohol, alcoholism and individuals with alcohol use disorders in Spanish presented robust psychometric qualities, affirming that the instrument possesses a solid factorial structure and reliability and is capable of precisely measuring the nurses’ atittudes towards the phenomenon proposed. PMID:28793126

  8. Development and implementation of the Structured Training Trainer Assessment Report (STTAR) in the English National Training Programme for laparoscopic colorectal surgery.

    PubMed

    Wyles, Susannah M; Miskovic, Danilo; Ni, Zhifang; Darzi, Ara W; Valori, Roland M; Coleman, Mark G; Hanna, George B

    2016-03-01

    There is a lack of educational tools available for surgical teaching critique, particularly for advanced laparoscopic surgery. The aim was to develop and implement a tool that assesses training quality and structures feedback for trainers in the English National Training Programme for laparoscopic colorectal surgery. Semi-structured interviews were performed and analysed, and items were extracted. Through the Delphi process, essential items pertaining to desirable trainer characteristics, training structure and feedback were determined. An assessment tool (Structured Training Trainer Assessment Report-STTAR) was developed and tested for feasibility, acceptability and educational impact. Interview transcripts (29 surgical trainers, 10 trainees, four educationalists) were analysed, and item lists created and distributed for consensus opinion (11 trainers and seven trainees). The STTAR consisted of 64 factors, and its web-based version, the mini-STTAR, included 21 factors that were categorised into four groups (training structure, training behaviour, trainer attributes and role modelling) and structured around a training session timeline (beginning, middle and end). The STTAR (six trainers, 48 different assessments) demonstrated good internal consistency (α = 0.88) and inter-rater reliability (ICC = 0.75). The mini-STTAR demonstrated good inter-item reliability (α = 0.79) and intra-observer reliability on comparison of 85 different trainer/trainee combinations (r = 0.701, p = <0.001). Both were found to be feasible and acceptable. The educational report for trainers was found to be useful (4.4 out of 5). An assessment tool that evaluates training quality was developed and shown to be reliable, acceptable and of educational value. It has been successfully implemented into the English National Training Programme for laparoscopic colorectal surgery.

  9. Relating Measurement Invariance, Cross-Level Invariance, and Multilevel Reliability.

    PubMed

    Jak, Suzanne; Jorgensen, Terrence D

    2017-01-01

    Data often have a nested, multilevel structure, for example when data are collected from children in classrooms. This kind of data complicate the evaluation of reliability and measurement invariance, because several properties can be evaluated at both the individual level and the cluster level, as well as across levels. For example, cross-level invariance implies equal factor loadings across levels, which is needed to give latent variables at the two levels a similar interpretation. Reliability at a specific level refers to the ratio of true score variance over total variance at that level. This paper aims to shine light on the relation between reliability, cross-level invariance, and strong factorial invariance across clusters in multilevel data. Specifically, we will illustrate how strong factorial invariance across clusters implies cross-level invariance and perfect reliability at the between level in multilevel factor models.

  10. A road map for integrating eco-evolutionary processes into biodiversity models.

    PubMed

    Thuiller, Wilfried; Münkemüller, Tamara; Lavergne, Sébastien; Mouillot, David; Mouquet, Nicolas; Schiffers, Katja; Gravel, Dominique

    2013-05-01

    The demand for projections of the future distribution of biodiversity has triggered an upsurge in modelling at the crossroads between ecology and evolution. Despite the enthusiasm around these so-called biodiversity models, most approaches are still criticised for not integrating key processes known to shape species ranges and community structure. Developing an integrative modelling framework for biodiversity distribution promises to improve the reliability of predictions and to give a better understanding of the eco-evolutionary dynamics of species and communities under changing environments. In this article, we briefly review some eco-evolutionary processes and interplays among them, which are essential to provide reliable projections of species distributions and community structure. We identify gaps in theory, quantitative knowledge and data availability hampering the development of an integrated modelling framework. We argue that model development relying on a strong theoretical foundation is essential to inspire new models, manage complexity and maintain tractability. We support our argument with an example of a novel integrated model for species distribution modelling, derived from metapopulation theory, which accounts for abiotic constraints, dispersal, biotic interactions and evolution under changing environmental conditions. We hope such a perspective will motivate exciting and novel research, and challenge others to improve on our proposed approach. © 2013 John Wiley & Sons Ltd/CNRS.

  11. A Generative Angular Model of Protein Structure Evolution

    PubMed Central

    Golden, Michael; García-Portugués, Eduardo; Sørensen, Michael; Mardia, Kanti V.; Hamelryck, Thomas; Hein, Jotun

    2017-01-01

    Abstract Recently described stochastic models of protein evolution have demonstrated that the inclusion of structural information in addition to amino acid sequences leads to a more reliable estimation of evolutionary parameters. We present a generative, evolutionary model of protein structure and sequence that is valid on a local length scale. The model concerns the local dependencies between sequence and structure evolution in a pair of homologous proteins. The evolutionary trajectory between the two structures in the protein pair is treated as a random walk in dihedral angle space, which is modeled using a novel angular diffusion process on the two-dimensional torus. Coupling sequence and structure evolution in our model allows for modeling both “smooth” conformational changes and “catastrophic” conformational jumps, conditioned on the amino acid changes. The model has interpretable parameters and is comparatively more realistic than previous stochastic models, providing new insights into the relationship between sequence and structure evolution. For example, using the trained model we were able to identify an apparent sequence–structure evolutionary motif present in a large number of homologous protein pairs. The generative nature of our model enables us to evaluate its validity and its ability to simulate aspects of protein evolution conditioned on an amino acid sequence, a related amino acid sequence, a related structure or any combination thereof. PMID:28453724

  12. Measurement and modeling of intrinsic transcription terminators

    PubMed Central

    Cambray, Guillaume; Guimaraes, Joao C.; Mutalik, Vivek K.; Lam, Colin; Mai, Quynh-Anh; Thimmaiah, Tim; Carothers, James M.; Arkin, Adam P.; Endy, Drew

    2013-01-01

    The reliable forward engineering of genetic systems remains limited by the ad hoc reuse of many types of basic genetic elements. Although a few intrinsic prokaryotic transcription terminators are used routinely, termination efficiencies have not been studied systematically. Here, we developed and validated a genetic architecture that enables reliable measurement of termination efficiencies. We then assembled a collection of 61 natural and synthetic terminators that collectively encode termination efficiencies across an ∼800-fold dynamic range within Escherichia coli. We simulated co-transcriptional RNA folding dynamics to identify competing secondary structures that might interfere with terminator folding kinetics or impact termination activity. We found that structures extending beyond the core terminator stem are likely to increase terminator activity. By excluding terminators encoding such context-confounding elements, we were able to develop a linear sequence-function model that can be used to estimate termination efficiencies (r = 0.9, n = 31) better than models trained on all terminators (r = 0.67, n = 54). The resulting systematically measured collection of terminators should improve the engineering of synthetic genetic systems and also advance quantitative modeling of transcription termination. PMID:23511967

  13. On how to avoid input and structural uncertainties corrupt the inference of hydrological parameters using a Bayesian framework

    NASA Astrophysics Data System (ADS)

    Hernández, Mario R.; Francés, Félix

    2015-04-01

    One phase of the hydrological models implementation process, significantly contributing to the hydrological predictions uncertainty, is the calibration phase in which values of the unknown model parameters are tuned by optimizing an objective function. An unsuitable error model (e.g. Standard Least Squares or SLS) introduces noise into the estimation of the parameters. The main sources of this noise are the input errors and the hydrological model structural deficiencies. Thus, the biased calibrated parameters cause the divergence model phenomenon, where the errors variance of the (spatially and temporally) forecasted flows far exceeds the errors variance in the fitting period, and provoke the loss of part or all of the physical meaning of the modeled processes. In other words, yielding a calibrated hydrological model which works well, but not for the right reasons. Besides, an unsuitable error model yields a non-reliable predictive uncertainty assessment. Hence, with the aim of prevent all these undesirable effects, this research focuses on the Bayesian joint inference (BJI) of both the hydrological and error model parameters, considering a general additive (GA) error model that allows for correlation, non-stationarity (in variance and bias) and non-normality of model residuals. As hydrological model, it has been used a conceptual distributed model called TETIS, with a particular split structure of the effective model parameters. Bayesian inference has been performed with the aid of a Markov Chain Monte Carlo (MCMC) algorithm called Dream-ZS. MCMC algorithm quantifies the uncertainty of the hydrological and error model parameters by getting the joint posterior probability distribution, conditioned on the observed flows. The BJI methodology is a very powerful and reliable tool, but it must be used correctly this is, if non-stationarity in errors variance and bias is modeled, the Total Laws must be taken into account. The results of this research show that the application of BJI with a GA error model outperforms the hydrological parameters robustness (diminishing the divergence model phenomenon) and improves the reliability of the streamflow predictive distribution, in respect of the results of a bad error model as SLS. Finally, the most likely prediction in a validation period, for both BJI+GA and SLS error models shows a similar performance.

  14. Development of Equivalent Material Properties of Microbump for Simulating Chip Stacking Packaging

    PubMed Central

    Lee, Chang-Chun; Tzeng, Tzai-Liang; Huang, Pei-Chen

    2015-01-01

    A three-dimensional integrated circuit (3D-IC) structure with a significant scale mismatch causes difficulty in analytic model construction. This paper proposes a simulation technique to introduce an equivalent material composed of microbumps and their surrounding wafer level underfill (WLUF). The mechanical properties of this equivalent material, including Young’s modulus (E), Poisson’s ratio, shear modulus, and coefficient of thermal expansion (CTE), are directly obtained by applying either a tensile load or a constant displacement, and by increasing the temperature during simulations, respectively. Analytic results indicate that at least eight microbumps at the outermost region of the chip stacking structure need to be considered as an accurate stress/strain contour in the concerned region. In addition, a factorial experimental design with analysis of variance is proposed to optimize chip stacking structure reliability with four factors: chip thickness, substrate thickness, CTE, and E-value. Analytic results show that the most significant factor is CTE of WLUF. This factor affects microbump reliability and structural warpage under a temperature cycling load and high-temperature bonding process. WLUF with low CTE and high E-value are recommended to enhance the assembly reliability of the 3D-IC architecture. PMID:28793495

  15. Generalized Reliability Methodology Applied to Brittle Anisotropic Single Crystals. Degree awarded by Washington Univ., 1999

    NASA Technical Reports Server (NTRS)

    Salem, Jonathan A.

    2002-01-01

    A generalized reliability model was developed for use in the design of structural components made from brittle, homogeneous anisotropic materials such as single crystals. The model is based on the Weibull distribution and incorporates a variable strength distribution and any equivalent stress failure criteria. In addition to the reliability model, an energy based failure criterion for elastically anisotropic materials was formulated. The model is different from typical Weibull-based models in that it accounts for strength anisotropy arising from fracture toughness anisotropy and thereby allows for strength and reliability predictions of brittle, anisotropic single crystals subjected to multiaxial stresses. The model is also applicable to elastically isotropic materials exhibiting strength anisotropy due to an anisotropic distribution of flaws. In order to develop and experimentally verify the model, the uniaxial and biaxial strengths of a single crystal nickel aluminide were measured. The uniaxial strengths of the <100> and <110> crystal directions were measured in three and four-point flexure. The biaxial strength was measured by subjecting <100> plates to a uniform pressure in a test apparatus that was developed and experimentally verified. The biaxial strengths of the single crystal plates were estimated by extending and verifying the displacement solution for a circular, anisotropic plate to the case of a variable radius and thickness. The best correlation between the experimental strength data and the model predictions occurred when an anisotropic stress analysis was combined with the normal stress criterion and the strength parameters associated with the <110> crystal direction.

  16. The Psychometric Evaluation of the Connor-Davidson Resilience Scale Using a Chinese Military Sample

    PubMed Central

    Xie, Yuanjun; Peng, Li; Zuo, Xin; Li, Min

    2016-01-01

    This study examined the psychometric properties of the Connor-Davidson Resilience Scale (CD-RISC) with a Chinese military population with the aim of finding a suitable instrument to quantify resilience in Chinese military service members. The confirmatory factor analysis results did not support the factorial structure of the original or the Chinese community version of the CD-RISC, but the exploratory factor analysis results revealed a three-factor model (composed of Competency, Toughness, and Adaptability) that seemed to fit. Moreover, the repeat confirmatory factory analysis replicated the three-factor model. Additionally, the CD-RISC with a Chinese military sample exhibited appropriate psychometric properties, including internal consistency, test-retest reliability, and structural and concurrent validity. The revised CD-RISC with a Chinese military sample provides insight into the resilience measurement framework and could be a reliable and valid measurement for evaluating resilience in a Chinese military population. PMID:26859484

  17. Psychometric properties of the Swedish PedsQL, Pediatric Quality of Life Inventory 4.0 generic core scales.

    PubMed

    Petersen, Solveig; Hägglöf, Bruno; Stenlund, Hans; Bergström, Erik

    2009-09-01

    To study the psychometric performance of the Swedish version of the Pediatric Quality of Life Inventory (PedsQL) 4.0 generic core scales in a general child population in Sweden. PedsQL forms were distributed to 2403 schoolchildren and 888 parents in two different school settings. Reliability and validity was studied for self-reports and proxy reports, full forms and short forms. Confirmatory factor analysis tested the factor structure and multigroup confirmatory factor analysis tested measurement invariance between boys and girls. Test-retest reliability was demonstrated for all scales and internal consistency reliability was shown with alpha value exceeding 0.70 for all scales but one (self-report short form: social functioning). Child-parent agreement was low to moderate. The four-factor structure of the PedsQL and factorial invariance across sex subgroups were confirmed for the self-report forms and for the proxy short form, while model fit indices suggested improvement of several proxy full-form scales. The Swedish PedsQL 4.0 generic core scales are a reliable and valid tool for health-related quality of life (HRQoL) assessment in Swedish child populations. The proxy full form, however, should be used with caution. The study also support continued use of the PedsQL as a four-factor model, capable of revealing meaningful HRQoL differences between boys and girls.

  18. An enhanced reliability-oriented workforce planning model for process industry using combined fuzzy goal programming and differential evolution approach

    NASA Astrophysics Data System (ADS)

    Ighravwe, D. E.; Oke, S. A.; Adebiyi, K. A.

    2018-03-01

    This paper draws on the "human reliability" concept as a structure for gaining insight into the maintenance workforce assessment in a process industry. Human reliability hinges on developing the reliability of humans to a threshold that guides the maintenance workforce to execute accurate decisions within the limits of resources and time allocations. This concept offers a worthwhile point of deviation to encompass three elegant adjustments to literature model in terms of maintenance time, workforce performance and return-on-workforce investments. These fully explain the results of our influence. The presented structure breaks new grounds in maintenance workforce theory and practice from a number of perspectives. First, we have successfully implemented fuzzy goal programming (FGP) and differential evolution (DE) techniques for the solution of optimisation problem in maintenance of a process plant for the first time. The results obtained in this work showed better quality of solution from the DE algorithm compared with those of genetic algorithm and particle swarm optimisation algorithm, thus expressing superiority of the proposed procedure over them. Second, the analytical discourse, which was framed on stochastic theory, focusing on specific application to a process plant in Nigeria is a novelty. The work provides more insights into maintenance workforce planning during overhaul rework and overtime maintenance activities in manufacturing systems and demonstrated capacity in generating substantially helpful information for practice.

  19. Combined DFT and BS study on the exchange coupling of dinuclear sandwich-type POM: comparison of different functionals and reliability of structure modeling.

    PubMed

    Yin, Bing; Xue, GangLin; Li, JianLi; Bai, Lu; Huang, YuanHe; Wen, ZhenYi; Jiang, ZhenYi

    2012-05-01

    The exchange coupling of a group of three dinuclear sandwich-type polyoxomolybdates [MM'(AsMo7O27)2](12-) with MM' = CrCr, FeFe, FeCr are theoretically predicted from combined DFT and broken-symmetry (BS) approach. Eight different XC functionals are utilized to calculate the exchange-coupling constant J from both the full crystalline structures and model structures of smaller size. The comparison between theoretical values and accurate experimental results supports the applicability of DFT-BS method in this new type of sandwich-type dinuclear polyoxomolybdates. However, a careful choice of functionals is necessary to achieve the desired accuracy. The encouraging results obtained from calculations on model structures highlight the great potential of application of structure modeling in theoretical study of POM. Structural modeling may not only reduce the computational cost of large POM species but also be able to take into account the external field effect arising from solvent molecules in solution or counterions in crystal.

  20. Quality assessment of protein model-structures using evolutionary conservation.

    PubMed

    Kalman, Matan; Ben-Tal, Nir

    2010-05-15

    Programs that evaluate the quality of a protein structural model are important both for validating the structure determination procedure and for guiding the model-building process. Such programs are based on properties of native structures that are generally not expected for faulty models. One such property, which is rarely used for automatic structure quality assessment, is the tendency for conserved residues to be located at the structural core and for variable residues to be located at the surface. We present ConQuass, a novel quality assessment program based on the consistency between the model structure and the protein's conservation pattern. We show that it can identify problematic structural models, and that the scores it assigns to the server models in CASP8 correlate with the similarity of the models to the native structure. We also show that when the conservation information is reliable, the method's performance is comparable and complementary to that of the other single-structure quality assessment methods that participated in CASP8 and that do not use additional structural information from homologs. A perl implementation of the method, as well as the various perl and R scripts used for the analysis are available at http://bental.tau.ac.il/ConQuass/. nirb@tauex.tau.ac.il Supplementary data are available at Bioinformatics online.

  1. Special methods for aerodynamic-moment calculations from parachute FSI modeling

    NASA Astrophysics Data System (ADS)

    Takizawa, Kenji; Tezduyar, Tayfun E.; Boswell, Cody; Tsutsui, Yuki; Montel, Kenneth

    2015-06-01

    The space-time fluid-structure interaction (STFSI) methods for 3D parachute modeling are now at a level where they can bring reliable, practical analysis to some of the most complex parachute systems, such as spacecraft parachutes. The methods include the Deforming-Spatial-Domain/Stabilized ST method as the core computational technology, and a good number of special FSI methods targeting parachutes. Evaluating the stability characteristics of a parachute based on how the aerodynamic moment varies as a function of the angle of attack is one of the practical analyses that reliable parachute FSI modeling can deliver. We describe the special FSI methods we developed for this specific purpose and present the aerodynamic-moment data obtained from FSI modeling of NASA Orion spacecraft parachutes and Japan Aerospace Exploration Agency (JAXA) subscale parachutes.

  2. Rehabilitation reliability of the road pavement structure with recycled base course with foamed bitumen

    NASA Astrophysics Data System (ADS)

    Buczyński, P.

    2018-05-01

    This article presents a new approach to reliability assessment of the road structure in which the base layer will be constructed in the process of cold deep recycling with foamed bitumen. In order to properly assess the reliability of the structure with the recycled base, it is necessary to determine the distribution of stress and strain in typical pavement layer systems. The true stress and strain values were established for particular structural layers using the complex modulus (E*) determined based on the master curves. The complex modulus was determined by the direct tension-compression test on cylindrical specimens (DTC-CY) at five temperatures (-7°C, 5°C, 13°C, 25°C, 40°C) and six loading times (0.1 Hz, 0.3 Hz, 1 Hz, 3 Hz, 10 Hz, 20 Hz) in accordance with EN 12697-26 in the linear viscoelasticity (LVE) range for small strains ranging from 25 to 50 με. The master curves of the complex modulus were constructed using the Richards model for the mixtures typically incorporated in structural layers, i.e., SMA11, AC16W, AC22P and MCAS. The values of the modulus characterizing particular layers were determined with temperature distribution in the structure taken into account, when the surface temperature was 40°C. The stress distribution was established for those calculation models. The stress values were used to evaluate the fatigue life under controlled stress conditions (IT-FT). This evaluation, with the controlled stress corresponding to that in the structure, facilitated the quality assessment of the rehabilitated recycled base course. Results showed that the recycled base mixtures having the indirect tensile strength (ITSDRY) similar to the stress in the structure under analysis needed an additional fatigue life evaluation in the indirect tensile test ITT. This approach to the recycled base quality assessment will allow eliminating the damage induced by overloading.

  3. Honing process optimization algorithms

    NASA Astrophysics Data System (ADS)

    Kadyrov, Ramil R.; Charikov, Pavel N.; Pryanichnikova, Valeria V.

    2018-03-01

    This article considers the relevance of honing processes for creating high-quality mechanical engineering products. The features of the honing process are revealed and such important concepts as the task for optimization of honing operations, the optimal structure of the honing working cycles, stepped and stepless honing cycles, simulation of processing and its purpose are emphasized. It is noted that the reliability of the mathematical model determines the quality parameters of the honing process control. An algorithm for continuous control of the honing process is proposed. The process model reliably describes the machining of a workpiece in a sufficiently wide area and can be used to operate the CNC machine CC743.

  4. Probabilistic simulation of the human factor in structural reliability

    NASA Astrophysics Data System (ADS)

    Chamis, Christos C.; Singhal, Surendra N.

    1994-09-01

    The formal approach described herein computationally simulates the probable ranges of uncertainties for the human factor in probabilistic assessments of structural reliability. Human factors such as marital status, professional status, home life, job satisfaction, work load, and health are studied by using a multifactor interaction equation (MFIE) model to demonstrate the approach. Parametric studies in conjunction with judgment are used to select reasonable values for the participating factors (primitive variables). Subsequently performed probabilistic sensitivity studies assess the suitability of the MFIE as well as the validity of the whole approach. Results show that uncertainties range from 5 to 30 percent for the most optimistic case, assuming 100 percent for no error (perfect performance).

  5. Probabilistic Simulation of the Human Factor in Structural Reliability

    NASA Technical Reports Server (NTRS)

    Chamis, Christos C.; Singhal, Surendra N.

    1994-01-01

    The formal approach described herein computationally simulates the probable ranges of uncertainties for the human factor in probabilistic assessments of structural reliability. Human factors such as marital status, professional status, home life, job satisfaction, work load, and health are studied by using a multifactor interaction equation (MFIE) model to demonstrate the approach. Parametric studies in conjunction with judgment are used to select reasonable values for the participating factors (primitive variables). Subsequently performed probabilistic sensitivity studies assess the suitability of the MFIE as well as the validity of the whole approach. Results show that uncertainties range from 5 to 30 percent for the most optimistic case, assuming 100 percent for no error (perfect performance).

  6. Weak data do not make a free lunch, only a cheap meal

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Luo, Zhipu; Rajashankar, Kanagalaghatta; Dauter, Zbigniew, E-mail: dauter@anl.gov

    2014-02-01

    Refinement and analysis of four structures with various data resolution cutoffs suggests that at present there are no reliable criteria for judging the diffraction data resolution limit and the condition I/σ(I) = 2.0 is reasonable. However, extending the limit by about 0.2 Å beyond the resolution defined by this threshold does not deteriorate the quality of refined structures and in some cases may be beneficial. Four data sets were processed at resolutions significantly exceeding the criteria traditionally used for estimating the diffraction data resolution limit. The analysis of these data and the corresponding model-quality indicators suggests that the criteria ofmore » resolution limits widely adopted in the past may be somewhat conservative. Various parameters, such as R{sub merge} and I/σ(I), optical resolution and the correlation coefficients CC{sub 1/2} and CC*, can be used for judging the internal data quality, whereas the reliability factors R and R{sub free} as well as the maximum-likelihood target values and real-space map correlation coefficients can be used to estimate the agreement between the data and the refined model. However, none of these criteria provide a reliable estimate of the data resolution cutoff limit. The analysis suggests that extension of the maximum resolution by about 0.2 Å beyond the currently adopted limit where the I/σ(I) value drops to 2.0 does not degrade the quality of the refined structural models, but may sometimes be advantageous. Such an extension may be particularly beneficial for significantly anisotropic diffraction. Extension of the maximum resolution at the stage of data collection and structure refinement is cheap in terms of the required effort and is definitely more advisable than accepting a too conservative resolution cutoff, which is unfortunately quite frequent among the crystal structures deposited in the Protein Data Bank.« less

  7. The Application of COMSOL Multiphysics Package on the Modelling of Complex 3-D Lithospheric Electrical Resistivity Structures - A Case Study from the Proterozoic Orogenic belt within the North China Craton

    NASA Astrophysics Data System (ADS)

    Guo, L.; Yin, Y.; Deng, M.; Guo, L.; Yan, J.

    2017-12-01

    At present, most magnetotelluric (MT) forward modelling and inversion codes are based on finite difference method. But its structured mesh gridding cannot be well adapted for the conditions with arbitrary topography or complex tectonic structures. By contrast, the finite element method is more accurate in calculating complex and irregular 3-D region and has lower requirement of function smoothness. However, the complexity of mesh gridding and limitation of computer capacity has been affecting its application. COMSOL Multiphysics is a cross-platform finite element analysis, solver and multiphysics full-coupling simulation software. It achieves highly accurate numerical simulations with high computational performance and outstanding multi-field bi-directional coupling analysis capability. In addition, its AC/DC and RF module can be used to easily calculate the electromagnetic responses of complex geological structures. Using the adaptive unstructured grid, the calculation is much faster. In order to improve the discretization technique of computing area, we use the combination of Matlab and COMSOL Multiphysics to establish a general procedure for calculating the MT responses for arbitrary resistivity models. The calculated responses include the surface electric and magnetic field components, impedance components, magnetic transfer functions and phase tensors. Then, the reliability of this procedure is certificated by 1-D, 2-D and 3-D and anisotropic forward modeling tests. Finally, we establish the 3-D lithospheric resistivity model for the Proterozoic Wutai-Hengshan Mts. within the North China Craton by fitting the real MT data collected there. The reliability of the model is also verified by induced vectors and phase tensors. Our model shows more details and better resolution, compared with the previously published 3-D model based on the finite difference method. In conclusion, COMSOL Multiphysics package is suitable for modeling the 3-D lithospheric resistivity structures under complex tectonic deformation backgrounds, which could be a good complement to the existing finite-difference inversion algorithms.

  8. Cross-cultural adaptation of the Female Genital Self-Image Scale (FGSIS) in Iranian female college students.

    PubMed

    Pakpour, Amir H; Zeidi, Isa Mohammadi; Ziaeiha, Masoumeh; Burri, Andrea

    2014-01-01

    The aim of the present study was to investigate the psychometric properties of a translated and culturally adapted Iranian version of the Female Genital Self-Image Scale (FGSIS-I) in a sample of college women. Further, the relationship between women's self-image, body appreciation, sexual functioning, and gynecological exam behavior was explored. A sample of 1,877 female students from five different universities across Qazvin and Tehran completed the Female Sexual Function Index (FSFI), the Body Appreciation Scale (BAS), the Rosenberg Self-Esteem Scale (RSES), the FGSIS-I, and a gynecological exam behavior questionnaire. Good to excellent internal consistency reliability, test-retest reliability, and convergent and construct validity were found. Exploratory factor analysis (EFA) and confirmatory factor analysis (CFA) both provided a two-factor structure for the FGSIS-I. The validity of the FGSIS-I in predicting gynecological exam behavior of college women was tested using structural equation modeling (SEM). The final model accounted for 33% of the variance in gynecological exam behavior (p < 0.01). In conclusion, the FGSIS-I was found to be a highly valid and reliable instrument to assess female genital self-image in Iranian women.

  9. Thermal Cycling Life Prediction of Sn-3.0Ag-0.5Cu Solder Joint Using Type-I Censored Data

    PubMed Central

    Mi, Jinhua; Yang, Yuan-Jian; Huang, Hong-Zhong

    2014-01-01

    Because solder joint interconnections are the weaknesses of microelectronic packaging, their reliability has great influence on the reliability of the entire packaging structure. Based on an accelerated life test the reliability assessment and life prediction of lead-free solder joints using Weibull distribution are investigated. The type-I interval censored lifetime data were collected from a thermal cycling test, which was implemented on microelectronic packaging with lead-free ball grid array (BGA) and fine-pitch ball grid array (FBGA) interconnection structures. The number of cycles to failure of lead-free solder joints is predicted by using a modified Engelmaier fatigue life model and a type-I censored data processing method. Then, the Pan model is employed to calculate the acceleration factor of this test. A comparison of life predictions between the proposed method and the ones calculated directly by Matlab and Minitab is conducted to demonstrate the practicability and effectiveness of the proposed method. At last, failure analysis and microstructure evolution of lead-free solders are carried out to provide useful guidance for the regular maintenance, replacement of substructure, and subsequent processing of electronic products. PMID:25121138

  10. A Psychometric Analysis of the Italian Version of the eHealth Literacy Scale Using Item Response and Classical Test Theory Methods

    PubMed Central

    Dima, Alexandra Lelia; Schulz, Peter Johannes

    2017-01-01

    Background The eHealth Literacy Scale (eHEALS) is a tool to assess consumers’ comfort and skills in using information technologies for health. Although evidence exists of reliability and construct validity of the scale, less agreement exists on structural validity. Objective The aim of this study was to validate the Italian version of the eHealth Literacy Scale (I-eHEALS) in a community sample with a focus on its structural validity, by applying psychometric techniques that account for item difficulty. Methods Two Web-based surveys were conducted among a total of 296 people living in the Italian-speaking region of Switzerland (Ticino). After examining the latent variables underlying the observed variables of the Italian scale via principal component analysis (PCA), fit indices for two alternative models were calculated using confirmatory factor analysis (CFA). The scale structure was examined via parametric and nonparametric item response theory (IRT) analyses accounting for differences between items regarding the proportion of answers indicating high ability. Convergent validity was assessed by correlations with theoretically related constructs. Results CFA showed a suboptimal model fit for both models. IRT analyses confirmed all items measure a single dimension as intended. Reliability and construct validity of the final scale were also confirmed. The contrasting results of factor analysis (FA) and IRT analyses highlight the importance of considering differences in item difficulty when examining health literacy scales. Conclusions The findings support the reliability and validity of the translated scale and its use for assessing Italian-speaking consumers’ eHealth literacy. PMID:28400356

  11. Mathematical programming models for the economic design and assessment of wind energy conversion systems

    NASA Astrophysics Data System (ADS)

    Reinert, K. A.

    The use of linear decision rules (LDR) and chance constrained programming (CCP) to optimize the performance of wind energy conversion clusters coupled to storage systems is described. Storage is modelled by LDR and output by CCP. The linear allocation rule and linear release rule prescribe the size and optimize a storage facility with a bypass. Chance constraints are introduced to explicitly treat reliability in terms of an appropriate value from an inverse cumulative distribution function. Details of deterministic programming structure and a sample problem involving a 500 kW and a 1.5 MW WECS are provided, considering an installed cost of $1/kW. Four demand patterns and three levels of reliability are analyzed for optimizing the generator choice and the storage configuration for base load and peak operating conditions. Deficiencies in ability to predict reliability and to account for serial correlations are noted in the model, which is concluded useful for narrowing WECS design options.

  12. Components of Mathematics Anxiety: Factor Modeling of the MARS30-Brief

    PubMed Central

    Pletzer, Belinda; Wood, Guilherme; Scherndl, Thomas; Kerschbaum, Hubert H.; Nuerk, Hans-Christoph

    2016-01-01

    Mathematics anxiety involves feelings of tension, discomfort, high arousal, and physiological reactivity interfering with number manipulation and mathematical problem solving. Several factor analytic models indicate that mathematics anxiety is rather a multidimensional than unique construct. However, the factor structure of mathematics anxiety has not been fully clarified by now. This issue shall be addressed in the current study. The Mathematics Anxiety Rating Scale (MARS) is a reliable measure of mathematics anxiety (Richardson and Suinn, 1972), for which several reduced forms have been developed. Most recently, a shortened version of the MARS (MARS30-brief) with comparable reliability was published. Different studies suggest that mathematics anxiety involves up to seven different factors. Here we examined the factor structure of the MARS30-brief by means of confirmatory factor analysis. The best model fit was obtained by a six-factor model, dismembering the known two general factors “Mathematical Test Anxiety” (MTA) and “Numerical Anxiety” (NA) in three factors each. However, a more parsimonious 5-factor model with two sub-factors for MTA and three for NA fitted the data comparably well. Factors were differentially susceptible to sex differences and differences between majors. Measurement invariance for sex was established. PMID:26924996

  13. Components of Mathematics Anxiety: Factor Modeling of the MARS30-Brief.

    PubMed

    Pletzer, Belinda; Wood, Guilherme; Scherndl, Thomas; Kerschbaum, Hubert H; Nuerk, Hans-Christoph

    2016-01-01

    Mathematics anxiety involves feelings of tension, discomfort, high arousal, and physiological reactivity interfering with number manipulation and mathematical problem solving. Several factor analytic models indicate that mathematics anxiety is rather a multidimensional than unique construct. However, the factor structure of mathematics anxiety has not been fully clarified by now. This issue shall be addressed in the current study. The Mathematics Anxiety Rating Scale (MARS) is a reliable measure of mathematics anxiety (Richardson and Suinn, 1972), for which several reduced forms have been developed. Most recently, a shortened version of the MARS (MARS30-brief) with comparable reliability was published. Different studies suggest that mathematics anxiety involves up to seven different factors. Here we examined the factor structure of the MARS30-brief by means of confirmatory factor analysis. The best model fit was obtained by a six-factor model, dismembering the known two general factors "Mathematical Test Anxiety" (MTA) and "Numerical Anxiety" (NA) in three factors each. However, a more parsimonious 5-factor model with two sub-factors for MTA and three for NA fitted the data comparably well. Factors were differentially susceptible to sex differences and differences between majors. Measurement invariance for sex was established.

  14. Waves at Navigation Structures

    DTIC Science & Technology

    2015-10-30

    upgrades the Coastal Modeling System (CMS) wave models CMS-Wave, a phase- averaged spectral wave model, and BOUSS-2D, a Boussinesq type nonlinear wave...developing WaveNet and TideNet, two Web-based tool systems for wind and wave data access and processing, which provide critical data for USACE project...practical applications, resulting in optimization of navigation system to improve safety, reliability and operations with innovative infrastructures

  15. The neural processing of hierarchical structure in music and speech at different timescales

    PubMed Central

    Farbood, Morwaread M.; Heeger, David J.; Marcus, Gary; Hasson, Uri; Lerner, Yulia

    2015-01-01

    Music, like speech, is a complex auditory signal that contains structures at multiple timescales, and as such is a potentially powerful entry point into the question of how the brain integrates complex streams of information. Using an experimental design modeled after previous studies that used scrambled versions of a spoken story (Lerner et al., 2011) and a silent movie (Hasson et al., 2008), we investigate whether listeners perceive hierarchical structure in music beyond short (~6 s) time windows and whether there is cortical overlap between music and language processing at multiple timescales. Experienced pianists were presented with an extended musical excerpt scrambled at multiple timescales—by measure, phrase, and section—while measuring brain activity with functional magnetic resonance imaging (fMRI). The reliability of evoked activity, as quantified by inter-subject correlation of the fMRI responses, was measured. We found that response reliability depended systematically on musical structure coherence, revealing a topographically organized hierarchy of processing timescales. Early auditory areas (at the bottom of the hierarchy) responded reliably in all conditions. For brain areas at the top of the hierarchy, the original (unscrambled) excerpt evoked more reliable responses than any of the scrambled excerpts, indicating that these brain areas process long-timescale musical structures, on the order of minutes. The topography of processing timescales was analogous with that reported previously for speech, but the timescale gradients for music and speech overlapped with one another only partially, suggesting that temporally analogous structures—words/measures, sentences/musical phrases, paragraph/sections—are processed separately. PMID:26029037

  16. Probabilistic structural analysis methods for improving Space Shuttle engine reliability

    NASA Technical Reports Server (NTRS)

    Boyce, L.

    1989-01-01

    Probabilistic structural analysis methods are particularly useful in the design and analysis of critical structural components and systems that operate in very severe and uncertain environments. These methods have recently found application in space propulsion systems to improve the structural reliability of Space Shuttle Main Engine (SSME) components. A computer program, NESSUS, based on a deterministic finite-element program and a method of probabilistic analysis (fast probability integration) provides probabilistic structural analysis for selected SSME components. While computationally efficient, it considers both correlated and nonnormal random variables as well as an implicit functional relationship between independent and dependent variables. The program is used to determine the response of a nickel-based superalloy SSME turbopump blade. Results include blade tip displacement statistics due to the variability in blade thickness, modulus of elasticity, Poisson's ratio or density. Modulus of elasticity significantly contributed to blade tip variability while Poisson's ratio did not. Thus, a rational method for choosing parameters to be modeled as random is provided.

  17. An examination of the psychometric structure of the Multidimensional Pain Inventory in temporomandibular disorder patients: a confirmatory factor analysis

    PubMed Central

    Andreu, Yolanda; Galdon, Maria J; Durá, Estrella; Ferrando, Maite; Pascual, Juan; Turk, Dennis C; Jiménez, Yolanda; Poveda, Rafael

    2006-01-01

    Background This paper seeks to analyse the psychometric and structural properties of the Multidimensional Pain Inventory (MPI) in a sample of temporomandibular disorder patients. Methods The internal consistency of the scales was obtained. Confirmatory Factor Analysis was carried out to test the MPI structure section by section in a sample of 114 temporomandibular disorder patients. Results Nearly all scales obtained good reliability indexes. The original structure could not be totally confirmed. However, with a few adjustments we obtained a satisfactory structural model of the MPI which was slightly different from the original: certain items and the Self control scale were eliminated; in two cases, two original scales were grouped in one factor, Solicitous and Distracting responses on the one hand, and Social activities and Away from home activities, on the other. Conclusion The MPI has been demonstrated to be a reliable tool for the assessment of pain in temporomandibular disorder patients. Some divergences to be taken into account have been clarified. PMID:17169143

  18. Modified chloride diffusion model for concrete under the coupling effect of mechanical load and chloride salt environment

    NASA Astrophysics Data System (ADS)

    Lei, Mingfeng; Lin, Dayong; Liu, Jianwen; Shi, Chenghua; Ma, Jianjun; Yang, Weichao; Yu, Xiaoniu

    2018-03-01

    For the purpose of investigating lining concrete durability, this study derives a modified chloride diffusion model for concrete based on the odd continuation of boundary conditions and Fourier transform. In order to achieve this, the linear stress distribution on a sectional structure is considered, detailed procedures and methods are presented for model verification and parametric analysis. Simulation results show that the chloride diffusion model can reflect the effects of linear stress distribution of the sectional structure on the chloride diffusivity with reliable accuracy. Along with the natural environmental characteristics of practical engineering structures, reference value ranges of model parameters are provided. Furthermore, a chloride diffusion model is extended for the consideration of multi-factor coupling of linear stress distribution, chloride concentration and diffusion time. Comparison between model simulation and typical current research results shows that the presented model can produce better considerations with a greater universality.

  19. Impact of Rating Scale Categories on Reliability and Fit Statistics of the Malay Spiritual Well-Being Scale using Rasch Analysis.

    PubMed

    Daher, Aqil Mohammad; Ahmad, Syed Hassan; Winn, Than; Selamat, Mohd Ikhsan

    2015-01-01

    Few studies have employed the item response theory in examining reliability. We conducted this study to examine the effect of Rating Scale Categories (RSCs) on the reliability and fit statistics of the Malay Spiritual Well-Being Scale, employing the Rasch model. The Malay Spiritual Well-Being Scale (SWBS) with the original six; three and four newly structured RSCs was distributed randomly among three different samples of 50 participants each. The mean age of respondents in the three samples ranged between 36 and 39 years old. The majority was female in all samples, and Islam was the most prevalent religion among the respondents. The predominating race was Malay, followed by Chinese and Indian. The original six RSCs indicated better targeting of 0.99 and smallest model error of 0.24. The Infit Mnsq (mean square) and Zstd (Z standard) of the six RSCs were "1.1"and "-0.1"respectively. The six RSCs achieved the highest person and item reliabilities of 0.86 and 0.85 respectively. These reliabilities yielded the highest person (2.46) and item (2.38) separation indices compared to other the RSCs. The person and item reliability and, to a lesser extent, the fit statistics, were better with the six RSCs compared to the four and three RSCs.

  20. NASTRAN analysis of Tokamak vacuum vessel using interactive graphics

    NASA Technical Reports Server (NTRS)

    Miller, A.; Badrian, M.

    1978-01-01

    Isoparametric quadrilateral and triangular elements were used to represent the vacuum vessel shell structure. For toroidally symmetric loadings, MPCs were employed across model boundaries and rigid format 24 was invoked. Nonsymmetric loadings required the use of the cyclic symmetry analysis available with rigid format 49. NASTRAN served as an important analysis tool in the Tokamak design effort by providing a reliable means for assessing structural integrity. Interactive graphics were employed in the finite element model generation and in the post-processing of results. It was felt that model generation and checkout with interactive graphics reduced the modelling effort and debugging man-hours significantly.

  1. Theoretical relationship between vibration transmissibility and driving-point response functions of the human body.

    PubMed

    Dong, Ren G; Welcome, Daniel E; McDowell, Thomas W; Wu, John Z

    2013-11-25

    The relationship between the vibration transmissibility and driving-point response functions (DPRFs) of the human body is important for understanding vibration exposures of the system and for developing valid models. This study identified their theoretical relationship and demonstrated that the sum of the DPRFs can be expressed as a linear combination of the transmissibility functions of the individual mass elements distributed throughout the system. The relationship is verified using several human vibration models. This study also clarified the requirements for reliably quantifying transmissibility values used as references for calibrating the system models. As an example application, this study used the developed theory to perform a preliminary analysis of the method for calibrating models using both vibration transmissibility and DPRFs. The results of the analysis show that the combined method can theoretically result in a unique and valid solution of the model parameters, at least for linear systems. However, the validation of the method itself does not guarantee the validation of the calibrated model, because the validation of the calibration also depends on the model structure and the reliability and appropriate representation of the reference functions. The basic theory developed in this study is also applicable to the vibration analyses of other structures.

  2. Software reliability studies

    NASA Technical Reports Server (NTRS)

    Hoppa, Mary Ann; Wilson, Larry W.

    1994-01-01

    There are many software reliability models which try to predict future performance of software based on data generated by the debugging process. Our research has shown that by improving the quality of the data one can greatly improve the predictions. We are working on methodologies which control some of the randomness inherent in the standard data generation processes in order to improve the accuracy of predictions. Our contribution is twofold in that we describe an experimental methodology using a data structure called the debugging graph and apply this methodology to assess the robustness of existing models. The debugging graph is used to analyze the effects of various fault recovery orders on the predictive accuracy of several well-known software reliability algorithms. We found that, along a particular debugging path in the graph, the predictive performance of different models can vary greatly. Similarly, just because a model 'fits' a given path's data well does not guarantee that the model would perform well on a different path. Further we observed bug interactions and noted their potential effects on the predictive process. We saw that not only do different faults fail at different rates, but that those rates can be affected by the particular debugging stage at which the rates are evaluated. Based on our experiment, we conjecture that the accuracy of a reliability prediction is affected by the fault recovery order as well as by fault interaction.

  3. The Psychometric Properties of the Center for Epidemiologic Studies Depression Scale in Chinese Primary Care Patients: Factor Structure, Construct Validity, Reliability, Sensitivity and Responsiveness

    PubMed Central

    2015-01-01

    Background The Center for Epidemiologic Studies Depression Scale (CES-D) is a commonly used instrument to measure depressive symptomatology. Despite this, the evidence for its psychometric properties remains poorly established in Chinese populations. The aim of this study was to validate the use of the CES-D in Chinese primary care patients by examining factor structure, construct validity, reliability, sensitivity and responsiveness. Methods and Results The psychometric properties were assessed amongst a sample of 3686 Chinese adult primary care patients in Hong Kong. Three competing factor structure models were examined using confirmatory factor analysis. The original CES-D four-structure model had adequate fit, however the data was better fit into a bi-factor model. For the internal construct validity, corrected item-total correlations were 0.4 for most items. The convergent validity was assessed by examining the correlations between the CES-D, the Patient Health Questionnaire 9 (PHQ-9) and the Short Form-12 Health Survey (version 2) Mental Component Summary (SF-12 v2 MCS). The CES-D had a strong correlation with the PHQ-9 (coefficient: 0.78) and SF-12 v2 MCS (coefficient: -0.75). Internal consistency was assessed by McDonald’s omega hierarchical (ωH). The ωH value for the general depression factor was 0.855. The ωH values for “somatic”, “depressed affect”, “positive affect” and “interpersonal problems” were 0.434, 0.038, 0.738 and 0.730, respectively. For the two-week test-retest reliability, the intraclass correlation coefficient was 0.91. The CES-D was sensitive in detecting differences between known groups, with the AUC >0.7. Internal responsiveness of the CES-D to detect positive and negative changes was satisfactory (with p value <0.01 and all effect size statistics >0.2). The CES-D was externally responsive, with the AUC>0.7. Conclusions The CES-D appears to be a valid, reliable, sensitive and responsive instrument for screening and monitoring depressive symptoms in adult Chinese primary care patients. In its original four-factor and bi-factor structure, the CES-D is supported for cross-cultural comparisons of depression in multi-center studies. PMID:26252739

  4. The Shutdown Dissociation Scale (Shut-D)

    PubMed Central

    Schalinski, Inga; Schauer, Maggie; Elbert, Thomas

    2015-01-01

    The evolutionary model of the defense cascade by Schauer and Elbert (2010) provides a theoretical frame for a short interview to assess problems underlying and leading to the dissociative subtype of posttraumatic stress disorder. Based on known characteristics of the defense stages “fright,” “flag,” and “faint,” we designed a structured interview to assess the vulnerability for the respective types of dissociation. Most of the scales that assess dissociative phenomena are designed as self-report questionnaires. Their items are usually selected based on more heuristic considerations rather than a theoretical model and thus include anything from minor dissociative experiences to major pathological dissociation. The shutdown dissociation scale (Shut-D) was applied in several studies in patients with a history of multiple traumatic events and different disorders that have been shown previously to be prone to symptoms of dissociation. The goal of the present investigation was to obtain psychometric characteristics of the Shut-D (including factor structure, internal consistency, retest reliability, predictive, convergent and criterion-related concurrent validity). A total population of 225 patients and 68 healthy controls were accessed. Shut-D appears to have sufficient internal reliability, excellent retest reliability, high convergent validity, and satisfactory predictive validity, while the summed score of the scale reliably separates patients with exposure to trauma (in different diagnostic groups) from healthy controls. The Shut-D is a brief structured interview for assessing the vulnerability to dissociate as a consequence of exposure to traumatic stressors. The scale demonstrates high-quality psychometric properties and may be useful for researchers and clinicians in assessing shutdown dissociation as well as in predicting the risk of dissociative responding. PMID:25976478

  5. The School Counseling Program Implementation Survey: Initial Instrument Development and Exploratory Factor Analysis

    ERIC Educational Resources Information Center

    Clemens, Elysia V.; Carey, John C.; Harrington, Karen M.

    2010-01-01

    This article details the initial development of the School Counseling Program Implementation Survey and psychometric results including reliability and factor structure. An exploratory factor analysis revealed a three-factor model that accounted for 54% of the variance of the intercorrelation matrix and a two-factor model that accounted for 47% of…

  6. Reliability and validity of general health questionnaire (GHQ-12) for male tannery workers: a study carried out in Kanpur, India.

    PubMed

    Kashyap, Gyan Chandra; Singh, Shri Kant

    2017-03-21

    The purpose of this study was to test the reliability, validity and factor structure of GHQ-12 questionnaire on male tannery workers of India. We have tested three different factor models of the GHQ-12. This paper used primary data obtained from a cross-sectional household study of tannery workers from Jajmau area of the city of Kanpur in northern India, which was conducted during January-June, 2015, as part of a doctoral program. The study covered 286 tannery workers from the study area. An interview schedule containing GHQ-12 was used for tannery workers who had completed at least 1 year at their present occupation preceding the survey. To test reliability, Cronbach's alpha test was used. The convergent test was used for validity. Confirmatory factor analysis was used to compare three factor structures for the GHQ-12. A total of 286 samples were analyzed in this study. The mean age of the tannery workers in this study was 38 years (SD = 1.42). We found the alpha coefficient to be 0.93 for the complete sample. The value of alpha represents the acceptable internal consistency for all the groups. Each item of scale showed almost the same internal consistency of 0.93 for the male tannery workers. The correlation between factor 1 (Anxiety and Depression) and factor 2 (Social Dysfunction) was 0.92. The correlation between factor 1 (Anxiety and Depression) and factor 3 (Loss of confidence) was the highest 0.98. Comparative fit index (CFI) estimate best-fitted for model-III that gave the CFI value 0.97. The SRMR indicator gave the lowest value 0.031 for the model-III. The findings suggest that the Hindi version of GHQ-12 is a reliable and valid tool for measuring psychological distress in male tannery workers of Kanpur city, India. Study found that the model proposed by the Graetz was the best fitted model for the data.

  7. Evaluating the Effect of Minimizing Screws on Stabilization of Symphysis Mandibular Fracture by 3D Finite Element Analysis.

    PubMed

    Kharmanda, Ghias; Kharma, Mohamed-Yaser

    2017-06-01

    The objective of this work is to integrate structural optimization and reliability concepts into mini-plate fixation strategy used in symphysis mandibular fractures. The structural reliability levels are next estimated when considering a single failure mode and multiple failure modes. A 3-dimensional finite element model is developed in order to evaluate the ability of reducing the negative effect due to the stabilization of the fracture. Topology optimization process is considered in the conceptual design stage to predict possible fixation layouts. In the detailed design stage, suitable mini-plates are selected taking into account the resulting topology and different anatomical considerations. Several muscle forces are considered in order to obtain realistic predictions. Since some muscles can be cut or harmed during the surgery and cannot operate at its maximum capacity, there is a strong motivation to introduce the loading uncertainties in order to obtain reliable designs. The structural reliability is carried out for a single failure mode and multiple failure modes. The different results are validated with a clinical case of a male patient with symphysis fracture. In this case while use of the upper plate fixation with four holes, only two screws were applied to protect adjacent vital structure. This behavior does not affect the stability of the fracture. The proposed strategy to optimize bone plates leads to fewer complications and second surgeries, less patient discomfort, and shorter time of healing.

  8. STAMPS: development and verification of swallowing kinematic analysis software.

    PubMed

    Lee, Woo Hyung; Chun, Changmook; Seo, Han Gil; Lee, Seung Hak; Oh, Byung-Mo

    2017-10-17

    Swallowing impairment is a common complication in various geriatric and neurodegenerative diseases. Swallowing kinematic analysis is essential to quantitatively evaluate the swallowing motion of the oropharyngeal structures. This study aims to develop a novel swallowing kinematic analysis software, called spatio-temporal analyzer for motion and physiologic study (STAMPS), and verify its validity and reliability. STAMPS was developed in MATLAB, which is one of the most popular platforms for biomedical analysis. This software was constructed to acquire, process, and analyze the data of swallowing motion. The target of swallowing structures includes bony structures (hyoid bone, mandible, maxilla, and cervical vertebral bodies), cartilages (epiglottis and arytenoid), soft tissues (larynx and upper esophageal sphincter), and food bolus. Numerous functions are available for the spatiotemporal parameters of the swallowing structures. Testing for validity and reliability was performed in 10 dysphagia patients with diverse etiologies and using the instrumental swallowing model which was designed to mimic the motion of the hyoid bone and the epiglottis. The intra- and inter-rater reliability tests showed excellent agreement for displacement and moderate to excellent agreement for velocity. The Pearson correlation coefficients between the measured and instrumental reference values were nearly 1.00 (P < 0.001) for displacement and velocity. The Bland-Altman plots showed good agreement between the measurements and the reference values. STAMPS provides precise and reliable kinematic measurements and multiple practical functionalities for spatiotemporal analysis. The software is expected to be useful for researchers who are interested in the swallowing motion analysis.

  9. Advances and trends in computational structural mechanics

    NASA Technical Reports Server (NTRS)

    Noor, A. K.

    1986-01-01

    Recent developments in computational structural mechanics are reviewed with reference to computational needs for future structures technology, advances in computational models for material behavior, discrete element technology, assessment and control of numerical simulations of structural response, hybrid analysis, and techniques for large-scale optimization. Research areas in computational structural mechanics which have high potential for meeting future technological needs are identified. These include prediction and analysis of the failure of structural components made of new materials, development of computational strategies and solution methodologies for large-scale structural calculations, and assessment of reliability and adaptive improvement of response predictions.

  10. Plate and butt-weld stresses beyond elastic limit, material and structural modeling

    NASA Technical Reports Server (NTRS)

    Verderaime, V.

    1991-01-01

    Ultimate safety factors of high performance structures depend on stress behavior beyond the elastic limit, a region not too well understood. An analytical modeling approach was developed to gain fundamental insights into inelastic responses of simple structural elements. Nonlinear material properties were expressed in engineering stresses and strains variables and combined with strength of material stress and strain equations similar to numerical piece-wise linear method. Integrations are continuous which allows for more detailed solutions. Included with interesting results are the classical combined axial tension and bending load model and the strain gauge conversion to stress beyond the elastic limit. Material discontinuity stress factors in butt-welds were derived. This is a working-type document with analytical methods and results applicable to all industries of high reliability structures.

  11. Validity and Reliability of the 8-Item Work Limitations Questionnaire.

    PubMed

    Walker, Timothy J; Tullar, Jessica M; Diamond, Pamela M; Kohl, Harold W; Amick, Benjamin C

    2017-12-01

    Purpose To evaluate factorial validity, scale reliability, test-retest reliability, convergent validity, and discriminant validity of the 8-item Work Limitations Questionnaire (WLQ) among employees from a public university system. Methods A secondary analysis using de-identified data from employees who completed an annual Health Assessment between the years 2009-2015 tested research aims. Confirmatory factor analysis (CFA) (n = 10,165) tested the latent structure of the 8-item WLQ. Scale reliability was determined using a CFA-based approach while test-retest reliability was determined using the intraclass correlation coefficient. Convergent/discriminant validity was tested by evaluating relations between the 8-item WLQ with health/performance variables for convergent validity (health-related work performance, number of chronic conditions, and general health) and demographic variables for discriminant validity (gender and institution type). Results A 1-factor model with three correlated residuals demonstrated excellent model fit (CFI = 0.99, TLI = 0.99, RMSEA = 0.03, and SRMR = 0.01). The scale reliability was acceptable (0.69, 95% CI 0.68-0.70) and the test-retest reliability was very good (ICC = 0.78). Low-to-moderate associations were observed between the 8-item WLQ and the health/performance variables while weak associations were observed between the demographic variables. Conclusions The 8-item WLQ demonstrated sufficient reliability and validity among employees from a public university system. Results suggest the 8-item WLQ is a usable alternative for studies when the more comprehensive 25-item WLQ is not available.

  12. Exploring the validity and reliability of a questionnaire for evaluating veterinary clinical teachers' supervisory skills during clinical rotations.

    PubMed

    Boerboom, T B B; Dolmans, D H J M; Jaarsma, A D C; Muijtjens, A M M; Van Beukelen, P; Scherpbier, A J J A

    2011-01-01

    Feedback to aid teachers in improving their teaching requires validated evaluation instruments. When implementing an evaluation instrument in a different context, it is important to collect validity evidence from multiple sources. We examined the validity and reliability of the Maastricht Clinical Teaching Questionnaire (MCTQ) as an instrument to evaluate individual clinical teachers during short clinical rotations in veterinary education. We examined four sources of validity evidence: (1) Content was examined based on theory of effective learning. (2) Response process was explored in a pilot study. (3) Internal structure was assessed by confirmatory factor analysis using 1086 student evaluations and reliability was examined utilizing generalizability analysis. (4) Relations with other relevant variables were examined by comparing factor scores with other outcomes. Content validity was supported by theory underlying the cognitive apprenticeship model on which the instrument is based. The pilot study resulted in an additional question about supervision time. A five-factor model showed a good fit with the data. Acceptable reliability was achievable with 10-12 questionnaires per teacher. Correlations between the factors and overall teacher judgement were strong. The MCTQ appears to be a valid and reliable instrument to evaluate clinical teachers' performance during short rotations.

  13. Recent advances in computational structural reliability analysis methods

    NASA Astrophysics Data System (ADS)

    Thacker, Ben H.; Wu, Y.-T.; Millwater, Harry R.; Torng, Tony Y.; Riha, David S.

    1993-10-01

    The goal of structural reliability analysis is to determine the probability that the structure will adequately perform its intended function when operating under the given environmental conditions. Thus, the notion of reliability admits the possibility of failure. Given the fact that many different modes of failure are usually possible, achievement of this goal is a formidable task, especially for large, complex structural systems. The traditional (deterministic) design methodology attempts to assure reliability by the application of safety factors and conservative assumptions. However, the safety factor approach lacks a quantitative basis in that the level of reliability is never known and usually results in overly conservative designs because of compounding conservatisms. Furthermore, problem parameters that control the reliability are not identified, nor their importance evaluated. A summary of recent advances in computational structural reliability assessment is presented. A significant level of activity in the research and development community was seen recently, much of which was directed towards the prediction of failure probabilities for single mode failures. The focus is to present some early results and demonstrations of advanced reliability methods applied to structural system problems. This includes structures that can fail as a result of multiple component failures (e.g., a redundant truss), or structural components that may fail due to multiple interacting failure modes (e.g., excessive deflection, resonate vibration, or creep rupture). From these results, some observations and recommendations are made with regard to future research needs.

  14. Recent advances in computational structural reliability analysis methods

    NASA Technical Reports Server (NTRS)

    Thacker, Ben H.; Wu, Y.-T.; Millwater, Harry R.; Torng, Tony Y.; Riha, David S.

    1993-01-01

    The goal of structural reliability analysis is to determine the probability that the structure will adequately perform its intended function when operating under the given environmental conditions. Thus, the notion of reliability admits the possibility of failure. Given the fact that many different modes of failure are usually possible, achievement of this goal is a formidable task, especially for large, complex structural systems. The traditional (deterministic) design methodology attempts to assure reliability by the application of safety factors and conservative assumptions. However, the safety factor approach lacks a quantitative basis in that the level of reliability is never known and usually results in overly conservative designs because of compounding conservatisms. Furthermore, problem parameters that control the reliability are not identified, nor their importance evaluated. A summary of recent advances in computational structural reliability assessment is presented. A significant level of activity in the research and development community was seen recently, much of which was directed towards the prediction of failure probabilities for single mode failures. The focus is to present some early results and demonstrations of advanced reliability methods applied to structural system problems. This includes structures that can fail as a result of multiple component failures (e.g., a redundant truss), or structural components that may fail due to multiple interacting failure modes (e.g., excessive deflection, resonate vibration, or creep rupture). From these results, some observations and recommendations are made with regard to future research needs.

  15. Surveying the factor structure and reliability of the Persian version of the Jefferson Scale of Physician Lifelong Learning (JeffSPLL) in staff of medical sciences.

    PubMed

    Karimi, Fatemeh Zahra; Alesheikh, Aytay; Pakravan, Soheila; Abdollahi, Mahbubeh; Damough, Mozhdeh; Anbaran, Zahra Khosravi; Farahani, Leila Amiri

    2017-10-01

    In medical sciences, commitment to lifelong learning has been expressed as an important element. Today, due to the rapid development of medical information and technology, lifelong learning is critical for safe medical care and development in medical research. JeffSPLL is one of the scales for measuring lifelong learning among the staff of medical sciences that has never been used in Iran. The aim of the present study was to determine the factor structure and reliability of the Persian version of JeffSPLL among Persian-speaking staff of universities of medical sciences in Iran. This study was a cross-sectional study, methodologically, that was conducted in 2012-2013. In this study, 210 staff members of Birjand University of Medical Sciences were selected. Data collection tool was the Persian version of JeffSPLL. To investigate the factor structure of this tool, confirmatory factor analysis was used and to evaluate the model fit, goodness-of-fit indices, root mean square error of approximation (RMSEA), the ratio of chi-square to the degree of freedom associated with it, comparative fit index (CFI), and root mean square residual (RMR) were used. To investigate the reliability of tool, Cronbach's alpha was employed. Data analysis was conducted using LISREL8.8 and SPSS 20 software. Confirmatory factor analysis showed that RMSEA was close to 0.1, and CFI and GFI were close to one. Therefore, four-factor model was appropriate. Cronbach's alpha was 0.92 for the whole tool and it was between 0.82 and 0.89 for subscales. The present study verified the four-factor structure of the 19-item Persian version of JeffSPLL that included professional learning beliefs and motivation, scholarly activities, attention to learning opportunities, and technical skills in information seeking among the staff. In addition, this tool has acceptable reliability. Therefore, it was appropriate to assess lifelong learning in the Persian-speaking staff population.

  16. Structural models for the design of novel antiviral agents against Greek Goat Encephalitis

    PubMed Central

    Papageorgiou, Louis; Loukatou, Styliani; Koumandou, Vassiliki Lila; Makałowski, Wojciech; Megalooikonomou, Vasileios

    2014-01-01

    The Greek Goat Encephalitis virus (GGE) belongs to the Flaviviridae family of the genus Flavivirus. The GGE virus constitutes an important pathogen of livestock that infects the goat’s central nervous system. The viral enzymes of GGE, helicase and RNA-dependent RNA polymerase (RdRP), are ideal targets for inhibitor design, since those enzymes are crucial for the virus’ survival, proliferation and transmission. In an effort to understand the molecular structure underlying the functions of those viral enzymes, the three dimensional structures of GGE NS3 helicase and NS5 RdRP have been modelled. The models were constructed in silico using conventional homology modelling techniques and the known 3D crystal structures of solved proteins from closely related species as templates. The established structural models of the GGE NS3 helicase and NS5 RdRP have been evaluated for their viability using a repertoire of in silico tools. The goal of this study is to present the 3D conformations of the GGE viral enzymes as reliable structural models that could provide the platform for the design of novel anti-GGE agents. PMID:25392762

  17. Mobility Device Quality Affects Participation Outcomes for People With Disabilities: A Structural Equation Modeling Analysis.

    PubMed

    Magasi, Susan; Wong, Alex; Miskovic, Ana; Tulsky, David; Heinemann, Allen W

    2018-01-01

    To test the effect that indicators of mobility device quality have on participation outcomes in community-dwelling adults with spinal cord injury, traumatic brain injury, and stroke by using structural equation modeling. Survey, cross-sectional study, and model testing. Clinical research space at 2 academic medical centers and 1 free-standing rehabilitation hospital. Community-dwelling adults (N=250; mean age, 48±14.3y) with spinal cord injury, traumatic brain injury, and stroke. Not applicable. The Mobility Device Impact Scale, Patient-Reported Outcomes Measurement Information System Social Function (version 2.0) scale, including Ability to Participate in Social Roles and Activities and Satisfaction with Social Roles and Activities, and the 2 Community Participation Indicators' enfranchisement scales. Details about device quality (reparability, reliability, ease of maintenance) and device type were also collected. Respondents used ambulation aids (30%), manual (34%), and power wheelchairs (30%). Indicators of device quality had a moderating effect on participation outcomes, with 3 device quality variables (repairability, ease of maintenance, device reliability) accounting for 20% of the variance in participation. Wheelchair users reported lower participation enfranchisement than did ambulation aid users. Mobility device quality plays an important role in participation outcomes. It is critical that people have access to mobility devices and that these devices be reliable. Copyright © 2017 American Congress of Rehabilitation Medicine. Published by Elsevier Inc. All rights reserved.

  18. Reliable structural interpretation of small-angle scattering data from bio-molecules in solution--the importance of quality control and a standard reporting framework.

    PubMed

    Jacques, David A; Guss, Jules Mitchell; Trewhella, Jill

    2012-05-17

    Small-angle scattering is becoming an increasingly popular tool for the study of bio-molecular structures in solution. The large number of publications with 3D-structural models generated from small-angle solution scattering data has led to a growing consensus for the need to establish a standard reporting framework for their publication. The International Union of Crystallography recently established a set of guidelines for the necessary information required for the publication of such structural models. Here we describe the rationale for these guidelines and the importance of standardising the way in which small-angle scattering data from bio-molecules and associated structural interpretations are reported.

  19. Space Flight Cable Model Development

    NASA Technical Reports Server (NTRS)

    Spak, Kaitlin

    2013-01-01

    This work concentrates the modeling efforts presented in last year's VSGC conference paper, "Model Development for Cable-Harnessed Beams." The focus is narrowed to modeling of space-flight cables only, as a reliable damped cable model is not yet readily available and is necessary to continue modeling cable-harnessed space structures. New experimental data is presented, eliminating the low-frequency noise that plagued the first year's efforts. The distributed transfer function method is applied to a single section of space flight cable for Euler-Bernoulli and shear beams. The work presented here will be developed into a damped cable model that can be incorporated into an interconnected beam-cable system. The overall goal of this work is to accurately predict natural frequencies and modal damping ratios for cabled space structures.

  20. A Model of High-Frequency Self-Mixing in Double-Barrier Rectifier

    NASA Astrophysics Data System (ADS)

    Palma, Fabrizio; Rao, R.

    2018-03-01

    In this paper, a new model of the frequency dependence of the double-barrier THz rectifier is presented. The new structure is of interest because it can be realized by CMOS image sensor technology. Its application in a complex field such as that of THz receivers requires the availability of an analytical model, which is reliable and able to highlight the dependence on the parameters of the physical structure. The model is based on the hydrodynamic semiconductor equations, solved in the small signal approximation. The model depicts the mechanisms of the THz modulation of the charge in the depleted regions of the double-barrier device and explains the self-mixing process, the frequency dependence, and the detection capability of the structure. The model thus substantially improves the analytical models of the THz rectification available in literature, mainly based on lamped equivalent circuits.

  1. A predictive framework for evaluating models of semantic organization in free recall

    PubMed Central

    Morton, Neal W; Polyn, Sean M.

    2016-01-01

    Research in free recall has demonstrated that semantic associations reliably influence the organization of search through episodic memory. However, the specific structure of these associations and the mechanisms by which they influence memory search remain unclear. We introduce a likelihood-based model-comparison technique, which embeds a model of semantic structure within the context maintenance and retrieval (CMR) model of human memory search. Within this framework, model variants are evaluated in terms of their ability to predict the specific sequence in which items are recalled. We compare three models of semantic structure, latent semantic analysis (LSA), global vectors (GloVe), and word association spaces (WAS), and find that models using WAS have the greatest predictive power. Furthermore, we find evidence that semantic and temporal organization is driven by distinct item and context cues, rather than a single context cue. This finding provides important constraint for theories of memory search. PMID:28331243

  2. The Transition Readiness Assessment Questionnaire (TRAQ): its factor structure, reliability, and validity.

    PubMed

    Wood, David L; Sawicki, Gregory S; Miller, M David; Smotherman, Carmen; Lukens-Bull, Katryne; Livingood, William C; Ferris, Maria; Kraemer, Dale F

    2014-01-01

    National consensus statements recommend that providers regularly assess the transition readiness skills of adolescent and young adults (AYA). In 2010 we developed a 29-item version of Transition Readiness Assessment Questionnaire (TRAQ). We reevaluated item performance and factor structure, and reassessed the TRAQ's reliability and validity. We surveyed youth from 3 academic clinics in Jacksonville, Florida; Chapel Hill, North Carolina; and Boston, Massachusetts. Participants were AYA with special health care needs aged 14 to 21 years. From a convenience sample of 306 patients, we conducted item reduction strategies and exploratory factor analysis (EFA). On a second convenience sample of 221 patients, we conducted confirmatory factor analysis (CFA). Internal reliability was assessed by Cronbach's alpha and criterion validity. Analyses were conducted by the Wilcoxon rank sum test and mixed linear models. The item reduction and EFA resulted in a 20-item scale with 5 identified subscales. The CFA conducted on a second sample provided a good fit to the data. The overall scale has high reliability overall (Cronbach's alpha = .94) and good reliability for 4 of the 5 subscales (Cronbach's alpha ranging from .90 to .77 in the pooled sample). Each of the 5 subscale scores were significantly higher for adolescents aged 18 years and older versus those younger than 18 (P < .0001) in both univariate and multivariate analyses. The 20-item, 5-factor structure for the TRAQ is supported by EFA and CFA on independent samples and has good internal reliability and criterion validity. Additional work is needed to expand or revise the TRAQ subscales and test their predictive validity. Copyright © 2014 Academic Pediatric Association. Published by Elsevier Inc. All rights reserved.

  3. Testing the factor structure of the Family Quality of Life Survey - 2006.

    PubMed

    Isaacs, B; Wang, M; Samuel, P; Ajuwon, P; Baum, N; Edwards, M; Rillotta, F

    2012-01-01

    Although the Family Quality of Life Survey - 2006 (FQOLS-2006) is being used in research, there is little evidence to support its hypothesised domain structure. The purpose of this study was to test the domain structure of the survey using confirmatory factor analysis. Samples from Australia, Canada, Nigeria and the USA were analysed using structural equation modelling. The data from Australia, Canada and the USA were combined on the assumption that these countries are similar, at least to some degree, in economic development, language and culture. The Nigerian data were analysed on its own. The analysis was undertaken in two phases. First, the hypothesis that each of nine domains of the FQOLS-2006 is a unidimensional construct that can reliably measure the dimensions Importance, Stability, Opportunities, Attainment, Stability and Satisfaction was tested. Second, the hypothesis that family quality of life (FQoL) is a single latent construct represented by the nine domains measured in the FQOLS-2006 was tested. In the first phase of the analysis, the Importance dimension was dropped because of skewness and lack of variance. The Stability dimension did not fit well within the individual domain model in both the Nigerian and the combined three countries' data. When Importance and Stability were excluded, the individual domain models showede good or acceptable fit when error variances of some dimensions were allowed to correlate. In the second phase of the analysis, the overall model, FQoL, represented by the nine domains of the FQOLS-2006 showed good fit in both data sets. The conceptual model of the FQOLS-2006 was supported with some qualifications. Each domain on the survey can be reliably measured by four dimensions Opportunities, Initiative, Attainment and Satisfaction. The dimensions of Importance and Stability, however, did not fit. Data reported on these dimensions from past and current studies should be interpreted with caution. The construct of FQoL is also reliably measured by the domains of the FQOLS-2006. Further research into the psychometric properties of the survey, particularly from a cross-cultural perspective, is needed. © 2011 The Authors. Journal of Intellectual Disability Research © 2011 Blackwell Publishing Ltd.

  4. Comparative Reliability of Structured Versus Unstructured Interviews in the Admission Process of a Residency Program

    PubMed Central

    Blouin, Danielle; Day, Andrew G.; Pavlov, Andrey

    2011-01-01

    Background Although never directly compared, structured interviews are reported as being more reliable than unstructured interviews. This study compared the reliability of both types of interview when applied to a common pool of applicants for positions in an emergency medicine residency program. Methods In 2008, one structured interview was added to the two unstructured interviews traditionally used in our resident selection process. A formal job analysis using the critical incident technique guided the development of the structured interview tool. This tool consisted of 7 scenarios assessing 4 of the domains deemed essential for success as a resident in this program. The traditional interview tool assessed 5 general criteria. In addition to these criteria, the unstructured panel members were asked to rate each candidate on the same 4 essential domains rated by the structured panel members. All 3 panels interviewed all candidates. Main outcomes were the overall, interitem, and interrater reliabilities, the correlations between interview panels, and the dimensionality of each interview tool. Results Thirty candidates were interviewed. The overall reliability reached 0.43 for the structured interview, and 0.81 and 0.71 for the unstructured interviews. Analyses of the variance components showed a high interrater, low interitem reliability for the structured interview, and a high interrater, high interitem reliability for the unstructured interviews. The summary measures from the 2 unstructured interviews were significantly correlated, but neither was correlated with the structured interview. Only the structured interview was multidimensional. Conclusions A structured interview did not yield a higher overall reliability than both unstructured interviews. The lower reliability is explained by a lower interitem reliability, which in turn is due to the multidimensionality of the interview tool. Both unstructured panels consistently rated a single dimension, even when prompted to assess the 4 specific domains established as essential to succeed in this residency program. PMID:23205201

  5. Comparative reliability of structured versus unstructured interviews in the admission process of a residency program.

    PubMed

    Blouin, Danielle; Day, Andrew G; Pavlov, Andrey

    2011-12-01

    Although never directly compared, structured interviews are reported as being more reliable than unstructured interviews. This study compared the reliability of both types of interview when applied to a common pool of applicants for positions in an emergency medicine residency program. In 2008, one structured interview was added to the two unstructured interviews traditionally used in our resident selection process. A formal job analysis using the critical incident technique guided the development of the structured interview tool. This tool consisted of 7 scenarios assessing 4 of the domains deemed essential for success as a resident in this program. The traditional interview tool assessed 5 general criteria. In addition to these criteria, the unstructured panel members were asked to rate each candidate on the same 4 essential domains rated by the structured panel members. All 3 panels interviewed all candidates. Main outcomes were the overall, interitem, and interrater reliabilities, the correlations between interview panels, and the dimensionality of each interview tool. Thirty candidates were interviewed. The overall reliability reached 0.43 for the structured interview, and 0.81 and 0.71 for the unstructured interviews. Analyses of the variance components showed a high interrater, low interitem reliability for the structured interview, and a high interrater, high interitem reliability for the unstructured interviews. The summary measures from the 2 unstructured interviews were significantly correlated, but neither was correlated with the structured interview. Only the structured interview was multidimensional. A structured interview did not yield a higher overall reliability than both unstructured interviews. The lower reliability is explained by a lower interitem reliability, which in turn is due to the multidimensionality of the interview tool. Both unstructured panels consistently rated a single dimension, even when prompted to assess the 4 specific domains established as essential to succeed in this residency program.

  6. Validity and reliability of bilingual English-Arabic version of Schutte self report emotional intelligence scale in an undergraduate Arab medical student sample.

    PubMed

    Naeem, Naghma; Muijtjens, Arno

    2015-04-01

    The psychological construct of emotional intelligence (EI), its theoretical models, measurement instruments and applications have been the subject of several research studies in health professions education. The objective of the current study was to investigate the factorial validity and reliability of a bilingual version of the Schutte Self Report Emotional Intelligence Scale (SSREIS) in an undergraduate Arab medical student population. The study was conducted during April-May 2012. A cross-sectional survey design was employed. A sample (n = 467) was obtained from undergraduate medical students belonging to the male and female medical college of King Saud University, Riyadh, Saudi Arabia. Exploratory and confirmatory factor analysis was performed using SPSS 16.0 and AMOS 4.0 statistical software to determine the factor structure. Reliability was determined using Cronbach's alpha statistics. The results obtained using an undergraduate Arab medical student sample supported a multidimensional; three factor structure of the SSREIS. The three factors are Optimism, Awareness-of-Emotions and Use-of-Emotions. The reliability (Cronbach's alpha) for the three subscales was 0.76, 0.72 and 0.55, respectively. Emotional intelligence is a multifactorial construct (three factors). The bilingual version of the SSREIS is a valid and reliable measure of trait emotional intelligence in an undergraduate Arab medical student population.

  7. Topology Optimization of Lightweight Lattice Structural Composites Inspired by Cuttlefish Bone

    NASA Astrophysics Data System (ADS)

    Hu, Zhong; Gadipudi, Varun Kumar; Salem, David R.

    2018-03-01

    Lattice structural composites are of great interest to various industries where lightweight multifunctionality is important, especially aerospace. However, strong coupling among the composition, microstructure, porous topology, and fabrication of such materials impedes conventional trial-and-error experimental development. In this work, a discontinuous carbon fiber reinforced polymer matrix composite was adopted for structural design. A reliable and robust design approach for developing lightweight multifunctional lattice structural composites was proposed, inspired by biomimetics and based on topology optimization. Three-dimensional periodic lattice blocks were initially designed, inspired by the cuttlefish bone microstructure. The topologies of the three-dimensional periodic blocks were further optimized by computer modeling, and the mechanical properties of the topology optimized lightweight lattice structures were characterized by computer modeling. The lattice structures with optimal performance were identified.

  8. Lewis Structures Technology, 1988. Volume 2: Structural Mechanics

    NASA Technical Reports Server (NTRS)

    1988-01-01

    Lewis Structures Div. performs and disseminates results of research conducted in support of aerospace engine structures. These results have a wide range of applicability to practitioners of structural engineering mechanics beyond the aerospace arena. The engineering community was familiarized with the depth and range of research performed by the division and its academic and industrial partners. Sessions covered vibration control, fracture mechanics, ceramic component reliability, parallel computing, nondestructive evaluation, constitutive models and experimental capabilities, dynamic systems, fatigue and damage, wind turbines, hot section technology (HOST), aeroelasticity, structural mechanics codes, computational methods for dynamics, structural optimization, and applications of structural dynamics, and structural mechanics computer codes.

  9. Closed-form solution of decomposable stochastic models

    NASA Technical Reports Server (NTRS)

    Sjogren, Jon A.

    1990-01-01

    Markov and semi-Markov processes are increasingly being used in the modeling of complex reconfigurable systems (fault tolerant computers). The estimation of the reliability (or some measure of performance) of the system reduces to solving the process for its state probabilities. Such a model may exhibit numerous states and complicated transition distributions, contributing to an expensive and numerically delicate solution procedure. Thus, when a system exhibits a decomposition property, either structurally (autonomous subsystems), or behaviorally (component failure versus reconfiguration), it is desirable to exploit this decomposition in the reliability calculation. In interesting cases there can be failure states which arise from non-failure states of the subsystems. Equations are presented which allow the computation of failure probabilities of the total (combined) model without requiring a complete solution of the combined model. This material is presented within the context of closed-form functional representation of probabilities as utilized in the Symbolic Hierarchical Automated Reliability and Performance Evaluator (SHARPE) tool. The techniques adopted enable one to compute such probability functions for a much wider class of systems at a reduced computational cost. Several examples show how the method is used, especially in enhancing the versatility of the SHARPE tool.

  10. Implementation strategies influence the structure, process and outcome of quality systems: an empirical study of hospital departments in Sweden.

    PubMed

    Kunkel, S; Rosenqvist, U; Westerling, R

    2009-02-01

    To analyse whether the organisation of quality systems (structure, process, and outcome) is related to how these systems were implemented (implementation prerequisites, cooperation between managers and staff, and source of initiative). A questionnaire was developed, piloted and distributed to 600 hospital departments. Questions were included to reflect implementation prerequisites (adequate resources, competence, problem-solving capacity and high expectations), cooperative implementation, source of initiative (manager, staff and purchaser), structure (resources and administration), process (culture and cooperation) and outcome (goal evaluation and competence development). The adjusted response rate was 75%. Construct validity and reliability was assessed by confirmatory factor analysis, and Cronbach alpha scores were calculated. The relationships among the variables were analysed with structural equation modelling with LISREL. Implementation prerequisites were highly related to structure (0.51) and process (0.33). Cooperative implementation was associated with process (0.26) and outcome (0.34). High manager initiative was related to structure (0.19) and process (0.17). The numbers in parentheses can be interpreted as correlations. Construct validity was good, and reliability was excellent for all factors (Cronbach alpha>0.78). The model was a good representation of reality (model fit p value = 0.082). The implementation of organisationally demanding quality systems may require managers to direct and lead the process while assuring that their staff get opportunities to contribute to the planning and designing of the new system. This would correspond to a cooperative implementation strategy rather than to top-down or bottom-up strategies. The results of this study could be used to adjust implementation processes.

  11. A Fresh Start for Flood Estimation in Ungauged Basins

    NASA Astrophysics Data System (ADS)

    Woods, R. A.

    2017-12-01

    The two standard methods for flood estimation in ungauged basins, regression-based statistical models and rainfall-runoff models using a design rainfall event, have survived relatively unchanged as the methods of choice for more than 40 years. Their technical implementation has developed greatly, but the models' representation of hydrological processes has not, despite a large volume of hydrological research. I suggest it is time to introduce more hydrology into flood estimation. The reliability of the current methods can be unsatisfactory. For example, despite the UK's relatively straightforward hydrology, regression estimates of the index flood are uncertain by +/- a factor of two (for a 95% confidence interval), an impractically large uncertainty for design. The standard error of rainfall-runoff model estimates is not usually known, but available assessments indicate poorer reliability than statistical methods. There is a practical need for improved reliability in flood estimation. Two promising candidates to supersede the existing methods are (i) continuous simulation by rainfall-runoff modelling and (ii) event-based derived distribution methods. The main challenge with continuous simulation methods in ungauged basins is to specify the model structure and parameter values, when calibration data are not available. This has been an active area of research for more than a decade, and this activity is likely to continue. The major challenges for the derived distribution method in ungauged catchments include not only the correct specification of model structure and parameter values, but also antecedent conditions (e.g. seasonal soil water balance). However, a much smaller community of researchers are active in developing or applying the derived distribution approach, and as a result slower progress is being made. A change in needed: surely we have learned enough about hydrology in the last 40 years that we can make a practical hydrological advance on our methods for flood estimation! A shift to new methods for flood estimation will not be taken lightly by practitioners. However, the standard for change is clear - can we develop new methods which give significant improvements in reliability over those existing methods which are demonstrably unsatisfactory?

  12. Reliability-based criteria for load and resistance factor design code for wood bridges

    Treesearch

    Chris Eamon; Andrzej S. Nowak; Michael A. Ritter; Joe Murphy

    2000-01-01

    Recently AASHTO adopted a load and resistance factor design code for highway bridges. The new code provides a rational basis for the design of steel and concrete structures. However, the calibration was not done for wood bridges. Therefore, there is a need to fill this gap. The development of statistical models for wood bridge structures is discussed. Recent test...

  13. Bioresorbable polymer coated drug eluting stent: a model study.

    PubMed

    Rossi, Filippo; Casalini, Tommaso; Raffa, Edoardo; Masi, Maurizio; Perale, Giuseppe

    2012-07-02

    In drug eluting stent technologies, an increased demand for better control, higher reliability, and enhanced performances of drug delivery systems emerged in the last years and thus offered the opportunity to introduce model-based approaches aimed to overcome the remarkable limits of trial-and-error methods. In this context a mathematical model was studied, based on detailed conservation equations and taking into account the main physical-chemical mechanisms involved in polymeric coating degradation, drug release, and restenosis inhibition. It allowed highlighting the interdependence between factors affecting each of these phenomena and, in particular, the influence of stent design parameters on drug antirestenotic efficacy. Therefore, the here-proposed model is aimed to simulate the diffusional release, for both in vitro and the in vivo conditions: results were verified against various literature data, confirming the reliability of the parameter estimation procedure. The hierarchical structure of this model also allows easily modifying the set of equations describing restenosis evolution to enhance model reliability and taking advantage of the deep understanding of physiological mechanisms governing the different stages of smooth muscle cell growth and proliferation. In addition, thanks to its simplicity and to the very low system requirements and central processing unit (CPU) time, our model allows obtaining immediate views of system behavior.

  14. Towards a chromatographic similarity index to establish localised quantitative structure-retention relationships for retention prediction. II Use of Tanimoto similarity index in ion chromatography.

    PubMed

    Park, Soo Hyun; Talebi, Mohammad; Amos, Ruth I J; Tyteca, Eva; Haddad, Paul R; Szucs, Roman; Pohl, Christopher A; Dolan, John W

    2017-11-10

    Quantitative Structure-Retention Relationships (QSRR) are used to predict retention times of compounds based only on their chemical structures encoded by molecular descriptors. The main concern in QSRR modelling is to build models with high predictive power, allowing reliable retention prediction for the unknown compounds across the chromatographic space. With the aim of enhancing the prediction power of the models, in this work, our previously proposed QSRR modelling approach called "federation of local models" is extended in ion chromatography to predict retention times of unknown ions, where a local model for each target ion (unknown) is created using only structurally similar ions from the dataset. A Tanimoto similarity (TS) score was utilised as a measure of structural similarity and training sets were developed by including ions that were similar to the target ion, as defined by a threshold value. The prediction of retention parameters (a- and b-values) in the linear solvent strength (LSS) model in ion chromatography, log k=a - blog[eluent], allows the prediction of retention times under all eluent concentrations. The QSRR models for a- and b-values were developed by a genetic algorithm-partial least squares method using the retention data of inorganic and small organic anions and larger organic cations (molecular mass up to 507) on four Thermo Fisher Scientific columns (AS20, AS19, AS11HC and CS17). The corresponding predicted retention times were calculated by fitting the predicted a- and b-values of the models into the LSS model equation. The predicted retention times were also plotted against the experimental values to evaluate the goodness of fit and the predictive power of the models. The application of a TS threshold of 0.6 was found to successfully produce predictive and reliable QSRR models (Q ext(F2) 2 >0.8 and Mean Absolute Error<0.1), and hence accurate retention time predictions with an average Mean Absolute Error of 0.2min. Crown Copyright © 2017. Published by Elsevier B.V. All rights reserved.

  15. XAFS of short-lived reduction products of structural and functional models of the [Fe Fe] hydrogenase H-cluster

    NASA Astrophysics Data System (ADS)

    Bondin, Mark I.; Borg, Stacey J.; Cheah, Mun-Hon; Best, Stephen P.

    2006-11-01

    Thiolate-bridged diiron compounds that are related to the active site of the [Fe-Fe] hydrogenase enzyme have been shown to act as electrocatalysts for reduction of protons. The use of XAFS for clarification of the structures of intermediates formed following reduction of related diiron carbonyl compounds is described. These measurements allow the determination of Fe-Fe and Fe-S bond lengths with good reliability and when used in conjunction with the standard bonding models this provides a means of validating the structures proposed for longer-lived ( t>20 s at -50 °C) reaction intermediates.

  16. 3D-quantitative structure-activity relationship study for the design of novel enterovirus A71 3C protease inhibitors.

    PubMed

    Nie, Quandeng; Xu, Xiaoyi; Zhang, Qi; Ma, Yuying; Yin, Zheng; Shang, Luqing

    2018-06-07

    A three-dimensional quantitative structure-activity relationships model of enterovirus A71 3C protease inhibitors was constructed in this study. The protein-ligand interaction fingerprint was analyzed to generate a pharmacophore model. A predictive and reliable three-dimensional quantitative structure-activity relationships model was built based on the Flexible Alignment of AutoGPA. Moreover, three novel compounds (I-III) were designed and evaluated for their biochemical activity against 3C protease and anti-enterovirus A71 activity in vitro. III exhibited excellent inhibitory activity (IC 50 =0.031 ± 0.005 μM, EC 50 =0.036 ± 0.007 μM). Thus, this study provides a useful quantitative structure-activity relationships model to develop potent inhibitors for enterovirus A71 3C protease. This article is protected by copyright. All rights reserved. This article is protected by copyright. All rights reserved.

  17. Modeling and numerical simulations of growth and morphologies of three dimensional aggregated silver films

    NASA Astrophysics Data System (ADS)

    Davis, L. J.; Boggess, M.; Kodpuak, E.; Deutsch, M.

    2012-11-01

    We report on a model for the deposition of three dimensional, aggregated nanocrystalline silver films, and an efficient numerical simulation method developed for visualizing such structures. We compare our results to a model system comprising chemically deposited silver films with morphologies ranging from dilute, uniform distributions of nanoparticles to highly porous aggregated networks. Disordered silver films grown in solution on silica substrates are characterized using digital image analysis of high resolution scanning electron micrographs. While the latter technique provides little volume information, plane-projected (two dimensional) island structure and surface coverage may be reliably determined. Three parameters governing film growth are evaluated using these data and used as inputs for the deposition model, greatly reducing computing requirements while still providing direct access to the complete (bulk) structure of the films throughout the growth process. We also show how valuable three dimensional characteristics of the deposited materials can be extracted using the simulated structures.

  18. Development of the Japanese version of the Council on Nutrition Appetite Questionnaire and its simplified versions, and evaluation of their reliability, validity, and reproducibility.

    PubMed

    Tokudome, Yuko; Okumura, Keiko; Kumagai, Yoshiko; Hirano, Hirohiko; Kim, Hunkyung; Morishita, Shiho; Watanabe, Yutaka

    2017-11-01

    Because few Japanese questionnaires assess the elderly's appetite, there is an urgent need to develop an appetite questionnaire with verified reliability, validity, and reproducibility. We translated and back-translated the Council on Nutrition Appetite Questionnaire (CNAQ), which has eight items, into Japanese (CNAQ-J), as well as the Simplified Nutritional Appetite Questionnaire (SNAQ-J), which includes four CNAQ-J-derived items. Using structural equation modeling, we examined the CNAQ-J structure based on data of 649 Japanese elderly people in 2013, including individuals having a certain degree of cognitive impairment, and we developed the SNAQ for the Japanese elderly (SNAQ-JE) according to an exploratory factor analysis. Confirmatory factor analyses on the appetite questionnaires were conducted to probe fitting to the model. We computed Cronbach's α coefficients and criterion-referenced/-related validity figures examining associations of the three appetite battery scores with body mass index (BMI) values and with nutrition-related questionnaire values. Test-retest reproducibility of appetite tools was scrutinized over an approximately 2-week interval. An exploratory factor analysis demonstrated that the CNAQ-J was constructed of one factor (appetite), yielding the SNAQ-JE, which includes four questions derived from the CNAQ-J. The three appetite instruments showed almost equivalent fitting to the model and reproducibility. The CNAQ-J and SNAQ-JE demonstrated satisfactory reliability and significant criterion-referenced/-related validity values, including BMIs, but the SNAQ-J included a low factor-loading item, exhibited less satisfactory reliability and had a non-significant relationship to BMI. The CNAQ-J and SNAQ-JE may be applied to assess the appetite of Japanese elderly, including persons with some cognitive impairment. Copyright © 2017 The Authors. Production and hosting by Elsevier B.V. All rights reserved.

  19. Structural design considerations for micromachined solid-oxide fuel cells

    NASA Astrophysics Data System (ADS)

    Srikar, V. T.; Turner, Kevin T.; Andrew Ie, Tze Yung; Spearing, S. Mark

    Micromachined solid-oxide fuel cells (μSOFCs) are among a class of devices being investigated for portable power generation. Optimization of the performance and reliability of such devices requires robust, scale-dependent, design methodologies. In this first analysis, we consider the structural design of planar, electrolyte-supported, μSOFCs from the viewpoints of electrochemical performance, mechanical stability and reliability, and thermal behavior. The effect of electrolyte thickness on fuel cell performance is evaluated using a simple analytical model. Design diagrams that account explicitly for thermal and intrinsic residual stresses are presented to identify geometries that are resistant to fracture and buckling. Analysis of energy loss due to in-plane heat conduction highlights the importance of efficient thermal isolation in microscale fuel cell design.

  20. Cation Selectivity in Biological Cation Channels Using Experimental Structural Information and Statistical Mechanical Simulation.

    PubMed

    Finnerty, Justin John; Peyser, Alexander; Carloni, Paolo

    2015-01-01

    Cation selective channels constitute the gate for ion currents through the cell membrane. Here we present an improved statistical mechanical model based on atomistic structural information, cation hydration state and without tuned parameters that reproduces the selectivity of biological Na+ and Ca2+ ion channels. The importance of the inclusion of step-wise cation hydration in these results confirms the essential role partial dehydration plays in the bacterial Na+ channels. The model, proven reliable against experimental data, could be straightforwardly used for designing Na+ and Ca2+ selective nanopores.

  1. Ceramics Analysis and Reliability Evaluation of Structures (CARES). Users and programmers manual

    NASA Technical Reports Server (NTRS)

    Nemeth, Noel N.; Manderscheid, Jane M.; Gyekenyesi, John P.

    1990-01-01

    This manual describes how to use the Ceramics Analysis and Reliability Evaluation of Structures (CARES) computer program. The primary function of the code is to calculate the fast fracture reliability or failure probability of macroscopically isotropic ceramic components. These components may be subjected to complex thermomechanical loadings, such as those found in heat engine applications. The program uses results from MSC/NASTRAN or ANSYS finite element analysis programs to evaluate component reliability due to inherent surface and/or volume type flaws. CARES utilizes the Batdorf model and the two-parameter Weibull cumulative distribution function to describe the effect of multiaxial stress states on material strength. The principle of independent action (PIA) and the Weibull normal stress averaging models are also included. Weibull material strength parameters, the Batdorf crack density coefficient, and other related statistical quantities are estimated from four-point bend bar or unifrom uniaxial tensile specimen fracture strength data. Parameter estimation can be performed for single or multiple failure modes by using the least-square analysis or the maximum likelihood method. Kolmogorov-Smirnov and Anderson-Darling goodness-of-fit tests, ninety percent confidence intervals on the Weibull parameters, and Kanofsky-Srinivasan ninety percent confidence band values are also provided. The probabilistic fast-fracture theories used in CARES, along with the input and output for CARES, are described. Example problems to demonstrate various feature of the program are also included. This manual describes the MSC/NASTRAN version of the CARES program.

  2. Methodology of homogeneous and non-homogeneous Markov Chains for modeling bridge element deterioration.

    DOT National Transportation Integrated Search

    2008-08-01

    Bridge management is an important activity of transportation agencies in the US : and in many other countries. A critical aspect of bridge management is to reliably predict : the deterioration of bridge structures, so that appropriate or optimal acti...

  3. Reliability-Based Model to Analyze the Performance and Cost of a Transit Fare Collection System.

    DOT National Transportation Integrated Search

    1985-06-01

    The collection of transit system fares has become more sophisticated in recent years, with more flexible structures requiring more sophisticated fare collection equipment to process tickets and admit passengers. However, this new and complex equipmen...

  4. A new modal-based approach for modelling the bump foil structure in the simultaneous solution of foil-air bearing rotor dynamic problems

    NASA Astrophysics Data System (ADS)

    Bin Hassan, M. F.; Bonello, P.

    2017-05-01

    Recently-proposed techniques for the simultaneous solution of foil-air bearing (FAB) rotor dynamic problems have been limited to a simple bump foil model in which the individual bumps were modelled as independent spring-damper (ISD) subsystems. The present paper addresses this limitation by introducing a modal model of the bump foil structure into the simultaneous solution scheme. The dynamics of the corrugated bump foil structure are first studied using the finite element (FE) technique. This study is experimentally validated using a purpose-made corrugated foil structure. Based on the findings of this study, it is proposed that the dynamics of the full foil structure, including bump interaction and foil inertia, can be represented by a modal model comprising a limited number of modes. This full foil structure modal model (FFSMM) is then adapted into the rotordynamic FAB problem solution scheme, instead of the ISD model. Preliminary results using the FFSMM under static and unbalance excitation conditions are proven to be reliable by comparison against the corresponding ISD foil model results and by cross-correlating different methods for computing the deflection of the full foil structure. The rotor-bearing model is also validated against experimental and theoretical results in the literature.

  5. Illustrated structural application of universal first-order reliability method

    NASA Technical Reports Server (NTRS)

    Verderaime, V.

    1994-01-01

    The general application of the proposed first-order reliability method was achieved through the universal normalization of engineering probability distribution data. The method superimposes prevailing deterministic techniques and practices on the first-order reliability method to surmount deficiencies of the deterministic method and provide benefits of reliability techniques and predictions. A reliability design factor is derived from the reliability criterion to satisfy a specified reliability and is analogous to the deterministic safety factor. Its application is numerically illustrated on several practical structural design and verification cases with interesting results and insights. Two concepts of reliability selection criteria are suggested. Though the method was developed to support affordable structures for access to space, the method should also be applicable for most high-performance air and surface transportation systems.

  6. A method for mandibular dental arch superimposition using 3D cone beam CT and orthodontic 3D digital model

    PubMed Central

    Park, Tae-Joon; Lee, Sang-Hyun

    2012-01-01

    Objective The purpose of this study was to develop superimposition method on the lower arch using 3-dimensional (3D) cone beam computed tomography (CBCT) images and orthodontic 3D digital modeling. Methods Integrated 3D CBCT images were acquired by substituting the dental portion of 3D CBCT images with precise dental images of an orthodontic 3D digital model. Images were acquired before and after treatment. For the superimposition, 2 superimposition methods were designed. Surface superimposition was based on the basal bone structure of the mandible by surface-to-surface matching (best-fit method). Plane superimposition was based on anatomical structures (mental and lingual foramen). For the evaluation, 10 landmarks including teeth and anatomic structures were assigned, and 30 times of superimpositions and measurements were performed to determine the more reproducible and reliable method. Results All landmarks demonstrated that the surface superimposition method produced relatively more consistent coordinate values. The mean distances of measured landmarks values from the means were statistically significantly lower with the surface superimpositions method. Conclusions Between the 2 superimposition methods designed for the evaluation of 3D changes in the lower arch, surface superimposition was the simpler, more reproducible, reliable method. PMID:23112948

  7. Large space telescope engineering scale model optical design

    NASA Technical Reports Server (NTRS)

    Facey, T. A.

    1973-01-01

    The objective is to develop the detailed design and tolerance data for the LST engineering scale model optical system. This will enable MSFC to move forward to the optical element procurement phase and also to evaluate tolerances, manufacturing requirements, assembly/checkout procedures, reliability, operational complexity, stability requirements of the structure and thermal system, and the flexibility to change and grow.

  8. Internal consistency and stability of the CANTAB neuropsychological test battery in children.

    PubMed

    Syväoja, Heidi J; Tammelin, Tuija H; Ahonen, Timo; Räsänen, Pekka; Tolvanen, Asko; Kankaanpää, Anna; Kantomaa, Marko T

    2015-06-01

    The Cambridge Neuropsychological Test Automated Battery (CANTAB) is a computer-assessed test battery widely use in different populations. The internal consistency and 1-year stability of CANTAB tests were examined in school-age children. Two hundred-thirty children (57% girls) from five schools in the Jyväskylä school district in Finland participated in the study in spring 2011. The children completed the following CANTAB tests: (a) visual memory (pattern recognition memory [PRM] and spatial recognition memory [SRM]), (b) executive function (spatial span [SSP], Stockings of Cambridge [SOC], and intra-extra dimensional set shift [IED]), and (c) attention (reaction time [RTI] and rapid visual information processing [RVP]). Seventy-four children participated in the follow-up measurements (64% girls) in spring 2012. Cronbach's alpha reliability coefficient was used to estimate the internal consistency of the nonhampering test, and structural equation models were applied to examine the stability of these tests. The reliability and the stability could not be determined for IED or SSP because of the nature of these tests. The internal consistency was acceptable only in the RTI task. The 1-year stability was moderate-to-good for the PRM, RTI, and RVP. The SSP and IED showed a moderate correlation between the two measurement points. The SRM and the SOC tasks were not reliable or stable measures in this study population. For research purposes, we recommend using structural equation modeling to improve reliability. The results suggest that the reliability and the stability of computer-based test batteries should be confirmed in the target population before using them for clinical or research purposes. (c) 2015 APA, all rights reserved).

  9. Uncertainty aggregation and reduction in structure-material performance prediction

    NASA Astrophysics Data System (ADS)

    Hu, Zhen; Mahadevan, Sankaran; Ao, Dan

    2018-02-01

    An uncertainty aggregation and reduction framework is presented for structure-material performance prediction. Different types of uncertainty sources, structural analysis model, and material performance prediction model are connected through a Bayesian network for systematic uncertainty aggregation analysis. To reduce the uncertainty in the computational structure-material performance prediction model, Bayesian updating using experimental observation data is investigated based on the Bayesian network. It is observed that the Bayesian updating results will have large error if the model cannot accurately represent the actual physics, and that this error will be propagated to the predicted performance distribution. To address this issue, this paper proposes a novel uncertainty reduction method by integrating Bayesian calibration with model validation adaptively. The observation domain of the quantity of interest is first discretized into multiple segments. An adaptive algorithm is then developed to perform model validation and Bayesian updating over these observation segments sequentially. Only information from observation segments where the model prediction is highly reliable is used for Bayesian updating; this is found to increase the effectiveness and efficiency of uncertainty reduction. A composite rotorcraft hub component fatigue life prediction model, which combines a finite element structural analysis model and a material damage model, is used to demonstrate the proposed method.

  10. The Norwegian Computerized Adaptive Test of Personality Disorder-Static Form (CAT-PD-SF): Reliability, Factor Structure, and Relationships With Personality Functioning.

    PubMed

    Thimm, Jens C

    2017-12-01

    The Computerized Adaptive Test of Personality Disorder-Static Form (CAT-PD-SF) is a self-report inventory developed to assess pathological personality traits. The current study explored the reliability and higher order factor structure of the Norwegian version of the CAT-PD-SF and the relationships between the CAT-PD traits and domains of personality functioning in an undergraduate student sample ( N = 375). In addition to the CAT-PD-SF, the short form of the Severity Indices of Personality Problems and the Brief Symptom Inventory were administered. The results showed that the Norwegian CAT-PD-SF has good score reliability. Factor analysis of the CAT-PD-SF scales indicated five superordinate factors that correspond to the trait domains of the alternative DSM-5 model for personality disorders. The CAT-PD traits were highly predictive of impaired personality functioning after controlling for psychological distress. It is concluded that the CAT-PD-SF is a promising tool for the assessment of personality disorder traits.

  11. Reliability assessment of slender concrete columns at the stability failure

    NASA Astrophysics Data System (ADS)

    Valašík, Adrián; Benko, Vladimír; Strauss, Alfred; Täubling, Benjamin

    2018-01-01

    The European Standard for designing concrete columns within the use of non-linear methods shows deficiencies in terms of global reliability, in case that the concrete columns fail by the loss of stability. The buckling failure is a brittle failure which occurs without warning and the probability of its formation depends on the columns slenderness. Experiments with slender concrete columns were carried out in cooperation with STRABAG Bratislava LTD in Central Laboratory of Faculty of Civil Engineering SUT in Bratislava. The following article aims to compare the global reliability of slender concrete columns with slenderness of 90 and higher. The columns were designed according to methods offered by EN 1992-1-1 [1]. The mentioned experiments were used as basis for deterministic nonlinear modelling of the columns and subsequent the probabilistic evaluation of structural response variability. Final results may be utilized as thresholds for loading of produced structural elements and they aim to present probabilistic design as less conservative compared to classic partial safety factor based design and alternative ECOV method.

  12. Developing Articulated Human Models from Laser Scan Data for Use as Avatars in Real-Time Networked Virtual Environments

    DTIC Science & Technology

    2001-09-01

    structure model, motion model, physical model, and possibly many other characteristics depending on the application [Ref. 4]. While the film industry has...applications. The film industry relies on this technology almost exclusively, as it is highly reliable under controlled conditions. Since optical tracking...Wavefront. Maya has been used extensively in the film industry to provide lifelike animation, and is adept at handling 3D objects [Ref. 27]. Maya can

  13. A Psychometric Analysis of the Italian Version of the eHealth Literacy Scale Using Item Response and Classical Test Theory Methods.

    PubMed

    Diviani, Nicola; Dima, Alexandra Lelia; Schulz, Peter Johannes

    2017-04-11

    The eHealth Literacy Scale (eHEALS) is a tool to assess consumers' comfort and skills in using information technologies for health. Although evidence exists of reliability and construct validity of the scale, less agreement exists on structural validity. The aim of this study was to validate the Italian version of the eHealth Literacy Scale (I-eHEALS) in a community sample with a focus on its structural validity, by applying psychometric techniques that account for item difficulty. Two Web-based surveys were conducted among a total of 296 people living in the Italian-speaking region of Switzerland (Ticino). After examining the latent variables underlying the observed variables of the Italian scale via principal component analysis (PCA), fit indices for two alternative models were calculated using confirmatory factor analysis (CFA). The scale structure was examined via parametric and nonparametric item response theory (IRT) analyses accounting for differences between items regarding the proportion of answers indicating high ability. Convergent validity was assessed by correlations with theoretically related constructs. CFA showed a suboptimal model fit for both models. IRT analyses confirmed all items measure a single dimension as intended. Reliability and construct validity of the final scale were also confirmed. The contrasting results of factor analysis (FA) and IRT analyses highlight the importance of considering differences in item difficulty when examining health literacy scales. The findings support the reliability and validity of the translated scale and its use for assessing Italian-speaking consumers' eHealth literacy. ©Nicola Diviani, Alexandra Lelia Dima, Peter Johannes Schulz. Originally published in the Journal of Medical Internet Research (http://www.jmir.org), 11.04.2017.

  14. A Second-Order Confirmatory Factor Analysis of the Moral Distress Scale-Revised for Nurses.

    PubMed

    Sharif Nia, Hamid; Shafipour, Vida; Allen, Kelly-Ann; Heidari, Mohammad Reza; Yazdani-Charati, Jamshid; Zareiyan, Armin

    2017-01-01

    Moral distress is a growing problem for healthcare professionals that may lead to dissatisfaction, resignation, or occupational burnout if left unattended, and nurses experience different levels of this phenomenon. This study aims to investigate the factor structure of the Persian version of the Moral Distress Scale-Revised in intensive care and general nurses. This methodological research was conducted with 771 nurses from eight hospitals in the Mazandaran Province of Iran in 2017. Participants completed the Moral Distress Scale-Revised, data collected, and factor structure assessed using the construct, convergent, and divergent validity methods. The reliability of the scale was assessed using internal consistency (Cronbach's alpha, Theta, and McDonald's omega coefficients) and construct reliability. Ethical considerations: This study was approved by the Ethics Committee of Mazandaran University of Medical Sciences. The exploratory factor analysis ( N = 380) showed that the Moral Distress Scale-Revised has five factors: lack of professional competence at work, ignoring ethical issues and patient conditions, futile care, carrying out the physician's orders without question and unsafe care, and providing care under personal and organizational pressures, which explained 56.62% of the overall variance. The confirmatory factor analysis ( N = 391) supported the five-factor solution and the second-order latent factor model. The first-order model did not show a favorable convergent and divergent validity. Ultimately, the Moral Distress Scale-Revised was found to have a favorable internal consistency and construct reliability. The Moral Distress Scale-Revised was found to be a multidimensional construct. The data obtained confirmed the hypothesis of the factor structure model with a latent second-order variable. Since the convergent and divergent validity of the scale were not confirmed in this study, further assessment is necessary in future studies.

  15. Assessing the validity and reliability of family factors on physical activity: A case study in Turkey.

    PubMed

    Steenson, Sharalyn; Özcebe, Hilal; Arslan, Umut; Konşuk Ünlü, Hande; Araz, Özgür M; Yardim, Mahmut; Üner, Sarp; Bilir, Nazmi; Huang, Terry T-K

    2018-01-01

    Childhood obesity rates have been rising rapidly in developing countries. A better understanding of the risk factors and social context is necessary to inform public health interventions and policies. This paper describes the validation of several measurement scales for use in Turkey, which relate to child and parent perceptions of physical activity (PA) and enablers and barriers of physical activity in the home environment. The aim of this study was to assess the validity and reliability of several measurement scales in Turkey using a population sample across three socio-economic strata in the Turkish capital, Ankara. Surveys were conducted in Grade 4 children (mean age = 9.7 years for boys; 9.9 years for girls), and their parents, across 6 randomly selected schools, stratified by SES (n = 641 students, 483 parents). Construct validity of the scales was evaluated through exploratory and confirmatory factor analysis. Internal consistency of scales and test-retest reliability were assessed by Cronbach's alpha and intra-class correlation. The scales as a whole were found to have acceptable-to-good model fit statistics (PA Barriers: RMSEA = 0.076, SRMR = 0.0577, AGFI = 0.901; PA Outcome Expectancies: RMSEA = 0.054, SRMR = 0.0545, AGFI = 0.916, and PA Home Environment: RMSEA = 0.038, SRMR = 0.0233, AGFI = 0.976). The PA Barriers subscales showed good internal consistency and poor to fair test-retest reliability (personal α = 0.79, ICC = 0.29, environmental α = 0.73, ICC = 0.59). The PA Outcome Expectancies subscales showed good internal consistency and test-retest reliability (negative α = 0.77, ICC = 0.56; positive α = 0.74, ICC = 0.49). Only the PA Home Environment subscale on support for PA was validated in the final confirmatory model; it showed moderate internal consistency and test-retest reliability (α = 0.61, ICC = 0.48). This study is the first to validate measures of perceptions of physical activity and the physical activity home environment in Turkey. Our results support the originally hypothesized two-factor structures for Physical Activity Barriers and Physical Activity Outcome Expectancies. However, we found the one-factor rather than two-factor structure for Physical Activity Home Environment had the best model fit. This study provides general support for the use of these scales in Turkey in terms of validity, but test-retest reliability warrants further research.

  16. Fatigue Reliability of Gas Turbine Engine Structures

    NASA Technical Reports Server (NTRS)

    Cruse, Thomas A.; Mahadevan, Sankaran; Tryon, Robert G.

    1997-01-01

    The results of an investigation are described for fatigue reliability in engine structures. The description consists of two parts. Part 1 is for method development. Part 2 is a specific case study. In Part 1, the essential concepts and practical approaches to damage tolerance design in the gas turbine industry are summarized. These have evolved over the years in response to flight safety certification requirements. The effect of Non-Destructive Evaluation (NDE) methods on these methods is also reviewed. Assessment methods based on probabilistic fracture mechanics, with regard to both crack initiation and crack growth, are outlined. Limit state modeling techniques from structural reliability theory are shown to be appropriate for application to this problem, for both individual failure mode and system-level assessment. In Part 2, the results of a case study for the high pressure turbine of a turboprop engine are described. The response surface approach is used to construct a fatigue performance function. This performance function is used with the First Order Reliability Method (FORM) to determine the probability of failure and the sensitivity of the fatigue life to the engine parameters for the first stage disk rim of the two stage turbine. A hybrid combination of regression and Monte Carlo simulation is to use incorporate time dependent random variables. System reliability is used to determine the system probability of failure, and the sensitivity of the system fatigue life to the engine parameters of the high pressure turbine. 'ne variation in the primary hot gas and secondary cooling air, the uncertainty of the complex mission loading, and the scatter in the material data are considered.

  17. Three-dimensional facial anthropometry of unilateral cleft lip infants with a structured light scanning system.

    PubMed

    Li, Guanghui; Wei, Jianhua; Wang, Xi; Wu, Guofeng; Ma, Dandan; Wang, Bo; Liu, Yanpu; Feng, Xinghua

    2013-08-01

    Cleft lip in the presence or absence of a cleft palate is a major public health problem. However, few studies have been published concerning the soft-tissue morphology of cleft lip infants. Currently, obtaining reliable three-dimensional (3D) surface models of infants remains a challenge. The aim of this study was to investigate a new way of capturing 3D images of cleft lip infants using a structured light scanning system. In addition, the accuracy and precision of the acquired facial 3D data were validated and compared with direct measurements. Ten unilateral cleft lip patients were enrolled in the study. Briefly, 3D facial images of the patients were acquired using a 3D scanner device before and after the surgery. Fourteen items were measured by direct anthropometry and 3D image software. The accuracy and precision of the 3D system were assessed by comparative analysis. The anthropometric data obtained using the 3D method were in agreement with the direct anthropometry measurements. All data calculated by the software were 'highly reliable' or 'reliable', as defined in the literature. The localisation of four landmarks was not consistent in repeated experiments of inter-observer reliability in preoperative images (P<0.05), while the intra-observer reliability in both pre- and postoperative images was good (P>0.05). The structured light scanning system is proven to be a non-invasive, accurate and precise method in cleft lip anthropometry. Copyright © 2013 British Association of Plastic, Reconstructive and Aesthetic Surgeons. Published by Elsevier Ltd. All rights reserved.

  18. Qualitative Importance Measures of Systems Components - A New Approach and Its Applications

    NASA Astrophysics Data System (ADS)

    Chybowski, Leszek; Gawdzińska, Katarzyna; Wiśnicki, Bogusz

    2016-12-01

    The paper presents an improved methodology of analysing the qualitative importance of components in the functional and reliability structures of the system. We present basic importance measures, i.e. the Birnbaum's structural measure, the order of the smallest minimal cut-set, the repetition count of an i-th event in the Fault Tree and the streams measure. A subsystem of circulation pumps and fuel heaters in the main engine fuel supply system of a container vessel illustrates the qualitative importance analysis. We constructed a functional model and a Fault Tree which we analysed using qualitative measures. Additionally, we compared the calculated measures and introduced corrected measures as a tool for improving the analysis. We proposed scaled measures and a common measure taking into account the location of the component in the reliability and functional structures. Finally, we proposed an area where the measures could be applied.

  19. Reliability model generator

    NASA Technical Reports Server (NTRS)

    Cohen, Gerald C. (Inventor); McMann, Catherine M. (Inventor)

    1991-01-01

    An improved method and system for automatically generating reliability models for use with a reliability evaluation tool is described. The reliability model generator of the present invention includes means for storing a plurality of low level reliability models which represent the reliability characteristics for low level system components. In addition, the present invention includes means for defining the interconnection of the low level reliability models via a system architecture description. In accordance with the principles of the present invention, a reliability model for the entire system is automatically generated by aggregating the low level reliability models based on the system architecture description.

  20. Estimating the Term Structure With a Semiparametric Bayesian Hierarchical Model: An Application to Corporate Bonds.

    PubMed

    Cruz-Marcelo, Alejandro; Ensor, Katherine B; Rosner, Gary L

    2011-06-01

    The term structure of interest rates is used to price defaultable bonds and credit derivatives, as well as to infer the quality of bonds for risk management purposes. We introduce a model that jointly estimates term structures by means of a Bayesian hierarchical model with a prior probability model based on Dirichlet process mixtures. The modeling methodology borrows strength across term structures for purposes of estimation. The main advantage of our framework is its ability to produce reliable estimators at the company level even when there are only a few bonds per company. After describing the proposed model, we discuss an empirical application in which the term structure of 197 individual companies is estimated. The sample of 197 consists of 143 companies with only one or two bonds. In-sample and out-of-sample tests are used to quantify the improvement in accuracy that results from approximating the term structure of corporate bonds with estimators by company rather than by credit rating, the latter being a popular choice in the financial literature. A complete description of a Markov chain Monte Carlo (MCMC) scheme for the proposed model is available as Supplementary Material.

  1. Estimating the Term Structure With a Semiparametric Bayesian Hierarchical Model: An Application to Corporate Bonds1

    PubMed Central

    Cruz-Marcelo, Alejandro; Ensor, Katherine B.; Rosner, Gary L.

    2011-01-01

    The term structure of interest rates is used to price defaultable bonds and credit derivatives, as well as to infer the quality of bonds for risk management purposes. We introduce a model that jointly estimates term structures by means of a Bayesian hierarchical model with a prior probability model based on Dirichlet process mixtures. The modeling methodology borrows strength across term structures for purposes of estimation. The main advantage of our framework is its ability to produce reliable estimators at the company level even when there are only a few bonds per company. After describing the proposed model, we discuss an empirical application in which the term structure of 197 individual companies is estimated. The sample of 197 consists of 143 companies with only one or two bonds. In-sample and out-of-sample tests are used to quantify the improvement in accuracy that results from approximating the term structure of corporate bonds with estimators by company rather than by credit rating, the latter being a popular choice in the financial literature. A complete description of a Markov chain Monte Carlo (MCMC) scheme for the proposed model is available as Supplementary Material. PMID:21765566

  2. Reading PDB: perception of molecules from 3D atomic coordinates.

    PubMed

    Urbaczek, Sascha; Kolodzik, Adrian; Groth, Inken; Heuser, Stefan; Rarey, Matthias

    2013-01-28

    The analysis of small molecule crystal structures is a common way to gather valuable information for drug development. The necessary structural data is usually provided in specific file formats containing only element identities and three-dimensional atomic coordinates as reliable chemical information. Consequently, the automated perception of molecular structures from atomic coordinates has become a standard task in cheminformatics. The molecules generated by such methods must be both chemically valid and reasonable to provide a reliable basis for subsequent calculations. This can be a difficult task since the provided coordinates may deviate from ideal molecular geometries due to experimental uncertainties or low resolution. Additionally, the quality of the input data often differs significantly thus making it difficult to distinguish between actual structural features and mere geometric distortions. We present a method for the generation of molecular structures from atomic coordinates based on the recently published NAOMI model. By making use of this consistent chemical description, our method is able to generate reliable results even with input data of low quality. Molecules from 363 Protein Data Bank (PDB) entries could be perceived with a success rate of 98%, a result which could not be achieved with previously described methods. The robustness of our approach has been assessed by processing all small molecules from the PDB and comparing them to reference structures. The complete data set can be processed in less than 3 min, thus showing that our approach is suitable for large scale applications.

  3. Analyzing the effect of transmissivity uncertainty on the reliability of a model of the northwestern Sahara aquifer system

    NASA Astrophysics Data System (ADS)

    Zammouri, Mounira; Ribeiro, Luis

    2017-05-01

    Groundwater flow model of the transboundary Saharan aquifer system is developed in 2003 and used for management and decision-making by Algeria, Tunisia and Libya. In decision-making processes, reliability plays a decisive role. This paper looks into the reliability assessment of the Saharan aquifers model. It aims to detect the shortcomings of the model considered properly calibrated. After presenting the calibration results of the effort modelling in 2003, the uncertainty in the model which arising from the lack of the groundwater level and the transmissivity data is analyzed using kriging technique and stochastic approach. The structural analysis of piezometry in steady state and logarithms of transmissivity were carried out for the Continental Intercalaire (CI) and the Complexe Terminal (CT) aquifers. The available data (piezometry and transmissivity) were compared to the calculated values, using geostatistics approach. Using a stochastic approach, 2500 realizations of a log-normal random transmissivity field of the CI aquifer has been performed to assess the errors of the model output, due to the uncertainty in transmissivity. Two types of bad calibration are shown. In some regions, calibration should be improved using the available data. In others areas, undertaking the model refinement requires gathering new data to enhance the aquifer system knowledge. Stochastic simulations' results showed that the calculated drawdowns in 2050 could be higher than the values predicted by the calibrated model.

  4. A Bayesian Approach for Sensor Optimisation in Impact Identification

    PubMed Central

    Mallardo, Vincenzo; Sharif Khodaei, Zahra; Aliabadi, Ferri M. H.

    2016-01-01

    This paper presents a Bayesian approach for optimizing the position of sensors aimed at impact identification in composite structures under operational conditions. The uncertainty in the sensor data has been represented by statistical distributions of the recorded signals. An optimisation strategy based on the genetic algorithm is proposed to find the best sensor combination aimed at locating impacts on composite structures. A Bayesian-based objective function is adopted in the optimisation procedure as an indicator of the performance of meta-models developed for different sensor combinations to locate various impact events. To represent a real structure under operational load and to increase the reliability of the Structural Health Monitoring (SHM) system, the probability of malfunctioning sensors is included in the optimisation. The reliability and the robustness of the procedure is tested with experimental and numerical examples. Finally, the proposed optimisation algorithm is applied to a composite stiffened panel for both the uniform and non-uniform probability of impact occurrence. PMID:28774064

  5. Psychometric evaluation of the Swedish adaptation of the Inventory for Assessing the Process of Cultural Competence Among Healthcare Professionals--Revised (IAPCC-R).

    PubMed

    Olt, Helen; Jirwe, Maria; Gustavsson, Petter; Emami, Azita

    2010-01-01

    The purpose of this study was to describe the translation, adaption, and psychometric evaluation process in relation to validity and reliability of the Swedish version of the instrument, Inventory for Assessing The Process of Cultural Competence Among Healthcare Professionals-Revised (IAPCC-R) following the translation, adaptation, and psychometric evaluation process. Validity tests were conducted on the response processes (N = 15), the content (N = 7), and the internal structure of the instrument (N = 334). Reliability (alpha = .65 for the total scale varying between -.01 and .65 for the different subscales) was evaluated in terms of internal consistency. Results indicated weak validity and reliability though it is difficult to conclude whether this is related to adaptation issues or the original construction.The testing of the response process identified problems in relation to respondents' conceptualization of cultural competence. The test of the content identified a weak correspondence between the items and the underlying model. In addition, a confirmatory factor analysis did not confirm the proposed structure of the instrument. This study concludes that this instrument is not valid and reliable for use with a Swedish population of practicing nurses or nursing students.

  6. The German Version of the Herth Hope Index (HHI-D): Development and Psychometric Properties.

    PubMed

    Geiser, Franziska; Zajackowski, Katharina; Conrad, Rupert; Imbierowicz, Katrin; Wegener, Ingo; Herth, Kaye A; Urbach, Anne Sarah

    2015-01-01

    The importance of hope is evident in clinical oncological care. Hope is associated with psychological and also physical functioning. However, there is still a dearth of empirical research on hope as a multidimensional concept. The Herth Hope Index is a reliable and valid instrument for the measurement of hope and is available in many languages. Until now no authorized German translation has been published and validated. After translation, the questionnaire was completed by 192 patients with different tumor entities in radiation therapy. Reliability, concurrent validity, and factor structure of the questionnaire were determined. Correlations were high with depression and anxiety as well as optimism and pessimism. As expected, correlations with coping styles were moderate. Internal consistency and test-retest reliability were satisfactory. We could not replicate the original 3-factor model. Application of the scree plot criterion in an exploratory factor analysis resulted in a single-factor structure. The Herth Hope Index - German Version (HHI-D) is a short, reliable, and valid instrument for the assessment of hope in patient populations. We recommend using only the HHI-D total score until further research gives more insights into possible factorial solutions and subscales. © 2015 S. Karger GmbH, Freiburg.

  7. Temperament and Character in the Child and Adolescent Twin Study in Sweden (CATSS): Comparison to the General Population, and Genetic Structure Analysis

    PubMed Central

    Garcia, Danilo; Lundström, Sebastian; Brändström, Sven; Råstam, Maria; Cloninger, C. Robert; Kerekes, Nóra; Nilsson, Thomas; Anckarsäter, Henrik

    2013-01-01

    Background The Child and Adolescent Twin Study in Sweden (CATSS) is an on-going, large population-based longitudinal twin study. We aimed (1) to investigate the reliability of two different versions (125-items and 238-items) of Cloninger's Temperament and Character Inventory (TCI) used in the CATSS and the validity of extracting the short version from the long version, (2) to compare these personality dimensions between twins and adolescents from the general population, and (3) to investigate the genetic structure of Cloninger's model. Method Reliability and correlation analyses were conducted for both TCI versions, 2,714 CATSS-twins were compared to 631 adolescents from the general population, and the genetic structure was investigated through univariate genetic analyses, using a model-fitting approach with structural equation-modeling techniques based on same-sex twin pairs from the CATSS (423 monozygotic and 408 dizygotic pairs). Results The TCI scores from the short and long versions showed comparable reliability coefficients and were strongly correlated. Twins scored about half a standard deviation higher in the character scales. Three of the four temperament dimensions (Novelty Seeking, Harm Avoidance, and Persistence) had strong genetic and non-shared environmental effects, while Reward Dependence and the three character dimensions had moderate genetic effects, and both shared and non-shared environmental effects. Conclusions Twins showed higher scores in character dimensions compared to adolescents from the general population. At least among adolescents there is a shared environmental influence for all of the character dimensions, but only for one of the temperament dimensions (i.e., Reward Dependence). This specific finding regarding the existence of shared environmental factors behind the character dimensions in adolescence, together with earlier findings showing a small shared environmental effects on character among young adults and no shared environmental effects on character among adults, suggest that there is a shift in type of environmental influence from adolescence to adulthood regarding character. PMID:23940581

  8. Developing a customized multiple interview for dental school admissions.

    PubMed

    Gardner, Karen M

    2014-04-01

    From the early 1980s until recently, the University of British Columbia Faculty of Dentistry had employed the Canadian Dental Association (CDA) Structured Interview in its Phase 2 admissions process (with those applicants invited for interviews). While this structured interview had demonstrated reliability and validity, the Faculty of Dentistry came to believe that a multiple interview process using scenarios would help it better identify applicants who would match its mission. After a literature review that investigated such interview protocols as unstructured, semi-structured, computerized, and telephone formats, a multiple interview format was chosen. This format was seen as an emerging trend, with evidence that it has been deemed fairer by applicants, more reliable by interviewers, more difficult for applicants to provide set answers for the scenarios, and not to require as many interviewers as other formats. This article describes the process undertaken to implement a customized multiple interview format for admissions and reports these outcomes of the process: a smoothly running multiple interview; effective training protocols for staff, interviewers, and applicants; and reports from successful applicants and interviewers that they felt the multiple interview was a more reliable and fairer recruiting tool than other models.

  9. Improving statistical inference on pathogen densities estimated by quantitative molecular methods: malaria gametocytaemia as a case study.

    PubMed

    Walker, Martin; Basáñez, María-Gloria; Ouédraogo, André Lin; Hermsen, Cornelus; Bousema, Teun; Churcher, Thomas S

    2015-01-16

    Quantitative molecular methods (QMMs) such as quantitative real-time polymerase chain reaction (q-PCR), reverse-transcriptase PCR (qRT-PCR) and quantitative nucleic acid sequence-based amplification (QT-NASBA) are increasingly used to estimate pathogen density in a variety of clinical and epidemiological contexts. These methods are often classified as semi-quantitative, yet estimates of reliability or sensitivity are seldom reported. Here, a statistical framework is developed for assessing the reliability (uncertainty) of pathogen densities estimated using QMMs and the associated diagnostic sensitivity. The method is illustrated with quantification of Plasmodium falciparum gametocytaemia by QT-NASBA. The reliability of pathogen (e.g. gametocyte) densities, and the accompanying diagnostic sensitivity, estimated by two contrasting statistical calibration techniques, are compared; a traditional method and a mixed model Bayesian approach. The latter accounts for statistical dependence of QMM assays run under identical laboratory protocols and permits structural modelling of experimental measurements, allowing precision to vary with pathogen density. Traditional calibration cannot account for inter-assay variability arising from imperfect QMMs and generates estimates of pathogen density that have poor reliability, are variable among assays and inaccurately reflect diagnostic sensitivity. The Bayesian mixed model approach assimilates information from replica QMM assays, improving reliability and inter-assay homogeneity, providing an accurate appraisal of quantitative and diagnostic performance. Bayesian mixed model statistical calibration supersedes traditional techniques in the context of QMM-derived estimates of pathogen density, offering the potential to improve substantially the depth and quality of clinical and epidemiological inference for a wide variety of pathogens.

  10. Probabilistic Assessment of High-Throughput Wireless Sensor Networks

    PubMed Central

    Kim, Robin E.; Mechitov, Kirill; Sim, Sung-Han; Spencer, Billie F.; Song, Junho

    2016-01-01

    Structural health monitoring (SHM) using wireless smart sensors (WSS) has the potential to provide rich information on the state of a structure. However, because of their distributed nature, maintaining highly robust and reliable networks can be challenging. Assessing WSS network communication quality before and after finalizing a deployment is critical to achieve a successful WSS network for SHM purposes. Early studies on WSS network reliability mostly used temporal signal indicators, composed of a smaller number of packets, to assess the network reliability. However, because the WSS networks for SHM purpose often require high data throughput, i.e., a larger number of packets are delivered within the communication, such an approach is not sufficient. Instead, in this study, a model that can assess, probabilistically, the long-term performance of the network is proposed. The proposed model is based on readily-available measured data sets that represent communication quality during high-throughput data transfer. Then, an empirical limit-state function is determined, which is further used to estimate the probability of network communication failure. Monte Carlo simulation is adopted in this paper and applied to a small and a full-bridge wireless networks. By performing the proposed analysis in complex sensor networks, an optimized sensor topology can be achieved. PMID:27258270

  11. Reliability, validity, and interpretation of the dependence scale in mild to moderately severe Alzheimer's disease.

    PubMed

    Lenderking, William R; Wyrwich, Kathleen W; Stolar, Marilyn; Howard, Kellee A; Leibman, Chris; Buchanan, Jacqui; Lacey, Loretto; Kopp, Zoe; Stern, Yaakov

    2013-12-01

    The Dependence Scale (DS) was designed to measure dependence on others among patients with Alzheimer's disease (AD). The objectives of this research were primarily to strengthen the psychometric evidence for the use of the DS in AD studies. Patients with mild to moderately severe AD were examined in 3 study databases. Within each data set, internal consistency, validity, and responsiveness were examined, and structural equation models were fit. The DS has strong psychometric properties. The DS scores differed significantly across known groups and demonstrated moderate to strong correlations with measures hypothesized to be related to dependence (|r| ≥ .31). Structural equation modeling supported the validity of the DS concept. An anchor-based DS responder definition to interpret a treatment benefit over time was identified. The DS is a reliable, valid, and interpretable measure of dependence associated with AD and is shown to be related to--but provides information distinct from--cognition, functioning, and behavior.

  12. Reliability, Factor Structure, and Associations With Measures of Problem Relationship and Behavior of the Personality Inventory for DSM-5 in a Sample of Italian Community-Dwelling Adolescents.

    PubMed

    Somma, Antonella; Borroni, Serena; Maffei, Cesare; Giarolli, Laura E; Markon, Kristian E; Krueger, Robert F; Fossati, Andrea

    2017-10-01

    In order to assess the reliability, factorial validity, and criterion validity of the Personality Inventory for DSM-5 (PID-5) among adolescents, 1,264 Italian high school students were administered the PID-5. Participants were also administered the Questionnaire on Relationships and Substance Use as a criterion measure. In the full sample, McDonald's ω values were adequate for the PID-5 scales (median ω = .85, SD = .06), except for Suspiciousness. However, all PID-5 scales showed average inter-item correlation values in the .20-.55 range. Exploratory structural equation modeling analyses provided moderate support for the a priori model of PID-5 trait scales. Ordinal logistic regression analyses showed that selected PID-5 trait scales predicted a significant, albeit moderate (Cox & Snell R 2 values ranged from .08 to .15, all ps < .001) amount of variance in Questionnaire on Relationships and Substance Use variables.

  13. Bem Sex Role Inventory Validation in the International Mobility in Aging Study.

    PubMed

    Ahmed, Tamer; Vafaei, Afshin; Belanger, Emmanuelle; Phillips, Susan P; Zunzunegui, Maria-Victoria

    2016-09-01

    This study investigated the measurement structure of the Bem Sex Role Inventory (BSRI) with different factor analysis methods. Most previous studies on validity applied exploratory factor analysis (EFA) to examine the BSRI. We aimed to assess the psychometric properties and construct validity of the 12-item short-form BSRI in a sample administered to 1,995 older adults from wave 1 of the International Mobility in Aging Study (IMIAS). We used Cronbach's alpha to assess internal consistency reliability and confirmatory factor analysis (CFA) to assess psychometric properties. EFA revealed a three-factor model, further confirmed by CFA and compared with the original two-factor structure model. Results revealed that a two-factor solution (instrumentality-expressiveness) has satisfactory construct validity and superior fit to data compared to the three-factor solution. The two-factor solution confirms expected gender differences in older adults. The 12-item BSRI provides a brief, psychometrically sound, and reliable instrument in international samples of older adults.

  14. Development of a Kalman Filter in the Gauss-Helmert Model for Reliability Analysis in Orientation Determination with Smartphone Sensors

    PubMed Central

    Ettlinger, Andreas; Neuner, Hans; Burgess, Thomas

    2018-01-01

    The topic of indoor positioning and indoor navigation by using observations from smartphone sensors is very challenging as the determined trajectories can be subject to significant deviations compared to the route travelled in reality. Especially the calculation of the direction of movement is the critical part of pedestrian positioning approaches such as Pedestrian Dead Reckoning (“PDR”). Due to distinct systematic effects in filtered trajectories, it can be assumed that there are systematic deviations present in the observations from smartphone sensors. This article has two aims: one is to enable the estimation of partial redundancies for each observation as well as for observation groups. Partial redundancies are a measure for the reliability indicating how well systematic deviations can be detected in single observations used in PDR. The second aim is to analyze the behavior of partial redundancy by modifying the stochastic and functional model of the Kalman filter. The equations relating the observations to the orientation are condition equations, which do not exhibit the typical structure of the Gauss-Markov model (“GMM”), wherein the observations are linear and can be formulated as functions of the states. To calculate and analyze the partial redundancy of the observations from smartphone-sensors used in PDR, the system equation and the measurement equation of a Kalman filter as well as the redundancy matrix need to be derived in the Gauss-Helmert model (“GHM”). These derivations are introduced in this article and lead to a novel Kalman filter structure based on condition equations, enabling reliability assessment of each observation. PMID:29385076

  15. Development of a Kalman Filter in the Gauss-Helmert Model for Reliability Analysis in Orientation Determination with Smartphone Sensors.

    PubMed

    Ettlinger, Andreas; Neuner, Hans; Burgess, Thomas

    2018-01-31

    The topic of indoor positioning and indoor navigation by using observations from smartphone sensors is very challenging as the determined trajectories can be subject to significant deviations compared to the route travelled in reality. Especially the calculation of the direction of movement is the critical part of pedestrian positioning approaches such as Pedestrian Dead Reckoning ("PDR"). Due to distinct systematic effects in filtered trajectories, it can be assumed that there are systematic deviations present in the observations from smartphone sensors. This article has two aims: one is to enable the estimation of partial redundancies for each observation as well as for observation groups. Partial redundancies are a measure for the reliability indicating how well systematic deviations can be detected in single observations used in PDR. The second aim is to analyze the behavior of partial redundancy by modifying the stochastic and functional model of the Kalman filter. The equations relating the observations to the orientation are condition equations, which do not exhibit the typical structure of the Gauss-Markov model ("GMM"), wherein the observations are linear and can be formulated as functions of the states. To calculate and analyze the partial redundancy of the observations from smartphone-sensors used in PDR, the system equation and the measurement equation of a Kalman filter as well as the redundancy matrix need to be derived in the Gauss-Helmert model ("GHM"). These derivations are introduced in this article and lead to a novel Kalman filter structure based on condition equations, enabling reliability assessment of each observation.

  16. Three-factor structure for Epistemic Belief Inventory: A cross-validation study

    PubMed Central

    2017-01-01

    Research on epistemic beliefs has been hampered by lack of validated models and measurement instruments. The most widely used instrument is the Epistemological Questionnaire, which has been criticized for validity, and it has been proposed a new instrument based in the Epistemological Questionnaire: the Epistemic Belief Inventory. The Spanish-language version of Epistemic Belief Inventory was applied to 1,785 Chilean high school students. Exploratory and confirmatory factor analyses in independent subsamples were performed. A three factor structure emerged and was confirmed. Reliability was comparable to other studies, and the factor structure was invariant among randomized subsamples. The structure that was found does not replicate the one proposed originally, but results are interpreted in light of embedded systemic model of epistemological beliefs. PMID:28278258

  17. Analytical model of cracking due to rebar corrosion expansion in concrete considering the structure internal force

    NASA Astrophysics Data System (ADS)

    Lin, Xiangyue; Peng, Minli; Lei, Fengming; Tan, Jiangxian; Shi, Huacheng

    2017-12-01

    Based on the assumptions of uniform corrosion and linear elastic expansion, an analytical model of cracking due to rebar corrosion expansion in concrete was established, which is able to consider the structure internal force. And then, by means of the complex variable function theory and series expansion technology established by Muskhelishvili, the corresponding stress component functions of concrete around the reinforcement were obtained. Also, a comparative analysis was conducted between the numerical simulation model and present model in this paper. The results show that the calculation results of both methods were consistent with each other, and the numerical deviation was less than 10%, proving that the analytical model established in this paper is reliable.

  18. Software reliability experiments data analysis and investigation

    NASA Technical Reports Server (NTRS)

    Walker, J. Leslie; Caglayan, Alper K.

    1991-01-01

    The objectives are to investigate the fundamental reasons which cause independently developed software programs to fail dependently, and to examine fault tolerant software structures which maximize reliability gain in the presence of such dependent failure behavior. The authors used 20 redundant programs from a software reliability experiment to analyze the software errors causing coincident failures, to compare the reliability of N-version and recovery block structures composed of these programs, and to examine the impact of diversity on software reliability using subpopulations of these programs. The results indicate that both conceptually related and unrelated errors can cause coincident failures and that recovery block structures offer more reliability gain than N-version structures if acceptance checks that fail independently from the software components are available. The authors present a theory of general program checkers that have potential application for acceptance tests.

  19. DSM-5 section III personality traits and section II personality disorders in a Flemish community sample.

    PubMed

    Bastiaens, Tim; Smits, Dirk; De Hert, Marc; Vanwalleghem, Dominique; Claes, Laurence

    2016-04-30

    The Personality Inventory for DSM-5 (PID-5; Krueger et al., 2012) is a dimensional self-report questionnaire designed to measure personality pathology according to the criterion B of the DSM-5 Section III personality model. In the current issue of DSM, this dimensional Section III personality model co-exists with the Section II categorical personality model derived from DSM-IV-TR. Therefore, investigation of the inter-relatedness of both models across populations and languages is warranted. In this study, we first examined the factor structure and reliability of the PID-5 in a Flemish community sample (N=509) by means of exploratory structural equation modeling and alpha coefficients. Next, we investigated the predictive ability of section III personality traits in relation to section II personality disorders through correlations and stepwise regression analyses. Results revealed a five factor solution for the PID-5, with adequate reliability of the facet scales. The variance in Section II personality disorders could be predicted by their theoretically comprising Section III personality traits, but additional Section III personality traits augmented this prediction. Based on current results, we discuss the Section II personality disorder conceptualization and the Section III personality disorder operationalization. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  20. Risk Management and Physical Modelling for Mountainous Natural Hazards

    NASA Astrophysics Data System (ADS)

    Lehning, Michael; Wilhelm, Christian

    Population growth and climate change cause rapid changes in mountainous regions resulting in increased risks of floods, avalanches, debris flows and other natural hazards. Xevents are of particular concern, since attempts to protect against them result in exponentially growing costs. In this contribution, we suggest an integral risk management approach to dealing with natural hazards that occur in mountainous areas. Using the example of a mountain pass road, which can be protected from the danger of an avalanche by engineering (galleries) and/or organisational (road closure) measures, we show the advantage of an optimal combination of both versus the traditional approach, which is to rely solely on engineering structures. Organisational measures become especially important for Xevents because engineering structures cannot be designed for those events. However, organisational measures need a reliable and objective forecast of the hazard. Therefore, we further suggest that such forecasts should be developed using physical numerical modelling. We present the status of current approaches to using physical modelling to predict snow cover stability for avalanche warnings and peak runoff from mountain catchments for flood warnings. While detailed physical models can already predict peak runoff reliably, they are only used to support avalanche warnings. With increased process knowledge and computer power, current developments should lead to a enhanced role for detailed physical models in natural mountain hazard prediction.

  1. Structural health monitoring in composite materials using frequency response methods

    NASA Astrophysics Data System (ADS)

    Kessler, Seth S.; Spearing, S. Mark; Atalla, Mauro J.; Cesnik, Carlos E. S.; Soutis, Constantinos

    2001-08-01

    Cost effective and reliable damage detection is critical for the utilization of composite materials in structural applications. Non-destructive evaluation techniques (e.g. ultrasound, radiography, infra-red imaging) are available for use during standard repair and maintenance cycles, however by comparison to the techniques used for metals these are relatively expensive and time consuming. This paper presents part of an experimental and analytical survey of candidate methods for the detection of damage in composite materials. The experimental results are presented for the application of modal analysis techniques applied to rectangular laminated graphite/epoxy specimens containing representative damage modes, including delamination, transverse ply cracks and through-holes. Changes in natural frequencies and modes were then found using a scanning laser vibrometer, and 2-D finite element models were created for comparison with the experimental results. The models accurately predicted the response of the specimems at low frequencies, but the local excitation and coalescence of higher frequency modes make mode-dependent damage detection difficult and most likely impractical for structural applications. The frequency response method was found to be reliable for detecting even small amounts of damage in a simple composite structure, however the potentially important information about damage type, size, location and orientation were lost using this method since several combinations of these variables can yield identical response signatures.

  2. Probabilistic Structural Analysis Methods (PSAM) for select space propulsion system structural components

    NASA Technical Reports Server (NTRS)

    Cruse, T. A.

    1987-01-01

    The objective is the development of several modular structural analysis packages capable of predicting the probabilistic response distribution for key structural variables such as maximum stress, natural frequencies, transient response, etc. The structural analysis packages are to include stochastic modeling of loads, material properties, geometry (tolerances), and boundary conditions. The solution is to be in terms of the cumulative probability of exceedance distribution (CDF) and confidence bounds. Two methods of probability modeling are to be included as well as three types of structural models - probabilistic finite-element method (PFEM); probabilistic approximate analysis methods (PAAM); and probabilistic boundary element methods (PBEM). The purpose in doing probabilistic structural analysis is to provide the designer with a more realistic ability to assess the importance of uncertainty in the response of a high performance structure. Probabilistic Structural Analysis Method (PSAM) tools will estimate structural safety and reliability, while providing the engineer with information on the confidence that should be given to the predicted behavior. Perhaps most critically, the PSAM results will directly provide information on the sensitivity of the design response to those variables which are seen to be uncertain.

  3. Probabilistic Structural Analysis Methods for select space propulsion system structural components (PSAM)

    NASA Technical Reports Server (NTRS)

    Cruse, T. A.; Burnside, O. H.; Wu, Y.-T.; Polch, E. Z.; Dias, J. B.

    1988-01-01

    The objective is the development of several modular structural analysis packages capable of predicting the probabilistic response distribution for key structural variables such as maximum stress, natural frequencies, transient response, etc. The structural analysis packages are to include stochastic modeling of loads, material properties, geometry (tolerances), and boundary conditions. The solution is to be in terms of the cumulative probability of exceedance distribution (CDF) and confidence bounds. Two methods of probability modeling are to be included as well as three types of structural models - probabilistic finite-element method (PFEM); probabilistic approximate analysis methods (PAAM); and probabilistic boundary element methods (PBEM). The purpose in doing probabilistic structural analysis is to provide the designer with a more realistic ability to assess the importance of uncertainty in the response of a high performance structure. Probabilistic Structural Analysis Method (PSAM) tools will estimate structural safety and reliability, while providing the engineer with information on the confidence that should be given to the predicted behavior. Perhaps most critically, the PSAM results will directly provide information on the sensitivity of the design response to those variables which are seen to be uncertain.

  4. Item-level and subscale-level factoring of Biggs' Learning Process Questionnaire (LPQ) in a mainland Chinese sample.

    PubMed

    Sachs, J; Gao, L

    2000-09-01

    The learning process questionnaire (LPQ) has been the source of intensive cross-cultural study. However, an item-level factor analysis of all the LPQ items simultaneously has never been reported. Rather, items within each subscale have been factor analysed to establish subscale unidimensionality and justify the use of composite subscale scores. It was of major interest to see if the six logically constructed items groups of the LPQ would be supported by empirical evidence. Additionally, it was of interest to compare the consistency of the reliability and correlational structure of the LPQ subscales in our study with those of previous cross-cultural studies. Confirmatory factor analysis was used to fit the six-factor item level model and to fit five representative subscale level factor models. A total of 1070 students between the ages of 15 to 18 years was drawn from a representative selection of 29 classes from within 15 secondary schools in Guangzhou, China. Males and females were almost equally represented. The six-factor item level model of the LPQ seemed to fit reasonably well, thus supporting the six dimensional structure of the LPQ and justifying the use of composite subscale scores for each LPQ dimension. However, the reliability of many of these subscales was low. Furthermore, only two subscale-level factor models showed marginally acceptable fit. Substantive considerations supported an oblique three-factor model. Because the LPQ subscales often show low internal consistency reliability, experimental and correlational studies that have used these subscales as dependent measures have been disappointing. It is suggested that some LPQ items should be revised and other items added to improve the inventory's overall psychometric properties.

  5. The process group approach to reliable distributed computing

    NASA Technical Reports Server (NTRS)

    Birman, Kenneth P.

    1991-01-01

    The difficulty of developing reliable distributed software is an impediment to applying distributed computing technology in many settings. Experience with the ISIS system suggests that a structured approach based on virtually synchronous process groups yields systems which are substantially easier to develop, fault-tolerance, and self-managing. Six years of research on ISIS are reviewed, describing the model, the types of applications to which ISIS was applied, and some of the reasoning that underlies a recent effort to redesign and reimplement ISIS as a much smaller, lightweight system.

  6. Specialty fibers for fiber optic sensor application

    NASA Astrophysics Data System (ADS)

    Bennett, K.; Koh, J.; Coon, J.; Chien, C. K.; Artuso, A.; Chen, X.; Nolan, D.; Li, M.-J.

    2007-09-01

    Over the last several years, Fiber Optic Sensor (FOS) applications have seen an increased acceptance in many areas including oil & gas production monitoring, gyroscopes, current sensors, structural sensing and monitoring, and aerospace applications. High level optical and mechanical reliability of optical fiber is necessary to guarantee reliable performance of FOS. In this paper, we review recent research and development activities on new specialty fibers. We discuss fiber design concepts and present both modeling and experimental results. The main approaches to enhancing fiber attributes include new index profile design and fiber coating modification.

  7. High-Temperature Strain Sensing for Aerospace Applications

    NASA Technical Reports Server (NTRS)

    Piazza, Anthony; Richards, Lance W.; Hudson, Larry D.

    2008-01-01

    Thermal protection systems (TPS) and hot structures are utilizing advanced materials that operate at temperatures that exceed abilities to measure structural performance. Robust strain sensors that operate accurately and reliably beyond 1800 F are needed but do not exist. These shortcomings hinder the ability to validate analysis and modeling techniques and hinders the ability to optimize structural designs. This presentation examines high-temperature strain sensing for aerospace applications and, more specifically, seeks to provide strain data for validating finite element models and thermal-structural analyses. Efforts have been made to develop sensor attachment techniques for relevant structural materials at the small test specimen level and to perform laboratory tests to characterize sensor and generate corrections to apply to indicated strains. Areas highlighted in this presentation include sensors, sensor attachment techniques, laboratory evaluation/characterization of strain measurement, and sensor use in large-scale structures.

  8. Multigroup confirmatory factor analysis and structural invariance with age of the Behavior Rating Inventory of Executive Function (BRIEF)--French version.

    PubMed

    Fournet, Nathalie; Roulin, Jean-Luc; Monnier, Catherine; Atzeni, Thierry; Cosnefroy, Olivier; Le Gall, Didier; Roy, Arnaud

    2015-01-01

    The parent and teacher forms of the French version of the Behavioral Rating Inventory of Executive Function (BRIEF) were used to evaluate executive function in everyday life in a large sample of healthy children (N = 951) aged between 5 and 18. Several psychometric methods were applied, with a view to providing clinicians with tools for score interpretation. The parent and teacher forms of the BRIEF were acceptably reliable. Demographic variables (such as age and gender) were found to influence the BRIEF scores. Confirmatory factor analysis was then used to test five competing models of the BRIEF's latent structure. Two of these models (a three-factor model and a two-factor model, both based on a nine-scale structure) had a good fit. However, structural invariance with age was only obtained with the two-factor model. The French version of the BRIEF provides a useful measure of everyday executive function and can be recommended for use in clinical research and practice.

  9. [Optimization of the parameters of microcirculatory structural adaptation model based on improved quantum-behaved particle swarm optimization algorithm].

    PubMed

    Pan, Qing; Yao, Jialiang; Wang, Ruofan; Cao, Ping; Ning, Gangmin; Fang, Luping

    2017-08-01

    The vessels in the microcirculation keep adjusting their structure to meet the functional requirements of the different tissues. A previously developed theoretical model can reproduce the process of vascular structural adaptation to help the study of the microcirculatory physiology. However, until now, such model lacks the appropriate methods for its parameter settings with subsequent limitation of further applications. This study proposed an improved quantum-behaved particle swarm optimization (QPSO) algorithm for setting the parameter values in this model. The optimization was performed on a real mesenteric microvascular network of rat. The results showed that the improved QPSO was superior to the standard particle swarm optimization, the standard QPSO and the previously reported Downhill algorithm. We conclude that the improved QPSO leads to a better agreement between mathematical simulation and animal experiment, rendering the model more reliable in future physiological studies.

  10. A Comprehensive Structural Dynamic Analysis Approach for Multi Mission Earth Entry Vehicle (MMEEV) Development

    NASA Technical Reports Server (NTRS)

    Perino, Scott; Bayandor, Javid; Siddens, Aaron

    2012-01-01

    The anticipated NASA Mars Sample Return Mission (MSR) requires a simple and reliable method in which to return collected Martian samples back to earth for scientific analysis. The Multi-Mission Earth Entry Vehicle (MMEEV) is NASA's proposed solution to this MSR requirement. Key aspects of the MMEEV are its reliable and passive operation, energy absorbing foam-composite structure, and modular impact sphere (IS) design. To aid in the development of an EEV design that can be modified for various missions requirements, two fully parametric finite element models were developed. The first model was developed in an explicit finite element code and was designed to evaluate the impact response of the vehicle and payload during the final stage of the vehicle's return to earth. The second model was developed in an explicit code and was designed to evaluate the static and dynamic structural response of the vehicle during launch and reentry. In contrast to most other FE models, built through a Graphical User Interface (GUI) pre-processor, the current model was developed using a coding technique that allows the analyst to quickly change nearly all aspects of the model including: geometric dimensions, material properties, load and boundary conditions, mesh properties, and analysis controls. Using the developed design tool, a full range of proposed designs can quickly be analyzed numerically and thus the design trade space for the EEV can be fully understood. An engineer can then quickly reach the best design for a specific mission and also adapt and optimize the general design for different missions.

  11. Factor structure and internal reliability of an exercise health belief model scale in a Mexican population.

    PubMed

    Villar, Oscar Armando Esparza-Del; Montañez-Alvarado, Priscila; Gutiérrez-Vega, Marisela; Carrillo-Saucedo, Irene Concepción; Gurrola-Peña, Gloria Margarita; Ruvalcaba-Romero, Norma Alicia; García-Sánchez, María Dolores; Ochoa-Alcaraz, Sergio Gabriel

    2017-03-01

    Mexico is one of the countries with the highest rates of overweight and obesity around the world, with 68.8% of men and 73% of women reporting both. This is a public health problem since there are several health related consequences of not exercising, like having cardiovascular diseases or some types of cancers. All of these problems can be prevented by promoting exercise, so it is important to evaluate models of health behaviors to achieve this goal. Among several models the Health Belief Model is one of the most studied models to promote health related behaviors. This study validates the first exercise scale based on the Health Belief Model (HBM) in Mexicans with the objective of studying and analyzing this model in Mexico. Items for the scale called the Exercise Health Belief Model Scale (EHBMS) were developed by a health research team, then the items were applied to a sample of 746 participants, male and female, from five cities in Mexico. The factor structure of the items was analyzed with an exploratory factor analysis and the internal reliability with Cronbach's alpha. The exploratory factor analysis reported the expected factor structure based in the HBM. The KMO index (0.92) and the Barlett's sphericity test (p < 0.01) indicated an adequate and normally distributed sample. Items had adequate factor loadings, ranging from 0.31 to 0.92, and the internal consistencies of the factors were also acceptable, with alpha values ranging from 0.67 to 0.91. The EHBMS is a validated scale that can be used to measure exercise based on the HBM in Mexican populations.

  12. Use of integrated analogue and numerical modelling to predict tridimensional fracture intensity in fault-related-folds.

    NASA Astrophysics Data System (ADS)

    Pizzati, Mattia; Cavozzi, Cristian; Magistroni, Corrado; Storti, Fabrizio

    2016-04-01

    Fracture density pattern predictions with low uncertainty is a fundamental issue for constraining fluid flow pathways in thrust-related anticlines in the frontal parts of thrust-and-fold belts and accretionary prisms, which can also provide plays for hydrocarbon exploration and development. Among the drivers that concur to determine the distribution of fractures in fold-and-thrust-belts, the complex kinematic pathways of folded structures play a key role. In areas with scarce and not reliable underground information, analogue modelling can provide effective support for developing and validating reliable hypotheses on structural architectures and their evolution. In this contribution, we propose a working method that combines analogue and numerical modelling. We deformed a sand-silicone multilayer to eventually produce a non-cylindrical thrust-related anticline at the wedge toe, which was our test geological structure at the reservoir scale. We cut 60 serial cross-sections through the central part of the deformed model to analyze faults and folds geometry using dedicated software (3D Move). The cross-sections were also used to reconstruct the 3D geometry of reference surfaces that compose the mechanical stratigraphy thanks to the use of the software GoCad. From the 3D model of the experimental anticline, by using 3D Move it was possible to calculate the cumulative stress and strain underwent by the deformed reference layers at the end of the deformation and also in incremental steps of fold growth. Based on these model outputs it was also possible to predict the orientation of three main fractures sets (joints and conjugate shear fractures) and their occurrence and density on model surfaces. The next step was the upscaling of the fracture network to the entire digital model volume, to create DFNs.

  13. Monte Carlo simulation methodology for the reliabilty of aircraft structures under damage tolerance considerations

    NASA Astrophysics Data System (ADS)

    Rambalakos, Andreas

    Current federal aviation regulations in the United States and around the world mandate the need for aircraft structures to meet damage tolerance requirements through out the service life. These requirements imply that the damaged aircraft structure must maintain adequate residual strength in order to sustain its integrity that is accomplished by a continuous inspection program. The multifold objective of this research is to develop a methodology based on a direct Monte Carlo simulation process and to assess the reliability of aircraft structures. Initially, the structure is modeled as a parallel system with active redundancy comprised of elements with uncorrelated (statistically independent) strengths and subjected to an equal load distribution. Closed form expressions for the system capacity cumulative distribution function (CDF) are developed by expanding the current expression for the capacity CDF of a parallel system comprised by three elements to a parallel system comprised with up to six elements. These newly developed expressions will be used to check the accuracy of the implementation of a Monte Carlo simulation algorithm to determine the probability of failure of a parallel system comprised of an arbitrary number of statistically independent elements. The second objective of this work is to compute the probability of failure of a fuselage skin lap joint under static load conditions through a Monte Carlo simulation scheme by utilizing the residual strength of the fasteners subjected to various initial load distributions and then subjected to a new unequal load distribution resulting from subsequent fastener sequential failures. The final and main objective of this thesis is to present a methodology for computing the resulting gradual deterioration of the reliability of an aircraft structural component by employing a direct Monte Carlo simulation approach. The uncertainties associated with the time to crack initiation, the probability of crack detection, the exponent in the crack propagation rate (Paris equation) and the yield strength of the elements are considered in the analytical model. The structural component is assumed to consist of a prescribed number of elements. This Monte Carlo simulation methodology is used to determine the required non-periodic inspections so that the reliability of the structural component will not fall below a prescribed minimum level. A sensitivity analysis is conducted to determine the effect of three key parameters on the specification of the non-periodic inspection intervals: namely a parameter associated with the time to crack initiation, the applied nominal stress fluctuation and the minimum acceptable reliability level.

  14. Reliability analysis of composite structures

    NASA Technical Reports Server (NTRS)

    Kan, Han-Pin

    1992-01-01

    A probabilistic static stress analysis methodology has been developed to estimate the reliability of a composite structure. Closed form stress analysis methods are the primary analytical tools used in this methodology. These structural mechanics methods are used to identify independent variables whose variations significantly affect the performance of the structure. Once these variables are identified, scatter in their values is evaluated and statistically characterized. The scatter in applied loads and the structural parameters are then fitted to appropriate probabilistic distribution functions. Numerical integration techniques are applied to compute the structural reliability. The predicted reliability accounts for scatter due to variability in material strength, applied load, fabrication and assembly processes. The influence of structural geometry and mode of failure are also considerations in the evaluation. Example problems are given to illustrate various levels of analytical complexity.

  15. Wafer level reliability for high-performance VLSI design

    NASA Technical Reports Server (NTRS)

    Root, Bryan J.; Seefeldt, James D.

    1987-01-01

    As very large scale integration architecture requires higher package density, reliability of these devices has approached a critical level. Previous processing techniques allowed a large window for varying reliability. However, as scaling and higher current densities push reliability to its limit, tighter control and instant feedback becomes critical. Several test structures developed to monitor reliability at the wafer level are described. For example, a test structure was developed to monitor metal integrity in seconds as opposed to weeks or months for conventional testing. Another structure monitors mobile ion contamination at critical steps in the process. Thus the reliability jeopardy can be assessed during fabrication preventing defective devices from ever being placed in the field. Most importantly, the reliability can be assessed on each wafer as opposed to an occasional sample.

  16. Estimating the reliability of repeatedly measured endpoints based on linear mixed-effects models. A tutorial.

    PubMed

    Van der Elst, Wim; Molenberghs, Geert; Hilgers, Ralf-Dieter; Verbeke, Geert; Heussen, Nicole

    2016-11-01

    There are various settings in which researchers are interested in the assessment of the correlation between repeated measurements that are taken within the same subject (i.e., reliability). For example, the same rating scale may be used to assess the symptom severity of the same patients by multiple physicians, or the same outcome may be measured repeatedly over time in the same patients. Reliability can be estimated in various ways, for example, using the classical Pearson correlation or the intra-class correlation in clustered data. However, contemporary data often have a complex structure that goes well beyond the restrictive assumptions that are needed with the more conventional methods to estimate reliability. In the current paper, we propose a general and flexible modeling approach that allows for the derivation of reliability estimates, standard errors, and confidence intervals - appropriately taking hierarchies and covariates in the data into account. Our methodology is developed for continuous outcomes together with covariates of an arbitrary type. The methodology is illustrated in a case study, and a Web Appendix is provided which details the computations using the R package CorrMixed and the SAS software. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.

  17. Homology modeling a fast tool for drug discovery: current perspectives.

    PubMed

    Vyas, V K; Ukawala, R D; Ghate, M; Chintha, C

    2012-01-01

    Major goal of structural biology involve formation of protein-ligand complexes; in which the protein molecules act energetically in the course of binding. Therefore, perceptive of protein-ligand interaction will be very important for structure based drug design. Lack of knowledge of 3D structures has hindered efforts to understand the binding specificities of ligands with protein. With increasing in modeling software and the growing number of known protein structures, homology modeling is rapidly becoming the method of choice for obtaining 3D coordinates of proteins. Homology modeling is a representation of the similarity of environmental residues at topologically corresponding positions in the reference proteins. In the absence of experimental data, model building on the basis of a known 3D structure of a homologous protein is at present the only reliable method to obtain the structural information. Knowledge of the 3D structures of proteins provides invaluable insights into the molecular basis of their functions. The recent advances in homology modeling, particularly in detecting and aligning sequences with template structures, distant homologues, modeling of loops and side chains as well as detecting errors in a model contributed to consistent prediction of protein structure, which was not possible even several years ago. This review focused on the features and a role of homology modeling in predicting protein structure and described current developments in this field with victorious applications at the different stages of the drug design and discovery.

  18. Homology Modeling a Fast Tool for Drug Discovery: Current Perspectives

    PubMed Central

    Vyas, V. K.; Ukawala, R. D.; Ghate, M.; Chintha, C.

    2012-01-01

    Major goal of structural biology involve formation of protein-ligand complexes; in which the protein molecules act energetically in the course of binding. Therefore, perceptive of protein-ligand interaction will be very important for structure based drug design. Lack of knowledge of 3D structures has hindered efforts to understand the binding specificities of ligands with protein. With increasing in modeling software and the growing number of known protein structures, homology modeling is rapidly becoming the method of choice for obtaining 3D coordinates of proteins. Homology modeling is a representation of the similarity of environmental residues at topologically corresponding positions in the reference proteins. In the absence of experimental data, model building on the basis of a known 3D structure of a homologous protein is at present the only reliable method to obtain the structural information. Knowledge of the 3D structures of proteins provides invaluable insights into the molecular basis of their functions. The recent advances in homology modeling, particularly in detecting and aligning sequences with template structures, distant homologues, modeling of loops and side chains as well as detecting errors in a model contributed to consistent prediction of protein structure, which was not possible even several years ago. This review focused on the features and a role of homology modeling in predicting protein structure and described current developments in this field with victorious applications at the different stages of the drug design and discovery. PMID:23204616

  19. Stirling Convertor Fasteners Reliability Quantification

    NASA Technical Reports Server (NTRS)

    Shah, Ashwin R.; Korovaichuk, Igor; Kovacevich, Tiodor; Schreiber, Jeffrey G.

    2006-01-01

    Onboard Radioisotope Power Systems (RPS) being developed for NASA s deep-space science and exploration missions require reliable operation for up to 14 years and beyond. Stirling power conversion is a candidate for use in an RPS because it offers a multifold increase in the conversion efficiency of heat to electric power and reduced inventory of radioactive material. Structural fasteners are responsible to maintain structural integrity of the Stirling power convertor, which is critical to ensure reliable performance during the entire mission. Design of fasteners involve variables related to the fabrication, manufacturing, behavior of fasteners and joining parts material, structural geometry of the joining components, size and spacing of fasteners, mission loads, boundary conditions, etc. These variables have inherent uncertainties, which need to be accounted for in the reliability assessment. This paper describes these uncertainties along with a methodology to quantify the reliability, and provides results of the analysis in terms of quantified reliability and sensitivity of Stirling power conversion reliability to the design variables. Quantification of the reliability includes both structural and functional aspects of the joining components. Based on the results, the paper also describes guidelines to improve the reliability and verification testing.

  20. Ultimate strength performance of tankers associated with industry corrosion addition practices

    NASA Astrophysics Data System (ADS)

    Kim, Do Kyun; Kim, Han Byul; Zhang, Xiaoming; Li, Chen Guang; Paik, Jeom Kee

    2014-09-01

    In the ship and offshore structure design, age-related problems such as corrosion damage, local denting, and fatigue damage are important factors to be considered in building a reliable structure as they have a significant influence on the residual structural capacity. In shipping, corrosion addition methods are widely adopted in structural design to prevent structural capacity degradation. The present study focuses on the historical trend of corrosion addition rules for ship structural design and investigates their effects on the ultimate strength performance such as hull girder and stiffened panel of double hull oil tankers. Three types of rules based on corrosion addition models, namely historic corrosion rules (pre-CSR), Common Structural Rules (CSR), and harmonised Common Structural Rules (CSRH) are considered and compared with two other corrosion models namely UGS model, suggested by the Union of Greek Shipowners (UGS), and Time-Dependent Corrosion Wastage Model (TDCWM). To identify the general trend in the effects of corrosion damage on the ultimate longitudinal strength performance, the corrosion addition rules are applied to four representative sizes of double hull oil tankers namely Panamax, Aframax, Suezmax, and VLCC. The results are helpful in understanding the trend of corrosion additions for tanker structures

  1. Generating a Dynamic Synthetic Population – Using an Age-Structured Two-Sex Model for Household Dynamics

    PubMed Central

    Namazi-Rad, Mohammad-Reza; Mokhtarian, Payam; Perez, Pascal

    2014-01-01

    Generating a reliable computer-simulated synthetic population is necessary for knowledge processing and decision-making analysis in agent-based systems in order to measure, interpret and describe each target area and the human activity patterns within it. In this paper, both synthetic reconstruction (SR) and combinatorial optimisation (CO) techniques are discussed for generating a reliable synthetic population for a certain geographic region (in Australia) using aggregated- and disaggregated-level information available for such an area. A CO algorithm using the quadratic function of population estimators is presented in this paper in order to generate a synthetic population while considering a two-fold nested structure for the individuals and households within the target areas. The baseline population in this study is generated from the confidentialised unit record files (CURFs) and 2006 Australian census tables. The dynamics of the created population is then projected over five years using a dynamic micro-simulation model for individual- and household-level demographic transitions. This projection is then compared with the 2011 Australian census. A prediction interval is provided for the population estimates obtained by the bootstrapping method, by which the variability structure of a predictor can be replicated in a bootstrap distribution. PMID:24733522

  2. On the matter of the reliability of the chemical monitoring system based on the modern control and monitoring devices

    NASA Astrophysics Data System (ADS)

    Andriushin, A. V.; Dolbikova, N. S.; Kiet, S. V.; Merzlikina, E. I.; Nikitina, I. S.

    2017-11-01

    The reliability of the main equipment of any power station depends on the correct water chemistry. In order to provide it, it is necessary to monitor the heat carrier quality, which, in its turn, is provided by the chemical monitoring system. Thus, the monitoring system reliability plays an important part in providing reliability of the main equipment. The monitoring system reliability is determined by the reliability and structure of its hardware and software consisting of sensors, controllers, HMI and so on [1,2]. Workers of a power plant dealing with the measuring equipment must be informed promptly about any breakdowns in the monitoring system, in this case they are able to remove the fault quickly. A computer consultant system for personnel maintaining the sensors and other chemical monitoring equipment can help to notice faults quickly and identify their possible causes. Some technical solutions for such a system are considered in the present paper. The experimental results were obtained on the laboratory and experimental workbench representing a physical model of a part of the chemical monitoring system.

  3. Incorporation of prior information on parameters into nonlinear regression groundwater flow models: 1. Theory

    USGS Publications Warehouse

    Cooley, Richard L.

    1982-01-01

    Prior information on the parameters of a groundwater flow model can be used to improve parameter estimates obtained from nonlinear regression solution of a modeling problem. Two scales of prior information can be available: (1) prior information having known reliability (that is, bias and random error structure) and (2) prior information consisting of best available estimates of unknown reliability. A regression method that incorporates the second scale of prior information assumes the prior information to be fixed for any particular analysis to produce improved, although biased, parameter estimates. Approximate optimization of two auxiliary parameters of the formulation is used to help minimize the bias, which is almost always much smaller than that resulting from standard ridge regression. It is shown that if both scales of prior information are available, then a combined regression analysis may be made.

  4. Structure-based Markov random field model for representing evolutionary constraints on functional sites.

    PubMed

    Jeong, Chan-Seok; Kim, Dongsup

    2016-02-24

    Elucidating the cooperative mechanism of interconnected residues is an important component toward understanding the biological function of a protein. Coevolution analysis has been developed to model the coevolutionary information reflecting structural and functional constraints. Recently, several methods have been developed based on a probabilistic graphical model called the Markov random field (MRF), which have led to significant improvements for coevolution analysis; however, thus far, the performance of these models has mainly been assessed by focusing on the aspect of protein structure. In this study, we built an MRF model whose graphical topology is determined by the residue proximity in the protein structure, and derived a novel positional coevolution estimate utilizing the node weight of the MRF model. This structure-based MRF method was evaluated for three data sets, each of which annotates catalytic site, allosteric site, and comprehensively determined functional site information. We demonstrate that the structure-based MRF architecture can encode the evolutionary information associated with biological function. Furthermore, we show that the node weight can more accurately represent positional coevolution information compared to the edge weight. Lastly, we demonstrate that the structure-based MRF model can be reliably built with only a few aligned sequences in linear time. The results show that adoption of a structure-based architecture could be an acceptable approximation for coevolution modeling with efficient computation complexity.

  5. Considerations in STS payload environmental verification

    NASA Technical Reports Server (NTRS)

    Keegan, W. B.

    1978-01-01

    Considerations regarding the Space Transportation System (STS) payload environmental verification are reviewed. It is noted that emphasis is placed on testing at the subassembly level and that the basic objective of structural dynamic payload verification is to ensure reliability in a cost-effective manner. Structural analyses consist of: (1) stress analysis for critical loading conditions, (2) model analysis for launch and orbital configurations, (3) flight loads analysis, (4) test simulation analysis to verify models, (5) kinematic analysis of deployment/retraction sequences, and (6) structural-thermal-optical program analysis. In addition to these approaches, payload verification programs are being developed in the thermal-vacuum area. These include the exposure to extreme temperatures, temperature cycling, thermal-balance testing and thermal-vacuum testing.

  6. Molecular modeling of the AhR structure and interactions can shed light on ligand-dependent activation and transformation mechanisms.

    PubMed

    Bonati, Laura; Corrada, Dario; Tagliabue, Sara Giani; Motta, Stefano

    2017-02-01

    Molecular modeling has given important contributions to elucidation of the main stages in the AhR signal transduction pathway. Despite the lack of experimentally determined structures of the AhR functional domains, information derived from homologous systems has been exploited for modeling their structure and interactions. Homology models of the AhR PASB domain have provided information on the binding cavity and contributed to elucidate species-specific differences in ligand binding. Molecular Docking simulations of the ligand binding process have given insights into differences in binding of diverse agonists, antagonists, and selective AhR modulators, and their application to virtual screening of large databases of compounds have allowed identification of novel AhR ligands. Recently available structural information on protein-protein and protein-DNA complexes of other bHLH-PAS systems has opened the way for modeling the AhR:ARNT dimer structure and investigating the mechanisms of AhR transformation and DNA binding. Future research directions should include simulation of the protein dynamics to obtain a more reliable description of intermolecular interactions involved in signal transmission.

  7. Comparative Analysis of the Reliability of Steel Structure with Pinned and Rigid Nodes Subjected to Fire

    NASA Astrophysics Data System (ADS)

    Kubicka, Katarzyna; Radoń, Urszula; Szaniec, Waldemar; Pawlak, Urszula

    2017-10-01

    The paper concerns the reliability analysis of steel structures subjected to high temperatures of fire gases. Two types of spatial structures were analysed, namely with pinned and rigid nodes. The fire analysis was carried out according to prescriptions of Eurocode. The static-strength analysis was conducted using the finite element method (FEM). The MES3D program, developed by Szaniec (Kielce University of Technology, Poland), was used for this purpose. The results received from MES3D made it possible to carry out the reliability analysis using the Numpress Explore program that was developed at the Institute of Fundamental Technological Research of the Polish Academy of Sciences [9]. The measurement of reliability of structures is the Hasofer-Lind reliability index (β). The reliability analysis was carried out according to approximation (FORM, SORM) and simulation (Importance Sampling, Monte Carlo) methods. As the fire progresses, the value of reliability index decreases. The analysis conducted for the study made it possible to evaluate the impact of node types on those changes. In real structures, it is often difficult to define correctly types of nodes, so some simplifications are made. The presented analysis contributes to the recognition of consequences of such assumptions for the safety of structures, subjected to fire.

  8. Assessing spatial uncertainty in reservoir characterization for carbon sequestration planning using public well-log data: A case study

    USGS Publications Warehouse

    Venteris, E.R.; Carter, K.M.

    2009-01-01

    Mapping and characterization of potential geologic reservoirs are key components in planning carbon dioxide (CO2) injection projects. The geometry of target and confining layers is vital to ensure that the injected CO2 remains in a supercritical state and is confined to the target layer. Also, maps of injection volume (porosity) are necessary to estimate sequestration capacity at undrilled locations. Our study uses publicly filed geophysical logs and geostatistical modeling methods to investigate the reliability of spatial prediction for oil and gas plays in the Medina Group (sandstone and shale facies) in northwestern Pennsylvania. Specifically, the modeling focused on two targets: the Grimsby Formation and Whirlpool Sandstone. For each layer, thousands of data points were available to model structure and thickness but only hundreds were available to support volumetric modeling because of the rarity of density-porosity logs in the public records. Geostatistical analysis based on this data resulted in accurate structure models, less accurate isopach models, and inconsistent models of pore volume. Of the two layers studied, only the Whirlpool Sandstone data provided for a useful spatial model of pore volume. Where reliable models for spatial prediction are absent, the best predictor available for unsampled locations is the mean value of the data, and potential sequestration sites should be planned as close as possible to existing wells with volumetric data. ?? 2009. The American Association of Petroleum Geologists/Division of Environmental Geosciences. All rights reserved.

  9. A Survey of Techniques for Modeling and Improving Reliability of Computing Systems

    DOE PAGES

    Mittal, Sparsh; Vetter, Jeffrey S.

    2015-04-24

    Recent trends of aggressive technology scaling have greatly exacerbated the occurrences and impact of faults in computing systems. This has made `reliability' a first-order design constraint. To address the challenges of reliability, several techniques have been proposed. In this study, we provide a survey of architectural techniques for improving resilience of computing systems. We especially focus on techniques proposed for microarchitectural components, such as processor registers, functional units, cache and main memory etc. In addition, we discuss techniques proposed for non-volatile memory, GPUs and 3D-stacked processors. To underscore the similarities and differences of the techniques, we classify them based onmore » their key characteristics. We also review the metrics proposed to quantify vulnerability of processor structures. Finally, we believe that this survey will help researchers, system-architects and processor designers in gaining insights into the techniques for improving reliability of computing systems.« less

  10. A Survey of Techniques for Modeling and Improving Reliability of Computing Systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mittal, Sparsh; Vetter, Jeffrey S.

    Recent trends of aggressive technology scaling have greatly exacerbated the occurrences and impact of faults in computing systems. This has made `reliability' a first-order design constraint. To address the challenges of reliability, several techniques have been proposed. In this study, we provide a survey of architectural techniques for improving resilience of computing systems. We especially focus on techniques proposed for microarchitectural components, such as processor registers, functional units, cache and main memory etc. In addition, we discuss techniques proposed for non-volatile memory, GPUs and 3D-stacked processors. To underscore the similarities and differences of the techniques, we classify them based onmore » their key characteristics. We also review the metrics proposed to quantify vulnerability of processor structures. Finally, we believe that this survey will help researchers, system-architects and processor designers in gaining insights into the techniques for improving reliability of computing systems.« less

  11. Turbulence and Solar p-Mode Oscillations

    NASA Astrophysics Data System (ADS)

    Bi, S. L.; Xu, H. Y.

    The discrepancy between observed and theoretical mode frequencies can be used to examine the reliability of the standard solar model as a faithful representation of solar real situation. With the help of an improved time-dependent convective model that takes into account contribution of the full spatial and temporal turbulent energy spectrum, we study the influence of turbulent pressure on structure and solar p-mode frequencies. For the radial modes we find that the Reynolds stress produces signification modifications in structure and p-mode spectrum. Compared with an adiabatic approximation, the discrepancy is largely removed by the turbulent correction.

  12. Cation Selectivity in Biological Cation Channels Using Experimental Structural Information and Statistical Mechanical Simulation

    PubMed Central

    Finnerty, Justin John

    2015-01-01

    Cation selective channels constitute the gate for ion currents through the cell membrane. Here we present an improved statistical mechanical model based on atomistic structural information, cation hydration state and without tuned parameters that reproduces the selectivity of biological Na+ and Ca2+ ion channels. The importance of the inclusion of step-wise cation hydration in these results confirms the essential role partial dehydration plays in the bacterial Na+ channels. The model, proven reliable against experimental data, could be straightforwardly used for designing Na+ and Ca2+ selective nanopores. PMID:26460827

  13. High-resolution computational algorithms for simulating offshore wind turbines and farms: Model development and validation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Calderer, Antoni; Yang, Xiaolei; Angelidis, Dionysios

    2015-10-30

    The present project involves the development of modeling and analysis design tools for assessing offshore wind turbine technologies. The computational tools developed herein are able to resolve the effects of the coupled interaction of atmospheric turbulence and ocean waves on aerodynamic performance and structural stability and reliability of offshore wind turbines and farms. Laboratory scale experiments have been carried out to derive data sets for validating the computational models.

  14. [Factor structure validity of the social capital scale used at baseline in the ELSA-Brasil study].

    PubMed

    Souto, Ester Paiva; Vasconcelos, Ana Glória Godoi; Chor, Dora; Reichenheim, Michael E; Griep, Rosane Härter

    2016-07-21

    This study aims to analyze the factor structure of the Brazilian version of the Resource Generator (RG) scale, using baseline data from the Brazilian Longitudinal Health Study in Adults (ELSA-Brasil). Cross-validation was performed in three random subsamples. Exploratory factor analysis using exploratory structural equation models was conducted in the first two subsamples to diagnose the factor structure, and confirmatory factor analysis was used in the third to corroborate the model defined by the exploratory analyses. Based on the 31 initial items, the model with the best fit included 25 items distributed across three dimensions. They all presented satisfactory convergent validity (values greater than 0.50 for the extracted variance) and precision (values greater than 0.70 for compound reliability). All factor correlations were below 0.85, indicating full discriminative factor validity. The RG scale presents acceptable psychometric properties and can be used in populations with similar characteristics.

  15. Damage evaluation by a guided wave-hidden Markov model based method

    NASA Astrophysics Data System (ADS)

    Mei, Hanfei; Yuan, Shenfang; Qiu, Lei; Zhang, Jinjin

    2016-02-01

    Guided wave based structural health monitoring has shown great potential in aerospace applications. However, one of the key challenges of practical engineering applications is the accurate interpretation of the guided wave signals under time-varying environmental and operational conditions. This paper presents a guided wave-hidden Markov model based method to improve the damage evaluation reliability of real aircraft structures under time-varying conditions. In the proposed approach, an HMM based unweighted moving average trend estimation method, which can capture the trend of damage propagation from the posterior probability obtained by HMM modeling is used to achieve a probabilistic evaluation of the structural damage. To validate the developed method, experiments are performed on a hole-edge crack specimen under fatigue loading condition and a real aircraft wing spar under changing structural boundary conditions. Experimental results show the advantage of the proposed method.

  16. Development of a probabilistic analysis methodology for structural reliability estimation

    NASA Technical Reports Server (NTRS)

    Torng, T. Y.; Wu, Y.-T.

    1991-01-01

    The novel probabilistic analysis method for assessment of structural reliability presented, which combines fast-convolution with an efficient structural reliability analysis, can after identifying the most important point of a limit state proceed to establish a quadratic-performance function. It then transforms the quadratic function into a linear one, and applies fast convolution. The method is applicable to problems requiring computer-intensive structural analysis. Five illustrative examples of the method's application are given.

  17. Application of Advanced Fracture Mechanics Technology to Ensure Structural Reliability in Critical Titanium Structures,

    DTIC Science & Technology

    1982-11-22

    RD-Ri42 354 APPLICATION OF ADVANCED FRACTURE MECHANICS TECHNOLOGY i/i TT ENSURE STRUCTURA..(U) 1WESTINGHOUSE RESEARCH FND DEVELOPMENT CENTER...I Iml .4. 47 Igo 12. 4 %B 1. __ ~. ~% ski Z L __ 12 APPLICATION OF ADVANCED FRACTURE MECHANICS -p TECHNOLOGY TO ENSURE STRUCTURAL RELIABILITY IN...Road W Pilttsburgh. Pennsylvania 15235 84 06 18 207 APPLICATION OF ADVANCED FRACTURE MECHANICS TECHNOLOGY TO ENSURE STRUCTURAL RELIABILITY IN CRITICAL

  18. Initial Development and Validation of the BullyHARM: The Bullying, Harassment, and Aggression Receipt Measure.

    PubMed

    Hall, William J

    2016-11-01

    This article describes the development and preliminary validation of the Bullying, Harassment, and Aggression Receipt Measure (BullyHARM). The development of the BullyHARM involved a number of steps and methods, including a literature review, expert review, cognitive testing, readability testing, data collection from a large sample, reliability testing, and confirmatory factor analysis. A sample of 275 middle school students was used to examine the psychometric properties and factor structure of the BullyHARM, which consists of 22 items and 6 subscales: physical bullying, verbal bullying, social/relational bullying, cyber-bullying, property bullying, and sexual bullying. First-order and second-order factor models were evaluated. Results demonstrate that the first-order factor model had superior fit. Results of reliability testing indicate that the BullyHARM scale and subscales have very good internal consistency reliability. Findings indicate that the BullyHARM has good properties regarding content validation and respondent-related validation and is a promising instrument for measuring bullying victimization in school.

  19. Initial Development and Validation of the BullyHARM: The Bullying, Harassment, and Aggression Receipt Measure

    PubMed Central

    Hall, William J.

    2017-01-01

    This article describes the development and preliminary validation of the Bullying, Harassment, and Aggression Receipt Measure (BullyHARM). The development of the BullyHARM involved a number of steps and methods, including a literature review, expert review, cognitive testing, readability testing, data collection from a large sample, reliability testing, and confirmatory factor analysis. A sample of 275 middle school students was used to examine the psychometric properties and factor structure of the BullyHARM, which consists of 22 items and 6 subscales: physical bullying, verbal bullying, social/relational bullying, cyber-bullying, property bullying, and sexual bullying. First-order and second-order factor models were evaluated. Results demonstrate that the first-order factor model had superior fit. Results of reliability testing indicate that the BullyHARM scale and subscales have very good internal consistency reliability. Findings indicate that the BullyHARM has good properties regarding content validation and respondent-related validation and is a promising instrument for measuring bullying victimization in school. PMID:28194041

  20. Forward Skirt Structural Testing on the Space Launch System (SLS) Program

    NASA Technical Reports Server (NTRS)

    Lohrer, J. D.; Wright, R. D.

    2016-01-01

    Structural testing was performed to evaluate heritage forward skirts from the Space Shuttle program for use on the NASA Space Launch System (SLS) program. Testing was needed because SLS ascent loads are 35% higher than Space Shuttle loads. Objectives of testing were to determine margins of safety, demonstrate reliability, and validate analytical models. Testing combined with analysis was able to show heritage forward skirts were acceptable to use on the SLS program.

  1. Adaptive optical microscope for brain imaging in vivo

    NASA Astrophysics Data System (ADS)

    Wang, Kai

    2017-04-01

    The optical heterogeneity of biological tissue imposes a major limitation to acquire detailed structural and functional information deep in the biological specimens using conventional microscopes. To restore optimal imaging performance, we developed an adaptive optical microscope based on direct wavefront sensing technique. This microscope can reliably measure and correct biological samples induced aberration. We demonstrated its performance and application in structural and functional brain imaging in various animal models, including fruit fly, zebrafish and mouse.

  2. New methodologies for multi-scale time-variant reliability analysis of complex lifeline networks

    NASA Astrophysics Data System (ADS)

    Kurtz, Nolan Scot

    The cost of maintaining existing civil infrastructure is enormous. Since the livelihood of the public depends on such infrastructure, its state must be managed appropriately using quantitative approaches. Practitioners must consider not only which components are most fragile to hazard, e.g. seismicity, storm surge, hurricane winds, etc., but also how they participate on a network level using network analysis. Focusing on particularly damaged components does not necessarily increase network functionality, which is most important to the people that depend on such infrastructure. Several network analyses, e.g. S-RDA, LP-bounds, and crude-MCS, and performance metrics, e.g. disconnection bounds and component importance, are available for such purposes. Since these networks are existing, the time state is also important. If networks are close to chloride sources, deterioration may be a major issue. Information from field inspections may also have large impacts on quantitative models. To address such issues, hazard risk analysis methodologies for deteriorating networks subjected to seismicity, i.e. earthquakes, have been created from analytics. A bridge component model has been constructed for these methodologies. The bridge fragilities, which were constructed from data, required a deeper level of analysis as these were relevant for specific structures. Furthermore, chloride-induced deterioration network effects were investigated. Depending on how mathematical models incorporate new information, many approaches are available, such as Bayesian model updating. To make such procedures more flexible, an adaptive importance sampling scheme was created for structural reliability problems. Additionally, such a method handles many kinds of system and component problems with singular or multiple important regions of the limit state function. These and previously developed analysis methodologies were found to be strongly sensitive to the network size. Special network topologies may be more or less computationally difficult, while the resolution of the network also has large affects. To take advantage of some types of topologies, network hierarchical structures with super-link representation have been used in the literature to increase the computational efficiency by analyzing smaller, densely connected networks; however, such structures were based on user input and subjective at times. To address this, algorithms must be automated and reliable. These hierarchical structures may indicate the structure of the network itself. This risk analysis methodology has been expanded to larger networks using such automated hierarchical structures. Component importance is the most important objective from such network analysis; however, this may only provide the information of which bridges to inspect/repair earliest and little else. High correlations influence such component importance measures in a negative manner. Additionally, a regional approach is not appropriately modelled. To investigate a more regional view, group importance measures based on hierarchical structures have been created. Such structures may also be used to create regional inspection/repair approaches. Using these analytical, quantitative risk approaches, the next generation of decision makers may make both component and regional-based optimal decisions using information from both network function and further effects of infrastructure deterioration.

  3. Commentary on Coefficient Alpha: A Cautionary Tale

    ERIC Educational Resources Information Center

    Green, Samuel B.; Yang, Yanyun

    2009-01-01

    The general use of coefficient alpha to assess reliability should be discouraged on a number of grounds. The assumptions underlying coefficient alpha are unlikely to hold in practice, and violation of these assumptions can result in nontrivial negative or positive bias. Structural equation modeling was discussed as an informative process both to…

  4. Cross-cultural adaptation, reliability and construct validity of the Tampa scale for kinesiophobia for temporomandibular disorders (TSK/TMD-Br) into Brazilian Portuguese.

    PubMed

    Aguiar, A S; Bataglion, C; Visscher, C M; Bevilaqua Grossi, D; Chaves, T C

    2017-07-01

    Fear of movement (kinesiophobia) seems to play an important role in the development of chronic pain. However, for temporomandibular disorders (TMD), there is a scarcity of studies about this topic. The Tampa Scale for Kinesiophobia for TMD (TSK/TMD) is the most widely used instrument to measure fear of movement and it is not available in Brazilian Portuguese. The purpose of this study was to culturally adapt the TSK/TMD to Brazilian Portuguese and to assess its psychometric properties regarding internal consistency, reliability, and construct and structural validity. A total of 100 female patients with chronic TMD participated in the validation process of the TSK/TMD-Br. The intraclass correlation coefficient (ICC) was used for statistical analysis of reliability (test-retest), Cronbach's alpha for internal consistency, Spearman's rank correlation for construct validity and confirmatory factor analysis (CFA) for structural validity. CFA endorsed the pre-specified model with two domains and 12-items (Activity Avoidance - AA/Somatic Focus - SF) and all items obtained a loading factor greater than 0·4. Acceptable levels of reliability were found (ICC > 0·75) for all questions and domains of the TSK/TMD-Br. For internal consistency, Cronbach's α of 0·78 for both domains were found. Moderate correlations (0·40 < r < 0.60) were observed for 84% of the analyses conducted between TSK/TMD-Br scores versus catastrophising, depression and jaw functional limitation. TSK/TMD-Br 12 items and two-factor demonstrated sound psychometric properties (transcultural validity, reliability, internal consistency and structural validity). In such a way, the instrument can be used in clinical settings and for research purposes. © 2017 John Wiley & Sons Ltd.

  5. NASA Langley developments in response calculations needed for failure and life prediction

    NASA Technical Reports Server (NTRS)

    Housner, Jerrold M.

    1993-01-01

    NASA Langley developments in response calculations needed for failure and life predictions are discussed. Topics covered include: structural failure analysis in concurrent engineering; accuracy of independent regional modeling demonstrated on classical example; functional interface method accurately joins incompatible finite element models; interface method for insertion of local detail modeling extended to curve pressurized fuselage window panel; interface concept for joining structural regions; motivation for coupled 2D-3D analysis; compression panel with discontinuous stiffener coupled 2D-3D model and axial surface strains at the middle of the hat stiffener; use of adaptive refinement with multiple methods; adaptive mesh refinement; and studies on quantity effect of bow-type initial imperfections on reliability of stiffened panels.

  6. Use of Anecdotal Occurrence Data in Species Distribution Models: An Example Based on the White-Nosed Coati (Nasua narica) in the American Southwest

    PubMed Central

    Frey, Jennifer K.; Lewis, Jeremy C.; Guy, Rachel K.; Stuart, James N.

    2013-01-01

    Simple Summary We evaluated the influence of occurrence records with different reliability on predicted distribution of a unique, rare mammal in the American Southwest, the white-nosed coati (Nasua narica). We concluded that occurrence datasets that include anecdotal records can be used to infer species distributions, providing such data are used only for easily-identifiable species and based on robust modeling methods such as maximum entropy. Use of a reliability rating system is critical for using anecdotal data. Abstract Species distributions are usually inferred from occurrence records. However, these records are prone to errors in spatial precision and reliability. Although influence of spatial errors has been fairly well studied, there is little information on impacts of poor reliability. Reliability of an occurrence record can be influenced by characteristics of the species, conditions during the observation, and observer’s knowledge. Some studies have advocated use of anecdotal data, while others have advocated more stringent evidentiary standards such as only accepting records verified by physical evidence, at least for rare or elusive species. Our goal was to evaluate the influence of occurrence records with different reliability on species distribution models (SDMs) of a unique mammal, the white-nosed coati (Nasua narica) in the American Southwest. We compared SDMs developed using maximum entropy analysis of combined bioclimatic and biophysical variables and based on seven subsets of occurrence records that varied in reliability and spatial precision. We found that the predicted distribution of the coati based on datasets that included anecdotal occurrence records were similar to those based on datasets that only included physical evidence. Coati distribution in the American Southwest was predicted to occur in southwestern New Mexico and southeastern Arizona and was defined primarily by evenness of climate and Madrean woodland and chaparral land-cover types. Coati distribution patterns in this region suggest a good model for understanding the biogeographic structure of range margins. We concluded that occurrence datasets that include anecdotal records can be used to infer species distributions, providing such data are used only for easily-identifiable species and based on robust modeling methods such as maximum entropy. Use of a reliability rating system is critical for using anecdotal data. PMID:26487405

  7. The Modified Reasons for Smoking Scale: factorial structure, validity and reliability in pregnant smokers.

    PubMed

    De Wilde, Katrien Sophie; Tency, Inge; Boudrez, Hedwig; Temmerman, Marleen; Maes, Lea; Clays, Els

    2016-06-01

    Smoking during pregnancy can cause several maternal and neonatal health risks, yet a considerable number of pregnant women continue to smoke. The objectives of this study were to test the factorial structure, validity and reliability of the Dutch version of the Modified Reasons for Smoking Scale (MRSS) in a sample of smoking pregnant women and to understand reasons for continued smoking during pregnancy. A longitudinal design was performed. Data of 97 pregnant smokers were collected during prenatal consultation. Structural equation modelling was performed to assess the construct validity of the MRSS: an exploratory factor analysis was conducted, followed by a confirmatory factor analysis.Test-retest reliability (<16 weeks and 32-34 weeks pregnancy) and internal consistency were assessed using the intraclass correlation coefficient and the Cronbach's alpha, respectively. To verify concurrent validity, Mann-Whitney U-tests were performed examining associations between the MRSS subscales and nicotine dependence, daily consumption, depressive symptoms and intention to quit. We found a factorial structure for the MRSS of 11 items within five subscales in order of importance: tension reduction, addiction, pleasure, habit and social function. Results for internal consistency and test-retest reliability were good to acceptable. There were significant associations of nicotine dependence with tension reduction and addiction and of daily consumption with addiction and habit. Validity and reliability of the MRSS were shown in a sample of pregnant smokers. Tension reduction was the most important reason for continued smoking, followed by pleasure and addiction. Although the score for nicotine dependence was low, addiction was an important reason for continued smoking during pregnancy; therefore, nicotine replacement therapy could be considered. Half of the respondents experienced depressive symptoms. Hence, it is important to identify those women who need more specialized care, which can include not only smoking cessation counselling but also treatment for depression. © 2016 John Wiley & Sons, Ltd.

  8. The Social Anxiety and Depression Life Interference—24 Inventory: Classical and modern psychometric evaluations

    PubMed Central

    Berzins, Tiffany L.; Garcia, Antonio F.; Acosta, Melina; Osman, Augustine

    2017-01-01

    Two instrument validation studies broadened the research literature exploring the factor structure, internal consistency reliability, and concurrent validity of scores on the Social Anxiety and Depression Life Interference—24 Inventory (SADLI-24; Osman, Bagge, Freedenthal, Guiterrez, & Emmerich, 2011). Study 1 (N = 1065) was undertaken to concurrently appraise three competing factor models for the instrument: a unidimensional model, a two-factor oblique model and a bifactor model. The bifactor model provided the best fit to the study sample data. Study 2 (N = 220) extended the results from Study 1 with an investigation of the convergent and discriminant validity for the bifactor model of the SADLI-24 with multiple regression analyses and scale-level exploratory structural equation modeling. This project yields data that augments the initial instrument development investigations for the target measure. PMID:28781401

  9. Lewis Structures Technology, 1988. Volume 3: Structural Integrity Fatigue and Fracture Wind Turbines HOST

    NASA Technical Reports Server (NTRS)

    1988-01-01

    The charter of the Structures Division is to perform and disseminate results of research conducted in support of aerospace engine structures. These results have a wide range of applicability to practioners of structural engineering mechanics beyond the aerospace arena. The specific purpose of the symposium was to familiarize the engineering structures community with the depth and range of research performed by the division and its academic and industrial partners. Sessions covered vibration control, fracture mechanics, ceramic component reliability, parallel computing, nondestructive evaluation, constitutive models and experimental capabilities, dynamic systems, fatigue and damage, wind turbines, hot section technology (HOST), aeroelasticity, structural mechanics codes, computational methods for dynamics, structural optimization, and applications of structural dynamics, and structural mechanics computer codes.

  10. Damage tolerance and structural monitoring for wind turbine blades

    PubMed Central

    McGugan, M.; Pereira, G.; Sørensen, B. F.; Toftegaard, H.; Branner, K.

    2015-01-01

    The paper proposes a methodology for reliable design and maintenance of wind turbine rotor blades using a condition monitoring approach and a damage tolerance index coupling the material and structure. By improving the understanding of material properties that control damage propagation it will be possible to combine damage tolerant structural design, monitoring systems, inspection techniques and modelling to manage the life cycle of the structures. This will allow an efficient operation of the wind turbine in terms of load alleviation, limited maintenance and repair leading to a more effective exploitation of offshore wind. PMID:25583858

  11. Reliable critical sized defect rodent model for cleft palate research.

    PubMed

    Mostafa, Nesrine Z; Doschak, Michael R; Major, Paul W; Talwar, Reena

    2014-12-01

    Suitable animal models are necessary to test the efficacy of new bone grafting therapies in cleft palate surgery. Rodent models of cleft palate are available but have limitations. This study compared and modified mid-palate cleft (MPC) and alveolar cleft (AC) models to determine the most reliable and reproducible model for bone grafting studies. Published MPC model (9 × 5 × 3 mm(3)) lacked sufficient information for tested rats. Our initial studies utilizing AC model (7 × 4 × 3 mm(3)) in 8 and 16 weeks old Sprague Dawley (SD) rats revealed injury to adjacent structures. After comparing anteroposterior and transverse maxillary dimensions in 16 weeks old SD and Wistar rats, virtual planning was performed to modify MPC and AC defects dimensions, taking the adjacent structures into consideration. Modified MPC (7 × 2.5 × 1 mm(3)) and AC (5 × 2.5 × 1 mm(3)) defects were employed in 16 weeks old Wistar rats and healing was monitored by micro-computed tomography and histology. Maxillary dimensions in SD and Wistar rats were not significantly different. Preoperative virtual planning enhanced postoperative surgical outcomes. Bone healing occurred at defect margin leaving central bone void confirming the critical size nature of the modified MPC and AC defects. Presented modifications for MPC and AC models created clinically relevant and reproducible defects. Copyright © 2014 European Association for Cranio-Maxillo-Facial Surgery. Published by Elsevier Ltd. All rights reserved.

  12. Reliability, precision, and measurement in the context of data from ability tests, surveys, and assessments

    NASA Astrophysics Data System (ADS)

    Fisher, W. P., Jr.; Elbaum, B.; Coulter, A.

    2010-07-01

    Reliability coefficients indicate the proportion of total variance attributable to differences among measures separated along a quantitative continuum by a testing, survey, or assessment instrument. Reliability is usually considered to be influenced by both the internal consistency of a data set and the number of items, though textbooks and research papers rarely evaluate the extent to which these factors independently affect the data in question. Probabilistic formulations of the requirements for unidimensional measurement separate consistency from error by modelling individual response processes instead of group-level variation. The utility of this separation is illustrated via analyses of small sets of simulated data, and of subsets of data from a 78-item survey of over 2,500 parents of children with disabilities. Measurement reliability ultimately concerns the structural invariance specified in models requiring sufficient statistics, parameter separation, unidimensionality, and other qualities that historically have made quantification simple, practical, and convenient for end users. The paper concludes with suggestions for a research program aimed at focusing measurement research more on the calibration and wide dissemination of tools applicable to individuals, and less on the statistical study of inter-variable relations in large data sets.

  13. Probabilistic Prediction of Lifetimes of Ceramic Parts

    NASA Technical Reports Server (NTRS)

    Nemeth, Noel N.; Gyekenyesi, John P.; Jadaan, Osama M.; Palfi, Tamas; Powers, Lynn; Reh, Stefan; Baker, Eric H.

    2006-01-01

    ANSYS/CARES/PDS is a software system that combines the ANSYS Probabilistic Design System (PDS) software with a modified version of the Ceramics Analysis and Reliability Evaluation of Structures Life (CARES/Life) Version 6.0 software. [A prior version of CARES/Life was reported in Program for Evaluation of Reliability of Ceramic Parts (LEW-16018), NASA Tech Briefs, Vol. 20, No. 3 (March 1996), page 28.] CARES/Life models effects of stochastic strength, slow crack growth, and stress distribution on the overall reliability of a ceramic component. The essence of the enhancement in CARES/Life 6.0 is the capability to predict the probability of failure using results from transient finite-element analysis. ANSYS PDS models the effects of uncertainty in material properties, dimensions, and loading on the stress distribution and deformation. ANSYS/CARES/PDS accounts for the effects of probabilistic strength, probabilistic loads, probabilistic material properties, and probabilistic tolerances on the lifetime and reliability of the component. Even failure probability becomes a stochastic quantity that can be tracked as a response variable. ANSYS/CARES/PDS enables tracking of all stochastic quantities in the design space, thereby enabling more precise probabilistic prediction of lifetimes of ceramic components.

  14. Identification of small molecules capable of regulating conformational changes of telomeric G-quadruplex

    NASA Astrophysics Data System (ADS)

    Chen, Shuo-Bin; Liu, Guo-Cai; Gu, Lian-Quan; Huang, Zhi-Shu; Tan, Jia-Heng

    2018-02-01

    Design of small molecules targeted at human telomeric G-quadruplex DNA is an extremely active research area. Interestingly, the telomeric G-quadruplex is a highly polymorphic structure. Changes in its conformation upon small molecule binding may be a powerful method to achieve a desired biological effect. However, the rational development of small molecules capable of regulating conformational change of telomeric G-quadruplex structures is still challenging. In this study, we developed a reliable ligand-based pharmacophore model based on isaindigotone derivatives with conformational change activity toward telomeric G-quadruplex DNA. Furthermore, virtual screening of database was conducted using this pharmacophore model and benzopyranopyrimidine derivatives in the database were identified as a strong inducer of the telomeric G-quadruplex DNA conformation, transforming it from hybrid-type structure to parallel structure.

  15. Characterising RNA secondary structure space using information entropy

    PubMed Central

    2013-01-01

    Comparative methods for RNA secondary structure prediction use evolutionary information from RNA alignments to increase prediction accuracy. The model is often described in terms of stochastic context-free grammars (SCFGs), which generate a probability distribution over secondary structures. It is, however, unclear how this probability distribution changes as a function of the input alignment. As prediction programs typically only return a single secondary structure, better characterisation of the underlying probability space of RNA secondary structures is of great interest. In this work, we show how to efficiently compute the information entropy of the probability distribution over RNA secondary structures produced for RNA alignments by a phylo-SCFG, and implement it for the PPfold model. We also discuss interpretations and applications of this quantity, including how it can clarify reasons for low prediction reliability scores. PPfold and its source code are available from http://birc.au.dk/software/ppfold/. PMID:23368905

  16. Docking-based classification models for exploratory toxicology ...

    EPA Pesticide Factsheets

    Background: Exploratory toxicology is a new emerging research area whose ultimate mission is that of protecting human health and environment from risks posed by chemicals. In this regard, the ethical and practical limitation of animal testing has encouraged the promotion of computational methods for the fast screening of huge collections of chemicals available on the market. Results: We derived 24 reliable docking-based classification models able to predict the estrogenic potential of a large collection of chemicals having high quality experimental data, kindly provided by the U.S. Environmental Protection Agency (EPA). The predictive power of our docking-based models was supported by values of AUC, EF1% (EFmax = 7.1), -LR (at SE = 0.75) and +LR (at SE = 0.25) ranging from 0.63 to 0.72, from 2.5 to 6.2, from 0.35 to 0.67 and from 2.05 to 9.84, respectively. In addition, external predictions were successfully made on some representative known estrogenic chemicals. Conclusion: We show how structure-based methods, widely applied to drug discovery programs, can be adapted to meet the conditions of the regulatory context. Importantly, these methods enable one to employ the physicochemical information contained in the X-ray solved biological target and to screen structurally-unrelated chemicals. Shows how structure-based methods, widely applied to drug discovery programs, can be adapted to meet the conditions of the regulatory context. Evaluation of 24 reliable dockin

  17. Weak data do not make a free lunch, only a cheap meal

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Luo, Zhipu; Rajashankar, Kanagalaghatta; Dauter, Zbigniew

    2014-01-17

    Four data sets were processed at resolutions significantly exceeding the criteria traditionally used for estimating the diffraction data resolution limit. The analysis of these data and the corresponding model-quality indicators suggests that the criteria of resolution limits widely adopted in the past may be somewhat conservative. Various parameters, such asR mergeandI/σ(I), optical resolution and the correlation coefficients CC 1/2and CC*, can be used for judging the internal data quality, whereas the reliability factorsRandR freeas well as the maximum-likelihood target values and real-space map correlation coefficients can be used to estimate the agreement between the data and the refined model. However,more » none of these criteria provide a reliable estimate of the data resolution cutoff limit. The analysis suggests that extension of the maximum resolution by about 0.2 Å beyond the currently adopted limit where theI/σ(I) value drops to 2.0 does not degrade the quality of the refined structural models, but may sometimes be advantageous. Such an extension may be particularly beneficial for significantly anisotropic diffraction. Extension of the maximum resolution at the stage of data collection and structure refinement is cheap in terms of the required effort and is definitely more advisable than accepting a too conservative resolution cutoff, which is unfortunately quite frequent among the crystal structures deposited in the Protein Data Bank.« less

  18. Advanced Stirling Convertor Heater Head Durability and Reliability Quantification

    NASA Technical Reports Server (NTRS)

    Krause, David L.; Shah, Ashwin R.; Korovaichuk, Igor; Kalluri, Sreeramesh

    2008-01-01

    The National Aeronautics and Space Administration (NASA) has identified the high efficiency Advanced Stirling Radioisotope Generator (ASRG) as a candidate power source for long duration Science missions, such as lunar applications, Mars rovers, and deep space missions, that require reliable design lifetimes of up to 17 years. Resistance to creep deformation of the MarM-247 heater head (HH), a structurally critical component of the ASRG Advanced Stirling Convertor (ASC), under high temperatures (up to 850 C) is a key design driver for durability. Inherent uncertainties in the creep behavior of the thin-walled HH and the variations in the wall thickness, control temperature, and working gas pressure need to be accounted for in the life and reliability prediction. Due to the availability of very limited test data, assuring life and reliability of the HH is a challenging task. The NASA Glenn Research Center (GRC) has adopted an integrated approach combining available uniaxial MarM-247 material behavior testing, HH benchmark testing and advanced analysis in order to demonstrate the integrity, life and reliability of the HH under expected mission conditions. The proposed paper describes analytical aspects of the deterministic and probabilistic approaches and results. The deterministic approach involves development of the creep constitutive model for the MarM-247 (akin to the Oak Ridge National Laboratory master curve model used previously for Inconel 718 (Special Metals Corporation)) and nonlinear finite element analysis to predict the mean life. The probabilistic approach includes evaluation of the effect of design variable uncertainties in material creep behavior, geometry and operating conditions on life and reliability for the expected life. The sensitivity of the uncertainties in the design variables on the HH reliability is also quantified, and guidelines to improve reliability are discussed.

  19. Method for evaluating the reliability of compressor impeller of turbocharger for vehicle application in plateau area

    NASA Astrophysics Data System (ADS)

    Wang, Zheng; Wang, Zengquan; Wang, A.-na; Zhuang, Li; Wang, Jinwei

    2016-10-01

    As turbocharging diesel engines for vehicle application are applied in plateau area, the environmental adaptability of engines has drawn more attention. For the environmental adaptability problem of turbocharging diesel engines for vehicle application, the present studies almost focus on the optimization of performance match between turbocharger and engine, and the reliability problem of turbocharger is almost ignored. The reliability problem of compressor impeller of turbocharger for vehicle application when diesel engines operate in plateau area is studied. Firstly, the rule that the rotational speed of turbocharger changes with the altitude height is presented, and the potential failure modes of compressor impeller are analyzed. Then, the failure behavior models of compressor impeller are built, and the reliability models of compressor impeller operating in plateau area are developed. Finally, the rule that the reliability of compressor impeller changes with the altitude height is studied, the measurements for improving the reliability of the compressor impellers of turbocharger operating in plateau area are given. The results indicate that when the operating speed of diesel engine is certain, the rotational speed of turbocharger increases with the increase of altitude height, and the failure risk of compressor impeller with the failure modes of hub fatigue and blade resonance increases. The reliability of compressor impeller decreases with the increase of altitude height, and it also decreases as the increase of number of the mission profile cycle of engine. The method proposed can not only be used to evaluating the reliability of compressor impeller when diesel engines operate in plateau area but also be applied to direct the structural optimization of compressor impeller.

  20. A reliability analysis of the revised competitiveness index.

    PubMed

    Harris, Paul B; Houston, John M

    2010-06-01

    This study examined the reliability of the Revised Competitiveness Index by investigating the test-retest reliability, interitem reliability, and factor structure of the measure based on a sample of 280 undergraduates (200 women, 80 men) ranging in age from 18 to 28 years (M = 20.1, SD = 2.1). The findings indicate that the Revised Competitiveness Index has high test-retest reliability, high inter-item reliability, and a stable factor structure. The results support the assertion that the Revised Competitiveness Index assesses competitiveness as a stable trait rather than a dynamic state.

  1. Modelling low velocity impact induced damage in composite laminates

    NASA Astrophysics Data System (ADS)

    Shi, Yu; Soutis, Constantinos

    2017-12-01

    The paper presents recent progress on modelling low velocity impact induced damage in fibre reinforced composite laminates. It is important to understand the mechanisms of barely visible impact damage (BVID) and how it affects structural performance. To reduce labour intensive testing, the development of finite element (FE) techniques for simulating impact damage becomes essential and recent effort by the composites research community is reviewed in this work. The FE predicted damage initiation and propagation can be validated by Non Destructive Techniques (NDT) that gives confidence to the developed numerical damage models. A reliable damage simulation can assist the design process to optimise laminate configurations, reduce weight and improve performance of components and structures used in aircraft construction.

  2. Precision and reliability of periodically and quasiperiodically driven integrate-and-fire neurons.

    PubMed

    Tiesinga, P H E

    2002-04-01

    Neurons in the brain communicate via trains of all-or-none electric events known as spikes. How the brain encodes information using spikes-the neural code-remains elusive. Here the robustness against noise of stimulus-induced neural spike trains is studied in terms of attractors and bifurcations. The dynamics of model neurons converges after a transient onto an attractor yielding a reproducible sequence of spike times. At a bifurcation point the spike times on the attractor change discontinuously when a parameter is varied. Reliability, the stability of the attractor against noise, is reduced when the neuron operates close to a bifurcation point. We determined using analytical spike-time maps the attractor and bifurcation structure of an integrate-and-fire model neuron driven by a periodic or a quasiperiodic piecewise constant current and investigated the stability of attractors against noise. The integrate-and-fire model neuron became mode locked to the periodic current with a rational winding number p/q and produced p spikes per q cycles. There were q attractors. p:q mode-locking regions formed Arnold tongues. In the model, reliability was the highest during 1:1 mode locking when there was only one attractor, as was also observed in recent experiments. The quasiperiodically driven neuron mode locked to either one of the two drive periods, or to a linear combination of both of them. Mode-locking regions were organized in Arnold tongues and reliability was again highest when there was only one attractor. These results show that neuronal reliability in response to the rhythmic drive generated by synchronized networks of neurons is profoundly influenced by the location of the Arnold tongues in parameter space.

  3. Spirituality as a Scientific Construct: Testing Its Universality across Cultures and Languages

    PubMed Central

    MacDonald, Douglas A.; Friedman, Harris L.; Brewczynski, Jacek; Holland, Daniel; Salagame, Kiran Kumar K.; Mohan, K. Krishna; Gubrij, Zuzana Ondriasova; Cheong, Hye Wook

    2015-01-01

    Using data obtained from 4004 participants across eight countries (Canada, India, Japan, Korea, Poland, Slovakia, Uganda, and the U.S.), the factorial reliability, validity and structural/measurement invariance of a 30-item version of Expressions of Spirituality Inventory (ESI-R) was evaluated. The ESI-R measures a five factor model of spirituality developed through the conjoint factor analysis of several extant measures of spiritual constructs. Exploratory factor analyses of pooled data provided evidence that the five ESI-R factors are reliable. Confirmatory analyses comparing four and five factor models revealed that the five dimensional model demonstrates superior goodness-of-fit with all cultural samples and suggest that the ESI-R may be viewed as structurally invariant. Measurement invariance, however, was not supported as manifested in significant differences in item and dimension scores and in significantly poorer fit when factor loadings were constrained to equality across all samples. Exploratory analyses with a second adjective measure of spirituality using American, Indian, and Ugandan samples identified three replicable factors which correlated with ESI-R dimensions in a manner supportive of convergent validity. The paper concludes with a discussion of the meaning of the findings and directions needed for future research. PMID:25734921

  4. 3D Documentation and BIM Modeling of Cultural Heritage Structures Using UAVs: The Case of the Foinikaria Church

    NASA Astrophysics Data System (ADS)

    Themistocleous, K.; Agapiou, A.; Hadjimitsis, D.

    2016-10-01

    The documentation of architectural cultural heritage sites has traditionally been expensive and labor-intensive. New innovative technologies, such as Unmanned Aerial Vehicles (UAVs), provide an affordable, reliable and straightforward method of capturing cultural heritage sites, thereby providing a more efficient and sustainable approach to documentation of cultural heritage structures. In this study, hundreds of images of the Panagia Chryseleousa church in Foinikaria, Cyprus were taken using a UAV with an attached high resolution camera. The images were processed to generate an accurate digital 3D model by using Structure in Motion techniques. Building Information Model (BIM) was then used to generate drawings of the church. The methodology described in the paper provides an accurate, simple and cost-effective method of documenting cultural heritage sites and generating digital 3D models using novel techniques and innovative methods.

  5. The reliability of molecular dynamics simulations of the multidrug transporter P-glycoprotein in a membrane environment

    PubMed Central

    Condic-Jurkic, Karmen; Subramanian, Nandhitha; Mark, Alan E.

    2018-01-01

    Despite decades of research, the mechanism of action of the ABC multidrug transporter P-glycoprotein (P-gp) remains elusive. Due to experimental limitations, many researchers have turned to molecular dynamics simulation studies in order to investigate different aspects of P-gp function. However, such studies are challenging and caution is required when interpreting the results. P-gp is highly flexible and the time scale on which it can be simulated is limited. There is also uncertainty regarding the accuracy of the various crystal structures available, let alone the structure of the protein in a physiologically relevant environment. In this study, three alternative structural models of mouse P-gp (3G5U, 4KSB, 4M1M), all resolved to 3.8 Å, were used to initiate sets of simulations of P-gp in a membrane environment in order to determine: a) the sensitivity of the results to differences in the starting configuration; and b) the extent to which converged results could be expected on the times scales commonly simulated for this system. The simulations suggest that the arrangement of the nucleotide binding domains (NBDs) observed in the crystal structures is not stable in a membrane environment. In all simulations, the NBDs rapidly associated (within 10 ns) and changes within the transmembrane helices were observed. The secondary structure within the transmembrane domain was best preserved in the 4M1M model under the simulation conditions used. However, the extent to which replicate simulations diverged on a 100 to 200 ns timescale meant that it was not possible to draw definitive conclusions as to which structure overall was most stable, or to obtain converged and reliable results for any of the properties examined. The work brings into question the reliability of conclusions made in regard to the nature of specific interactions inferred from previous simulation studies on this system involving similar sampling times. It also highlights the need to demonstrate the statistical significance of any results obtained in simulations of large flexible proteins, especially where the initial structure is uncertain. PMID:29370310

  6. The reliability of molecular dynamics simulations of the multidrug transporter P-glycoprotein in a membrane environment.

    PubMed

    Condic-Jurkic, Karmen; Subramanian, Nandhitha; Mark, Alan E; O'Mara, Megan L

    2018-01-01

    Despite decades of research, the mechanism of action of the ABC multidrug transporter P-glycoprotein (P-gp) remains elusive. Due to experimental limitations, many researchers have turned to molecular dynamics simulation studies in order to investigate different aspects of P-gp function. However, such studies are challenging and caution is required when interpreting the results. P-gp is highly flexible and the time scale on which it can be simulated is limited. There is also uncertainty regarding the accuracy of the various crystal structures available, let alone the structure of the protein in a physiologically relevant environment. In this study, three alternative structural models of mouse P-gp (3G5U, 4KSB, 4M1M), all resolved to 3.8 Å, were used to initiate sets of simulations of P-gp in a membrane environment in order to determine: a) the sensitivity of the results to differences in the starting configuration; and b) the extent to which converged results could be expected on the times scales commonly simulated for this system. The simulations suggest that the arrangement of the nucleotide binding domains (NBDs) observed in the crystal structures is not stable in a membrane environment. In all simulations, the NBDs rapidly associated (within 10 ns) and changes within the transmembrane helices were observed. The secondary structure within the transmembrane domain was best preserved in the 4M1M model under the simulation conditions used. However, the extent to which replicate simulations diverged on a 100 to 200 ns timescale meant that it was not possible to draw definitive conclusions as to which structure overall was most stable, or to obtain converged and reliable results for any of the properties examined. The work brings into question the reliability of conclusions made in regard to the nature of specific interactions inferred from previous simulation studies on this system involving similar sampling times. It also highlights the need to demonstrate the statistical significance of any results obtained in simulations of large flexible proteins, especially where the initial structure is uncertain.

  7. ModeRNA: a tool for comparative modeling of RNA 3D structure

    PubMed Central

    Rother, Magdalena; Rother, Kristian; Puton, Tomasz; Bujnicki, Janusz M.

    2011-01-01

    RNA is a large group of functionally important biomacromolecules. In striking analogy to proteins, the function of RNA depends on its structure and dynamics, which in turn is encoded in the linear sequence. However, while there are numerous methods for computational prediction of protein three-dimensional (3D) structure from sequence, with comparative modeling being the most reliable approach, there are very few such methods for RNA. Here, we present ModeRNA, a software tool for comparative modeling of RNA 3D structures. As an input, ModeRNA requires a 3D structure of a template RNA molecule, and a sequence alignment between the target to be modeled and the template. It must be emphasized that a good alignment is required for successful modeling, and for large and complex RNA molecules the development of a good alignment usually requires manual adjustments of the input data based on previous expertise of the respective RNA family. ModeRNA can model post-transcriptional modifications, a functionally important feature analogous to post-translational modifications in proteins. ModeRNA can also model DNA structures or use them as templates. It is equipped with many functions for merging fragments of different nucleic acid structures into a single model and analyzing their geometry. Windows and UNIX implementations of ModeRNA with comprehensive documentation and a tutorial are freely available. PMID:21300639

  8. Toward smart aerospace structures: design of a piezoelectric sensor and its analog interface for flaw detection.

    PubMed

    Boukabache, Hamza; Escriba, Christophe; Fourniols, Jean-Yves

    2014-10-31

    Structural health monitoring using noninvasive methods is one of the major challenges that aerospace manufacturers face in this decade. Our work in this field focuses on the development and the system integration of millimetric piezoelectric sensors/ actuators to generate and measure specific guided waves. The aim of the application is to detect mechanical flaws on complex composite and alloy structures to quantify efficiently the global structures' reliability. The study begins by a physical and analytical analysis of a piezoelectric patch. To preserve the structure's integrity, the transducers are directly pasted onto the surface which leads to a critical issue concerning the interfacing layer. In order to improve the reliability and mitigate the influence of the interfacing layer, the global equations of piezoelectricity are coupled with a load transfer model. Thus we can determine precisely the shear strain developed on the surface of the structure. To exploit the generated signal, a high precision analog charge amplifier coupled to a double T notch filter were designed and scaled. Finally, a novel joined time-frequency analysis based on a wavelet decomposition algorithm is used to extract relevant structures signatures. Finally, this paper provides examples of application on aircraft structure specimens and the feasibility of the system is thus demonstrated.

  9. Toward Smart Aerospace Structures: Design of a Piezoelectric Sensor and Its Analog Interface for Flaw Detection

    PubMed Central

    Boukabache, Hamza; Escriba, Christophe; Fourniols, Jean-Yves

    2014-01-01

    Structural health monitoring using noninvasive methods is one of the major challenges that aerospace manufacturers face in this decade. Our work in this field focuses on the development and the system integration of millimetric piezoelectric sensors/ actuators to generate and measure specific guided waves. The aim of the application is to detect mechanical flaws on complex composite and alloy structures to quantify efficiently the global structures' reliability. The study begins by a physical and analytical analysis of a piezoelectric patch. To preserve the structure's integrity, the transducers are directly pasted onto the surface which leads to a critical issue concerning the interfacing layer. In order to improve the reliability and mitigate the influence of the interfacing layer, the global equations of piezoelectricity are coupled with a load transfer model. Thus we can determine precisely the shear strain developed on the surface of the structure. To exploit the generated signal, a high precision analog charge amplifier coupled to a double T notch filter were designed and scaled. Finally, a novel joined time-frequency analysis based on a wavelet decomposition algorithm is used to extract relevant structures signatures. Finally, this paper provides examples of application on aircraft structure specimens and the feasibility of the system is thus demonstrated. PMID:25365457

  10. MEMS Reliability Assurance Activities at JPL

    NASA Technical Reports Server (NTRS)

    Kayali, S.; Lawton, R.; Stark, B.

    2000-01-01

    An overview of Microelectromechanical Systems (MEMS) reliability assurance and qualification activities at JPL is presented along with the a discussion of characterization of MEMS structures implemented on single crystal silicon, polycrystalline silicon, CMOS, and LIGA processes. Additionally, common failure modes and mechanisms affecting MEMS structures, including radiation effects, are discussed. Common reliability and qualification practices contained in the MEMS Reliability Assurance Guideline are also presented.

  11. A Pseudo-Atomic Model of the COPII Cage Obtained from CryoEM and Mass Spectrometry Analyses

    PubMed Central

    Noble, Alex J.; Zhang, Qian; O’Donnell, Jason; Hariri, Hanaa; Bhattacharya, Nilakshee; Marshall, Alan G.

    2012-01-01

    COPII vesicles transport proteins from the ER to the Golgi apparatus. Previous cryoEM structures of the COPII cage lacked the resolution necessary to determine the residues of Sec13 and Sec31 that mediate assembly and flexibility of the COPII cage. Here we present a 12Å-resolution structure of the COPII cage, where the tertiary structure of Sec13 and Sec31 is clearly identifiable. We employ this structure and a homology model of the Sec13-Sec31 complex to create a reliable pseudo-atomic model of the COPII cage. We combined this model with hydrogen/deuterium exchange mass spectrometry analysis to characterize four distinct contact regions at the vertices of the COPII cage. Furthermore, we found that the 2-fold symmetry of the Sec31 dimeric region of Sec13-31 is broken on cage formation, and that the resulting hinge is essential to form the proper edge geometry in COPII cages. PMID:23262493

  12. Template-based protein structure modeling using the RaptorX web server.

    PubMed

    Källberg, Morten; Wang, Haipeng; Wang, Sheng; Peng, Jian; Wang, Zhiyong; Lu, Hui; Xu, Jinbo

    2012-07-19

    A key challenge of modern biology is to uncover the functional role of the protein entities that compose cellular proteomes. To this end, the availability of reliable three-dimensional atomic models of proteins is often crucial. This protocol presents a community-wide web-based method using RaptorX (http://raptorx.uchicago.edu/) for protein secondary structure prediction, template-based tertiary structure modeling, alignment quality assessment and sophisticated probabilistic alignment sampling. RaptorX distinguishes itself from other servers by the quality of the alignment between a target sequence and one or multiple distantly related template proteins (especially those with sparse sequence profiles) and by a novel nonlinear scoring function and a probabilistic-consistency algorithm. Consequently, RaptorX delivers high-quality structural models for many targets with only remote templates. At present, it takes RaptorX ~35 min to finish processing a sequence of 200 amino acids. Since its official release in August 2011, RaptorX has processed ~6,000 sequences submitted by ~1,600 users from around the world.

  13. The factor structure of the 12-item general health questionnaire (GHQ-12) in young Chinese civil servants.

    PubMed

    Liang, Ying; Wang, Lei; Yin, Xican

    2016-09-26

    The 12-item General Health Questionnaire (GHQ-12) is a commonly used screening instrument for measuring mental disorders. However, few studies have measured the mental health of Chinese professionals or explored the factor structure of the GHQ-12 through investigations of young Chinese civil servants. This study analyses the factor structure of the GHQ-12 on young Chinese civil servants. Respondents include 1051 participants from six cities in eastern China. Exploratory Factor Analysis (EFA) is used to identify the potential factor structure of the GHQ-12. Confirmatory Factor Analysis (CFA) models of previous studies are referred to for model fitting. The results indicate the GHQ-12 has very good reliability and validity. All ten CFA models are well fitted with the actual data. All the ten models are feasible and fit the data equally well. The Chinese version of the GHQ-12 is suitable for professional groups and can serve as a screening tool to detect anxiety and psychiatric disorders.

  14. Template-based protein structure modeling using the RaptorX web server

    PubMed Central

    Källberg, Morten; Wang, Haipeng; Wang, Sheng; Peng, Jian; Wang, Zhiyong; Lu, Hui; Xu, Jinbo

    2016-01-01

    A key challenge of modern biology is to uncover the functional role of the protein entities that compose cellular proteomes. To this end, the availability of reliable three-dimensional atomic models of proteins is often crucial. This protocol presents a community-wide web-based method using RaptorX (http://raptorx.uchicago.edu/) for protein secondary structure prediction, template-based tertiary structure modeling, alignment quality assessment and sophisticated probabilistic alignment sampling. RaptorX distinguishes itself from other servers by the quality of the alignment between a target sequence and one or multiple distantly related template proteins (especially those with sparse sequence profiles) and by a novel nonlinear scoring function and a probabilistic-consistency algorithm. Consequently, RaptorX delivers high-quality structural models for many targets with only remote templates. At present, it takes RaptorX ~35 min to finish processing a sequence of 200 amino acids. Since its official release in August 2011, RaptorX has processed ~6,000 sequences submitted by ~1,600 users from around the world. PMID:22814390

  15. RICOR's new development of a highly reliable integral rotary cooler: engineering and reliability aspects

    NASA Astrophysics Data System (ADS)

    Filis, Avishai; Pundak, Nachman; Barak, Moshe; Porat, Ze'ev; Jaeger, Mordechai

    2011-06-01

    The growing demand for EO applications that work around the clock 24hr/7days a week, such as in border surveillance systems, emphasizes the need for a highly reliable cryocooler having increased operational availability and decreased integrated system Life Cycle (ILS) cost. In order to meet this need RICOR has developed a new rotary Stirling cryocooler, model K508N, intended to double the K508's operating MTTF achieving 20,000 operating MTTF hours. The K508N employs RICOR's latest mechanical design technologies such as optimized bearings and greases, bearings preloading, advanced seals, laser welded cold finger and robust design structure with increased natural frequency compared to the K508 model. The cooler enhanced MTTF was demonstrated by a Validation and Verification (V&V) plan comprising analytical means and a comparative accelerated life test between the standard K508 and the K508N models. Particularly, point estimate and confidence interval for the MTTF improvement factor where calculated periodically during and after the test. The (V&V) effort revealed that the K508N meets its MTTF design goal. The paper will focus on the technical and engineering aspects of the new design. In addition it will discuss the market needs and expectations, investigate the reliability data of the present reference K508 model; and report the accelerate life test data and the statistical analysis methodology as well as its underlying assumptions and results.

  16. Statistical Models and Inference Procedures for Structural and Materials Reliability

    DTIC Science & Technology

    1990-12-01

    as an official Department of the Army positio~n, policy, or decision, unless sD designated by other documentazion. 12a. DISTRIBUTION /AVAILABILITY...Some general stress-strength models were also developed and applied to the failure of systems subject to cyclic loading. Involved in the failure of...process control ideas and sequential design and analysis methods. Finally, smooth nonparametric quantile .wJ function estimators were studied. All of

  17. Reliability issues in active control of large flexible space structures

    NASA Technical Reports Server (NTRS)

    Vandervelde, W. E.

    1986-01-01

    Efforts in this reporting period were centered on four research tasks: design of failure detection filters for robust performance in the presence of modeling errors, design of generalized parity relations for robust performance in the presence of modeling errors, design of failure sensitive observers using the geometric system theory of Wonham, and computational techniques for evaluation of the performance of control systems with fault tolerance and redundancy management

  18. A Jones matrix formalism for simulating three-dimensional polarized light imaging of brain tissue.

    PubMed

    Menzel, M; Michielsen, K; De Raedt, H; Reckfort, J; Amunts, K; Axer, M

    2015-10-06

    The neuroimaging technique three-dimensional polarized light imaging (3D-PLI) provides a high-resolution reconstruction of nerve fibres in human post-mortem brains. The orientations of the fibres are derived from birefringence measurements of histological brain sections assuming that the nerve fibres—consisting of an axon and a surrounding myelin sheath—are uniaxial birefringent and that the measured optic axis is oriented in the direction of the nerve fibres (macroscopic model). Although experimental studies support this assumption, the molecular structure of the myelin sheath suggests that the birefringence of a nerve fibre can be described more precisely by multiple optic axes oriented radially around the fibre axis (microscopic model). In this paper, we compare the use of the macroscopic and the microscopic model for simulating 3D-PLI by means of the Jones matrix formalism. The simulations show that the macroscopic model ensures a reliable estimation of the fibre orientations as long as the polarimeter does not resolve structures smaller than the diameter of single fibres. In the case of fibre bundles, polarimeters with even higher resolutions can be used without losing reliability. When taking the myelin density into account, the derived fibre orientations are considerably improved. © 2015 The Author(s).

  19. On-clip high frequency reliability and failure test structures

    DOEpatents

    Snyder, Eric S.; Campbell, David V.

    1997-01-01

    Self-stressing test structures for realistic high frequency reliability characterizations. An on-chip high frequency oscillator, controlled by DC signals from off-chip, provides a range of high frequency pulses to test structures. The test structures provide information with regard to a variety of reliability failure mechanisms, including hot-carriers, electromigration, and oxide breakdown. The system is normally integrated at the wafer level to predict the failure mechanisms of the production integrated circuits on the same wafer.

  20. The reliability of multidimensional neuropsychological measures: from alpha to omega.

    PubMed

    Watkins, Marley W

    To demonstrate that Coefficient omega, a model-based estimate, is more a more appropriate index of reliability than coefficient alpha for the multidimensional scales that are commonly employed by neuropsychologists. As an illustration, a structural model of an overarching general factor and four first-order factors for the WAIS-IV based on the standardization sample of 2200 participants was identified and omega coefficients were subsequently computed for WAIS-IV composite scores. Alpha coefficients were ≥ .90 and omega coefficients ranged from .75 to .88 for WAIS-IV factor index scores, indicating that the blend of general and group factor variance in each index score created a reliable multidimensional composite. However, the amalgam of variance from general and group factors did not allow the precision of Full Scale IQ (FSIQ) and factor index scores to be disentangled. In contrast, omega hierarchical coefficients were low for all four factor index scores (.10-.41), indicating that most of the reliable variance of each factor index score was due to the general intelligence factor. In contrast, the omega hierarchical coefficient for the FSIQ score was .84. Meaningful interpretation of WAIS-IV factor index scores as unambiguous indicators of group factors is imprecise, thereby fostering unreliable identification of neurocognitive strengths and weaknesses, whereas the WAIS-IV FSIQ score can be interpreted as a reliable measure of general intelligence. It was concluded that neuropsychologists should base their clinical decisions on reliable scores as indexed by coefficient omega.

  1. Linking Structural Equation Modelling with Bayesian Network and Coastal Phytoplankton Dynamics in Bohai Bay

    NASA Astrophysics Data System (ADS)

    Chu, Jiangtao; Yang, Yue

    2018-06-01

    Bayesian networks (BN) have many advantages over other methods in ecological modelling and have become an increasingly popular modelling tool. However, BN are flawed in regard to building models based on inadequate existing knowledge. To overcome this limitation, we propose a new method that links BN with structural equation modelling (SEM). In this method, SEM is used to improve the model structure for BN. This method was used to simulate coastal phytoplankton dynamics in Bohai Bay. We demonstrate that this hybrid approach minimizes the need for expert elicitation, generates more reasonable structures for BN models and increases the BN model's accuracy and reliability. These results suggest that the inclusion of SEM for testing and verifying the theoretical structure during the initial construction stage improves the effectiveness of BN models, especially for complex eco-environment systems. The results also demonstrate that in Bohai Bay, while phytoplankton biomass has the greatest influence on phytoplankton dynamics, the impact of nutrients on phytoplankton dynamics is larger than the influence of the physical environment in summer. Furthermore, despite the Redfield ratio indicating that phosphorus should be the primary nutrient limiting factor, our results indicate that silicate plays the most important role in regulating phytoplankton dynamics in Bohai Bay.

  2. Refinement of protein termini in template-based modeling using conformational space annealing.

    PubMed

    Park, Hahnbeom; Ko, Junsu; Joo, Keehyoung; Lee, Julian; Seok, Chaok; Lee, Jooyoung

    2011-09-01

    The rapid increase in the number of experimentally determined protein structures in recent years enables us to obtain more reliable protein tertiary structure models than ever by template-based modeling. However, refinement of template-based models beyond the limit available from the best templates is still needed for understanding protein function in atomic detail. In this work, we develop a new method for protein terminus modeling that can be applied to refinement of models with unreliable terminus structures. The energy function for terminus modeling consists of both physics-based and knowledge-based potential terms with carefully optimized relative weights. Effective sampling of both the framework and terminus is performed using the conformational space annealing technique. This method has been tested on a set of termini derived from a nonredundant structure database and two sets of termini from the CASP8 targets. The performance of the terminus modeling method is significantly improved over our previous method that does not employ terminus refinement. It is also comparable or superior to the best server methods tested in CASP8. The success of the current approach suggests that similar strategy may be applied to other types of refinement problems such as loop modeling or secondary structure rearrangement. Copyright © 2011 Wiley-Liss, Inc.

  3. The Reliability of Psychiatric Diagnosis Revisited

    PubMed Central

    Rankin, Eric; France, Cheryl; El-Missiry, Ahmed; John, Collin

    2006-01-01

    Background: The authors reviewed the topic of reliability of psychiatric diagnosis from the turn of the 20th century to present. The objectives of this paper are to explore the reasons of unreliability of psychiatric diagnosis and propose ways to improve the reliability of psychiatric diagnosis. Method: The authors reviewed the literature on the concept of reliability of psychiatric diagnosis with emphasis on the impact of interviewing skills, use of diagnostic criteria, and structured interviews on the reliability of psychiatric diagnosis. Results: Causes of diagnostic unreliability are attributed to the patient, the clinician and psychiatric nomenclature. The reliability of psychiatric diagnosis can be enhanced by using diagnostic criteria, defining psychiatric symptoms and structuring the interviews. Conclusions: The authors propose the acronym ‘DR.SED,' which stands for diagnostic criteria, reference definitions, structuring the interview, clinical experience, and data. The authors recommend that clinicians use the DR.SED paradigm to improve the reliability of psychiatric diagnoses. PMID:21103149

  4. On the design of high-rise buildings with a specified level of reliability

    NASA Astrophysics Data System (ADS)

    Dolganov, Andrey; Kagan, Pavel

    2018-03-01

    High-rise buildings have a specificity, which significantly distinguishes them from traditional buildings of high-rise and multi-storey buildings. Steel structures in high-rise buildings are advisable to be used in earthquake-proof regions, since steel, due to its plasticity, provides damping of the kinetic energy of seismic impacts. These aspects should be taken into account when choosing a structural scheme of a high-rise building and designing load-bearing structures. Currently, modern regulatory documents do not quantify the reliability of structures. Although the problem of assigning an optimal level of reliability has existed for a long time. The article shows the possibility of designing metal structures of high-rise buildings with specified reliability. Currently, modern regulatory documents do not quantify the reliability of high-rise buildings. Although the problem of assigning an optimal level of reliability has existed for a long time. It is proposed to establish the value of reliability 0.99865 (3σ) for constructions of buildings and structures of a normal level of responsibility in calculations for the first group of limiting states. For increased (construction of high-rise buildings) and reduced levels of responsibility for the provision of load-bearing capacity, it is proposed to assign respectively 0.99997 (4σ) and 0.97725 (2σ). The coefficients of the use of the cross section of a metal beam for different levels of security are given.

  5. The role of population inertia in predicting the outcome of stage-structured biological invasions.

    PubMed

    Guiver, Chris; Dreiwi, Hanan; Filannino, Donna-Maria; Hodgson, Dave; Lloyd, Stephanie; Townley, Stuart

    2015-07-01

    Deterministic dynamic models for coupled resident and invader populations are considered with the purpose of finding quantities that are effective at predicting when the invasive population will become established asymptotically. A key feature of the models considered is the stage-structure, meaning that the populations are described by vectors of discrete developmental stage- or age-classes. The vector structure permits exotic transient behaviour-phenomena not encountered in scalar models. Analysis using a linear Lyapunov function demonstrates that for the class of population models considered, a large so-called population inertia is indicative of successful invasion. Population inertia is an indicator of transient growth or decline. Furthermore, for the class of models considered, we find that the so-called invasion exponent, an existing index used in models for invasion, is not always a reliable comparative indicator of successful invasion. We highlight these findings through numerical examples and a biological interpretation of why this might be the case is discussed. Copyright © 2015. Published by Elsevier Inc.

  6. Hysteretic Models Considering Axial-Shear-Flexure Interaction

    NASA Astrophysics Data System (ADS)

    Ceresa, Paola; Negrisoli, Giorgio

    2017-10-01

    Most of the existing numerical models implemented in finite element (FE) software, at the current state of the art, are not capable to describe, with enough reliability, the interaction between axial, shear and flexural actions under cyclic loading (e.g. seismic actions), neglecting crucial effects for predicting the nature of the collapse of reinforced concrete (RC) structural elements. Just a few existing 3D volume models or fibre beam models can lead to a quite accurate response, but they are still computationally inefficient for typical applications in earthquake engineering and also characterized by very complex formulation. Thus, discrete models with lumped plasticity hinges may be the preferred choice for modelling the hysteretic behaviour due to cyclic loading conditions, in particular with reference to its implementation in a commercial software package. These considerations lead to this research work focused on the development of a model for RC beam-column elements able to consider degradation effects and interaction between the actions under cyclic loading conditions. In order to develop a model for a general 3D discrete hinge element able to take into account the axial-shear-flexural interaction, it is necessary to provide an implementation which involves a corrector-predictor iterative scheme. Furthermore, a reliable constitutive model based on damage plasticity theory is formulated and implemented for its numerical validation. Aim of this research work is to provide the formulation of a numerical model, which will allow implementation within a FE software package for nonlinear cyclic analysis of RC structural members. The developed model accounts for stiffness degradation effect and stiffness recovery for loading reversal.

  7. User's guide to the Reliability Estimation System Testbed (REST)

    NASA Technical Reports Server (NTRS)

    Nicol, David M.; Palumbo, Daniel L.; Rifkin, Adam

    1992-01-01

    The Reliability Estimation System Testbed is an X-window based reliability modeling tool that was created to explore the use of the Reliability Modeling Language (RML). RML was defined to support several reliability analysis techniques including modularization, graphical representation, Failure Mode Effects Simulation (FMES), and parallel processing. These techniques are most useful in modeling large systems. Using modularization, an analyst can create reliability models for individual system components. The modules can be tested separately and then combined to compute the total system reliability. Because a one-to-one relationship can be established between system components and the reliability modules, a graphical user interface may be used to describe the system model. RML was designed to permit message passing between modules. This feature enables reliability modeling based on a run time simulation of the system wide effects of a component's failure modes. The use of failure modes effects simulation enhances the analyst's ability to correctly express system behavior when using the modularization approach to reliability modeling. To alleviate the computation bottleneck often found in large reliability models, REST was designed to take advantage of parallel processing on hypercube processors.

  8. Software reliability models for fault-tolerant avionics computers and related topics

    NASA Technical Reports Server (NTRS)

    Miller, Douglas R.

    1987-01-01

    Software reliability research is briefly described. General research topics are reliability growth models, quality of software reliability prediction, the complete monotonicity property of reliability growth, conceptual modelling of software failure behavior, assurance of ultrahigh reliability, and analysis techniques for fault-tolerant systems.

  9. Modeling the Biodegradability of Chemical Compounds Using the Online CHEmical Modeling Environment (OCHEM)

    PubMed Central

    Vorberg, Susann

    2013-01-01

    Abstract Biodegradability describes the capacity of substances to be mineralized by free‐living bacteria. It is a crucial property in estimating a compound’s long‐term impact on the environment. The ability to reliably predict biodegradability would reduce the need for laborious experimental testing. However, this endpoint is difficult to model due to unavailability or inconsistency of experimental data. Our approach makes use of the Online Chemical Modeling Environment (OCHEM) and its rich supply of machine learning methods and descriptor sets to build classification models for ready biodegradability. These models were analyzed to determine the relationship between characteristic structural properties and biodegradation activity. The distinguishing feature of the developed models is their ability to estimate the accuracy of prediction for each individual compound. The models developed using seven individual descriptor sets were combined in a consensus model, which provided the highest accuracy. The identified overrepresented structural fragments can be used by chemists to improve the biodegradability of new chemical compounds. The consensus model, the datasets used, and the calculated structural fragments are publicly available at http://ochem.eu/article/31660. PMID:27485201

  10. 3D electron density distributions in the solar corona during solar minima: assessment for more realistic solar wind modeling

    NASA Astrophysics Data System (ADS)

    de Patoul, J.; Foullon, C.; Riley, P.

    2015-12-01

    Knowledge of the electron density distribution in the solar corona put constraints on the magnetic field configurations for coronal modeling, and on initial conditions for solar wind modeling. We work with polarized SOHO/LASCO-C2 images from the last two recent minima of solar activity (1996-1997 and 2008-2010), devoid of coronal mass ejections. We derive the 4D electron density distributions in the corona by applying a newly developed time-dependent tomographic reconstruction method. First we compare the density distributions obtained from tomography with magnetohydrodynamic (MHD) solutions. The tomography provides more accurate distributions of electron densities in the polar regions, and we find that the observed density varies with the solar cycle in both polar and equatorial regions. Second, we find that the highest-density structures do not always correspond to the predicted large-scale heliospheric current sheet or its helmet streamer but can follow the locations of pseudo-streamers. We conclude that tomography offers reliable density distribution in the corona, reproducing the slow time evolution of coronal structures, without prior knowledge of the coronal magnetic field over a full rotation. Finally, we suggest that the highest-density structures show a differential rotation well above the surface depending on how it is magnetically connected to the surface. Such valuable information on the rotation of large-scale structures could help to connect the sources of the solar wind to their in-situ counterparts in future missions such as Solar Orbiter and Solar Probe Plus. This research combined with the MHD coronal modeling efforts has the potential to increase the reliability for future space weather forecasting.

  11. Probing the Detailed Seismic Velocity Structure of Subduction Zones Using Advanced Seismic Tomography Methods

    NASA Astrophysics Data System (ADS)

    Zhang, H.; Thurber, C. H.

    2005-12-01

    Subduction zones are one of the most important components of the Earth's plate tectonic system. Knowing the detailed seismic velocity structure within and around subducting slabs is vital to understand the constitution of the slab, the cause of intermediate depth earthquakes inside the slab, the fluid distribution and recycling, and tremor occurrence [Hacker et al., 2001; Obara, 2002].Thanks to the ability of double-difference tomography [Zhang and Thurber, 2003] to resolve the fine-scale structure near the source region and the favorable seismicity distribution inside many subducting slabs, it is now possible to characterize the fine details of the velocity structure and earthquake locations inside the slab, as shown in the study of the Japan subduction zone [Zhang et al., 2004]. We further develop the double-difference tomography method in two aspects: the first improvement is to use an adaptive inversion mesh rather than a regular inversion grid and the second improvement is to determine a reliable Vp/Vs structure using various strategies rather than directly from Vp and Vs [see our abstract ``Strategies to solve for a better Vp/Vs model using P and S arrival time'' at Session T29]. The adaptive mesh seismic tomography method is based on tetrahedral diagrams and can automatically adjust the inversion mesh according to the ray distribution so that the inversion mesh nodes are denser where there are more rays and vice versa [Zhang and Thurber, 2005]. As a result, the number of inversion mesh nodes is greatly reduced compared to a regular inversion grid with comparable spatial resolution, and the tomographic system is more stable and better conditioned. This improvement is quite valuable for characterizing the fine structure of the subduction zone considering the highly uneven distribution of earthquakes within and around the subducting slab. The second improvement, to determine a reliable Vp/Vs model, lies in jointly inverting Vp, Vs, and Vp/Vs using P, S, and S-P times in a manner similar to double-difference tomography. Obtaining a reliable Vp/Vs model of the subduction zone is more helpful for understanding its mechanical and petrologic properties. Our applications of the original version of double-difference tomography to several subduction zones beneath northern Honshu, Japan, the Wellington region, New Zealand, and Alaska, United States, have shown evident velocity variations within and around the subducting slab, which likely is evidence of dehydration reactions of various hydrous minerals that are hypothesized to be responsible for intermediate depth earthquakes. We will show the new velocity models for these subduction zones by applying our advanced tomographic methods.

  12. Statistical modelling of software reliability

    NASA Technical Reports Server (NTRS)

    Miller, Douglas R.

    1991-01-01

    During the six-month period from 1 April 1991 to 30 September 1991 the following research papers in statistical modeling of software reliability appeared: (1) A Nonparametric Software Reliability Growth Model; (2) On the Use and the Performance of Software Reliability Growth Models; (3) Research and Development Issues in Software Reliability Engineering; (4) Special Issues on Software; and (5) Software Reliability and Safety.

  13. Reliability-based optimization of maintenance scheduling of mechanical components under fatigue

    PubMed Central

    Beaurepaire, P.; Valdebenito, M.A.; Schuëller, G.I.; Jensen, H.A.

    2012-01-01

    This study presents the optimization of the maintenance scheduling of mechanical components under fatigue loading. The cracks of damaged structures may be detected during non-destructive inspection and subsequently repaired. Fatigue crack initiation and growth show inherent variability, and as well the outcome of inspection activities. The problem is addressed under the framework of reliability based optimization. The initiation and propagation of fatigue cracks are efficiently modeled using cohesive zone elements. The applicability of the method is demonstrated by a numerical example, which involves a plate with two holes subject to alternating stress. PMID:23564979

  14. On the Genesis of Reliability Models.

    DTIC Science & Technology

    1982-07-01

    dy1 fRi fI, where I: Y1 H,(yl~y2,t) = ex (tly) Irl(y2)11+ f ,ra(y2*(yl’))dyl’].(.9 The results, dH , = _r(y2*(y))H(y,y,,t)* dt H,(4rfjy 2 t) = eP...Press, 1971. 2. Payne, A. 0.: A reliability approach to the fatigue of structures. ASTM STP 511, pp. 106 - 155, 1972. 3. Diamond, P., and Payne, A. 0...Commonwealth Aircraft Corporation, Library Hawker de Havilland Aust. Pty. Ltd., Bankstown, Library Rolls-Royce of Australia Pty. Ltd., Mr. C. 0. A. Bailey

  15. Homology modeling and molecular dynamics simulation of the HIF2α degradation-related HIF2α-VHL complex.

    PubMed

    Dong, Xiaotian; Su, Xiaoru; Yu, Jiong; Liu, Jingqi; Shi, Xiaowei; Pan, Qiaoling; Yang, Jinfeng; Chen, Jiajia; Li, Lanjuan; Cao, Hongcui

    2017-01-01

    Hypoxia-inducible factor 2 alpha (HIF2α), prolyl hydroxylase domain protein 2 (PHD2), and the von Hippel Lindau tumor suppressor protein (pVHL) are three principal proteins in the oxygen-sensing pathway. Under normoxic conditions, a conserved proline in HIF2α is hydroxylated by PHD2 in an oxygen-dependent manner, and then pVHL binds and promotes the degradation of HIF2α. However, the crystal structure of the HIF2α-pVHL complex has not yet been established, and this has limited research on the interaction between HIF and pVHL. Here, we constructed a structural model of a 23-residue HIF2α peptide (528-550)-pVHL-ElonginB-ElonginC complex by using homology modeling and molecular dynamics simulations. We also applied these methods to HIF2α mutants (HYP531PRO, F540L, A530 V, A530T, and G537R) to reveal structural defects that explain how these mutations weaken the interaction with pVHL. Homology modeling and molecular dynamics simulations were used to construct a three-dimensional (3D) structural model of the HIF2α-VHL complex. Subsequently, MolProbity, an active validation tool, was used to analyze the reliability of the model. Molecular mechanics energies combined with the generalized Born and surface area continuum solvation (MM-GBSA) and solvated interaction energy (SIE) methods were used to calculate the binding free energy between HIF2a and pVHL, and the stability of the simulation system was evaluated by using root mean square deviation (RMSD) analysis. We also determined the secondary structure of the system by using the definition of secondary structure of proteins (DSSP) algorithm. Finally, we investigated the structural significance of specific point mutations known to have clinical implications. We established a reliable structural model of the HIF2α-pVHL complex, which is similar to the crystal structure of HIF1α in 1LQB. Furthermore, we compared the structural model of the HIF2α-pVHL complex and the HIF2α (HYP531P, F540L, A530V, A530T, and G537R)-pVHL mutants on the basis of RMSD, DSSP, binding free energy, and hydrogen bonding. The experimental data indicate that the stability of the structural model of the HIF2α-pVHL complex is higher than that of the mutants, consistently with clinical observations. The structural model of the HIF2α-pVHL complex presented in this study enhances understanding of how HIF2α is captured by pVHL. Moreover, the important contact amino acids that we identified may be useful in the development of drugs to treat HIF2a-related diseases. Copyright © 2016 Elsevier Inc. All rights reserved.

  16. Vibration analysis in reciprocating compressors

    NASA Astrophysics Data System (ADS)

    Kacani, V.

    2017-08-01

    This paper presents the influence of modelling on the mechanical natural frequencies, the effect of inertia loads on the structure vibration, the impact of the crank gear damping on speed fluctuation to ensure a safe operation and increasing the reliability of reciprocating compressors. In this paper it is shown, that conventional way of modelling is not sufficient. For best results it is required to include the whole system (bare block, frame, coupling, main driver, vessels, pipe work, etc.) in the model (see results in Table 1).

  17. Modeling Self-Heating Effects in Nanoscale Devices

    NASA Astrophysics Data System (ADS)

    Raleva, K.; Shaik, A. R.; Vasileska, D.; Goodnick, S. M.

    2017-08-01

    Accurate thermal modeling and the design of microelectronic devices and thin film structures at the micro- and nanoscales poses a challenge to electrical engineers who are less familiar with the basic concepts and ideas in sub-continuum heat transport. This book aims to bridge that gap. Efficient heat removal methods are necessary to increase device performance and device reliability. The authors provide readers with a combination of nanoscale experimental techniques and accurate modeling methods that must be employed in order to determine a device's temperature profile.

  18. Appearance motives to tan and not tan: evidence for validity and reliability of a new scale.

    PubMed

    Cafri, Guy; Thompson, J Kevin; Roehrig, Megan; Rojas, Ariz; Sperry, Steffanie; Jacobsen, Paul B; Hillhouse, Joel

    2008-04-01

    Risk for skin cancer is increased by UV exposure and decreased by sun protection. Appearance reasons to tan and not tan have consistently been shown to be related to intentions and behaviors to UV exposure and protection. This study was designed to determine the factor structure of appearance motives to tan and not tan, evaluate the extent to which this factor structure is gender invariant, test for mean differences in the identified factors, and evaluate internal consistency, temporal stability, and criterion-related validity. Five-hundred eighty-nine females and 335 male college students were used to test confirmatory factor analysis models within and across gender groups, estimate latent mean differences, and use the correlation coefficient and Cronbach's alpha to further evaluate the reliability and validity of the identified factors. A measurement invariant (i.e., factor-loading invariant) model was identified with three higher-order factors: sociocultural influences to tan (lower order factors: media, friends, family, significant others), appearance reasons to tan (general, acne, body shape), and appearance reasons not to tan (skin aging, immediate skin damage). Females had significantly higher means than males on all higher-order factors. All subscales had evidence of internal consistency, temporal stability, and criterion-related validity. This study offers a framework and measurement instrument that has evidence of validity and reliability for evaluating appearance-based motives to tan and not tan.

  19. Measuring chronic condition self-management in an Australian community: factor structure of the revised Partners in Health (PIH) scale.

    PubMed

    Smith, David; Harvey, Peter; Lawn, Sharon; Harris, Melanie; Battersby, Malcolm

    2017-01-01

    To evaluate the factor structure of the revised Partners in Health (PIH) scale for measuring chronic condition self-management in a representative sample from the Australian community. A series of consultations between clinical groups underpinned the revision of the PIH. The factors in the revised instrument were proposed to be: knowledge of illness and treatment, patient-health professional partnership, recognition and management of symptoms and coping with chronic illness. Participants (N = 904) reporting having a chronic illness completed the revised 12-item scale. Two a priori models, the 4-factor and bi-factor models were then evaluated using Bayesian confirmatory factor analysis (BCFA). Final model selection was established on model complexity, posterior predictive p values and deviance information criterion. Both 4-factor and bi-factor BCFA models with small informative priors for cross-loadings provided an acceptable fit with the data. The 4-factor model was shown to provide a better and more parsimonious fit with the observed data in terms of substantive theory. McDonald's omega coefficients indicated that the reliability of subscale raw scores was mostly in the acceptable range. The findings showed that the PIH scale is a relevant and structurally valid instrument for measuring chronic condition self-management in an Australian community. The PIH scale may help health professionals to introduce the concept of self-management to their patients and provide assessment of areas of self-management. A limitation is the narrow range of validated PIH measurement properties to date. Further research is needed to evaluate other important properties such as test-retest reliability, responsiveness over time and content validity.

  20. Optoelectronic transport properties in amorphous/crystalline silicon solar cell heterojunctions measured by frequency-domain photocarrier radiometry: multi-parameter measurement reliability and precision studies.

    PubMed

    Zhang, Y; Melnikov, A; Mandelis, A; Halliop, B; Kherani, N P; Zhu, R

    2015-03-01

    A theoretical one-dimensional two-layer linear photocarrier radiometry (PCR) model including the presence of effective interface carrier traps was used to evaluate the transport parameters of p-type hydrogenated amorphous silicon (a-Si:H) and n-type crystalline silicon (c-Si) passivated by an intrinsic hydrogenated amorphous silicon (i-layer) nanolayer. Several crystalline Si heterojunction structures were examined to investigate the influence of the i-layer thickness and the doping concentration of the a-Si:H layer. The experimental data of a series of heterojunction structures with intrinsic thin layers were fitted to PCR theory to gain insight into the transport properties of these devices. The quantitative multi-parameter results were studied with regard to measurement reliability (uniqueness) and precision using two independent computational best-fit programs. The considerable influence on the transport properties of the entire structure of two key parameters that can limit the performance of amorphous thin film solar cells, namely, the doping concentration of the a-Si:H layer and the i-layer thickness was demonstrated. It was shown that PCR can be applied to the non-destructive characterization of a-Si:H/c-Si heterojunction solar cells yielding reliable measurements of the key parameters.

  1. Optoelectronic transport properties in amorphous/crystalline silicon solar cell heterojunctions measured by frequency-domain photocarrier radiometry: Multi-parameter measurement reliability and precision studies

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhang, Y.; Institute of Electronic Engineering and Optoelectronic Technology, Nanjing University of Science and Technology, Nanjing, Jiangsu 210094; Melnikov, A.

    2015-03-15

    A theoretical one-dimensional two-layer linear photocarrier radiometry (PCR) model including the presence of effective interface carrier traps was used to evaluate the transport parameters of p-type hydrogenated amorphous silicon (a-Si:H) and n-type crystalline silicon (c-Si) passivated by an intrinsic hydrogenated amorphous silicon (i-layer) nanolayer. Several crystalline Si heterojunction structures were examined to investigate the influence of the i-layer thickness and the doping concentration of the a-Si:H layer. The experimental data of a series of heterojunction structures with intrinsic thin layers were fitted to PCR theory to gain insight into the transport properties of these devices. The quantitative multi-parameter results weremore » studied with regard to measurement reliability (uniqueness) and precision using two independent computational best-fit programs. The considerable influence on the transport properties of the entire structure of two key parameters that can limit the performance of amorphous thin film solar cells, namely, the doping concentration of the a-Si:H layer and the i-layer thickness was demonstrated. It was shown that PCR can be applied to the non-destructive characterization of a-Si:H/c-Si heterojunction solar cells yielding reliable measurements of the key parameters.« less

  2. Tensile failure criteria for fiber composite materials

    NASA Technical Reports Server (NTRS)

    Rosen, B. W.; Zweben, C. H.

    1972-01-01

    The analysis provides insight into the failure mechanics of these materials and defines criteria which serve as tools for preliminary design material selection and for material reliability assessment. The model incorporates both dispersed and propagation type failures and includes the influence of material heterogeneity. The important effects of localized matrix damage and post-failure matrix shear stress transfer are included in the treatment. The model is used to evaluate the influence of key parameters on the failure of several commonly used fiber-matrix systems. Analyses of three possible failure modes were developed. These modes are the fiber break propagation mode, the cumulative group fracture mode, and the weakest link mode. Application of the new model to composite material systems has indicated several results which require attention in the development of reliable structural composites. Prominent among these are the size effect and the influence of fiber strength variability.

  3. Flow Channel Influence of a Collision-Based Piezoelectric Jetting Dispenser on Jet Performance

    PubMed Central

    Deng, Guiling; Li, Junhui; Duan, Ji’an

    2018-01-01

    To improve the jet performance of a bi-piezoelectric jet dispenser, mathematical and simulation models were established according to the operating principle. In order to improve the accuracy and reliability of the simulation calculation, a viscosity model of the fluid was fitted to a fifth-order function with shear rate based on rheological test data, and the needle displacement model was fitted to a nine-order function with time based on real-time displacement test data. The results show that jet performance is related to the diameter of the nozzle outlet and the cone angle of the nozzle, and the impacts of the flow channel structure were confirmed. The approach of numerical simulation is confirmed by the testing results of droplet volume. It will provide a reliable simulation platform for mechanical collision-based jet dispensing and a theoretical basis for micro jet valve design and improvement. PMID:29677140

  4. Population-based validation of a German version of the Brief Resilience Scale

    PubMed Central

    Wenzel, Mario; Stieglitz, Rolf-Dieter; Kunzler, Angela; Bagusat, Christiana; Helmreich, Isabella; Gerlicher, Anna; Kampa, Miriam; Kubiak, Thomas; Kalisch, Raffael; Lieb, Klaus; Tüscher, Oliver

    2018-01-01

    Smith and colleagues developed the Brief Resilience Scale (BRS) to assess the individual ability to recover from stress despite significant adversity. This study aimed to validate the German version of the BRS. We used data from a population-based (sample 1: n = 1.481) and a representative (sample 2: n = 1.128) sample of participants from the German general population (age ≥ 18) to assess reliability and validity. Confirmatory factor analyses (CFA) were conducted to compare one- and two-factorial models from previous studies with a method-factor model which especially accounts for the wording of the items. Reliability was analyzed. Convergent validity was measured by correlating BRS scores with mental health measures, coping, social support, and optimism. Reliability was good (α = .85, ω = .85 for both samples). The method-factor model showed excellent model fit (sample 1: χ2/df = 7.544; RMSEA = .07; CFI = .99; SRMR = .02; sample 2: χ2/df = 1.166; RMSEA = .01; CFI = 1.00; SRMR = .01) which was significantly better than the one-factor model (Δχ2(4) = 172.71, p < .001) or the two-factor model (Δχ2(3) = 31.16, p < .001). The BRS was positively correlated with well-being, social support, optimism, and the coping strategies active coping, positive reframing, acceptance, and humor. It was negatively correlated with somatic symptoms, anxiety and insomnia, social dysfunction, depression, and the coping strategies religion, denial, venting, substance use, and self-blame. To conclude, our results provide evidence for the reliability and validity of the German adaptation of the BRS as well as the unidimensional structure of the scale once method effects are accounted for. PMID:29438435

  5. The inverse power law model for the lifetime of a mylar-polyurethane laminated dc hv insulating structure

    NASA Astrophysics Data System (ADS)

    Kalkanis, G.; Rosso, E.

    1989-09-01

    Results of an accelerated test on the lifetime of a mylar-polyurethane laminated dc high voltage insulating structure are reported. This structure consists of mylar ribbons placed side by side in a number of layers, staggered and glued together with a polyurethane adhesive. The lifetime until breakdown as a function of extremely high values of voltage stress is measured and represented by a mathematical model, the inverse power law model with a 2-parameter Weibull lifetime distribution. The statistical treatment of the data — either by graphical or by analytical methods — allowed us to estimate the lifetime distribution and confidence bounds for any required normal voltage stress. The laminated structure under consideration is, according to the analysis, a very reliable dc hv insulating material, with a very good life performance according to the inverse power law model, and with an exponent of voltage stress equal to 6. A large insulator of cylindrical shape with this kind of laminated structure can be constructed by winding helically a mylar ribbon in a number of layers.

  6. On-clip high frequency reliability and failure test structures

    DOEpatents

    Snyder, E.S.; Campbell, D.V.

    1997-04-29

    Self-stressing test structures for realistic high frequency reliability characterizations. An on-chip high frequency oscillator, controlled by DC signals from off-chip, provides a range of high frequency pulses to test structures. The test structures provide information with regard to a variety of reliability failure mechanisms, including hot-carriers, electromigration, and oxide breakdown. The system is normally integrated at the wafer level to predict the failure mechanisms of the production integrated circuits on the same wafer. 22 figs.

  7. Structural system reliability calculation using a probabilistic fault tree analysis method

    NASA Technical Reports Server (NTRS)

    Torng, T. Y.; Wu, Y.-T.; Millwater, H. R.

    1992-01-01

    The development of a new probabilistic fault tree analysis (PFTA) method for calculating structural system reliability is summarized. The proposed PFTA procedure includes: developing a fault tree to represent the complex structural system, constructing an approximation function for each bottom event, determining a dominant sampling sequence for all bottom events, and calculating the system reliability using an adaptive importance sampling method. PFTA is suitable for complicated structural problems that require computer-intensive computer calculations. A computer program has been developed to implement the PFTA.

  8. Versatile Micromechanics Model for Multiscale Analysis of Composite Structures

    NASA Astrophysics Data System (ADS)

    Kwon, Y. W.; Park, M. S.

    2013-08-01

    A general-purpose micromechanics model was developed so that the model could be applied to various composite materials such as reinforced by particles, long fibers and short fibers as well as those containing micro voids. Additionally, the model can be used with hierarchical composite materials. The micromechanics model can be used to compute effective material properties like elastic moduli, shear moduli, Poisson's ratios, and coefficients of thermal expansion for the various composite materials. The model can also calculate the strains and stresses at the constituent material level such as fibers, particles, and whiskers from the composite level stresses and strains. The model was implemented into ABAQUS using the UMAT option for multiscale analysis. An extensive set of examples are presented to demonstrate the reliability and accuracy of the developed micromechanics model for different kinds of composite materials. Another set of examples is provided to study the multiscale analysis of composite structures.

  9. Time-dependent reliability analysis of ceramic engine components

    NASA Technical Reports Server (NTRS)

    Nemeth, Noel N.

    1993-01-01

    The computer program CARES/LIFE calculates the time-dependent reliability of monolithic ceramic components subjected to thermomechanical and/or proof test loading. This program is an extension of the CARES (Ceramics Analysis and Reliability Evaluation of Structures) computer program. CARES/LIFE accounts for the phenomenon of subcritical crack growth (SCG) by utilizing either the power or Paris law relations. The two-parameter Weibull cumulative distribution function is used to characterize the variation in component strength. The effects of multiaxial stresses are modeled using either the principle of independent action (PIA), the Weibull normal stress averaging method (NSA), or the Batdorf theory. Inert strength and fatigue parameters are estimated from rupture strength data of naturally flawed specimens loaded in static, dynamic, or cyclic fatigue. Two example problems demonstrating proof testing and fatigue parameter estimation are given.

  10. Depressive symptoms following natural disaster in Korea: psychometric properties of the Center for Epidemiologic Studies Depression Scale.

    PubMed

    Cho, Sungkun; Cho, Yongrae

    2017-11-28

    Depressive symptoms have been recognized as one of the most frequent complaints among natural disaster survivors. One of the most frequently used self-report measures of depressive symptoms is the Center for Epidemiologic Studies Depression Scale (CES-D). To our knowledge, no study has yet examined the factor structure, reliability, and validity of the CES-D in a sample of natural disaster survivors. Thus, the present study investigated the factor structure, reliability, and validity of a Korean language version of the CES-D (KCES-D) for natural disaster survivors. We utilized two archived datasets collected independently for two different periods in 2008 in the same region of Korea (n = 192 for sample 1; n = 148 for sample 2). Participants were survivors of torrential rains in the mid-eastern region of the Korean peninsula. For analysis, Samples 1 and 2 were merged (N = 340). Confirmatory factor analysis was performed to evaluate the one-factor model, the four-factor model, and the bi-factor models, as well as the second-order factor model. Composite reliability was computed to examine the internal consistency of the KCES-D total and subscale scores. Finally, Pearson's r was computed to examine the relationship between the KCES-D and the trauma-related measures. The four-factor model provided the best fit to the data among the alternatives. The KCES-D showed adequate internal consistency, except for the 'interpersonal difficulties' subscale. Also regarding concurrent validity, weak to moderate positive correlations were observed between the KCES-D and the trauma-related measures. The results support the four-factor model and indicate that the KCES-D has adequate psychometric properties for natural disaster survivors. If these findings are further confirmed, the KCES-D can be used as a useful, rapid, and inexpensive screening tool for assessing depressive symptoms in natural disaster survivors.

  11. The Positive and Negative Syndrome Scale (PANSS): A Three-Factor Model of Psychopathology in Marginally Housed Persons with Substance Dependence and Psychiatric Illness.

    PubMed

    Giesbrecht, Chantelle J; O'Rourke, Norm; Leonova, Olga; Strehlau, Verena; Paquet, Karine; Vila-Rodriguez, Fidel; Panenka, William J; MacEwan, G William; Smith, Geoffrey N; Thornton, Allen E; Honer, William G

    2016-01-01

    Rates of psychopathology are elevated in marginalized and unstably housed persons, underscoring the need for applicable clinical measures for these populations. The Positive and Negative Syndrome Scale (PANSS) is a clinical instrument principally developed for use in schizophrenia to identify the presence and severity of psychopathology symptoms. The current study investigates whether a reliable and valid PANSS factor structure emerges in a marginally housed, heterogeneous sample recruited from the Downtown Eastside of Vancouver where substance use disorders and psychiatric illness are pervasive. Participants (n = 270) underwent structured clinical assessments including the PANSS and then were randomly assigned to either exploratory (EFA) or confirmatory factor analytic (CFA) subsamples. EFA pointed to a novel three factor PANSS. This solution was supported by CFA. All retained items (28 out of 30) load significantly upon hypothesized factors and model goodness of fit analyses are in the acceptable to good range. Each of the three first-order factor constructs, labeled Psychosis/Disorganized, Negative Symptoms/Hostility, and Insight/Awareness, contributed significantly to measurement of a higher-order psychopathology construct. Further, the latent structure of this 3-factor solution appears temporally consistent over one-year. This PANSS factor structure appears valid and reliable for use in persons with multimorbidity, including substance use disorders. The structure is somewhat distinct from existing solutions likely due to the unique characteristics of this marginally housed sample.

  12. The Positive and Negative Syndrome Scale (PANSS): A Three-Factor Model of Psychopathology in Marginally Housed Persons with Substance Dependence and Psychiatric Illness

    PubMed Central

    Giesbrecht, Chantelle J.; O’Rourke, Norm; Leonova, Olga; Strehlau, Verena; Paquet, Karine; Vila-Rodriguez, Fidel; Panenka, William J.; MacEwan, G. William; Smith, Geoffrey N.; Thornton, Allen E.; Honer, William G.

    2016-01-01

    Rates of psychopathology are elevated in marginalized and unstably housed persons, underscoring the need for applicable clinical measures for these populations. The Positive and Negative Syndrome Scale (PANSS) is a clinical instrument principally developed for use in schizophrenia to identify the presence and severity of psychopathology symptoms. The current study investigates whether a reliable and valid PANSS factor structure emerges in a marginally housed, heterogeneous sample recruited from the Downtown Eastside of Vancouver where substance use disorders and psychiatric illness are pervasive. Participants (n = 270) underwent structured clinical assessments including the PANSS and then were randomly assigned to either exploratory (EFA) or confirmatory factor analytic (CFA) subsamples. EFA pointed to a novel three factor PANSS. This solution was supported by CFA. All retained items (28 out of 30) load significantly upon hypothesized factors and model goodness of fit analyses are in the acceptable to good range. Each of the three first-order factor constructs, labeled Psychosis/Disorganized, Negative Symptoms/Hostility, and Insight/Awareness, contributed significantly to measurement of a higher-order psychopathology construct. Further, the latent structure of this 3-factor solution appears temporally consistent over one-year. This PANSS factor structure appears valid and reliable for use in persons with multimorbidity, including substance use disorders. The structure is somewhat distinct from existing solutions likely due to the unique characteristics of this marginally housed sample. PMID:26999280

  13. Integration of system identification and finite element modelling of nonlinear vibrating structures

    NASA Astrophysics Data System (ADS)

    Cooper, Samson B.; DiMaio, Dario; Ewins, David J.

    2018-03-01

    The Finite Element Method (FEM), Experimental modal analysis (EMA) and other linear analysis techniques have been established as reliable tools for the dynamic analysis of engineering structures. They are often used to provide solutions to small and large structures and other variety of cases in structural dynamics, even those exhibiting a certain degree of nonlinearity. Unfortunately, when the nonlinear effects are substantial or the accuracy of the predicted response is of vital importance, a linear finite element model will generally prove to be unsatisfactory. As a result, the validated linear FE model requires further enhancement so that it can represent and predict the nonlinear behaviour exhibited by the structure. In this paper, a pragmatic approach to integrating test-based system identification and FE modelling of a nonlinear structure is presented. This integration is based on three different phases: the first phase involves the derivation of an Underlying Linear Model (ULM) of the structure, the second phase includes experiment-based nonlinear identification using measured time series and the third phase covers augmenting the linear FE model and experimental validation of the nonlinear FE model. The proposed case study is demonstrated on a twin cantilever beam assembly coupled with a flexible arch shaped beam. In this case, polynomial-type nonlinearities are identified and validated with force-controlled stepped-sine test data at several excitation levels.

  14. Delamination Modeling of Composites for Improved Crash Analysis

    NASA Technical Reports Server (NTRS)

    Fleming, David C.

    1999-01-01

    Finite element crash modeling of composite structures is limited by the inability of current commercial crash codes to accurately model delamination growth. Efforts are made to implement and assess delamination modeling techniques using a current finite element crash code, MSC/DYTRAN. Three methods are evaluated, including a straightforward method based on monitoring forces in elements or constraints representing an interface; a cohesive fracture model proposed in the literature; and the virtual crack closure technique commonly used in fracture mechanics. Results are compared with dynamic double cantilever beam test data from the literature. Examples show that it is possible to accurately model delamination propagation in this case. However, the computational demands required for accurate solution are great and reliable property data may not be available to support general crash modeling efforts. Additional examples are modeled including an impact-loaded beam, damage initiation in laminated crushing specimens, and a scaled aircraft subfloor structures in which composite sandwich structures are used as energy-absorbing elements. These examples illustrate some of the difficulties in modeling delamination as part of a finite element crash analysis.

  15. Global remote sensing of water-chlorophyll ratio in terrestrial plant leaves.

    PubMed

    Kushida, Keiji

    2012-10-01

    I evaluated the use of global remote sensing techniques for estimating plant leaf chlorophyll a + b (C(ab); μg cm(-2)) and water (C(w); mg cm(-2)) concentrations as well as the ratio of C(w)/C(ab) with the PROSAIL model under possible distributions for leaf and soil spectra, leaf area index (LAI), canopy geometric structure, and leaf size. First, I estimated LAI from the normalized difference vegetation index. I found that, at LAI values <2, C(ab), C(w), and C(w)/C(ab) could not be reliably estimated. At LAI values >2, C(ab) and C(w) could be estimated for only restricted ranges of the canopy structure; however, the ratio of C(w)/C(ab) could be reliably estimated for a variety of possible canopy structures with coefficients of determination (R(2)) ranging from 0.56 to 0.90. The remote estimation of the C(w)/C(ab) ratio from satellites offers information on plant condition at a global scale.

  16. The Reliability and Validity of the Power-Load-Margin Inventory: A Rasch Analysis.

    PubMed

    Hardigan, Patrick C; Cohen, Stanley R; Hagen, Kathleen P

    2015-01-01

    Margin is a function of the relationship of stress to strength. The greater the margin, the more likely students are able to successfully navigate academic structures. This study examined the psychometric properties of a newly created instrument designed to measure margin - the Power-Load-Margin Inventory (PLMI). The PLMI was created using eight domains: (A) Student's aptitude and ability, (B) Course structure, (C) External motivation, (D) Student health, (E) Instructor style, (F) Internal motivation, (G) Life opportunities, and (H) University support structure. A three-point response scale was used to measure the domains: (1) stress, (2) neither stress nor strength, and (3) strength. The PLMI was administered to 586 medical, dental, and pharmacy students. A Rasch rating scale model was used to examine the psychometric properties of the PLMI. The PLMI demonstrated acceptable psychometric properties for use with pharmacy, dental, and medical students. The PLMI's primary weakness was with the subscales' reliability. We attribute this to the small number of items per subscale.

  17. An experimental investigation of fault tolerant software structures in an avionics application

    NASA Technical Reports Server (NTRS)

    Caglayan, Alper K.; Eckhardt, Dave E., Jr.

    1989-01-01

    The objective of this experimental investigation is to compare the functional performance and software reliability of competing fault tolerant software structures utilizing software diversity. In this experiment, three versions of the redundancy management software for a skewed sensor array have been developed using three diverse failure detection and isolation algorithms and incorporated into various N-version, recovery block and hybrid software structures. The empirical results show that, for maximum functional performance improvement in the selected application domain, the results of diverse algorithms should be voted before being processed by multiple versions without enforced diversity. Results also suggest that when the reliability gain with an N-version structure is modest, recovery block structures are more feasible since higher reliability can be obtained using an acceptance check with a modest reliability.

  18. Large-scale structure prediction by improved contact predictions and model quality assessment.

    PubMed

    Michel, Mirco; Menéndez Hurtado, David; Uziela, Karolis; Elofsson, Arne

    2017-07-15

    Accurate contact predictions can be used for predicting the structure of proteins. Until recently these methods were limited to very big protein families, decreasing their utility. However, recent progress by combining direct coupling analysis with machine learning methods has made it possible to predict accurate contact maps for smaller families. To what extent these predictions can be used to produce accurate models of the families is not known. We present the PconsFold2 pipeline that uses contact predictions from PconsC3, the CONFOLD folding algorithm and model quality estimations to predict the structure of a protein. We show that the model quality estimation significantly increases the number of models that reliably can be identified. Finally, we apply PconsFold2 to 6379 Pfam families of unknown structure and find that PconsFold2 can, with an estimated 90% specificity, predict the structure of up to 558 Pfam families of unknown structure. Out of these, 415 have not been reported before. Datasets as well as models of all the 558 Pfam families are available at http://c3.pcons.net/ . All programs used here are freely available. arne@bioinfo.se. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com

  19. Road safety forecasts in five European countries using structural time series models.

    PubMed

    Antoniou, Constantinos; Papadimitriou, Eleonora; Yannis, George

    2014-01-01

    Modeling road safety development is a complex task and needs to consider both the quantifiable impact of specific parameters as well as the underlying trends that cannot always be measured or observed. The objective of this research is to apply structural time series models for obtaining reliable medium- to long-term forecasts of road traffic fatality risk using data from 5 countries with different characteristics from all over Europe (Cyprus, Greece, Hungary, Norway, and Switzerland). Two structural time series models are considered: (1) the local linear trend model and the (2) latent risk time series model. Furthermore, a structured decision tree for the selection of the applicable model for each situation (developed within the Road Safety Data, Collection, Transfer and Analysis [DaCoTA] research project, cofunded by the European Commission) is outlined. First, the fatality and exposure data that are used for the development of the models are presented and explored. Then, the modeling process is presented, including the model selection process, introduction of intervention variables, and development of mobility scenarios. The forecasts using the developed models appear to be realistic and within acceptable confidence intervals. The proposed methodology is proved to be very efficient for handling different cases of data availability and quality, providing an appropriate alternative from the family of structural time series models in each country. A concluding section providing perspectives and directions for future research is presented.

  20. Protein-Protein Interface Predictions by Data-Driven Methods: A Review

    PubMed Central

    Xue, Li C; Dobbs, Drena; Bonvin, Alexandre M.J.J.; Honavar, Vasant

    2015-01-01

    Reliably pinpointing which specific amino acid residues form the interface(s) between a protein and its binding partner(s) is critical for understanding the structural and physicochemical determinants of protein recognition and binding affinity, and has wide applications in modeling and validating protein interactions predicted by high-throughput methods, in engineering proteins, and in prioritizing drug targets. Here, we review the basic concepts, principles and recent advances in computational approaches to the analysis and prediction of protein-protein interfaces. We point out caveats for objectively evaluating interface predictors, and discuss various applications of data-driven interface predictors for improving energy model-driven protein-protein docking. Finally, we stress the importance of exploiting binding partner information in reliably predicting interfaces and highlight recent advances in this emerging direction. PMID:26460190

  1. CARES - CERAMICS ANALYSIS AND RELIABILITY EVALUATION OF STRUCTURES

    NASA Technical Reports Server (NTRS)

    Nemeth, N. N.

    1994-01-01

    The beneficial properties of structural ceramics include their high-temperature strength, light weight, hardness, and corrosion and oxidation resistance. For advanced heat engines, ceramics have demonstrated functional abilities at temperatures well beyond the operational limits of metals. This is offset by the fact that ceramic materials tend to be brittle. When a load is applied, their lack of significant plastic deformation causes the material to crack at microscopic flaws, destroying the component. CARES calculates the fast-fracture reliability or failure probability of macroscopically isotropic ceramic components. These components may be subjected to complex thermomechanical loadings. The program uses results from a commercial structural analysis program (MSC/NASTRAN or ANSYS) to evaluate component reliability due to inherent surface and/or volume type flaws. A multiple material capability allows the finite element model reliability to be a function of many different ceramic material statistical characterizations. The reliability analysis uses element stress, temperature, area, and volume output, which are obtained from two dimensional shell and three dimensional solid isoparametric or axisymmetric finite elements. CARES utilizes the Batdorf model and the two-parameter Weibull cumulative distribution function to describe the effects of multi-axial stress states on material strength. The shear-sensitive Batdorf model requires a user-selected flaw geometry and a mixed-mode fracture criterion. Flaws intersecting the surface and imperfections embedded in the volume can be modeled. The total strain energy release rate theory is used as a mixed mode fracture criterion for co-planar crack extension. Out-of-plane crack extension criteria are approximated by a simple equation with a semi-empirical constant that can model the maximum tangential stress theory, the minimum strain energy density criterion, the maximum strain energy release rate theory, or experimental results. For comparison, Griffith's maximum tensile stress theory, the principle of independent action, and the Weibull normal stress averaging models are also included. Weibull material strength parameters, the Batdorf crack density coefficient, and other related statistical quantities are estimated from four-point bend bar or uniform uniaxial tensile specimen fracture strength data. Parameter estimation can be performed for single or multiple failure modes by using the least-squares analysis or the maximum likelihood method. A more limited program, CARES/PC (COSMIC number LEW-15248) runs on a personal computer and estimates ceramic material properties from three-point bend bar data. CARES/PC does not perform fast fracture reliability estimation. CARES is written in FORTRAN 77 and has been implemented on DEC VAX series computers under VMS and on IBM 370 series computers under VM/CMS. On a VAX, CARES requires 10Mb of main memory. Five MSC/NASTRAN example problems and two ANSYS example problems are provided. There are two versions of CARES supplied on the distribution tape, CARES1 and CARES2. CARES2 contains sub-elements and CARES1 does not. CARES is available on a 9-track 1600 BPI VAX FILES-11 format magnetic tape (standard media) or in VAX BACKUP format on a TK50 tape cartridge. The program requires a FORTRAN 77 compiler and about 12Mb memory. CARES was developed in 1990. DEC, VAX and VMS are trademarks of Digital Equipment Corporation. IBM 370 is a trademark of International Business Machines. MSC/NASTRAN is a trademark of MacNeal-Schwendler Corporation. ANSYS is a trademark of Swanson Analysis Systems, Inc.

  2. Rapid and reliable protein structure determination via chemical shift threading.

    PubMed

    Hafsa, Noor E; Berjanskii, Mark V; Arndt, David; Wishart, David S

    2018-01-01

    Protein structure determination using nuclear magnetic resonance (NMR) spectroscopy can be both time-consuming and labor intensive. Here we demonstrate how chemical shift threading can permit rapid, robust, and accurate protein structure determination using only chemical shift data. Threading is a relatively old bioinformatics technique that uses a combination of sequence information and predicted (or experimentally acquired) low-resolution structural data to generate high-resolution 3D protein structures. The key motivations behind using NMR chemical shifts for protein threading lie in the fact that they are easy to measure, they are available prior to 3D structure determination, and they contain vital structural information. The method we have developed uses not only sequence and chemical shift similarity but also chemical shift-derived secondary structure, shift-derived super-secondary structure, and shift-derived accessible surface area to generate a high quality protein structure regardless of the sequence similarity (or lack thereof) to a known structure already in the PDB. The method (called E-Thrifty) was found to be very fast (often < 10 min/structure) and to significantly outperform other shift-based or threading-based structure determination methods (in terms of top template model accuracy)-with an average TM-score performance of 0.68 (vs. 0.50-0.62 for other methods). Coupled with recent developments in chemical shift refinement, these results suggest that protein structure determination, using only NMR chemical shifts, is becoming increasingly practical and reliable. E-Thrifty is available as a web server at http://ethrifty.ca .

  3. Development, validity and reliability of the short multidimensional positive mental health instrument.

    PubMed

    Vaingankar, Janhavi Ajit; Subramaniam, Mythily; Abdin, Edimansyah; Picco, Louisa; Chua, Boon Yiang; Eng, Goi Khia; Sambasivam, Rajeswari; Shafie, Saleha; Zhang, Yunjue; Chong, Siow Ann

    2014-06-01

    The 47-item positive mental health (PMH) instrument measures the level of PMH in multiethnic adult Asian populations. This study aimed to (1) develop a short PMH instrument and (2) establish its validity and reliability among the adult Singapore population. Two separate studies were conducted among adult community-dwelling Singapore residents of Chinese, Malay or Indian ethnicity where participants completed self-administered questionnaires. In the first study, secondary data analysis was conducted using confirmatory factor analysis (CFA) to shorten the PMH instrument. In the second study, the newly developed short PMH instrument and other scales were administered to 201 residents to establish its factor structure, validity and reliability. A 20-item short PMH instrument fulfilling a higher-order six-factor structure was developed following secondary analysis. The mean age of the participants in the second study was 41 years and about 53% were women. One item with poor factor loading was further removed to generate a 19-item version of the PMH instrument. CFA demonstrated a first-order six-factor model of the short PMH instrument. The PMH-19 instrument and its subscales fulfilled criterion validity hypotheses. Internal consistency and test-retest reliability of the PMH-19 instrument were high (Cronbach's α coefficient = 0.87; intraclass correlation coefficient = 0.93, respectively). The 19-item PMH instrument is multidimensional, valid and reliable, and most importantly, with its reduced administration time, the short PMH instrument can be used to measure and evaluate PMH in Asian communities.

  4. Rapid Modeling and Analysis Tools: Evolution, Status, Needs and Directions

    NASA Technical Reports Server (NTRS)

    Knight, Norman F., Jr.; Stone, Thomas J.; Ransom, Jonathan B. (Technical Monitor)

    2002-01-01

    Advanced aerospace systems are becoming increasingly more complex, and customers are demanding lower cost, higher performance, and high reliability. Increased demands are placed on the design engineers to collaborate and integrate design needs and objectives early in the design process to minimize risks that may occur later in the design development stage. High performance systems require better understanding of system sensitivities much earlier in the design process to meet these goals. The knowledge, skills, intuition, and experience of an individual design engineer will need to be extended significantly for the next generation of aerospace system designs. Then a collaborative effort involving the designer, rapid and reliable analysis tools and virtual experts will result in advanced aerospace systems that are safe, reliable, and efficient. This paper discusses the evolution, status, needs and directions for rapid modeling and analysis tools for structural analysis. First, the evolution of computerized design and analysis tools is briefly described. Next, the status of representative design and analysis tools is described along with a brief statement on their functionality. Then technology advancements to achieve rapid modeling and analysis are identified. Finally, potential future directions including possible prototype configurations are proposed.

  5. Best Practices for Reliable and Robust Spacecraft Structures

    NASA Technical Reports Server (NTRS)

    Raju, Ivatury S.; Murthy, P. L. N.; Patel, Naresh R.; Bonacuse, Peter J.; Elliott, Kenny B.; Gordon, S. A.; Gyekenyesi, J. P.; Daso, E. O.; Aggarwal, P.; Tillman, R. F.

    2007-01-01

    A study was undertaken to capture the best practices for the development of reliable and robust spacecraft structures for NASA s next generation cargo and crewed launch vehicles. In this study, the NASA heritage programs such as Mercury, Gemini, Apollo, and the Space Shuttle program were examined. A series of lessons learned during the NASA and DoD heritage programs are captured. The processes that "make the right structural system" are examined along with the processes to "make the structural system right". The impact of technology advancements in materials and analysis and testing methods on reliability and robustness of spacecraft structures is studied. The best practices and lessons learned are extracted from these studies. Since the first human space flight, the best practices for reliable and robust spacecraft structures appear to be well established, understood, and articulated by each generation of designers and engineers. However, these best practices apparently have not always been followed. When the best practices are ignored or short cuts are taken, risks accumulate, and reliability suffers. Thus program managers need to be vigilant of circumstances and situations that tend to violate best practices. Adherence to the best practices may help develop spacecraft systems with high reliability and robustness against certain anomalies and unforeseen events.

  6. Development and Validation of Two Scales to Measure Elaboration and Behaviors Associated with Stewardship in Children

    ERIC Educational Resources Information Center

    Vezeau, Susan Lynn; Powell, Robert B.; Stern, Marc J.; Moore, D. DeWayne; Wright, Brett A.

    2017-01-01

    This investigation examines the development of two scales that measure elaboration and behaviors associated with stewardship in children. The scales were developed using confirmatory factor analysis to investigate their construct validity, reliability, and psychometric properties. Results suggest that a second-order factor model structure provides…

  7. High Power Microwave Tube Reliability Study

    DTIC Science & Technology

    1976-08-01

    Factors . . . . . ................ 67 1. Environmental Factors . . . . . . . . . a. Ground Fixed ...... .......... 67 b. Ground Mobile ...including cube structure and operating parameters as factors in the models but also environment and aplication . Initially, the tubes to be included in...instLllations. Mobile ground based and seagoing systems have minimum restrictions, spacecraft systems the maximum and airborne system Y 6 .*.. restrictionts

  8. A support architecture for reliable distributed computing systems

    NASA Technical Reports Server (NTRS)

    Dasgupta, Partha; Leblanc, Richard J., Jr.

    1988-01-01

    The Clouds project is well underway to its goal of building a unified distributed operating system supporting the object model. The operating system design uses the object concept of structuring software at all levels of the system. The basic operating system was developed and work is under progress to build a usable system.

  9. Health Sciences-Evidence Based Practice questionnaire (HS-EBP) for measuring transprofessional evidence-based practice: Creation, development and psychometric validation

    PubMed Central

    Fernández-Domínguez, Juan Carlos; de Pedro-Gómez, Joan Ernest; Morales-Asencio, José Miguel; Sastre-Fullana, Pedro; Sesé-Abad, Albert

    2017-01-01

    Introduction Most of the EBP measuring instruments available to date present limitations both in the operationalisation of the construct and also in the rigour of their psychometric development, as revealed in the literature review performed. The aim of this paper is to provide rigorous and adequate reliability and validity evidence of the scores of a new transdisciplinary psychometric tool, the Health Sciences Evidence-Based Practice (HS-EBP), for measuring the construct EBP in Health Sciences professionals. Methods A pilot study and a subsequent two-stage validation test sample were conducted to progressively refine the instrument until a reduced 60-item version with a five-factor latent structure. Reliability was analysed through both Cronbach’s alpha coefficient and intraclass correlations (ICC). Latent structure was contrasted using confirmatory factor analysis (CFA) following a model comparison aproach. Evidence of criterion validity of the scores obtained was achieved by considering attitudinal resistance to change, burnout, and quality of professional life as criterion variables; while convergent validity was assessed using the Spanish version of the Evidence-Based Practice Questionnaire (EBPQ-19). Results Adequate evidence of both reliability and ICC was obtained for the five dimensions of the questionnaire. According to the CFA model comparison, the best fit corresponded to the five-factor model (RMSEA = 0.049; CI 90% RMSEA = [0.047; 0.050]; CFI = 0.99). Adequate criterion and convergent validity evidence was also provided. Finally, the HS-EBP showed the capability to find differences between EBP training levels as an important evidence of decision validity. Conclusions Reliability and validity evidence obtained regarding the HS-EBP confirm the adequate operationalisation of the EBP construct as a process put into practice to respond to every clinical situation arising in the daily practice of professionals in health sciences (transprofessional). The tool could be useful for EBP individual assessment and for evaluating the impact of specific interventions to improve EBP. PMID:28486533

  10. Health Sciences-Evidence Based Practice questionnaire (HS-EBP) for measuring transprofessional evidence-based practice: Creation, development and psychometric validation.

    PubMed

    Fernández-Domínguez, Juan Carlos; de Pedro-Gómez, Joan Ernest; Morales-Asencio, José Miguel; Bennasar-Veny, Miquel; Sastre-Fullana, Pedro; Sesé-Abad, Albert

    2017-01-01

    Most of the EBP measuring instruments available to date present limitations both in the operationalisation of the construct and also in the rigour of their psychometric development, as revealed in the literature review performed. The aim of this paper is to provide rigorous and adequate reliability and validity evidence of the scores of a new transdisciplinary psychometric tool, the Health Sciences Evidence-Based Practice (HS-EBP), for measuring the construct EBP in Health Sciences professionals. A pilot study and a subsequent two-stage validation test sample were conducted to progressively refine the instrument until a reduced 60-item version with a five-factor latent structure. Reliability was analysed through both Cronbach's alpha coefficient and intraclass correlations (ICC). Latent structure was contrasted using confirmatory factor analysis (CFA) following a model comparison aproach. Evidence of criterion validity of the scores obtained was achieved by considering attitudinal resistance to change, burnout, and quality of professional life as criterion variables; while convergent validity was assessed using the Spanish version of the Evidence-Based Practice Questionnaire (EBPQ-19). Adequate evidence of both reliability and ICC was obtained for the five dimensions of the questionnaire. According to the CFA model comparison, the best fit corresponded to the five-factor model (RMSEA = 0.049; CI 90% RMSEA = [0.047; 0.050]; CFI = 0.99). Adequate criterion and convergent validity evidence was also provided. Finally, the HS-EBP showed the capability to find differences between EBP training levels as an important evidence of decision validity. Reliability and validity evidence obtained regarding the HS-EBP confirm the adequate operationalisation of the EBP construct as a process put into practice to respond to every clinical situation arising in the daily practice of professionals in health sciences (transprofessional). The tool could be useful for EBP individual assessment and for evaluating the impact of specific interventions to improve EBP.

  11. Reliability models: the influence of model specification in generation expansion planning

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stremel, J.P.

    1982-10-01

    This paper is a critical evaluation of reliability methods used for generation expansion planning. It is shown that the methods for treating uncertainty are critical for determining the relative reliability value of expansion alternatives. It is also shown that the specification of the reliability model will not favor all expansion options equally. Consequently, the model is biased. In addition, reliability models should be augmented with an economic value of reliability (such as the cost of emergency procedures or energy not served). Generation expansion evaluations which ignore the economic value of excess reliability can be shown to be inconsistent. The conclusionsmore » are that, in general, a reliability model simplifies generation expansion planning evaluations. However, for a thorough analysis, the expansion options should be reviewed for candidates which may be unduly rejected because of the bias of the reliability model. And this implies that for a consistent formulation in an optimization framework, the reliability model should be replaced with a full economic optimization which includes the costs of emergency procedures and interruptions in the objective function.« less

  12. Experimental Protocol to Determine the Chloride Threshold Value for Corrosion in Samples Taken from Reinforced Concrete Structures

    PubMed Central

    Angst, Ueli M.; Boschmann, Carolina; Wagner, Matthias; Elsener, Bernhard

    2017-01-01

    The aging of reinforced concrete infrastructure in developed countries imposes an urgent need for methods to reliably assess the condition of these structures. Corrosion of the embedded reinforcing steel is the most frequent cause for degradation. While it is well known that the ability of a structure to withstand corrosion depends strongly on factors such as the materials used or the age, it is common practice to rely on threshold values stipulated in standards or textbooks. These threshold values for corrosion initiation (Ccrit) are independent of the actual properties of a certain structure, which clearly limits the accuracy of condition assessments and service life predictions. The practice of using tabulated values can be traced to the lack of reliable methods to determine Ccrit on-site and in the laboratory. Here, an experimental protocol to determine Ccrit for individual engineering structures or structural members is presented. A number of reinforced concrete samples are taken from structures and laboratory corrosion testing is performed. The main advantage of this method is that it ensures real conditions concerning parameters that are well known to greatly influence Ccrit, such as the steel-concrete interface, which cannot be representatively mimicked in laboratory-produced samples. At the same time, the accelerated corrosion test in the laboratory permits the reliable determination of Ccrit prior to corrosion initiation on the tested structure; this is a major advantage over all common condition assessment methods that only permit estimating the conditions for corrosion after initiation, i.e., when the structure is already damaged. The protocol yields the statistical distribution of Ccrit for the tested structure. This serves as a basis for probabilistic prediction models for the remaining time to corrosion, which is needed for maintenance planning. This method can potentially be used in material testing of civil infrastructures, similar to established methods used for mechanical testing. PMID:28892023

  13. Experimental Protocol to Determine the Chloride Threshold Value for Corrosion in Samples Taken from Reinforced Concrete Structures.

    PubMed

    Angst, Ueli M; Boschmann, Carolina; Wagner, Matthias; Elsener, Bernhard

    2017-08-31

    The aging of reinforced concrete infrastructure in developed countries imposes an urgent need for methods to reliably assess the condition of these structures. Corrosion of the embedded reinforcing steel is the most frequent cause for degradation. While it is well known that the ability of a structure to withstand corrosion depends strongly on factors such as the materials used or the age, it is common practice to rely on threshold values stipulated in standards or textbooks. These threshold values for corrosion initiation (Ccrit) are independent of the actual properties of a certain structure, which clearly limits the accuracy of condition assessments and service life predictions. The practice of using tabulated values can be traced to the lack of reliable methods to determine Ccrit on-site and in the laboratory. Here, an experimental protocol to determine Ccrit for individual engineering structures or structural members is presented. A number of reinforced concrete samples are taken from structures and laboratory corrosion testing is performed. The main advantage of this method is that it ensures real conditions concerning parameters that are well known to greatly influence Ccrit, such as the steel-concrete interface, which cannot be representatively mimicked in laboratory-produced samples. At the same time, the accelerated corrosion test in the laboratory permits the reliable determination of Ccrit prior to corrosion initiation on the tested structure; this is a major advantage over all common condition assessment methods that only permit estimating the conditions for corrosion after initiation, i.e., when the structure is already damaged. The protocol yields the statistical distribution of Ccrit for the tested structure. This serves as a basis for probabilistic prediction models for the remaining time to corrosion, which is needed for maintenance planning. This method can potentially be used in material testing of civil infrastructures, similar to established methods used for mechanical testing.

  14. System and Software Reliability (C103)

    NASA Technical Reports Server (NTRS)

    Wallace, Dolores

    2003-01-01

    Within the last decade better reliability models (hardware. software, system) than those currently used have been theorized and developed but not implemented in practice. Previous research on software reliability has shown that while some existing software reliability models are practical, they are no accurate enough. New paradigms of development (e.g. OO) have appeared and associated reliability models have been proposed posed but not investigated. Hardware models have been extensively investigated but not integrated into a system framework. System reliability modeling is the weakest of the three. NASA engineers need better methods and tools to demonstrate that the products meet NASA requirements for reliability measurement. For the new models for the software component of the last decade, there is a great need to bring them into a form that they can be used on software intensive systems. The Statistical Modeling and Estimation of Reliability Functions for Systems (SMERFS'3) tool is an existing vehicle that may be used to incorporate these new modeling advances. Adapting some existing software reliability modeling changes to accommodate major changes in software development technology may also show substantial improvement in prediction accuracy. With some additional research, the next step is to identify and investigate system reliability. System reliability models could then be incorporated in a tool such as SMERFS'3. This tool with better models would greatly add value in assess in GSFC projects.

  15. Study of structural reliability of existing concrete structures

    NASA Astrophysics Data System (ADS)

    Druķis, P.; Gaile, L.; Valtere, K.; Pakrastiņš, L.; Goremikins, V.

    2017-10-01

    Structural reliability of buildings has become an important issue after the collapse of a shopping center in Riga 21.11.2013, caused the death of 54 people. The reliability of a building is the practice of designing, constructing, operating, maintaining and removing buildings in ways that ensure maintained health, ward suffered injuries or death due to use of the building. Evaluation and improvement of existing buildings is becoming more and more important. For a large part of existing buildings, the design life has been reached or will be reached in the near future. The structures of these buildings need to be reassessed in order to find out whether the safety requirements are met. The safety requirements provided by the Eurocodes are a starting point for the assessment of safety. However, it would be uneconomical to require all existing buildings and structures to comply fully with these new codes and corresponding safety levels, therefore the assessment of existing buildings differs with each design situation. This case study describes the simple and practical procedure of determination of minimal reliability index β of existing concrete structures designed by different codes than Eurocodes and allows to reassess the actual reliability level of different structural elements of existing buildings under design load.

  16. A damage mechanics based approach to structural deterioration and reliability

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bhattcharya, B.; Ellingwood, B.

    1998-02-01

    Structural deterioration often occurs without perceptible manifestation. Continuum damage mechanics defines structural damage in terms of the material microstructure, and relates the damage variable to the macroscopic strength or stiffness of the structure. This enables one to predict the state of damage prior to the initiation of a macroscopic flaw, and allows one to estimate residual strength/service life of an existing structure. The accumulation of damage is a dissipative process that is governed by the laws of thermodynamics. Partial differential equations for damage growth in terms of the Helmholtz free energy are derived from fundamental thermodynamical conditions. Closed-form solutions tomore » the equations are obtained under uniaxial loading for ductile deformation damage as a function of plastic strain, for creep damage as a function of time, and for fatigue damage as function of number of cycles. The proposed damage growth model is extended into the stochastic domain by considering fluctuations in the free energy, and closed-form solutions of the resulting stochastic differential equation are obtained in each of the three cases mentioned above. A reliability analysis of a ring-stiffened cylindrical steel shell subjected to corrosion, accidental pressure, and temperature is performed.« less

  17. Development and reliability of a structured interview guide for the Montgomery Asberg Depression Rating Scale (SIGMA).

    PubMed

    Williams, Janet B W; Kobak, Kenneth A

    2008-01-01

    The Montgomery-Asberg Depression Rating Scale (MADRS) is often used in clinical trials to select patients and to assess treatment efficacy. The scale was originally published without suggested questions for clinicians to use in gathering the information necessary to rate the items. Structured and semi-structured interview guides have been found to improve reliability with other scales. To describe the development and test-retest reliability of a structured interview guide for the MADRS (SIGMA). A total of 162 test-retest interviews were conducted by 81 rater pairs. Each patient was interviewed twice, once by each rater conducting an independent interview. The intraclass correlation for total score between raters using the SIGMA was r=0.93, P<0.0001. All ten items had good to excellent interrater reliability. Use of the SIGMA can result in high reliability of MADRS scores in evaluating patients with depression.

  18. A New Calibration Method for Commercial RGB-D Sensors.

    PubMed

    Darwish, Walid; Tang, Shenjun; Li, Wenbin; Chen, Wu

    2017-05-24

    Commercial RGB-D sensors such as Kinect and Structure Sensors have been widely used in the game industry, where geometric fidelity is not of utmost importance. For applications in which high quality 3D is required, i.e., 3D building models of centimeter‑level accuracy, accurate and reliable calibrations of these sensors are required. This paper presents a new model for calibrating the depth measurements of RGB-D sensors based on the structured light concept. Additionally, a new automatic method is proposed for the calibration of all RGB-D parameters, including internal calibration parameters for all cameras, the baseline between the infrared and RGB cameras, and the depth error model. When compared with traditional calibration methods, this new model shows a significant improvement in depth precision for both near and far ranges.

  19. Software reliability models for critical applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pham, H.; Pham, M.

    This report presents the results of the first phase of the ongoing EG G Idaho, Inc. Software Reliability Research Program. The program is studying the existing software reliability models and proposes a state-of-the-art software reliability model that is relevant to the nuclear reactor control environment. This report consists of three parts: (1) summaries of the literature review of existing software reliability and fault tolerant software reliability models and their related issues, (2) proposed technique for software reliability enhancement, and (3) general discussion and future research. The development of this proposed state-of-the-art software reliability model will be performed in the secondmore » place. 407 refs., 4 figs., 2 tabs.« less

  20. Software reliability models for critical applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pham, H.; Pham, M.

    This report presents the results of the first phase of the ongoing EG&G Idaho, Inc. Software Reliability Research Program. The program is studying the existing software reliability models and proposes a state-of-the-art software reliability model that is relevant to the nuclear reactor control environment. This report consists of three parts: (1) summaries of the literature review of existing software reliability and fault tolerant software reliability models and their related issues, (2) proposed technique for software reliability enhancement, and (3) general discussion and future research. The development of this proposed state-of-the-art software reliability model will be performed in the second place.more » 407 refs., 4 figs., 2 tabs.« less

Top