Neurology objective structured clinical examination reliability using generalizability theory
Park, Yoon Soo; Lukas, Rimas V.; Brorson, James R.
2015-01-01
Objectives: This study examines factors affecting reliability, or consistency of assessment scores, from an objective structured clinical examination (OSCE) in neurology through generalizability theory (G theory). Methods: Data include assessments from a multistation OSCE taken by 194 medical students at the completion of a neurology clerkship. Facets evaluated in this study include cases, domains, and items. Domains refer to areas of skill (or constructs) that the OSCE measures. G theory is used to estimate variance components associated with each facet, derive reliability, and project the number of cases required to obtain a reliable (consistent, precise) score. Results: Reliability using G theory is moderate (Φ coefficient = 0.61, G coefficient = 0.64). Performance is similar across cases but differs by the particular domain, such that the majority of variance is attributed to the domain. Projections in reliability estimates reveal that students need to participate in 3 OSCE cases in order to increase reliability beyond the 0.70 threshold. Conclusions: This novel use of G theory in evaluating an OSCE in neurology provides meaningful measurement characteristics of the assessment. Differing from prior work in other medical specialties, the cases students were randomly assigned did not influence their OSCE score; rather, scores varied in expected fashion by domain assessed. PMID:26432851
Neurology objective structured clinical examination reliability using generalizability theory.
Blood, Angela D; Park, Yoon Soo; Lukas, Rimas V; Brorson, James R
2015-11-03
This study examines factors affecting reliability, or consistency of assessment scores, from an objective structured clinical examination (OSCE) in neurology through generalizability theory (G theory). Data include assessments from a multistation OSCE taken by 194 medical students at the completion of a neurology clerkship. Facets evaluated in this study include cases, domains, and items. Domains refer to areas of skill (or constructs) that the OSCE measures. G theory is used to estimate variance components associated with each facet, derive reliability, and project the number of cases required to obtain a reliable (consistent, precise) score. Reliability using G theory is moderate (Φ coefficient = 0.61, G coefficient = 0.64). Performance is similar across cases but differs by the particular domain, such that the majority of variance is attributed to the domain. Projections in reliability estimates reveal that students need to participate in 3 OSCE cases in order to increase reliability beyond the 0.70 threshold. This novel use of G theory in evaluating an OSCE in neurology provides meaningful measurement characteristics of the assessment. Differing from prior work in other medical specialties, the cases students were randomly assigned did not influence their OSCE score; rather, scores varied in expected fashion by domain assessed. © 2015 American Academy of Neurology.
Structural reliability analysis under evidence theory using the active learning kriging model
NASA Astrophysics Data System (ADS)
Yang, Xufeng; Liu, Yongshou; Ma, Panke
2017-11-01
Structural reliability analysis under evidence theory is investigated. It is rigorously proved that a surrogate model providing only correct sign prediction of the performance function can meet the accuracy requirement of evidence-theory-based reliability analysis. Accordingly, a method based on the active learning kriging model which only correctly predicts the sign of the performance function is proposed. Interval Monte Carlo simulation and a modified optimization method based on Karush-Kuhn-Tucker conditions are introduced to make the method more efficient in estimating the bounds of failure probability based on the kriging model. Four examples are investigated to demonstrate the efficiency and accuracy of the proposed method.
Review of Reliability-Based Design Optimization Approach and Its Integration with Bayesian Method
NASA Astrophysics Data System (ADS)
Zhang, Xiangnan
2018-03-01
A lot of uncertain factors lie in practical engineering, such as external load environment, material property, geometrical shape, initial condition, boundary condition, etc. Reliability method measures the structural safety condition and determine the optimal design parameter combination based on the probabilistic theory. Reliability-based design optimization (RBDO) is the most commonly used approach to minimize the structural cost or other performance under uncertainty variables which combines the reliability theory and optimization. However, it cannot handle the various incomplete information. The Bayesian approach is utilized to incorporate this kind of incomplete information in its uncertainty quantification. In this paper, the RBDO approach and its integration with Bayesian method are introduced.
Exploring the reliability and validity of the social-moral awareness test.
Livesey, Alexandra; Dodd, Karen; Pote, Helen; Marlow, Elizabeth
2012-11-01
The aim of the study was to explore the validity of the social-moral awareness test (SMAT) a measure designed for assessing socio-moral rule knowledge and reasoning in people with learning disabilities. Comparisons between Theory of Mind and socio-moral reasoning allowed the exploration of construct validity of the tool. Factor structure, reliability and discriminant validity were also assessed. Seventy-one participants with mild-moderate learning disabilities completed the two scales of the SMAT and two False Belief Tasks for Theory of Mind. Reliability of the SMAT was very good, and the scales were shown to be uni-dimensional in factor structure. There was a significant positive relationship between Theory of Mind and both SMAT scales. There is early evidence of the construct validity and reliability of the SMAT. Further assessment of the validity of the SMAT will be required. © 2012 Blackwell Publishing Ltd.
Reliability analysis of the objective structured clinical examination using generalizability theory.
Trejo-Mejía, Juan Andrés; Sánchez-Mendiola, Melchor; Méndez-Ramírez, Ignacio; Martínez-González, Adrián
2016-01-01
The objective structured clinical examination (OSCE) is a widely used method for assessing clinical competence in health sciences education. Studies using this method have shown evidence of validity and reliability. There are no published studies of OSCE reliability measurement with generalizability theory (G-theory) in Latin America. The aims of this study were to assess the reliability of an OSCE in medical students using G-theory and explore its usefulness for quality improvement. An observational cross-sectional study was conducted at National Autonomous University of Mexico (UNAM) Faculty of Medicine in Mexico City. A total of 278 fifth-year medical students were assessed with an 18-station OSCE in a summative end-of-career final examination. There were four exam versions. G-theory with a crossover random effects design was used to identify the main sources of variance. Examiners, standardized patients, and cases were considered as a single facet of analysis. The exam was applied to 278 medical students. The OSCE had a generalizability coefficient of 0.93. The major components of variance were stations, students, and residual error. The sites and the versions of the tests had minimum variance. Our study achieved a G coefficient similar to that found in other reports, which is acceptable for summative tests. G-theory allows the estimation of the magnitude of multiple sources of error and helps decision makers to determine the number of stations, test versions, and examiners needed to obtain reliable measurements.
Reliability analysis of the objective structured clinical examination using generalizability theory.
Trejo-Mejía, Juan Andrés; Sánchez-Mendiola, Melchor; Méndez-Ramírez, Ignacio; Martínez-González, Adrián
2016-01-01
Background The objective structured clinical examination (OSCE) is a widely used method for assessing clinical competence in health sciences education. Studies using this method have shown evidence of validity and reliability. There are no published studies of OSCE reliability measurement with generalizability theory (G-theory) in Latin America. The aims of this study were to assess the reliability of an OSCE in medical students using G-theory and explore its usefulness for quality improvement. Methods An observational cross-sectional study was conducted at National Autonomous University of Mexico (UNAM) Faculty of Medicine in Mexico City. A total of 278 fifth-year medical students were assessed with an 18-station OSCE in a summative end-of-career final examination. There were four exam versions. G-theory with a crossover random effects design was used to identify the main sources of variance. Examiners, standardized patients, and cases were considered as a single facet of analysis. Results The exam was applied to 278 medical students. The OSCE had a generalizability coefficient of 0.93. The major components of variance were stations, students, and residual error. The sites and the versions of the tests had minimum variance. Conclusions Our study achieved a G coefficient similar to that found in other reports, which is acceptable for summative tests. G-theory allows the estimation of the magnitude of multiple sources of error and helps decision makers to determine the number of stations, test versions, and examiners needed to obtain reliable measurements.
Software reliability experiments data analysis and investigation
NASA Technical Reports Server (NTRS)
Walker, J. Leslie; Caglayan, Alper K.
1991-01-01
The objectives are to investigate the fundamental reasons which cause independently developed software programs to fail dependently, and to examine fault tolerant software structures which maximize reliability gain in the presence of such dependent failure behavior. The authors used 20 redundant programs from a software reliability experiment to analyze the software errors causing coincident failures, to compare the reliability of N-version and recovery block structures composed of these programs, and to examine the impact of diversity on software reliability using subpopulations of these programs. The results indicate that both conceptually related and unrelated errors can cause coincident failures and that recovery block structures offer more reliability gain than N-version structures if acceptance checks that fail independently from the software components are available. The authors present a theory of general program checkers that have potential application for acceptance tests.
How to Map Theory: Reliable Methods Are Fruitless Without Rigorous Theory.
Gray, Kurt
2017-09-01
Good science requires both reliable methods and rigorous theory. Theory allows us to build a unified structure of knowledge, to connect the dots of individual studies and reveal the bigger picture. Some have criticized the proliferation of pet "Theories," but generic "theory" is essential to healthy science, because questions of theory are ultimately those of validity. Although reliable methods and rigorous theory are synergistic, Action Identification suggests psychological tension between them: The more we focus on methodological details, the less we notice the broader connections. Therefore, psychology needs to supplement training in methods (how to design studies and analyze data) with training in theory (how to connect studies and synthesize ideas). This article provides a technique for visually outlining theory: theory mapping. Theory mapping contains five elements, which are illustrated with moral judgment and with cars. Also included are 15 additional theory maps provided by experts in emotion, culture, priming, power, stress, ideology, morality, marketing, decision-making, and more (see all at theorymaps.org ). Theory mapping provides both precision and synthesis, which helps to resolve arguments, prevent redundancies, assess the theoretical contribution of papers, and evaluate the likelihood of surprising effects.
Toward Predictive Theories of Nuclear Reactions Across the Isotopic Chart: Web Report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Escher, J. E.; Blackmon, J.; Elster, C.
Recent years have seen exciting new developments and progress in nuclear structure theory, reaction theory, and experimental techniques, that allow us to move towards a description of exotic systems and environments, setting the stage for new discoveries. The purpose of the 5-week program was to bring together physicists from the low-energy nuclear structure and reaction communities to identify avenues for achieving reliable and predictive descriptions of reactions involving nuclei across the isotopic chart. The 4-day embedded workshop focused on connecting theory developments to experimental advances and data needs for astrophysics and other applications. Nuclear theory must address phenomena from laboratorymore » experiments to stellar environments, from stable nuclei to weakly-bound and exotic isotopes. Expanding the reach of theory to these regimes requires a comprehensive understanding of the reaction mechanisms involved as well as detailed knowledge of nuclear structure. A recurring theme throughout the program was the desire to produce reliable predictions rooted in either ab initio or microscopic approaches. At the same time it was recognized that some applications involving heavy nuclei away from stability, e.g. those involving fi ssion fragments, may need to rely on simple parameterizations of incomplete data for the foreseeable future. The goal here, however, is to subsequently improve and refine the descriptions, moving to phenomenological, then microscopic approaches. There was overarching consensus that future work should also focus on reliable estimates of errors in theoretical descriptions.« less
Krejsa, Martin; Janas, Petr; Yilmaz, Işık; Marschalko, Marian; Bouchal, Tomas
2013-01-01
The load-carrying system of each construction should fulfill several conditions which represent reliable criteria in the assessment procedure. It is the theory of structural reliability which determines probability of keeping required properties of constructions. Using this theory, it is possible to apply probabilistic computations based on the probability theory and mathematic statistics. Development of those methods has become more and more popular; it is used, in particular, in designs of load-carrying structures with the required level or reliability when at least some input variables in the design are random. The objective of this paper is to indicate the current scope which might be covered by the new method—Direct Optimized Probabilistic Calculation (DOProC) in assessments of reliability of load-carrying structures. DOProC uses a purely numerical approach without any simulation techniques. This provides more accurate solutions to probabilistic tasks, and, in some cases, such approach results in considerably faster completion of computations. DOProC can be used to solve efficiently a number of probabilistic computations. A very good sphere of application for DOProC is the assessment of the bolt reinforcement in the underground and mining workings. For the purposes above, a special software application—“Anchor”—has been developed. PMID:23935412
NASA Technical Reports Server (NTRS)
Rodal, J. J. A.; Witmer, E. A.
1979-01-01
A method of analysis for thin structures that incorporates finite strain, elastic-plastic, strain hardening, time dependent material behavior implemented with respect to a fixed configuration and is consistently valid for finite strains and finite rotations is developed. The theory is formulated systematically in a body fixed system of convected coordinates with materially embedded vectors that deform in common with continuum. Tensors are considered as linear vector functions and use is made of the dyadic representation. The kinematics of a deformable continuum is treated in detail, carefully defining precisely all quantities necessary for the analysis. The finite strain theory developed gives much better predictions and agreement with experiment than does the traditional small strain theory, and at practically no additional cost. This represents a very significant advance in the capability for the reliable prediction of nonlinear transient structural responses, including the reliable prediction of strains large enough to produce ductile metal rupture.
CARES/LIFE Ceramics Analysis and Reliability Evaluation of Structures Life Prediction Program
NASA Technical Reports Server (NTRS)
Nemeth, Noel N.; Powers, Lynn M.; Janosik, Lesley A.; Gyekenyesi, John P.
2003-01-01
This manual describes the Ceramics Analysis and Reliability Evaluation of Structures Life Prediction (CARES/LIFE) computer program. The program calculates the time-dependent reliability of monolithic ceramic components subjected to thermomechanical and/or proof test loading. CARES/LIFE is an extension of the CARES (Ceramic Analysis and Reliability Evaluation of Structures) computer program. The program uses results from MSC/NASTRAN, ABAQUS, and ANSYS finite element analysis programs to evaluate component reliability due to inherent surface and/or volume type flaws. CARES/LIFE accounts for the phenomenon of subcritical crack growth (SCG) by utilizing the power law, Paris law, or Walker law. The two-parameter Weibull cumulative distribution function is used to characterize the variation in component strength. The effects of multiaxial stresses are modeled by using either the principle of independent action (PIA), the Weibull normal stress averaging method (NSA), or the Batdorf theory. Inert strength and fatigue parameters are estimated from rupture strength data of naturally flawed specimens loaded in static, dynamic, or cyclic fatigue. The probabilistic time-dependent theories used in CARES/LIFE, along with the input and output for CARES/LIFE, are described. Example problems to demonstrate various features of the program are also included.
Reliability analysis in interdependent smart grid systems
NASA Astrophysics Data System (ADS)
Peng, Hao; Kan, Zhe; Zhao, Dandan; Han, Jianmin; Lu, Jianfeng; Hu, Zhaolong
2018-06-01
Complex network theory is a useful way to study many real complex systems. In this paper, a reliability analysis model based on complex network theory is introduced in interdependent smart grid systems. In this paper, we focus on understanding the structure of smart grid systems and studying the underlying network model, their interactions, and relationships and how cascading failures occur in the interdependent smart grid systems. We propose a practical model for interdependent smart grid systems using complex theory. Besides, based on percolation theory, we also study the effect of cascading failures effect and reveal detailed mathematical analysis of failure propagation in such systems. We analyze the reliability of our proposed model caused by random attacks or failures by calculating the size of giant functioning components in interdependent smart grid systems. Our simulation results also show that there exists a threshold for the proportion of faulty nodes, beyond which the smart grid systems collapse. Also we determine the critical values for different system parameters. In this way, the reliability analysis model based on complex network theory can be effectively utilized for anti-attack and protection purposes in interdependent smart grid systems.
Ali Morowatisharifabad, Mohammad; Abdolkarimi, Mahdi; Asadpour, Mohammad; Fathollahi, Mahmood Sheikh; Balaee, Parisa
2018-04-15
Theory-based education tailored to target behaviour and group can be effective in promoting physical activity. The purpose of this study was to examine the predictive power of Protection Motivation Theory on intent and behaviour of Physical Activity in Patients with Type 2 Diabetes. This descriptive study was conducted on 250 patients in Rafsanjan, Iran. To examine the scores of protection motivation theory structures, a researcher-made questionnaire was used. Its validity and reliability were confirmed. The level of physical activity was also measured by the International Short - form Physical Activity Inventory. Its validity and reliability were also approved. Data were analysed by statistical tests including correlation coefficient, chi-square, logistic regression and linear regression. The results revealed that there was a significant correlation between all the protection motivation theory constructs and the intention to do physical activity. The results showed that the Theory structures were able to predict 60% of the variance of physical activity intention. The results of logistic regression demonstrated that increase in the score of physical activity intent and self - efficacy increased the chance of higher level of physical activity by 3.4 and 1.5 times, respectively OR = (3.39, 1.54). Considering the ability of protection motivation theory structures to explain the physical activity behaviour, interventional designs are suggested based on the structures of this theory, especially to improve self -efficacy as the most powerful factor in predicting physical activity intention and behaviour.
NASA Technical Reports Server (NTRS)
Thomas, J. M.; Hanagud, S.
1974-01-01
The design criteria and test options for aerospace structural reliability were investigated. A decision methodology was developed for selecting a combination of structural tests and structural design factors. The decision method involves the use of Bayesian statistics and statistical decision theory. Procedures are discussed for obtaining and updating data-based probabilistic strength distributions for aerospace structures when test information is available and for obtaining subjective distributions when data are not available. The techniques used in developing the distributions are explained.
Theory of reliable systems. [systems analysis and design
NASA Technical Reports Server (NTRS)
Meyer, J. F.
1973-01-01
The analysis and design of reliable systems are discussed. The attributes of system reliability studied are fault tolerance, diagnosability, and reconfigurability. Objectives of the study include: to determine properties of system structure that are conducive to a particular attribute; to determine methods for obtaining reliable realizations of a given system; and to determine how properties of system behavior relate to the complexity of fault tolerant realizations. A list of 34 references is included.
NASA Astrophysics Data System (ADS)
Yu, Long; Xu, Juanjuan; Zhang, Lifang; Xu, Xiaogang
2018-03-01
Based on stress-strength interference theory to establish the reliability mathematical model for high temperature and high pressure multi-stage decompression control valve (HMDCV), and introduced to the temperature correction coefficient for revising material fatigue limit at high temperature. Reliability of key dangerous components and fatigue sensitivity curve of each component are calculated and analyzed by the means, which are analyzed the fatigue life of control valve and combined with reliability theory of control valve model. The impact proportion of each component on the control valve system fatigue failure was obtained. The results is shown that temperature correction factor makes the theoretical calculations of reliability more accurate, prediction life expectancy of main pressure parts accords with the technical requirements, and valve body and the sleeve have obvious influence on control system reliability, the stress concentration in key part of control valve can be reduced in the design process by improving structure.
Ali Morowatisharifabad, Mohammad; Abdolkarimi, Mahdi; Asadpour, Mohammad; Fathollahi, Mahmood Sheikh; Balaee, Parisa
2018-01-01
INTRODUCTION: Theory-based education tailored to target behaviour and group can be effective in promoting physical activity. AIM: The purpose of this study was to examine the predictive power of Protection Motivation Theory on intent and behaviour of Physical Activity in Patients with Type 2 Diabetes. METHODS: This descriptive study was conducted on 250 patients in Rafsanjan, Iran. To examine the scores of protection motivation theory structures, a researcher-made questionnaire was used. Its validity and reliability were confirmed. The level of physical activity was also measured by the International Short - form Physical Activity Inventory. Its validity and reliability were also approved. Data were analysed by statistical tests including correlation coefficient, chi-square, logistic regression and linear regression. RESULTS: The results revealed that there was a significant correlation between all the protection motivation theory constructs and the intention to do physical activity. The results showed that the Theory structures were able to predict 60% of the variance of physical activity intention. The results of logistic regression demonstrated that increase in the score of physical activity intent and self - efficacy increased the chance of higher level of physical activity by 3.4 and 1.5 times, respectively OR = (3.39, 1.54). CONCLUSION: Considering the ability of protection motivation theory structures to explain the physical activity behaviour, interventional designs are suggested based on the structures of this theory, especially to improve self -efficacy as the most powerful factor in predicting physical activity intention and behaviour. PMID:29731945
Hackett, Paul M. W.
2016-01-01
When behavior is interpreted in a reliable manner (i.e., robustly across different situations and times) its explained meaning may be seen to possess hermeneutic consistency. In this essay I present an evaluation of the hermeneutic consistency that I propose may be present when the research tool known as the mapping sentence is used to create generic structural ontologies. I also claim that theoretical and empirical validity is a likely result of employing the mapping sentence in research design and interpretation. These claims are non-contentious within the realm of quantitative psychological and behavioral research. However, I extend the scope of both facet theory based research and claims for its structural utility, reliability and validity to philosophical and qualitative investigations. I assert that the hermeneutic consistency of a structural ontology is a product of a structural representation's ontological components and the mereological relationships between these ontological sub-units: the mapping sentence seminally allows for the depiction of such structure. PMID:27065932
Study on safety level of RC beam bridges under earthquake
NASA Astrophysics Data System (ADS)
Zhao, Jun; Lin, Junqi; Liu, Jinlong; Li, Jia
2017-08-01
This study considers uncertainties in material strengths and the modeling which have important effects on structural resistance force based on reliability theory. After analyzing the destruction mechanism of a RC bridge, structural functions and the reliability were given, then the safety level of the piers of a reinforced concrete continuous girder bridge with stochastic structural parameters against earthquake was analyzed. Using response surface method to calculate the failure probabilities of bridge piers under high-level earthquake, their seismic reliability for different damage states within the design reference period were calculated applying two-stage design, which describes seismic safety level of the built bridges to some extent.
Factor Structure and Reliability of Test Items for Saudi Teacher Licence Assessment
ERIC Educational Resources Information Center
Alsadaawi, Abdullah Saleh
2017-01-01
The Saudi National Assessment Centre administers the Computer Science Teacher Test for teacher certification. The aim of this study is to explore gender differences in candidates' scores, and investigate dimensionality, reliability, and differential item functioning using confirmatory factor analysis and item response theory. The confirmatory…
Monolithic ceramic analysis using the SCARE program
NASA Technical Reports Server (NTRS)
Manderscheid, Jane M.
1988-01-01
The Structural Ceramics Analysis and Reliability Evaluation (SCARE) computer program calculates the fast fracture reliability of monolithic ceramic components. The code is a post-processor to the MSC/NASTRAN general purpose finite element program. The SCARE program automatically accepts the MSC/NASTRAN output necessary to compute reliability. This includes element stresses, temperatures, volumes, and areas. The SCARE program computes two-parameter Weibull strength distributions from input fracture data for both volume and surface flaws. The distributions can then be used to calculate the reliability of geometrically complex components subjected to multiaxial stress states. Several fracture criteria and flaw types are available for selection by the user, including out-of-plane crack extension theories. The theoretical basis for the reliability calculations was proposed by Batdorf. These models combine linear elastic fracture mechanics (LEFM) with Weibull statistics to provide a mechanistic failure criterion. Other fracture theories included in SCARE are the normal stress averaging technique and the principle of independent action. The objective of this presentation is to summarize these theories, including their limitations and advantages, and to provide a general description of the SCARE program, along with example problems.
Critical Social Class Theory for Music Education
ERIC Educational Resources Information Center
Bates, Vincent C.
2017-01-01
This work of critical social theory explores how formal music education in modern capitalist societies mirrors the hierarchical, means-ends, one-dimensional structures of capitalism. So, rather than consistently or reliably empowering and emancipating children musically, school music can tend to marginalize, exploit, repress, and alienate. The…
Probabilistic sizing of laminates with uncertainties
NASA Technical Reports Server (NTRS)
Shah, A. R.; Liaw, D. G.; Chamis, C. C.
1993-01-01
A reliability based design methodology for laminate sizing and configuration for a special case of composite structures is described. The methodology combines probabilistic composite mechanics with probabilistic structural analysis. The uncertainties of constituent materials (fiber and matrix) to predict macroscopic behavior are simulated using probabilistic theory. Uncertainties in the degradation of composite material properties are included in this design methodology. A multi-factor interaction equation is used to evaluate load and environment dependent degradation of the composite material properties at the micromechanics level. The methodology is integrated into a computer code IPACS (Integrated Probabilistic Assessment of Composite Structures). Versatility of this design approach is demonstrated by performing a multi-level probabilistic analysis to size the laminates for design structural reliability of random type structures. The results show that laminate configurations can be selected to improve the structural reliability from three failures in 1000, to no failures in one million. Results also show that the laminates with the highest reliability are the least sensitive to the loading conditions.
Mass Media Theory, Leveraging Relationships, and Reliable Strategic Communication Effects
2008-03-19
other people who are in the same social and cultural groups. Families respond to patriarchs and matriarchs , congregations respond to pastors, and teens...media to self-correct behavior in order to make society seem more “normal.” Verbal and Written Message- Centric Theories Premise of Theory Magic...Effects Harmony and Balance People gravitate toward information they already believe. Structural Functionalism When society begins to seem
Reliability analysis of structural ceramics subjected to biaxial flexure
NASA Technical Reports Server (NTRS)
Chao, Luen-Yuan; Shetty, Dinesh K.
1991-01-01
The reliability of alumina disks subjected to biaxial flexure is predicted on the basis of statistical fracture theory using a critical strain energy release rate fracture criterion. Results on a sintered silicon nitride are consistent with reliability predictions based on pore-initiated penny-shaped cracks with preferred orientation normal to the maximum principal stress. Assumptions with regard to flaw types and their orientations in each ceramic can be justified by fractography. It is shown that there are no universal guidelines for selecting fracture criteria or assuming flaw orientations in reliability analyses.
F + H/sub 2/ potential energy surface: the ecstasy and the agony
DOE Office of Scientific and Technical Information (OSTI.GOV)
Schaefer, H.F. III
1985-12-05
This account surveys 14 years of more or less continuing theoretical research on the FH/sub 2/ potential energy hypersurface. Early encouragement concerning the ability of theory to reliably characterize the entrance barrier for F + H/sub 2/ ..-->.. FH + H has more recently been sobered by the realization that very high levels of theory are required for this task. The importance of zero-point vibrational corrections and tunneling corrections in reliable predictions of the same activation energy is discussed. In contrast, the barrier height of H + FH ..-->.. HF + H three-center exchange stands as a prominent early successmore » of ab initio molecular electronic structure theory. 90 references, 4 figures, 6 tables.« less
Reliability analysis applied to structural tests
NASA Technical Reports Server (NTRS)
Diamond, P.; Payne, A. O.
1972-01-01
The application of reliability theory to predict, from structural fatigue test data, the risk of failure of a structure under service conditions because its load-carrying capability is progressively reduced by the extension of a fatigue crack, is considered. The procedure is applicable to both safe-life and fail-safe structures and, for a prescribed safety level, it will enable an inspection procedure to be planned or, if inspection is not feasible, it will evaluate the life to replacement. The theory has been further developed to cope with the case of structures with initial cracks, such as can occur in modern high-strength materials which are susceptible to the formation of small flaws during the production process. The method has been applied to a structure of high-strength steel and the results are compared with those obtained by the current life estimation procedures. This has shown that the conventional methods can be unconservative in certain cases, depending on the characteristics of the structure and the design operating conditions. The suitability of the probabilistic approach to the interpretation of the results from full-scale fatigue testing of aircraft structures is discussed and the assumptions involved are examined.
Li, Haibin; He, Yun; Nie, Xiaobo
2018-01-01
Structural reliability analysis under uncertainty is paid wide attention by engineers and scholars due to reflecting the structural characteristics and the bearing actual situation. The direct integration method, started from the definition of reliability theory, is easy to be understood, but there are still mathematics difficulties in the calculation of multiple integrals. Therefore, a dual neural network method is proposed for calculating multiple integrals in this paper. Dual neural network consists of two neural networks. The neural network A is used to learn the integrand function, and the neural network B is used to simulate the original function. According to the derivative relationships between the network output and the network input, the neural network B is derived from the neural network A. On this basis, the performance function of normalization is employed in the proposed method to overcome the difficulty of multiple integrations and to improve the accuracy for reliability calculations. The comparisons between the proposed method and Monte Carlo simulation method, Hasofer-Lind method, the mean value first-order second moment method have demonstrated that the proposed method is an efficient and accurate reliability method for structural reliability problems.
Issen, Laurel; Woodcock, Thomas; McNicholas, Christopher; Lennox, Laura; Reed, Julie E
2018-04-09
Despite criticisms that many quality improvement (QI) initiatives fail due to incomplete programme theory, there is no defined way to evaluate how programme theory has been articulated. The objective of this research was to develop, and assess the usability and reliability of scoring criteria to evaluate programme theory diagrams. Criteria development was informed by published literature and QI experts. Inter-rater reliability was tested between two evaluators. About 63 programme theory diagrams (42 driver diagrams and 21 action-effect diagrams) were reviewed to establish whether the criteria could support comparative analysis of different approaches to constructing diagrams. Components of the scoring criteria include: assessment of overall aim, logical overview, clarity of components, cause-effect relationships, evidence and measurement. Independent reviewers had 78% inter-rater reliability. Scoring enabled direct comparison of different approaches to developing programme theory; action-effect diagrams were found to have had a statistically significant but moderate improvement in programme theory quality over driver diagrams; no significant differences were observed based on the setting in which driver diagrams were developed. The scoring criteria summarise the necessary components of programme theory that are thought to contribute to successful QI projects. The viability of the scoring criteria for practical application was demonstrated. Future uses include assessment of individual programme theory diagrams and comparison of different approaches (e.g. methodological, teaching or other QI support) to produce programme theory. The criteria can be used as a tool to guide the production of better programme theory diagrams, and also highlights where additional support for QI teams could be needed.
Ammonia-water cation and ammonia dimer cation.
Kim, Hahn; Lee, Han Myoung
2009-06-25
We have investigated the structure, interaction energy, electronic properties, and IR spectra of the ammonia-water cation (NH(3)H(2)O)(+) using density functional theory (DFT) and high-level ab initio theory. The ammonia-water cation has three minimum-energy structures of (a) H(2)NH(+)...OH(2), (b) H(3)N(+)...OH(2), and (c) H(3)NH(+)...OH. The lowest-energy structure is (a), followed by (c) and (b). The ammonia dimer cation has two minimum-energy structures [the lowest H(3)NH(+)...NH(2) structure and the second lowest (H(3)N...NH(3))(+) structure]. The minimum transition barrier for the interconversion between (a), (b), and (c) is approximately 6 kcal/mol. Most DFT calculations with various functionals, except a few cases, overstabilize the N...O and N...N binding, predicting different structures from Moller-Plesset second-order perturbation (MP2) theory and the most reliable complete basis set (CBS) limit of coupled cluster theory with single, double, and perturbative triple excitations [CCSD(T)]. Thus, the validity test of the DFT functionals for these ionized molecular systems would be of importance.
Reliability Assessment of Graphite Specimens under Multiaxial Stresses
NASA Technical Reports Server (NTRS)
Sookdeo, Steven; Nemeth, Noel N.; Bratton, Robert L.
2008-01-01
An investigation was conducted to predict the failure strength response of IG-100 nuclear grade graphite exposed to multiaxial stresses. As part of this effort, a review of failure criteria accounting for the stochastic strength response is provided. The experimental work was performed in the early 1990s at the Oak Ridge National Laboratory (ORNL) on hollow graphite tubes under the action of axial tensile loading and internal pressurization. As part of the investigation, finite-element analysis (FEA) was performed and compared with results of FEA from the original ORNL report. The new analysis generally compared well with the original analysis, although some discrepancies in the location of peak stresses was noted. The Ceramics Analysis and Reliability Evaluation of Structures Life prediction code (CARES/Life) was used with the FEA results to predict the quadrants I (tensile-tensile) and quadrant IV (compression-tension) strength response of the graphite tubes for the principle of independent action (PIA), the Weibull normal stress averaging (NSA), and the Batdorf multiaxial failure theories. The CARES/Life reliability analysis showed that all three failure theories gave similar results in quadrant I but that in quadrant IV, the PIA and Weibull normal stress-averaging theories were not conservative, whereas the Batdorf theory was able to correlate well with experimental results. The conclusion of the study was that the Batdorf theory should generally be used to predict the reliability response of graphite and brittle materials in multiaxial loading situations.
Clayson, Peter E; Miller, Gregory A
2017-01-01
Generalizability theory (G theory) provides a flexible, multifaceted approach to estimating score reliability. G theory's approach to estimating score reliability has important advantages over classical test theory that are relevant for research using event-related brain potentials (ERPs). For example, G theory does not require parallel forms (i.e., equal means, variances, and covariances), can handle unbalanced designs, and provides a single reliability estimate for designs with multiple sources of error. This monograph provides a detailed description of the conceptual framework of G theory using examples relevant to ERP researchers, presents the algorithms needed to estimate ERP score reliability, and provides a detailed walkthrough of newly-developed software, the ERP Reliability Analysis (ERA) Toolbox, that calculates score reliability using G theory. The ERA Toolbox is open-source, Matlab software that uses G theory to estimate the contribution of the number of trials retained for averaging, group, and/or event types on ERP score reliability. The toolbox facilitates the rigorous evaluation of psychometric properties of ERP scores recommended elsewhere in this special issue. Copyright © 2016 Elsevier B.V. All rights reserved.
Paige, Samantha R; Krieger, Janice L; Stellefson, Michael; Alber, Julia M
2017-02-01
Chronic disease patients are affected by low computer and health literacy, which negatively affects their ability to benefit from access to online health information. To estimate reliability and confirm model specifications for eHealth Literacy Scale (eHEALS) scores among chronic disease patients using Classical Test (CTT) and Item Response Theory techniques. A stratified sample of Black/African American (N=341) and Caucasian (N=343) adults with chronic disease completed an online survey including the eHEALS. Item discrimination was explored using bi-variate correlations and Cronbach's alpha for internal consistency. A categorical confirmatory factor analysis tested a one-factor structure of eHEALS scores. Item characteristic curves, in-fit/outfit statistics, omega coefficient, and item reliability and separation estimates were computed. A 1-factor structure of eHEALS was confirmed by statistically significant standardized item loadings, acceptable model fit indices (CFI/TLI>0.90), and 70% variance explained by the model. Item response categories increased with higher theta levels, and there was evidence of acceptable reliability (ω=0.94; item reliability=89; item separation=8.54). eHEALS scores are a valid and reliable measure of self-reported eHealth literacy among Internet-using chronic disease patients. Providers can use eHEALS to help identify patients' eHealth literacy skills. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Song, Lu-Kai; Wen, Jie; Fei, Cheng-Wei; Bai, Guang-Chen
2018-05-01
To improve the computing efficiency and precision of probabilistic design for multi-failure structure, a distributed collaborative probabilistic design method-based fuzzy neural network of regression (FR) (called as DCFRM) is proposed with the integration of distributed collaborative response surface method and fuzzy neural network regression model. The mathematical model of DCFRM is established and the probabilistic design idea with DCFRM is introduced. The probabilistic analysis of turbine blisk involving multi-failure modes (deformation failure, stress failure and strain failure) was investigated by considering fluid-structure interaction with the proposed method. The distribution characteristics, reliability degree, and sensitivity degree of each failure mode and overall failure mode on turbine blisk are obtained, which provides a useful reference for improving the performance and reliability of aeroengine. Through the comparison of methods shows that the DCFRM reshapes the probability of probabilistic analysis for multi-failure structure and improves the computing efficiency while keeping acceptable computational precision. Moreover, the proposed method offers a useful insight for reliability-based design optimization of multi-failure structure and thereby also enriches the theory and method of mechanical reliability design.
Generalizability Theory as a Unifying Framework of Measurement Reliability in Adolescent Research
ERIC Educational Resources Information Center
Fan, Xitao; Sun, Shaojing
2014-01-01
In adolescence research, the treatment of measurement reliability is often fragmented, and it is not always clear how different reliability coefficients are related. We show that generalizability theory (G-theory) is a comprehensive framework of measurement reliability, encompassing all other reliability methods (e.g., Pearson "r,"…
Lifetime Reliability Evaluation of Structural Ceramic Parts with the CARES/LIFE Computer Program
NASA Technical Reports Server (NTRS)
Nemeth, Noel N.; Powers, Lynn M.; Janosik, Lesley A.; Gyekenyesi, John P.
1993-01-01
The computer program CARES/LIFE calculates the time-dependent reliability of monolithic ceramic components subjected to thermomechanical and/or proof test loading. This program is an extension of the CARES (Ceramics Analysis and Reliability Evaluation of Structures) computer program. CARES/LIFE accounts for the phenomenon of subcritical crack growth (SCG) by utilizing the power law, Paris law, or Walker equation. The two-parameter Weibull cumulative distribution function is used to characterize the variation in component strength. The effects of multiaxial stresses are modeled using either the principle of independent action (PIA), Weibull's normal stress averaging method (NSA), or Batdorf's theory. Inert strength and fatigue parameters are estimated from rupture strength data of naturally flawed specimens loaded in static, dynamic, or cyclic fatigue. Two example problems demonstrating cyclic fatigue parameter estimation and component reliability analysis with proof testing are included.
Probabilistic structural mechanics research for parallel processing computers
NASA Technical Reports Server (NTRS)
Sues, Robert H.; Chen, Heh-Chyun; Twisdale, Lawrence A.; Martin, William R.
1991-01-01
Aerospace structures and spacecraft are a complex assemblage of structural components that are subjected to a variety of complex, cyclic, and transient loading conditions. Significant modeling uncertainties are present in these structures, in addition to the inherent randomness of material properties and loads. To properly account for these uncertainties in evaluating and assessing the reliability of these components and structures, probabilistic structural mechanics (PSM) procedures must be used. Much research has focused on basic theory development and the development of approximate analytic solution methods in random vibrations and structural reliability. Practical application of PSM methods was hampered by their computationally intense nature. Solution of PSM problems requires repeated analyses of structures that are often large, and exhibit nonlinear and/or dynamic response behavior. These methods are all inherently parallel and ideally suited to implementation on parallel processing computers. New hardware architectures and innovative control software and solution methodologies are needed to make solution of large scale PSM problems practical.
Barone, Vincenzo; Biczysko, Malgorzata; Bloino, Julien; Egidi, Franco; Puzzarini, Cristina
2015-01-01
The CCSD(T) model coupled with extrapolation to the complete basis-set limit and additive approaches represents the “golden standard” for the structural and spectroscopic characterization of building blocks of biomolecules and nanosystems. However, when open-shell systems are considered, additional problems related to both specific computational difficulties and the need of obtaining spin-dependent properties appear. In this contribution, we present a comprehensive study of the molecular structure and spectroscopic (IR, Raman, EPR) properties of the phenyl radical with the aim of validating an accurate computational protocol able to deal with conjugated open-shell species. We succeeded in obtaining reliable and accurate results, thus confirming and, partly, extending the available experimental data. The main issue to be pointed out is the need of going beyond the CCSD(T) level by including a full treatment of triple excitations in order to fulfil the accuracy requirements. On the other hand, the reliability of density functional theory in properly treating open-shell systems has been further confirmed. PMID:23802956
Neural-network-enhanced evolutionary algorithm applied to supported metal nanoparticles
NASA Astrophysics Data System (ADS)
Kolsbjerg, E. L.; Peterson, A. A.; Hammer, B.
2018-05-01
We show that approximate structural relaxation with a neural network enables orders of magnitude faster global optimization with an evolutionary algorithm in a density functional theory framework. The increased speed facilitates reliable identification of global minimum energy structures, as exemplified by our finding of a hollow Pt13 nanoparticle on an MgO support. We highlight the importance of knowing the correct structure when studying the catalytic reactivity of the different particle shapes. The computational speedup further enables screening of hundreds of different pathways in the search for optimum kinetic transitions between low-energy conformers and hence pushes the limits of the insight into thermal ensembles that can be obtained from theory.
Hartwell, Christopher J; Campion, Michael A
2016-06-01
This study explores normative feedback as a way to reduce rating errors and increase the reliability and validity of structured interview ratings. Based in control theory and social comparison theory, we propose a model of normative feedback interventions (NFIs) in the context of structured interviews and test our model using data from over 20,000 interviews conducted by more than 100 interviewers over a period of more than 4 years. Results indicate that lenient and severe interviewers reduced discrepancies between their ratings and the overall normative mean rating after receipt of normative feedback, though changes were greater for lenient interviewers. When various waves of feedback were presented in later NFIs, the combined normative mean rating over multiple time periods was more predictive of subsequent rating changes than the normative mean rating from the most recent time period. Mean within-interviewer rating variance, along with interrater agreement and interrater reliability, increased after the initial NFI, but results from later NFIs were more complex and revealed that feedback interventions may lose effectiveness over time. A second study using simulated data indicated that leniency and severity errors did not impact rating validity, but did affect which applicants were hired. We conclude that giving normative feedback to interviewers will aid in minimizing interviewer rating differences and enhance the reliability of structured interview ratings. We suggest that interviewer feedback might be considered as a potential new component of interview structure, though future research is needed before a definitive conclusion can be drawn. (PsycINFO Database Record (c) 2016 APA, all rights reserved).
NASA Astrophysics Data System (ADS)
Ighravwe, D. E.; Oke, S. A.; Adebiyi, K. A.
2018-03-01
This paper draws on the "human reliability" concept as a structure for gaining insight into the maintenance workforce assessment in a process industry. Human reliability hinges on developing the reliability of humans to a threshold that guides the maintenance workforce to execute accurate decisions within the limits of resources and time allocations. This concept offers a worthwhile point of deviation to encompass three elegant adjustments to literature model in terms of maintenance time, workforce performance and return-on-workforce investments. These fully explain the results of our influence. The presented structure breaks new grounds in maintenance workforce theory and practice from a number of perspectives. First, we have successfully implemented fuzzy goal programming (FGP) and differential evolution (DE) techniques for the solution of optimisation problem in maintenance of a process plant for the first time. The results obtained in this work showed better quality of solution from the DE algorithm compared with those of genetic algorithm and particle swarm optimisation algorithm, thus expressing superiority of the proposed procedure over them. Second, the analytical discourse, which was framed on stochastic theory, focusing on specific application to a process plant in Nigeria is a novelty. The work provides more insights into maintenance workforce planning during overhaul rework and overtime maintenance activities in manufacturing systems and demonstrated capacity in generating substantially helpful information for practice.
Cordier, Reinie; Speyer, Renée; Schindler, Antonio; Michou, Emilia; Heijnen, Bas Joris; Baijens, Laura; Karaduman, Ayşe; Swan, Katina; Clavé, Pere; Joosten, Annette Veronica
2018-02-01
The Swallowing Quality of Life questionnaire (SWAL-QOL) is widely used clinically and in research to evaluate quality of life related to swallowing difficulties. It has been described as a valid and reliable tool, but was developed and tested using classic test theory. This study describes the reliability and validity of the SWAL-QOL using item response theory (IRT; Rasch analysis). SWAL-QOL data were gathered from 507 participants at risk of oropharyngeal dysphagia (OD) across four European countries. OD was confirmed in 75.7% of participants via videofluoroscopy and/or fiberoptic endoscopic evaluation, or a clinical diagnosis based on meeting selected criteria. Patients with esophageal dysphagia were excluded. Data were analysed using Rasch analysis. Item and person reliability was good for all the items combined. However, person reliability was poor for 8 subscales and item reliability was poor for one subscale. Eight subscales exhibited poor person separation and two exhibited poor item separation. Overall item and person fit statistics were acceptable. However, at an individual item fit level results indicated unpredictable item responses for 28 items, and item redundancy for 10 items. The item-person dimensionality map confirmed these findings. Results from the overall Rasch model fit and Principal Component Analysis were suggestive of a second dimension. For all the items combined, none of the item categories were 'category', 'threshold' or 'step' disordered; however, all subscales demonstrated category disordered functioning. Findings suggest an urgent need to further investigate the underlying structure of the SWAL-QOL and its psychometric characteristics using IRT.
Boerboom, T B B; Dolmans, D H J M; Jaarsma, A D C; Muijtjens, A M M; Van Beukelen, P; Scherpbier, A J J A
2011-01-01
Feedback to aid teachers in improving their teaching requires validated evaluation instruments. When implementing an evaluation instrument in a different context, it is important to collect validity evidence from multiple sources. We examined the validity and reliability of the Maastricht Clinical Teaching Questionnaire (MCTQ) as an instrument to evaluate individual clinical teachers during short clinical rotations in veterinary education. We examined four sources of validity evidence: (1) Content was examined based on theory of effective learning. (2) Response process was explored in a pilot study. (3) Internal structure was assessed by confirmatory factor analysis using 1086 student evaluations and reliability was examined utilizing generalizability analysis. (4) Relations with other relevant variables were examined by comparing factor scores with other outcomes. Content validity was supported by theory underlying the cognitive apprenticeship model on which the instrument is based. The pilot study resulted in an additional question about supervision time. A five-factor model showed a good fit with the data. Acceptable reliability was achievable with 10-12 questionnaires per teacher. Correlations between the factors and overall teacher judgement were strong. The MCTQ appears to be a valid and reliable instrument to evaluate clinical teachers' performance during short rotations.
Cherif, Alhaji; Barley, Kamal
2010-01-01
Quantification of historical sociological processes have recently gained attention among theoreticians in the effort of providing a solid theoretical understanding of the behaviors and regularities present in socio-political dynamics. Here we present a reliability theory of polity processes with emphases on individual political dynamics of African countries. We found that the structural properties of polity failure rates successfully capture the risk of political vulnerability and instabilities in which , , , and of the countries with monotonically increasing, unimodal, U-shaped and monotonically decreasing polity failure rates, respectively, have high level of state fragility indices. The quasi-U-shape relationship between average polity duration and regime types corroborates historical precedents and explains the stability of the autocracies and democracies. PMID:21206911
Mkanya, Anele; Pellicane, Giuseppe; Pini, Davide; Caccamo, Carlo
2017-09-13
We report extensive calculations, based on the modified hypernetted chain (MHNC) theory, on the hierarchical reference theory (HRT), and on Monte Carlo simulations, of thermodynamical, structural and phase coexistence properties of symmetric binary hard-core Yukawa mixtures (HCYM) with attractive interactions at equal species concentration. The obtained results are throughout compared with those available in the literature for the same systems. It turns out that the MHNC predictions for thermodynamic and structural quantities are quite accurate in comparison with the MC data. The HRT is equally accurate for thermodynamics, and slightly less accurate for structure. Liquid-vapor (LV) and liquid-liquid (LL) consolute coexistence conditions as emerging from simulations, are also highly satisfactorily reproduced by both the MHNC and HRT for relatively long ranged potentials. When the potential range reduces, the MHNC faces problems in determining the LV binodal line; however, the LL consolute line and the critical end point (CEP) temperature and density turn out to be still satisfactorily predicted within this theory. The HRT also predicts with good accuracy the CEP position. The possibility of employing liquid state theories HCYM for the purpose of reliably determining phase equilibria in multicomponent colloidal fluids of current technological interest, is discussed.
NASA Astrophysics Data System (ADS)
Mkanya, Anele; Pellicane, Giuseppe; Pini, Davide; Caccamo, Carlo
2017-09-01
We report extensive calculations, based on the modified hypernetted chain (MHNC) theory, on the hierarchical reference theory (HRT), and on Monte Carlo simulations, of thermodynamical, structural and phase coexistence properties of symmetric binary hard-core Yukawa mixtures (HCYM) with attractive interactions at equal species concentration. The obtained results are throughout compared with those available in the literature for the same systems. It turns out that the MHNC predictions for thermodynamic and structural quantities are quite accurate in comparison with the MC data. The HRT is equally accurate for thermodynamics, and slightly less accurate for structure. Liquid-vapor (LV) and liquid-liquid (LL) consolute coexistence conditions as emerging from simulations, are also highly satisfactorily reproduced by both the MHNC and HRT for relatively long ranged potentials. When the potential range reduces, the MHNC faces problems in determining the LV binodal line; however, the LL consolute line and the critical end point (CEP) temperature and density turn out to be still satisfactorily predicted within this theory. The HRT also predicts with good accuracy the CEP position. The possibility of employing liquid state theories HCYM for the purpose of reliably determining phase equilibria in multicomponent colloidal fluids of current technological interest, is discussed.
Zhang, Dengke; Pang, Yanxia; Cai, Weixiong; Fazio, Rachel L; Ge, Jianrong; Su, Qiaorong; Xu, Shuiqin; Pan, Yinan; Chen, Sanmei; Zhang, Hongwei
2016-08-01
Impairment of theory of mind (ToM) is a common phenomenon following traumatic brain injury (TBI) that has clear effects on patients' social functioning. A growing body of research has focused on this area, and several methods have been developed to assess ToM deficiency. Although an informant assessment scale would be useful for examining individuals with TBI, very few studies have adopted this approach. The purpose of the present study was to develop an informant assessment scale of ToM for adults with traumatic brain injury (IASToM-aTBI) and to test its reliability and validity with 196 adults with TBI and 80 normal adults. A 44-item scale was developed following a literature review, interviews with patient informants, consultations with experts, item analysis, and exploratory factor analysis (EFA). The following three common factors were extracted: social interaction, understanding of beliefs, and understanding of emotions. The psychometric analyses indicate that the scale has good internal consistency reliability, split-half reliability, test-retest reliability, inter-rater reliability, structural validity, discriminate validity and criterion validity. These results provide preliminary evidence that supports the reliability and validity of the IASToM-aTBI as a ToM assessment tool for adults with TBI.
NASA Astrophysics Data System (ADS)
Shi, Yu-Fang; Ma, Yi-Yi; Song, Ping-Ping
2018-03-01
System Reliability Theory is a research hotspot of management science and system engineering in recent years, and construction reliability is useful for quantitative evaluation of project management level. According to reliability theory and target system of engineering project management, the defination of construction reliability appears. Based on fuzzy mathematics theory and language operator, value space of construction reliability is divided into seven fuzzy subsets and correspondingly, seven membership function and fuzzy evaluation intervals are got with the operation of language operator, which provides the basis of corresponding method and parameter for the evaluation of construction reliability. This method is proved to be scientific and reasonable for construction condition and an useful attempt for theory and method research of engineering project system reliability.
Ardestani, M S; Niknami, S; Hidarnia, A; Hajizadeh, E
2016-08-18
This research examined the validity and reliability of a researcher-developed questionnaire based on Social Cognitive Theory (SCT) to assess the physical activity behaviour of Iranian adolescent girls (SCT-PAIAGS). Psychometric properties of the SCT-PAIAGS were assessed by determining its face validity, content and construct validity as well as its reliability. In order to evaluate factor structure, cross-sectional research was conducted on 400 high-school girls in Tehran. Content validity index, content validity ratio and impact score for the SCT-PAIAGS varied between 0.97-1, 0.91-1 and 4.6-4.9 respectively. Confirmatory factor analysis approved a six-factor structure comprising self-efficacy, self-regulation, family support, friend support, outcome expectancy and self-efficacy to overcoming impediments. Factor loadings, t-values and fit indices showed that the SCT model was fitted to the data. Cronbach's α-coefficient ranged from 0.78 to 0.85 and intraclass correlation coefficient from 0.73 to 0.90.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Liakh, Dmitry I
While the formalism of multiresolution analysis (MRA), based on wavelets and adaptive integral representations of operators, is actively progressing in electronic structure theory (mostly on the independent-particle level and, recently, second-order perturbation theory), the concepts of multiresolution and adaptivity can also be utilized within the traditional formulation of correlated (many-particle) theory which is based on second quantization and the corresponding (generally nonorthogonal) tensor algebra. In this paper, we present a formalism called scale-adaptive tensor algebra (SATA) which exploits an adaptive representation of tensors of many-body operators via the local adjustment of the basis set quality. Given a series of locallymore » supported fragment bases of a progressively lower quality, we formulate the explicit rules for tensor algebra operations dealing with adaptively resolved tensor operands. The formalism suggested is expected to enhance the applicability and reliability of local correlated many-body methods of electronic structure theory, especially those directly based on atomic orbitals (or any other localized basis functions).« less
Adsorption structures and energetics of molecules on metal surfaces: Bridging experiment and theory
NASA Astrophysics Data System (ADS)
Maurer, Reinhard J.; Ruiz, Victor G.; Camarillo-Cisneros, Javier; Liu, Wei; Ferri, Nicola; Reuter, Karsten; Tkatchenko, Alexandre
2016-05-01
Adsorption geometry and stability of organic molecules on surfaces are key parameters that determine the observable properties and functions of hybrid inorganic/organic systems (HIOSs). Despite many recent advances in precise experimental characterization and improvements in first-principles electronic structure methods, reliable databases of structures and energetics for large adsorbed molecules are largely amiss. In this review, we present such a database for a range of molecules adsorbed on metal single-crystal surfaces. The systems we analyze include noble-gas atoms, conjugated aromatic molecules, carbon nanostructures, and heteroaromatic compounds adsorbed on five different metal surfaces. The overall objective is to establish a diverse benchmark dataset that enables an assessment of current and future electronic structure methods, and motivates further experimental studies that provide ever more reliable data. Specifically, the benchmark structures and energetics from experiment are here compared with the recently developed van der Waals (vdW) inclusive density-functional theory (DFT) method, DFT + vdWsurf. In comparison to 23 adsorption heights and 17 adsorption energies from experiment we find a mean average deviation of 0.06 Å and 0.16 eV, respectively. This confirms the DFT + vdWsurf method as an accurate and efficient approach to treat HIOSs. A detailed discussion identifies remaining challenges to be addressed in future development of electronic structure methods, for which the here presented benchmark database may serve as an important reference.
NASA Technical Reports Server (NTRS)
Gyekenyesi, John P.; Nemeth, Noel N.
1987-01-01
The SCARE (Structural Ceramics Analysis and Reliability Evaluation) computer program on statistical fast fracture reliability analysis with quadratic elements for volume distributed imperfections is enhanced to include the use of linear finite elements and the capability of designing against concurrent surface flaw induced ceramic component failure. The SCARE code is presently coupled as a postprocessor to the MSC/NASTRAN general purpose, finite element analysis program. The improved version now includes the Weibull and Batdorf statistical failure theories for both surface and volume flaw based reliability analysis. The program uses the two-parameter Weibull fracture strength cumulative failure probability distribution model with the principle of independent action for poly-axial stress states, and Batdorf's shear-sensitive as well as shear-insensitive statistical theories. The shear-sensitive surface crack configurations include the Griffith crack and Griffith notch geometries, using the total critical coplanar strain energy release rate criterion to predict mixed-mode fracture. Weibull material parameters based on both surface and volume flaw induced fracture can also be calculated from modulus of rupture bar tests, using the least squares method with known specimen geometry and grouped fracture data. The statistical fast fracture theories for surface flaw induced failure, along with selected input and output formats and options, are summarized. An example problem to demonstrate various features of the program is included.
NASA Technical Reports Server (NTRS)
Lee, Alice T.; Gunn, Todd; Pham, Tuan; Ricaldi, Ron
1994-01-01
This handbook documents the three software analysis processes the Space Station Software Analysis team uses to assess space station software, including their backgrounds, theories, tools, and analysis procedures. Potential applications of these analysis results are also presented. The first section describes how software complexity analysis provides quantitative information on code, such as code structure and risk areas, throughout the software life cycle. Software complexity analysis allows an analyst to understand the software structure, identify critical software components, assess risk areas within a software system, identify testing deficiencies, and recommend program improvements. Performing this type of analysis during the early design phases of software development can positively affect the process, and may prevent later, much larger, difficulties. The second section describes how software reliability estimation and prediction analysis, or software reliability, provides a quantitative means to measure the probability of failure-free operation of a computer program, and describes the two tools used by JSC to determine failure rates and design tradeoffs between reliability, costs, performance, and schedule.
A two-factor theory for concussion assessment using ImPACT: memory and speed.
Schatz, Philip; Maerlender, Arthur
2013-12-01
We present the initial validation of a two-factor structure of Immediate Post-Concussion Assessment and Cognitive Testing (ImPACT) using ImPACT composite scores and document the reliability and validity of this factor structure. Factor analyses were conducted for baseline (N = 21,537) and post-concussion (N = 560) data, yielding "Memory" (Verbal and Visual) and "Speed" (Visual Motor Speed and Reaction Time) Factors; inclusion of Total Symptom Scores resulted in a third discrete factor. Speed and Memory z-scores were calculated, and test-retest reliability (using intra-class correlation coefficients) at 1 month (0.88/0.81), 1 year (0.85/0.75), and 2 years (0.76/0.74) were higher than published data using Composite scores. Speed and Memory scores yielded 89% sensitivity and 70% specificity, which was higher than composites (80%/62%) and comparable with subscales (91%/69%). This emergent two-factor structure has improved test-retest reliability with no loss of sensitivity/specificity and may improve understanding and interpretability of ImPACT test results.
Optimization of life support systems and their systems reliability
NASA Technical Reports Server (NTRS)
Fan, L. T.; Hwang, C. L.; Erickson, L. E.
1971-01-01
The identification, analysis, and optimization of life support systems and subsystems have been investigated. For each system or subsystem that has been considered, the procedure involves the establishment of a set of system equations (or mathematical model) based on theory and experimental evidences; the analysis and simulation of the model; the optimization of the operation, control, and reliability; analysis of sensitivity of the system based on the model; and, if possible, experimental verification of the theoretical and computational results. Research activities include: (1) modeling of air flow in a confined space; (2) review of several different gas-liquid contactors utilizing centrifugal force: (3) review of carbon dioxide reduction contactors in space vehicles and other enclosed structures: (4) application of modern optimal control theory to environmental control of confined spaces; (5) optimal control of class of nonlinear diffusional distributed parameter systems: (6) optimization of system reliability of life support systems and sub-systems: (7) modeling, simulation and optimal control of the human thermal system: and (8) analysis and optimization of the water-vapor eletrolysis cell.
Ziatabar Ahmadi, Seyyede Zohreh; Jalaie, Shohreh; Ashayeri, Hassan
2015-09-01
Theory of mind (ToM) or mindreading is an aspect of social cognition that evaluates mental states and beliefs of oneself and others. Validity and reliability are very important criteria when evaluating standard tests; and without them, these tests are not usable. The aim of this study was to systematically review the validity and reliability of published English comprehensive ToM tests developed for normal preschool children. We searched MEDLINE (PubMed interface), Web of Science, Science direct, PsycINFO, and also evidence base Medicine (The Cochrane Library) databases from 1990 to June 2015. Search strategy was Latin transcription of 'Theory of Mind' AND test AND children. Also, we manually studied the reference lists of all final searched articles and carried out a search of their references. Inclusion criteria were as follows: Valid and reliable diagnostic ToM tests published from 1990 to June 2015 for normal preschool children; and exclusion criteria were as follows: the studies that only used ToM tests and single tasks (false belief tasks) for ToM assessment and/or had no description about structure, validity or reliability of their tests. METHODological quality of the selected articles was assessed using the Critical Appraisal Skills Programme (CASP). In primary searching, we found 1237 articles in total databases. After removing duplicates and applying all inclusion and exclusion criteria, we selected 11 tests for this systematic review. There were a few valid, reliable and comprehensive ToM tests for normal preschool children. However, we had limitations concerning the included articles. The defined ToM tests were different in populations, tasks, mode of presentations, scoring, mode of responses, times and other variables. Also, they had various validities and reliabilities. Therefore, it is recommended that the researchers and clinicians select the ToM tests according to their psychometric characteristics, validity and reliability.
Ziatabar Ahmadi, Seyyede Zohreh; Jalaie, Shohreh; Ashayeri, Hassan
2015-01-01
Objective: Theory of mind (ToM) or mindreading is an aspect of social cognition that evaluates mental states and beliefs of oneself and others. Validity and reliability are very important criteria when evaluating standard tests; and without them, these tests are not usable. The aim of this study was to systematically review the validity and reliability of published English comprehensive ToM tests developed for normal preschool children. Method: We searched MEDLINE (PubMed interface), Web of Science, Science direct, PsycINFO, and also evidence base Medicine (The Cochrane Library) databases from 1990 to June 2015. Search strategy was Latin transcription of ‘Theory of Mind’ AND test AND children. Also, we manually studied the reference lists of all final searched articles and carried out a search of their references. Inclusion criteria were as follows: Valid and reliable diagnostic ToM tests published from 1990 to June 2015 for normal preschool children; and exclusion criteria were as follows: the studies that only used ToM tests and single tasks (false belief tasks) for ToM assessment and/or had no description about structure, validity or reliability of their tests. Methodological quality of the selected articles was assessed using the Critical Appraisal Skills Programme (CASP). Result: In primary searching, we found 1237 articles in total databases. After removing duplicates and applying all inclusion and exclusion criteria, we selected 11 tests for this systematic review. Conclusion: There were a few valid, reliable and comprehensive ToM tests for normal preschool children. However, we had limitations concerning the included articles. The defined ToM tests were different in populations, tasks, mode of presentations, scoring, mode of responses, times and other variables. Also, they had various validities and reliabilities. Therefore, it is recommended that the researchers and clinicians select the ToM tests according to their psychometric characteristics, validity and reliability. PMID:27006666
NASA Astrophysics Data System (ADS)
Yankovskii, A. P.
2018-01-01
On the basis of constitutive equations of the Rabotnov nonlinear hereditary theory of creep, the problem on the rheonomic flexural behavior of layered plates with a regular structure is formu-lated. Equations allowing one to describe, with different degrees of accuracy, the stress-strain state of such plates with account of their weakened resistance to transverse shear were ob-tained. From them, the relations of the nonclassical Reissner- and Reddytype theories can be found. For axially loaded annular plates clamped at one edge and loaded quasistatically on the other edge, a simplified version of the refined theory, whose complexity is comparable to that of the Reissner and Reddy theories, is developed. The flexural strains of such metal-composite annular plates in shortterm and long-term loadings at different levels of heat action are calcu-lated. It is shown that, for plates with a relative thickness of order of 1/10, neither the classical theory, nor the traditional nonclassical Reissner and Reddy theories guarantee reliable results for deflections even with the rough 10% accuracy. The accuracy of these theories decreases at elevated temperatures and with time under long-term loadings of structures. On the basic of relations of the refined theory, it is revealed that, in bending of layered metal-composite heat-sensitive plates under elevated temperatures, marked edge effects arise in the neighborhood of the supported edge, which characterize the shear of these structures in the transverse direction
NASA Astrophysics Data System (ADS)
Karri, Naveen K.; Mo, Changki
2018-06-01
Structural reliability of thermoelectric generation (TEG) systems still remains an issue, especially for applications such as large-scale industrial or automobile exhaust heat recovery, in which TEG systems are subject to dynamic loads and thermal cycling. Traditional thermoelectric (TE) system design and optimization techniques, focused on performance alone, could result in designs that may fail during operation as the geometric requirements for optimal performance (especially the power) are often in conflict with the requirements for mechanical reliability. This study focused on reducing the thermomechanical stresses in a TEG system without compromising the optimized system performance. Finite element simulations were carried out to study the effect of TE element (leg) geometry such as leg length and cross-sectional shape under constrained material volume requirements. Results indicated that the element length has a major influence on the element stresses whereas regular cross-sectional shapes have minor influence. The impact of TE element stresses on the mechanical reliability is evaluated using brittle material failure theory based on Weibull analysis. An alternate couple configuration that relies on the industry practice of redundant element design is investigated. Results showed that the alternate configuration considerably reduced the TE element and metallization stresses, thereby enhancing the structural reliability, with little trade-off in the optimized performance. The proposed alternate configuration could serve as a potential design modification for improving the reliability of systems optimized for thermoelectric performance.
Leckelt, Marius; Wetzel, Eunike; Gerlach, Tanja M; Ackerman, Robert A; Miller, Joshua D; Chopik, William J; Penke, Lars; Geukes, Katharina; Küfner, Albrecht C P; Hutteman, Roos; Richter, David; Renner, Karl-Heinz; Allroggen, Marc; Brecheen, Courtney; Campbell, W Keith; Grossmann, Igor; Back, Mitja D
2018-01-01
Due to increased empirical interest in narcissism across the social sciences, there is a need for inventories that can be administered quickly while also reliably measuring both the agentic and antagonistic aspects of grandiose narcissism. In this study, we sought to validate the factor structure, provide representative descriptive data and reliability estimates, assess the reliability across the trait spectrum, and examine the nomological network of the short version of the Narcissistic Admiration and Rivalry Questionnaire (NARQ-S; Back et al., 2013). We used data from a large convenience sample (total N = 11,937) as well as data from a large representative sample (total N = 4,433) that included responses to other narcissism measures as well as related constructs, including the other Dark Triad traits, Big Five personality traits, and self-esteem. Confirmatory factor analysis and item response theory were used to validate the factor structure and estimate the reliability across the latent trait spectrum, respectively. Results suggest that the NARQ-S shows a robust factor structure and is a reliable and valid short measure of the agentic and antagonistic aspects of grandiose narcissism. We also discuss future directions and applications of the NARQ-S as a short and comprehensive measure of grandiose narcissism. (PsycINFO Database Record (c) 2018 APA, all rights reserved).
[Reliability theory based on quality risk network analysis for Chinese medicine injection].
Li, Zheng; Kang, Li-Yuan; Fan, Xiao-Hui
2014-08-01
A new risk analysis method based upon reliability theory was introduced in this paper for the quality risk management of Chinese medicine injection manufacturing plants. The risk events including both cause and effect ones were derived in the framework as nodes with a Bayesian network analysis approach. It thus transforms the risk analysis results from failure mode and effect analysis (FMEA) into a Bayesian network platform. With its structure and parameters determined, the network can be used to evaluate the system reliability quantitatively with probabilistic analytical appraoches. Using network analysis tools such as GeNie and AgenaRisk, we are able to find the nodes that are most critical to influence the system reliability. The importance of each node to the system can be quantitatively evaluated by calculating the effect of the node on the overall risk, and minimization plan can be determined accordingly to reduce their influences and improve the system reliability. Using the Shengmai injection manufacturing plant of SZYY Ltd as a user case, we analyzed the quality risk with both static FMEA analysis and dynamic Bayesian Network analysis. The potential risk factors for the quality of Shengmai injection manufacturing were identified with the network analysis platform. Quality assurance actions were further defined to reduce the risk and improve the product quality.
Interactive Reliability Model for Whisker-toughened Ceramics
NASA Technical Reports Server (NTRS)
Palko, Joseph L.
1993-01-01
Wider use of ceramic matrix composites (CMC) will require the development of advanced structural analysis technologies. The use of an interactive model to predict the time-independent reliability of a component subjected to multiaxial loads is discussed. The deterministic, three-parameter Willam-Warnke failure criterion serves as the theoretical basis for the reliability model. The strength parameters defining the model are assumed to be random variables, thereby transforming the deterministic failure criterion into a probabilistic criterion. The ability of the model to account for multiaxial stress states with the same unified theory is an improvement over existing models. The new model was coupled with a public-domain finite element program through an integrated design program. This allows a design engineer to predict the probability of failure of a component. A simple structural problem is analyzed using the new model, and the results are compared to existing models.
Construction and Validation of the Perceived Opportunity to Craft Scale.
van Wingerden, Jessica; Niks, Irene M W
2017-01-01
We developed and validated a scale to measure employees' perceived opportunity to craft (POC) in two separate studies conducted in the Netherlands (total N = 2329). POC is defined as employees' perception of their opportunity to craft their job. In Study 1, the perceived opportunity to craft scale (POCS) was developed and tested for its factor structure and reliability in an explorative way. Study 2 consisted of confirmatory analyses of the factor structure and reliability of the scale as well as examination of the discriminant and criterion-related validity of the POCS. The results indicated that the scale consists of one dimension and could be reliably measured with five items. Evidence was found for the discriminant validity of the POCS. The scale also showed criterion-related validity when correlated with job crafting (+), job resources (autonomy +; opportunities for professional development +), work engagement (+), and the inactive construct cynicism (-). We discuss the implications of these findings for theory and practice.
Scale of attitudes toward alcohol - Spanish version: evidences of validity and reliability 1
Ramírez, Erika Gisseth León; de Vargas, Divane
2017-01-01
ABSTRACT Objective: validate the Scale of attitudes toward alcohol, alcoholism and individuals with alcohol use disorders in its Spanish version. Method: methodological study, involving 300 Colombian nurses. Adopting the classical theory, confirmatory factor analysis was applied without prior examination, based on the strong historical evidence of the factorial structure of the original scale to determine the construct validity of this Spanish version. To assess the reliability, Cronbach’s Alpha and Mc Donalid’s Omega coefficients were used. Results: the confirmatory factor analysis indicated the good fit of the scale model in a four-factor distribution, with a cut-off point at 3.2, demonstrating 66.7% of sensitivity. Conclusions: the Scale of attitudes toward alcohol, alcoholism and individuals with alcohol use disorders in Spanish presented robust psychometric qualities, affirming that the instrument possesses a solid factorial structure and reliability and is capable of precisely measuring the nurses’ atittudes towards the phenomenon proposed. PMID:28793126
An Introduction to Structural Reliability Theory
1989-01-01
Test Samples psi COV Distribution Remarks Yield stress 66 (XX 0.09 assumed lognormal mill test I containment vesel SA537 GrB Yield stress 6W8...straightened shape Tension :32 57.909 0.089 cold straightened shape Tension 9 84.039 0.1124 annealed , alloy steel Tension 9 124,9 0.1796 . quenched...alloys, annealed and quenched, and drawn samples Tension 22 29.50 X 103 0.0072 .. structural steel Compression 22 29.49 X 103 0.0146 ... structural
Sebire, Simon J; Jago, Russell; Fox, Kenneth R; Edwards, Mark J; Thompson, Janice L
2013-09-26
Understanding children's physical activity motivation, its antecedents and associations with behavior is important and can be advanced by using self-determination theory. However, research among youth is largely restricted to adolescents and studies of motivation within certain contexts (e.g., physical education). There are no measures of self-determination theory constructs (physical activity motivation or psychological need satisfaction) for use among children and no previous studies have tested a self-determination theory-based model of children's physical activity motivation. The purpose of this study was to test the reliability and validity of scores derived from scales adapted to measure self-determination theory constructs among children and test a motivational model predicting accelerometer-derived physical activity. Cross-sectional data from 462 children aged 7 to 11 years from 20 primary schools in Bristol, UK were analysed. Confirmatory factor analysis was used to examine the construct validity of adapted behavioral regulation and psychological need satisfaction scales. Structural equation modelling was used to test cross-sectional associations between psychological need satisfaction, motivation types and physical activity assessed by accelerometer. The construct validity and reliability of the motivation and psychological need satisfaction measures were supported. Structural equation modelling provided evidence for a motivational model in which psychological need satisfaction was positively associated with intrinsic and identified motivation types and intrinsic motivation was positively associated with children's minutes in moderate-to-vigorous physical activity. The study provides evidence for the psychometric properties of measures of motivation aligned with self-determination theory among children. Children's motivation that is based on enjoyment and inherent satisfaction of physical activity is associated with their objectively-assessed physical activity and such motivation is positively associated with perceptions of psychological need satisfaction. These psychological factors represent potential malleable targets for interventions to increase children's physical activity.
Comparison of Reliability Measures under Factor Analysis and Item Response Theory
ERIC Educational Resources Information Center
Cheng, Ying; Yuan, Ke-Hai; Liu, Cheng
2012-01-01
Reliability of test scores is one of the most pervasive psychometric concepts in measurement. Reliability coefficients based on a unifactor model for continuous indicators include maximal reliability rho and an unweighted sum score-based omega, among many others. With increasing popularity of item response theory, a parallel reliability measure pi…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Weigend, Florian, E-mail: florian.weigend@kit.edu
2014-10-07
Energy surfaces of metal clusters usually show a large variety of local minima. For homo-metallic species the energetically lowest can be found reliably with genetic algorithms, in combination with density functional theory without system-specific parameters. For mixed-metallic clusters this is much more difficult, as for a given arrangement of nuclei one has to find additionally the best of many possibilities of assigning different metal types to the individual positions. In the framework of electronic structure methods this second issue is treatable at comparably low cost at least for elements with similar atomic number by means of first-order perturbation theory, asmore » shown previously [F. Weigend, C. Schrodt, and R. Ahlrichs, J. Chem. Phys. 121, 10380 (2004)]. In the present contribution the extension of a genetic algorithm with the re-assignment of atom types to atom sites is proposed and tested for the search of the global minima of PtHf{sub 12} and [LaPb{sub 7}Bi{sub 7}]{sup 4−}. For both cases the (putative) global minimum is reliably found with the extended technique, which is not the case for the “pure” genetic algorithm.« less
NASA Technical Reports Server (NTRS)
Hooke, F. H.
1972-01-01
Both the conventional and reliability analyses for determining safe fatigue life are predicted on a population having a specified (usually log normal) distribution of life to collapse under a fatigue test load. Under a random service load spectrum, random occurrences of load larger than the fatigue test load may confront and cause collapse of structures which are weakened, though not yet to the fatigue test load. These collapses are included in reliability but excluded in conventional analysis. The theory of risk determination by each method is given, and several reasonably typical examples have been worked out, in which it transpires that if one excludes collapse through exceedance of the uncracked strength, the reliability and conventional analyses gave virtually identical probabilities of failure or survival.
Reliability and Validity of the Sexual Pressure Scale for Women-Revised
Jones, Rachel; Gulick, Elsie
2008-01-01
Sexual pressure among young urban women represents adherence to gender stereotypical expectations to engage in sex. Revision of the original 5-factor Sexual Pressure Scale was undertaken in two studies to improve reliabilities in two of the five factors. In Study 1 the reliability of the Sexual Pressure Scale for Women-Revised (SPSW-R) was tested, and principal components analysis was performed in a sample of 325 young, urban women. A parsimonious 18-item, 4-factor model explained 61% of the variance. In Study 2 the theory underlying sexual pressure was supported by confirmatory factor analysis using structural equation modeling in a sample of 181 women. Reliabilities of the SPSW-R total and subscales were very satisfactory, suggesting it may be used in intervention research. PMID:18666222
Reliability-Based Design Optimization of a Composite Airframe Component
NASA Technical Reports Server (NTRS)
Pai, Shantaram S.; Coroneos, Rula; Patnaik, Surya N.
2011-01-01
A stochastic optimization methodology (SDO) has been developed to design airframe structural components made of metallic and composite materials. The design method accommodates uncertainties in load, strength, and material properties that are defined by distribution functions with mean values and standard deviations. A response parameter, like a failure mode, has become a function of reliability. The primitive variables like thermomechanical loads, material properties, and failure theories, as well as variables like depth of beam or thickness of a membrane, are considered random parameters with specified distribution functions defined by mean values and standard deviations.
Structural Reliability Analysis and Optimization: Use of Approximations
NASA Technical Reports Server (NTRS)
Grandhi, Ramana V.; Wang, Liping
1999-01-01
This report is intended for the demonstration of function approximation concepts and their applicability in reliability analysis and design. Particularly, approximations in the calculation of the safety index, failure probability and structural optimization (modification of design variables) are developed. With this scope in mind, extensive details on probability theory are avoided. Definitions relevant to the stated objectives have been taken from standard text books. The idea of function approximations is to minimize the repetitive use of computationally intensive calculations by replacing them with simpler closed-form equations, which could be nonlinear. Typically, the approximations provide good accuracy around the points where they are constructed, and they need to be periodically updated to extend their utility. There are approximations in calculating the failure probability of a limit state function. The first one, which is most commonly discussed, is how the limit state is approximated at the design point. Most of the time this could be a first-order Taylor series expansion, also known as the First Order Reliability Method (FORM), or a second-order Taylor series expansion (paraboloid), also known as the Second Order Reliability Method (SORM). From the computational procedure point of view, this step comes after the design point identification; however, the order of approximation for the probability of failure calculation is discussed first, and it is denoted by either FORM or SORM. The other approximation of interest is how the design point, or the most probable failure point (MPP), is identified. For iteratively finding this point, again the limit state is approximated. The accuracy and efficiency of the approximations make the search process quite practical for analysis intensive approaches such as the finite element methods; therefore, the crux of this research is to develop excellent approximations for MPP identification and also different approximations including the higher-order reliability methods (HORM) for representing the failure surface. This report is divided into several parts to emphasize different segments of the structural reliability analysis and design. Broadly, it consists of mathematical foundations, methods and applications. Chapter I discusses the fundamental definitions of the probability theory, which are mostly available in standard text books. Probability density function descriptions relevant to this work are addressed. In Chapter 2, the concept and utility of function approximation are discussed for a general application in engineering analysis. Various forms of function representations and the latest developments in nonlinear adaptive approximations are presented with comparison studies. Research work accomplished in reliability analysis is presented in Chapter 3. First, the definition of safety index and most probable point of failure are introduced. Efficient ways of computing the safety index with a fewer number of iterations is emphasized. In chapter 4, the probability of failure prediction is presented using first-order, second-order and higher-order methods. System reliability methods are discussed in chapter 5. Chapter 6 presents optimization techniques for the modification and redistribution of structural sizes for improving the structural reliability. The report also contains several appendices on probability parameters.
A new theory for X-ray diffraction.
Fewster, Paul F
2014-05-01
This article proposes a new theory of X-ray scattering that has particular relevance to powder diffraction. The underlying concept of this theory is that the scattering from a crystal or crystallite is distributed throughout space: this leads to the effect that enhanced scatter can be observed at the `Bragg position' even if the `Bragg condition' is not satisfied. The scatter from a single crystal or crystallite, in any fixed orientation, has the fascinating property of contributing simultaneously to many `Bragg positions'. It also explains why diffraction peaks are obtained from samples with very few crystallites, which cannot be explained with the conventional theory. The intensity ratios for an Si powder sample are predicted with greater accuracy and the temperature factors are more realistic. Another consequence is that this new theory predicts a reliability in the intensity measurements which agrees much more closely with experimental observations compared to conventional theory that is based on `Bragg-type' scatter. The role of dynamical effects (extinction etc.) is discussed and how they are suppressed with diffuse scattering. An alternative explanation for the Lorentz factor is presented that is more general and based on the capture volume in diffraction space. This theory, when applied to the scattering from powders, will evaluate the full scattering profile, including peak widths and the `background'. The theory should provide an increased understanding of the reliability of powder diffraction measurements, and may also have wider implications for the analysis of powder diffraction data, by increasing the accuracy of intensities predicted from structural models.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Xie, T., E-mail: xietao@ustc.edu.cn; Key Laboratory of Geospace Environment, CAS, Hefei, Anhui 230026; Qin, H.
A unified ballooning theory, constructed on the basis of two special theories [Zhang et al., Phys. Fluids B 4, 2729 (1992); Y. Z. Zhang and T. Xie, Nucl. Fusion Plasma Phys. 33, 193 (2013)], shows that a weak up-down asymmetric mode structure is normally formed in an up-down symmetric equilibrium; the weak up-down asymmetry in mode structure is the manifestation of non-trivial higher order effects beyond the standard ballooning equation. It is shown that the asymmetric mode may have even higher growth rate than symmetric modes. The salient features of the theory are illustrated by investigating a fluid model formore » the ion temperature gradient (ITG) mode. The two dimensional (2D) analytical form of the ITG mode, solved in ballooning representation, is then converted into the radial-poloidal space to provide the natural boundary condition for solving the 2D mathematical local eigenmode problem. We find that the analytical expression of the mode structure is in a good agreement with finite difference solution. This sets a reliable framework for quasi-linear computation.« less
Hempler, Daniela; Schmidt, Martin U; van de Streek, Jacco
2017-08-01
More than 600 molecular crystal structures with correct, incorrect and uncertain space-group symmetry were energy-minimized with dispersion-corrected density functional theory (DFT-D, PBE-D3). For the purpose of determining the correct space-group symmetry the required tolerance on the atomic coordinates of all non-H atoms is established to be 0.2 Å. For 98.5% of 200 molecular crystal structures published with missed symmetry, the correct space group is identified; there are no false positives. Very small, very symmetrical molecules can end up in artificially high space groups upon energy minimization, although this is easily detected through visual inspection. If the space group of a crystal structure determined from powder diffraction data is ambiguous, energy minimization with DFT-D provides a fast and reliable method to select the correct space group.
High reliability and implications for nursing leaders.
Riley, William
2009-03-01
To review high reliability theory and discuss its implications for the nursing leader. A high reliability organization (HRO) is considered that which has measurable near perfect performance for quality and safety. The author has reviewed the literature, discussed research findings that contribute to improving reliability in health care organizations, and makes five recommendations for how nursing leaders can create high reliability organizations. Health care is not a safe industry and unintended patient harm occurs at epidemic levels. Health care can learn from high reliability theory and practice developed in other high-risk industries. Viewed by HRO standards, unintended patient injury in health care is excessively high and quality is distressingly low. HRO theory and practice can be successfully applied in health care using advanced interdisciplinary teamwork training and deliberate process design techniques. Nursing has a primary leadership function for ensuring patient safety and achieving high quality in health care organizations. Learning HRO theory and methods for achieving high reliability is a foremost opportunity for nursing leaders.
ERIC Educational Resources Information Center
Yelboga, Atilla; Tavsancil, Ezel
2010-01-01
In this research, the classical test theory and generalizability theory analyses were carried out with the data obtained by a job performance scale for the years 2005 and 2006. The reliability coefficients obtained (estimated) from the classical test theory and generalizability theory analyses were compared. In classical test theory, test retest…
NASA Technical Reports Server (NTRS)
Gyekenyesi, J. P.
1985-01-01
A computer program was developed for calculating the statistical fast fracture reliability and failure probability of ceramic components. The program includes the two-parameter Weibull material fracture strength distribution model, using the principle of independent action for polyaxial stress states and Batdorf's shear-sensitive as well as shear-insensitive crack theories, all for volume distributed flaws in macroscopically isotropic solids. Both penny-shaped cracks and Griffith cracks are included in the Batdorf shear-sensitive crack response calculations, using Griffith's maximum tensile stress or critical coplanar strain energy release rate criteria to predict mixed mode fracture. Weibull material parameters can also be calculated from modulus of rupture bar tests, using the least squares method with known specimen geometry and fracture data. The reliability prediction analysis uses MSC/NASTRAN stress, temperature and volume output, obtained from the use of three-dimensional, quadratic, isoparametric, or axisymmetric finite elements. The statistical fast fracture theories employed, along with selected input and output formats and options, are summarized. An example problem to demonstrate various features of the program is included.
Answering the call: a tool that measures functional breast cancer literacy.
Williams, Karen Patricia; Templin, Thomas N; Hines, Resche D
2013-01-01
There is a need for health care providers and health care educators to ensure that the messages they communicate are understood. The purpose of this research was to test the reliability and validity, in a culturally diverse sample of women, of a revised Breast Cancer Literacy Assessment Tool (Breast-CLAT) designed to measure functional understanding of breast cancer in English, Spanish, and Arabic. Community health workers verbally administered the 35-item Breast-CLAT to 543 Black, Latina, and Arab American women. A confirmatory factor analysis using a 2-parameter item response theory model was used to test the proposed 3-factor Breast-CLAT (awareness, screening and knowledge, and prevention and control). The confirmatory factor analysis using a 2-parameter item response theory model had a good fit (TLI = .91, RMSEA = .04) to the proposed 3-factor structure. The total scale reliability ranged from .80 for Black participants to .73 for total culturally diverse sample. The three subscales were differentially predictive of family history of cancer. The revised Breast-CLAT scales demonstrated internal consistency reliability and validity in this multiethnic, community-based sample.
Dima, Alexandra Lelia; Schulz, Peter Johannes
2017-01-01
Background The eHealth Literacy Scale (eHEALS) is a tool to assess consumers’ comfort and skills in using information technologies for health. Although evidence exists of reliability and construct validity of the scale, less agreement exists on structural validity. Objective The aim of this study was to validate the Italian version of the eHealth Literacy Scale (I-eHEALS) in a community sample with a focus on its structural validity, by applying psychometric techniques that account for item difficulty. Methods Two Web-based surveys were conducted among a total of 296 people living in the Italian-speaking region of Switzerland (Ticino). After examining the latent variables underlying the observed variables of the Italian scale via principal component analysis (PCA), fit indices for two alternative models were calculated using confirmatory factor analysis (CFA). The scale structure was examined via parametric and nonparametric item response theory (IRT) analyses accounting for differences between items regarding the proportion of answers indicating high ability. Convergent validity was assessed by correlations with theoretically related constructs. Results CFA showed a suboptimal model fit for both models. IRT analyses confirmed all items measure a single dimension as intended. Reliability and construct validity of the final scale were also confirmed. The contrasting results of factor analysis (FA) and IRT analyses highlight the importance of considering differences in item difficulty when examining health literacy scales. Conclusions The findings support the reliability and validity of the translated scale and its use for assessing Italian-speaking consumers’ eHealth literacy. PMID:28400356
NASA Astrophysics Data System (ADS)
Zhong, Yu-Xi; Guo, Yuan-Ru; Pan, Qing-Jiang
2016-02-01
Relativistic density functional theory was used to explore the structural and redox properties of 18 prototypical actinyl silylamides including a variation of metals (U, Np and Pu), metal oxidation states (VI and V) and equatorial ligands. A theoretical approach associated with implicit solvation and spin-orbit/multiplet corrections was proved to be reliable. A marked shift of reduction potentials of actinyl silylamides caused by changes of equatorial coordination ligands and implicit solvation was elucidated by analyses of electronic structures and single-electron reduction mechanism.
Report on INT Program INT-17-1a
DOE Office of Scientific and Technical Information (OSTI.GOV)
Escher, J. E.; Blackmon, J.; Elster, C.
The purpose of the 5-week program was to bring together physicists from the low-energy nuclear structure and reaction communities to identify avenues for achieving reliable and predictive descriptions of reactions involving nuclei across the isotopic chart. The 4-day embedded workshop focused on connecting theory developments to experimental advances and data needs for astrophysics and other applications.
Theories of risk and safety: what is their relevance to nursing?
Cooke, Hannah
2009-03-01
The aim of this paper is to review key theories of risk and safety and their implications for nursing. The concept of of patient safety has only recently risen to prominence as an organising principle in healthcare. The paper considers the wider social context in which contemporary concepts of risk and safety have developed. In particular it looks at sociological debates about the rise of risk culture and the risk society and their influence on the patient safety movement. The paper discusses three bodies of theory which have attempted to explain the management of risk and safety in organisations: normal accident theory, high reliability theory, and grid-group cultural theory. It examine debates between these theories and their implications for healthcare. It discusses reasons for the dominance of high reliability theory in healthcare and its strengths and limitations. The paper suggest that high reliability theory has particular difficulties in explaining some aspects of organisational culture. It also suggest that the implementation of high reliability theory in healthcare has involved over reliance on numerical indicators. It suggests that patient safety could be improved by openness to a wider range of theoretical perspectives.
The influence of various test plans on mission reliability. [for Shuttle Spacelab payloads
NASA Technical Reports Server (NTRS)
Stahle, C. V.; Gongloff, H. R.; Young, J. P.; Keegan, W. B.
1977-01-01
Methods have been developed for the evaluation of cost effective vibroacoustic test plans for Shuttle Spacelab payloads. The shock and vibration environments of components have been statistically represented, and statistical decision theory has been used to evaluate the cost effectiveness of five basic test plans with structural test options for two of the plans. Component, subassembly, and payload testing have been performed for each plan along with calculations of optimum test levels and expected costs. The tests have been ranked according to both minimizing expected project costs and vibroacoustic reliability. It was found that optimum costs may vary up to $6 million with the lowest plan eliminating component testing and maintaining flight vibration reliability via subassembly tests at high acoustic levels.
Test Theories, Educational Priorities and Reliability of Public Examinations in England
ERIC Educational Resources Information Center
Baird, Jo-Anne; Black, Paul
2013-01-01
Much has already been written on the controversies surrounding the use of different test theories in educational assessment. Other authors have noted the prevalence of classical test theory over item response theory in practice. This Special Issue draws together articles based upon work conducted on the Reliability Programme for England's…
A new theory for X-ray diffraction
Fewster, Paul F.
2014-01-01
This article proposes a new theory of X-ray scattering that has particular relevance to powder diffraction. The underlying concept of this theory is that the scattering from a crystal or crystallite is distributed throughout space: this leads to the effect that enhanced scatter can be observed at the ‘Bragg position’ even if the ‘Bragg condition’ is not satisfied. The scatter from a single crystal or crystallite, in any fixed orientation, has the fascinating property of contributing simultaneously to many ‘Bragg positions’. It also explains why diffraction peaks are obtained from samples with very few crystallites, which cannot be explained with the conventional theory. The intensity ratios for an Si powder sample are predicted with greater accuracy and the temperature factors are more realistic. Another consequence is that this new theory predicts a reliability in the intensity measurements which agrees much more closely with experimental observations compared to conventional theory that is based on ‘Bragg-type’ scatter. The role of dynamical effects (extinction etc.) is discussed and how they are suppressed with diffuse scattering. An alternative explanation for the Lorentz factor is presented that is more general and based on the capture volume in diffraction space. This theory, when applied to the scattering from powders, will evaluate the full scattering profile, including peak widths and the ‘background’. The theory should provide an increased understanding of the reliability of powder diffraction measurements, and may also have wider implications for the analysis of powder diffraction data, by increasing the accuracy of intensities predicted from structural models. PMID:24815975
Immodest Witnesses: Reliability and Writing Assessment
ERIC Educational Resources Information Center
Gallagher, Chris W.
2014-01-01
This article offers a survey of three reliability theories in writing assessment: positivist, hermeneutic, and rhetorical. Drawing on an interdisciplinary investigation of the notion of "witnessing," this survey emphasizes the kinds of readers and readings each theory of reliability produces and the epistemological grounds on which it…
Lifetime Reliability Prediction of Ceramic Structures Under Transient Thermomechanical Loads
NASA Technical Reports Server (NTRS)
Nemeth, Noel N.; Jadaan, Osama J.; Gyekenyesi, John P.
2005-01-01
An analytical methodology is developed to predict the probability of survival (reliability) of ceramic components subjected to harsh thermomechanical loads that can vary with time (transient reliability analysis). This capability enables more accurate prediction of ceramic component integrity against fracture in situations such as turbine startup and shutdown, operational vibrations, atmospheric reentry, or other rapid heating or cooling situations (thermal shock). The transient reliability analysis methodology developed herein incorporates the following features: fast-fracture transient analysis (reliability analysis without slow crack growth, SCG); transient analysis with SCG (reliability analysis with time-dependent damage due to SCG); a computationally efficient algorithm to compute the reliability for components subjected to repeated transient loading (block loading); cyclic fatigue modeling using a combined SCG and Walker fatigue law; proof testing for transient loads; and Weibull and fatigue parameters that are allowed to vary with temperature or time. Component-to-component variation in strength (stochastic strength response) is accounted for with the Weibull distribution, and either the principle of independent action or the Batdorf theory is used to predict the effect of multiaxial stresses on reliability. The reliability analysis can be performed either as a function of the component surface (for surface-distributed flaws) or component volume (for volume-distributed flaws). The transient reliability analysis capability has been added to the NASA CARES/ Life (Ceramic Analysis and Reliability Evaluation of Structures/Life) code. CARES/Life was also updated to interface with commercially available finite element analysis software, such as ANSYS, when used to model the effects of transient load histories. Examples are provided to demonstrate the features of the methodology as implemented in the CARES/Life program.
NASA Astrophysics Data System (ADS)
Wagler, Amy; Wagler, Ron
2013-09-01
The Measure of Acceptance of the Theory of Evolution (MATE) was constructed to be a single-factor instrument that assesses an individual's overall acceptance of evolutionary theory. The MATE was validated and the scores resulting from the MATE were found to be reliable for the population of inservice high school biology teachers. However, many studies have utilized the MATE for different populations, such as university students enrolled in a biology or genetics course, high school students, and preservice teachers. This is problematic because the dimensionality and reliability of the MATE may not be consistent across populations. It is not uncommon in science education research to find examples where scales are applied to novel populations without proper assessment of the validity and reliability. In order to illustrate this issue, a case study is presented where the dimensionality of the MATE is evaluated for a population of non-science major preservice elementary teachers. With this objective in mind, factor analytic and item response models are fit to the observed data to provide evidence for or against a one-dimensional latent structure and to detect which items do not conform to the theoretical construct for this population. The results of this study call into question any findings and conclusions made using the MATE for a Hispanic population of preservice teachers and point out the error of assuming invariance across substantively different populations.
Using Item Response Theory Methods with the Brazilian Temperament Scale for Students
ERIC Educational Resources Information Center
Primi, Ricardo; Wechsler, Solange Muglia; de Cassia Nakano, Tatiana; Oakland, Thomas; Guzzo, Raquel Souza Lobo
2014-01-01
The development and validation of the Brazilian Temperament Scale for Students (BTSS) are examined through the use of data from 1,258 children and adolescents, ages 10 through 21 (M = 15.0, SD = 2.1, 56% females). Three psychometric properties of BTSS are reported: its internal structure (e.g., validity), its reliability, and cut points to best…
ERIC Educational Resources Information Center
Kapuza, A. V.; Tyumeneva, Yu. A.
2017-01-01
One of the ways of controlling for the influence of social expectations on the answers given by survey respondents is to use a social desirability scale together with the main questions. The social desirability scale, which was included in the Teaching and Learning International Survey (TALIS) international comparative study for this purpose, was…
ERIC Educational Resources Information Center
Lee, Guemin; Park, In-Yong
2012-01-01
Previous assessments of the reliability of test scores for testlet-composed tests have indicated that item-based estimation methods overestimate reliability. This study was designed to address issues related to the extent to which item-based estimation methods overestimate the reliability of test scores composed of testlets and to compare several…
Constitutive Theory Developed for Monolithic Ceramic Materials
NASA Technical Reports Server (NTRS)
Janosik, Lesley A.
1998-01-01
With the increasing use of advanced ceramic materials in high-temperature structural applications such as advanced heat engine components, the need arises to accurately predict thermomechanical behavior that is inherently time-dependent and that is hereditary in the sense that the current behavior depends not only on current conditions but also on the material's thermomechanical history. Most current analytical life prediction methods for both subcritical crack growth and creep models use elastic stress fields to predict the time-dependent reliability response of components subjected to elevated service temperatures. Inelastic response at high temperatures has been well documented in the materials science literature for these material systems, but this issue has been ignored by the engineering design community. From a design engineer's perspective, it is imperative to emphasize that accurate predictions of time-dependent reliability demand accurate stress field information. Ceramic materials exhibit different time-dependent behavior in tension and compression. Thus, inelastic deformation models for ceramics must be constructed in a fashion that admits both sensitivity to hydrostatic stress and differing behavior in tension and compression. A number of constitutive theories for materials that exhibit sensitivity to the hydrostatic component of stress have been proposed that characterize deformation using time-independent classical plasticity as a foundation. However, none of these theories allow different behavior in tension and compression. In addition, these theories are somewhat lacking in that they are unable to capture the creep, relaxation, and rate-sensitive phenomena exhibited by ceramic materials at high temperatures. The objective of this effort at the NASA Lewis Research Center has been to formulate a macroscopic continuum theory that captures these time-dependent phenomena. Specifically, the effort has focused on inelastic deformation behavior associated with these service conditions by developing a multiaxial viscoplastic constitutive model that accounts for time-dependent hereditary material deformation (such as creep and stress relaxation) in monolithic structural ceramics. Using continuum principles of engineering mechanics, we derived the complete viscoplastic theory from a scalar dissipative potential function.
2013-01-01
Background Understanding children’s physical activity motivation, its antecedents and associations with behavior is important and can be advanced by using self-determination theory. However, research among youth is largely restricted to adolescents and studies of motivation within certain contexts (e.g., physical education). There are no measures of self-determination theory constructs (physical activity motivation or psychological need satisfaction) for use among children and no previous studies have tested a self-determination theory-based model of children’s physical activity motivation. The purpose of this study was to test the reliability and validity of scores derived from scales adapted to measure self-determination theory constructs among children and test a motivational model predicting accelerometer-derived physical activity. Methods Cross-sectional data from 462 children aged 7 to 11 years from 20 primary schools in Bristol, UK were analysed. Confirmatory factor analysis was used to examine the construct validity of adapted behavioral regulation and psychological need satisfaction scales. Structural equation modelling was used to test cross-sectional associations between psychological need satisfaction, motivation types and physical activity assessed by accelerometer. Results The construct validity and reliability of the motivation and psychological need satisfaction measures were supported. Structural equation modelling provided evidence for a motivational model in which psychological need satisfaction was positively associated with intrinsic and identified motivation types and intrinsic motivation was positively associated with children’s minutes in moderate-to-vigorous physical activity. Conclusions The study provides evidence for the psychometric properties of measures of motivation aligned with self-determination theory among children. Children’s motivation that is based on enjoyment and inherent satisfaction of physical activity is associated with their objectively-assessed physical activity and such motivation is positively associated with perceptions of psychological need satisfaction. These psychological factors represent potential malleable targets for interventions to increase children’s physical activity. PMID:24067078
Reliability of Test Scores in Nonparametric Item Response Theory.
ERIC Educational Resources Information Center
Sijtsma, Klaas; Molenaar, Ivo W.
1987-01-01
Three methods for estimating reliability are studied within the context of nonparametric item response theory. Two were proposed originally by Mokken and a third is developed in this paper. Using a Monte Carlo strategy, these three estimation methods are compared with four "classical" lower bounds to reliability. (Author/JAZ)
The Reliability of Criterion-Referenced Measures.
ERIC Educational Resources Information Center
Livingston, Samuel A.
The assumptions of the classical test-theory model are used to develop a theory of reliability for criterion-referenced measures which parallels that for norm-referenced measures. It is shown that the Spearman-Brown formula holds for criterion-referenced measures and that the criterion-referenced reliability coefficient can be used to correct…
Representations of spacetime: Formalism and ontological commitment
NASA Astrophysics Data System (ADS)
Bain, Jonathan Stanley
This dissertation consists of two parts. The first is on the relation between formalism and ontological commitment in the context of theories of spacetime, and the second is on scientific realism. The first part begins with a look at how the substantivalist/relationist debate over the ontological status of spacetime has been influenced by a particular mathematical formalism, that of tensor analysis on differential manifolds (TADM). This formalism has motivated the substantivalist position known as manifold substantivalism. Chapter 1 focuses on the hole argument which maintains that manifold substantivalism is incompatible with determinism. I claim that the realist motivations underlying manifold substantivalism can be upheld, and the hole argument avoided, by adopting structural realism with respect to spacetime. In this context, this is the claim that it is the structure that spacetime points enter into that warrants belief and not the points themselves. In Chapter 2, an elimination principle is defined by means of which a distinction can be made between surplus structure and essential structure with respect to formulations of a theory in two distinct mathematical formulations and some prior ontological commitments. This principle is then used to demonstrate that manifold points may be considered surplus structure in the formulation of field theories. This suggests that, if we are disposed to read field theories literally, then, at most, it should be the essential structure common to all alternative formulations of such theories that should be taken literally. I also investigate how the adoption of alternative formalisms informs other issues in the philosophy of spacetime. Chapter 3 offers a realist position which takes a semantic moral from the preceding investigation and an epistemic moral from work done on reliability. The semantic moral advises us to read only the essential structure of our theories literally. The epistemic moral shows us that such structure is robust under theory change, given an adequate reliabilist notion of epistemic warrant. I call the realist position that subscribes to these morals structural realism and attempt to demonstrate that it is immune to the semantic and epistemic versions of the underdetermination argument posed by the anti-realist.
Development and validation of the Alcohol Myopia Scale.
Lac, Andrew; Berger, Dale E
2013-09-01
Alcohol myopia theory conceptualizes the ability of alcohol to narrow attention and how this demand on mental resources produces the impairments of self-inflation, relief, and excess. The current research was designed to develop and validate a scale based on this framework. People who were alcohol users rated items representing myopic experiences arising from drinking episodes in the past month. In Study 1 (N = 260), the preliminary 3-factor structure was supported by exploratory factor analysis. In Study 2 (N = 289), the 3-factor structure was substantiated with confirmatory factor analysis, and it was superior in fit to an empirically indefensible 1-factor structure. The final 14-item scale was evaluated with internal consistency reliability, discriminant validity, convergent validity, criterion validity, and incremental validity. The alcohol myopia scale (AMS) illuminates conceptual underpinnings of this theory and yields insights for understanding the tunnel vision that arises from intoxication.
ERIC Educational Resources Information Center
Guler, Nese; Gelbal, Selahattin
2010-01-01
In this study, the Classical test theory and generalizability theory were used for determination to reliability of scores obtained from measurement tool of mathematics success. 24 open-ended mathematics question of the TIMSS-1999 was applied to 203 students in 2007-spring semester. Internal consistency of scores was found as 0.92. For…
Martin, Andrea E.
2016-01-01
I argue that cue integration, a psychophysiological mechanism from vision and multisensory perception, offers a computational linking hypothesis between psycholinguistic theory and neurobiological models of language. I propose that this mechanism, which incorporates probabilistic estimates of a cue's reliability, might function in language processing from the perception of a phoneme to the comprehension of a phrase structure. I briefly consider the implications of the cue integration hypothesis for an integrated theory of language that includes acquisition, production, dialogue and bilingualism, while grounding the hypothesis in canonical neural computation. PMID:26909051
Brueckner-AMD Study of Light Nuclei
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kato, Kiyoshi; Yamamoto, Yuhei; Togashi, Tomoaki
2011-06-28
We applied the Brueckner theory to the Antisymmetrized Molecular Dynamics (AMD) and examined the reliability of the AMD calculations based on realistic nuclear interactions. In this method, the Bethe-Goldstone equation in the Brueckner theory is solved for every nucleon pair described by wave packets of AMD, and the G-matrix is calculated with single-particle orbits in AMD self-consistently. We apply this framework to not only {alpha}-nuclei but also N{ne}Z nuclei with A{approx}10. It is confirmed that these results present the description of reasonable cluster structures and energy-level schemes comparable with the experimental ones in light nuclei.
Diviani, Nicola; Dima, Alexandra Lelia; Schulz, Peter Johannes
2017-04-11
The eHealth Literacy Scale (eHEALS) is a tool to assess consumers' comfort and skills in using information technologies for health. Although evidence exists of reliability and construct validity of the scale, less agreement exists on structural validity. The aim of this study was to validate the Italian version of the eHealth Literacy Scale (I-eHEALS) in a community sample with a focus on its structural validity, by applying psychometric techniques that account for item difficulty. Two Web-based surveys were conducted among a total of 296 people living in the Italian-speaking region of Switzerland (Ticino). After examining the latent variables underlying the observed variables of the Italian scale via principal component analysis (PCA), fit indices for two alternative models were calculated using confirmatory factor analysis (CFA). The scale structure was examined via parametric and nonparametric item response theory (IRT) analyses accounting for differences between items regarding the proportion of answers indicating high ability. Convergent validity was assessed by correlations with theoretically related constructs. CFA showed a suboptimal model fit for both models. IRT analyses confirmed all items measure a single dimension as intended. Reliability and construct validity of the final scale were also confirmed. The contrasting results of factor analysis (FA) and IRT analyses highlight the importance of considering differences in item difficulty when examining health literacy scales. The findings support the reliability and validity of the translated scale and its use for assessing Italian-speaking consumers' eHealth literacy. ©Nicola Diviani, Alexandra Lelia Dima, Peter Johannes Schulz. Originally published in the Journal of Medical Internet Research (http://www.jmir.org), 11.04.2017.
Branscum, Paul; Lora, Karina R
2016-06-02
Public health interventions are greatly needed for obesity prevention, and planning for such strategies should include community participation. The study's purpose was to develop and validate a theory-based instrument with low-income, Hispanic mothers of preschoolers, to assess theory-based determinants of maternal monitoring of child's consumption of fruits and vegetables and sugar-sweetened beverages (SSB). Nine focus groups with mothers were conducted to determine nutrition-related behaviors that mothers found as most obesogenic for their children. Next, behaviors were operationally defined and rated for importance and changeability. Two behaviors were selected for investigation (fruits and vegetable and SSB). Twenty semi-structured interviews with mothers were conducted next to develop culturally appropriate items for the instrument. Afterwards, face and content validity were established using a panel of six experts. Finally, the instrument was tested with a sample of 238 mothers. Psychometric properties evaluated included construct validity (using the maximum likelihood extraction method of factor analysis), and internal consistency reliability (Cronbach's alpha). Results suggested that all scales on the instrument were valid and reliable, except for the autonomy scales. Researchers and community planners working with Hispanic families can use this instrument to measure theory-based determinants of parenting behaviors related to preschoolers' consumption of fruits and vegetables, and SSB.
The molten glass sewing machine
Inamura, Chikara; Lizardo, Daniel; Franchin, Giorgia; Stern, Michael; Houk, Peter; Oxman, Neri
2017-01-01
We present a fluid-instability-based approach for digitally fabricating geometrically complex uniformly sized structures in molten glass. Formed by mathematically defined and physically characterized instability patterns, such structures are produced via the additive manufacturing of optically transparent glass, and result from the coiling of an extruded glass thread. We propose a minimal geometrical model—and a methodology—to reliably control the morphology of patterns, so that these building blocks can be assembled into larger structures with tailored functionally and optically tunable properties. This article is part of the themed issue ‘Patterning through instabilities in complex media: theory and applications’. PMID:28373379
Time-dependent reliability analysis of ceramic engine components
NASA Technical Reports Server (NTRS)
Nemeth, Noel N.
1993-01-01
The computer program CARES/LIFE calculates the time-dependent reliability of monolithic ceramic components subjected to thermomechanical and/or proof test loading. This program is an extension of the CARES (Ceramics Analysis and Reliability Evaluation of Structures) computer program. CARES/LIFE accounts for the phenomenon of subcritical crack growth (SCG) by utilizing either the power or Paris law relations. The two-parameter Weibull cumulative distribution function is used to characterize the variation in component strength. The effects of multiaxial stresses are modeled using either the principle of independent action (PIA), the Weibull normal stress averaging method (NSA), or the Batdorf theory. Inert strength and fatigue parameters are estimated from rupture strength data of naturally flawed specimens loaded in static, dynamic, or cyclic fatigue. Two example problems demonstrating proof testing and fatigue parameter estimation are given.
ERIC Educational Resources Information Center
Stenner, A. Jackson; Rohlf, Richard J.
The merits of generalizability theory in the formulation of construct definitions and in the determination of reliability estimates are discussed. The broadened conceptualization of reliability brought about by Cronbach's generalizability theory is reviewed. Career Maturity Inventory data from a sample of 60 ninth grade students is used to…
NASA Astrophysics Data System (ADS)
Miao, Yongchun; Kang, Rongxue; Chen, Xuefeng
2017-12-01
In recent years, with the gradual extension of reliability research, the study of production system reliability has become the hot topic in various industries. Man-machine-environment system is a complex system composed of human factors, machinery equipment and environment. The reliability of individual factor must be analyzed in order to gradually transit to the research of three-factor reliability. Meanwhile, the dynamic relationship among man-machine-environment should be considered to establish an effective blurry evaluation mechanism to truly and effectively analyze the reliability of such systems. In this paper, based on the system engineering, fuzzy theory, reliability theory, human error, environmental impact and machinery equipment failure theory, the reliabilities of human factor, machinery equipment and environment of some chemical production system were studied by the method of fuzzy evaluation. At last, the reliability of man-machine-environment system was calculated to obtain the weighted result, which indicated that the reliability value of this chemical production system was 86.29. Through the given evaluation domain it can be seen that the reliability of man-machine-environment integrated system is in a good status, and the effective measures for further improvement were proposed according to the fuzzy calculation results.
Large Sample Confidence Intervals for Item Response Theory Reliability Coefficients
ERIC Educational Resources Information Center
Andersson, Björn; Xin, Tao
2018-01-01
In applications of item response theory (IRT), an estimate of the reliability of the ability estimates or sum scores is often reported. However, analytical expressions for the standard errors of the estimators of the reliability coefficients are not available in the literature and therefore the variability associated with the estimated reliability…
ERIC Educational Resources Information Center
Md Desa, Zairul Nor Deana
2012-01-01
In recent years, there has been increasing interest in estimating and improving subscore reliability. In this study, the multidimensional item response theory (MIRT) and the bi-factor model were combined to estimate subscores, to obtain subscores reliability, and subscores classification. Both the compensatory and partially compensatory MIRT…
Shortell, Stephen M
2016-12-01
This commentary highights the key arguments and contributions of institutional thoery, transaction cost economics (TCE) theory, high reliability theory, and organizational learning theory to understanding the development and evolution of Accountable Care Organizations (ACOs). Institutional theory and TCE theory primarily emphasize the external influences shaping ACOs while high reliability theory and organizational learning theory underscore the internal fctors influencing ACO perfromance. A framework based on Implementation Science is proposed to conside the multiple perspectives on ACOs and, in particular, their abiity to innovate to achieve desired cost, quality, and population health goals. © The Author(s) 2016.
The biomedical disciplines and the structure of biomedical and clinical knowledge.
Nederbragt, H
2000-11-01
The relation between biomedical knowledge and clinical knowledge is discussed by comparing their respective structures. The knowledge of a disease as a biological phenomenon is constructed by the interaction of facts and theories from the main biomedical disciplines: epidemiology, diagnostics, clinical trial, therapy development and pathogenesis. Although these facts and theories are based on probabilities and extrapolations, the interaction provides a reliable and coherent structure, comparable to a Kuhnian paradigma. In the structure of clinical knowledge, i.e. knowledge of the patient with the disease, not only biomedical knowledge contributes to the structure but also economic and social relations, ethics and personal experience. However, the interaction between each of the participating "knowledges" in clinical knowledge is not based on mutual dependency and accumulation of different arguments from each, as in biomedical knowledge, but on competition and partial exclusion. Therefore, the structure of biomedical knowledge is different from that of clinical knowledge. This difference is used as the basis for a discussion in which the place of technology, evidence-based medicine and the gap between scientific and clinical knowledge are evaluated.
Experiment and density functional theory analyses of GdTaO4 single crystal
NASA Astrophysics Data System (ADS)
Ding, Shoujun; Kinross, Ashlie; Wang, Xiaofei; Yang, Huajun; Zhang, Qingli; Liu, Wenpeng; Sun, Dunlu
2018-05-01
GdTaO4 is a type of excellent materials that can be used as scintillation, laser matrix as well as self-activated phosphor has generated significant interest. Whereas its band structure, electronic structure and optical properties are still need elucidation. To solve this intriguing problem, high-quality GdTaO4 single crystal (M-type) was grown successfully using Czochralski method. Its structure as well as optical properties was determined in experiment. Moreover, a systematic theoretical calculation based on the density function theory methods were performed on M-type and M‧-type GdTaO4 and their band structure, density of state as well as optical properties were obtained. Combine with the performed experiment results, the calculated results were proved with high reliability. Hence, the calculated results obtained in this work could provide a deep understanding of GdTaO4 material, which also useful for the further investigation on GdTaO4 material.
NASA Technical Reports Server (NTRS)
Irwin, R. Dennis
1988-01-01
The applicability of H infinity control theory to the problems of large space structures (LSS) control was investigated. A complete evaluation to any technique as a candidate for large space structure control involves analytical evaluation, algorithmic evaluation, evaluation via simulation studies, and experimental evaluation. The results of analytical and algorithmic evaluations are documented. The analytical evaluation involves the determination of the appropriateness of the underlying assumptions inherent in the H infinity theory, the determination of the capability of the H infinity theory to achieve the design goals likely to be imposed on an LSS control design, and the identification of any LSS specific simplifications or complications of the theory. The resuls of the analytical evaluation are presented in the form of a tutorial on the subject of H infinity control theory with the LSS control designer in mind. The algorthmic evaluation of H infinity for LSS control pertains to the identification of general, high level algorithms for effecting the application of H infinity to LSS control problems, the identification of specific, numerically reliable algorithms necessary for a computer implementation of the general algorithms, the recommendation of a flexible software system for implementing the H infinity design steps, and ultimately the actual development of the necessary computer codes. Finally, the state of the art in H infinity applications is summarized with a brief outline of the most promising areas of current research.
Mehrkash, Milad; Azhari, Mojtaba; Mirdamadi, Hamid Reza
2014-01-01
The importance of elastic wave propagation problem in plates arises from the application of ultrasonic elastic waves in non-destructive evaluation of plate-like structures. However, precise study and analysis of acoustic guided waves especially in non-homogeneous waveguides such as functionally graded plates are so complicated that exact elastodynamic methods are rarely employed in practical applications. Thus, the simple approximate plate theories have attracted much interest for the calculation of wave fields in FGM plates. Therefore, in the current research, the classical plate theory (CPT), first-order shear deformation theory (FSDT) and third-order shear deformation theory (TSDT) are used to obtain the transient responses of flexural waves in FGM plates subjected to transverse impulsive loadings. Moreover, comparing the results with those based on a well recognized hybrid numerical method (HNM), we examine the accuracy of the plate theories for several plates of various thicknesses under excitations of different frequencies. The material properties of the plate are assumed to vary across the plate thickness according to a simple power-law distribution in terms of volume fractions of constituents. In all analyses, spatial Fourier transform together with modal analysis are applied to compute displacement responses of the plates. A comparison of the results demonstrates the reliability ranges of the approximate plate theories for elastic wave propagation analysis in FGM plates. Furthermore, based on various examples, it is shown that whenever the plate theories are used within the appropriate ranges of plate thickness and frequency content, solution process in wave number-time domain based on modal analysis approach is not only sufficient but also efficient for finding the transient waveforms in FGM plates. Copyright © 2013 Elsevier B.V. All rights reserved.
Wagenlehner, Florian Martin Erich; Fröhlich, Oliver; Bschleipfer, Thomas; Weidner, Wolfgang; Perletti, Gianpaolo
2014-06-01
Anatomical damage to pelvic floor structures may cause multiple symptoms. The Integral Theory System Questionnaire (ITSQ) is a holistic questionnaire that uses symptoms to help locate damage in specific connective tissue structures as a guide to reconstructive surgery. It is based on the integral theory, which states that pelvic floor symptoms and prolapse are both caused by lax suspensory ligaments. The aim of the present study was to psychometrically validate the ITSQ. Established psychometric properties including validity, reliability, and responsiveness were considered for evaluation. Criterion validity was assessed in a cohort of 110 women with pelvic floor dysfunctions by analyzing the correlation of questionnaire responses with objective clinical data. Test-retest was performed with questionnaires from 47 patients. Cronbach's alpha and "split-half" reliability coefficients were calculated for inner consistency analysis. Psychometric properties of ITSQ were comparable to the ones of previously validated Pelvic Floor Questionnaires. Face validity and content validity were approved by an expert group of the International Collaboration of Pelvic Floor surgeons. Convergent validity assessed using Bayesian method was at least as accurate as the expert assessment of anatomical defects. Objective data measurement in patients demonstrated significant correlations with ITSQ domains fulfilling criterion validity. Internal consistency values ranked from 0.85 to 0.89 in different scenarios. The ITSQ proofed accurate and is able to serve as a holistic Pelvic Floor Questionnaire directing symptoms to site-specific pelvic floor reconstructive surgery.
Sample Size for Estimation of G and Phi Coefficients in Generalizability Theory
ERIC Educational Resources Information Center
Atilgan, Hakan
2013-01-01
Problem Statement: Reliability, which refers to the degree to which measurement results are free from measurement errors, as well as its estimation, is an important issue in psychometrics. Several methods for estimating reliability have been suggested by various theories in the field of psychometrics. One of these theories is the generalizability…
Rantz, Marilyn J; Aud, Myra A; Zwygart-Stauffacher, Mary; Mehr, David R; Petroski, Gregory F; Owen, Steven V; Madsen, Richard W; Flesner, Marcia; Conn, Vicki; Maas, Meridean
2008-01-01
Field test results are reported for the Observable Indicators of Nursing Home Care Quality Instrument-Assisted Living Version, an instrument designed to measure the quality of care in assisted living facilities after a brief 30-minute walk-through. The OIQ-AL was tested in 207 assisted-living facilities in two states using classical test theory, generalizability theory, and exploratory factor analysis. The 34-item scale has a coherent six-factor structure that conceptually describes the multidimensional concept of care quality in assisted living. The six factors can be logically clustered into process (Homelike and Caring, 21 items) and structure (Access and Choice; Lighting; Plants and Pets; Outdoor Spaces) subscales and for a total quality score. Classical test theory results indicate most subscales and the total quality score from the OIQ-AL have acceptable interrater, test-retest, and strong internal consistency reliabilities. Generalizability theory analyses reveal that dependability of scores from the instrument are strong, particularly by including a second observer who conducts a site visit and independently completes an instrument, or by a single observer conducting two site visits and completing instruments during each visit. Scoring guidelines based on the total sample of observations (N = 358) help guide those who want to use the measure to interpret both subscale and total scores. Content validity was supported by two expert panels of people experienced in the assisted-living field, and a content validity index calculated for the first version of the scale is high (3.43 on a four-point scale). The OIQ-AL gives reliable and valid scores for researchers, and may be useful for consumers, providers, and others interested in measuring quality of care in assisted-living facilities.
ERIC Educational Resources Information Center
Uzun, N. Bilge; Aktas, Mehtap; Asiret, Semih; Yormaz, Seha
2018-01-01
The goal of this study is to determine the reliability of the performance points of dentistry students regarding communication skills and to examine the scoring reliability by generalizability theory in balanced random and fixed facet (mixed design) data, considering also the interactions of student, rater and duty. The study group of the research…
Medicine is not science: guessing the future, predicting the past.
Miller, Clifford
2014-12-01
Irregularity limits human ability to know, understand and predict. A better understanding of irregularity may improve the reliability of knowledge. Irregularity and its consequences for knowledge are considered. Reliable predictive empirical knowledge of the physical world has always been obtained by observation of regularities, without needing science or theory. Prediction from observational knowledge can remain reliable despite some theories based on it proving false. A naïve theory of irregularity is outlined. Reducing irregularity and/or increasing regularity can increase the reliability of knowledge. Beyond long experience and specialization, improvements include implementing supporting knowledge systems of libraries of appropriately classified prior cases and clinical histories and education about expertise, intuition and professional judgement. A consequence of irregularity and complexity is that classical reductionist science cannot provide reliable predictions of the behaviour of complex systems found in nature, including of the human body. Expertise, expert judgement and their exercise appear overarching. Diagnosis involves predicting the past will recur in the current patient applying expertise and intuition from knowledge and experience of previous cases and probabilistic medical theory. Treatment decisions are an educated guess about the future (prognosis). Benefits of the improvements suggested here are likely in fields where paucity of feedback for practitioners limits development of reliable expert diagnostic intuition. Further analysis, definition and classification of irregularity is appropriate. Observing and recording irregularities are initial steps in developing irregularity theory to improve the reliability and extent of knowledge, albeit some forms of irregularity present inherent difficulties. © 2014 John Wiley & Sons, Ltd.
CARES - CERAMICS ANALYSIS AND RELIABILITY EVALUATION OF STRUCTURES
NASA Technical Reports Server (NTRS)
Nemeth, N. N.
1994-01-01
The beneficial properties of structural ceramics include their high-temperature strength, light weight, hardness, and corrosion and oxidation resistance. For advanced heat engines, ceramics have demonstrated functional abilities at temperatures well beyond the operational limits of metals. This is offset by the fact that ceramic materials tend to be brittle. When a load is applied, their lack of significant plastic deformation causes the material to crack at microscopic flaws, destroying the component. CARES calculates the fast-fracture reliability or failure probability of macroscopically isotropic ceramic components. These components may be subjected to complex thermomechanical loadings. The program uses results from a commercial structural analysis program (MSC/NASTRAN or ANSYS) to evaluate component reliability due to inherent surface and/or volume type flaws. A multiple material capability allows the finite element model reliability to be a function of many different ceramic material statistical characterizations. The reliability analysis uses element stress, temperature, area, and volume output, which are obtained from two dimensional shell and three dimensional solid isoparametric or axisymmetric finite elements. CARES utilizes the Batdorf model and the two-parameter Weibull cumulative distribution function to describe the effects of multi-axial stress states on material strength. The shear-sensitive Batdorf model requires a user-selected flaw geometry and a mixed-mode fracture criterion. Flaws intersecting the surface and imperfections embedded in the volume can be modeled. The total strain energy release rate theory is used as a mixed mode fracture criterion for co-planar crack extension. Out-of-plane crack extension criteria are approximated by a simple equation with a semi-empirical constant that can model the maximum tangential stress theory, the minimum strain energy density criterion, the maximum strain energy release rate theory, or experimental results. For comparison, Griffith's maximum tensile stress theory, the principle of independent action, and the Weibull normal stress averaging models are also included. Weibull material strength parameters, the Batdorf crack density coefficient, and other related statistical quantities are estimated from four-point bend bar or uniform uniaxial tensile specimen fracture strength data. Parameter estimation can be performed for single or multiple failure modes by using the least-squares analysis or the maximum likelihood method. A more limited program, CARES/PC (COSMIC number LEW-15248) runs on a personal computer and estimates ceramic material properties from three-point bend bar data. CARES/PC does not perform fast fracture reliability estimation. CARES is written in FORTRAN 77 and has been implemented on DEC VAX series computers under VMS and on IBM 370 series computers under VM/CMS. On a VAX, CARES requires 10Mb of main memory. Five MSC/NASTRAN example problems and two ANSYS example problems are provided. There are two versions of CARES supplied on the distribution tape, CARES1 and CARES2. CARES2 contains sub-elements and CARES1 does not. CARES is available on a 9-track 1600 BPI VAX FILES-11 format magnetic tape (standard media) or in VAX BACKUP format on a TK50 tape cartridge. The program requires a FORTRAN 77 compiler and about 12Mb memory. CARES was developed in 1990. DEC, VAX and VMS are trademarks of Digital Equipment Corporation. IBM 370 is a trademark of International Business Machines. MSC/NASTRAN is a trademark of MacNeal-Schwendler Corporation. ANSYS is a trademark of Swanson Analysis Systems, Inc.
Fatigue Reliability of Gas Turbine Engine Structures
NASA Technical Reports Server (NTRS)
Cruse, Thomas A.; Mahadevan, Sankaran; Tryon, Robert G.
1997-01-01
The results of an investigation are described for fatigue reliability in engine structures. The description consists of two parts. Part 1 is for method development. Part 2 is a specific case study. In Part 1, the essential concepts and practical approaches to damage tolerance design in the gas turbine industry are summarized. These have evolved over the years in response to flight safety certification requirements. The effect of Non-Destructive Evaluation (NDE) methods on these methods is also reviewed. Assessment methods based on probabilistic fracture mechanics, with regard to both crack initiation and crack growth, are outlined. Limit state modeling techniques from structural reliability theory are shown to be appropriate for application to this problem, for both individual failure mode and system-level assessment. In Part 2, the results of a case study for the high pressure turbine of a turboprop engine are described. The response surface approach is used to construct a fatigue performance function. This performance function is used with the First Order Reliability Method (FORM) to determine the probability of failure and the sensitivity of the fatigue life to the engine parameters for the first stage disk rim of the two stage turbine. A hybrid combination of regression and Monte Carlo simulation is to use incorporate time dependent random variables. System reliability is used to determine the system probability of failure, and the sensitivity of the system fatigue life to the engine parameters of the high pressure turbine. 'ne variation in the primary hot gas and secondary cooling air, the uncertainty of the complex mission loading, and the scatter in the material data are considered.
Assessing transfer property and reliability of urban bus network based on complex network theory
NASA Astrophysics Data System (ADS)
Zhang, Hui; Zhuge, Cheng-Xiang; Zhao, Xiang; Song, Wen-Bo
Transfer reliability has an important impact on the urban bus network. The proportion of zero and one transfer time is a key indicator to measure the connectivity of bus networks. However, it is hard to calculate the transfer time between nodes because of the complicated network structure. In this paper, the topological structures of urban bus network in Jinan are constructed by space L and space P. A method to calculate transfer times between stations has been proposed by reachable matrix under space P. The result shows that it is efficient to calculate the transfer time between nodes in large networks. In order to test the transfer reliability, a node failure process has been built according to degree, clustering coefficient and betweenness centrality under space L and space P. The results show that the deliberate attack by betweenness centrality under space P is more effective compared with other five attack modes. This research could provide a power tool to find hub stations in bus networks and give a help for traffic manager to guarantee the normal operation of urban bus systems.
Yang, Sook Ja; Chee, Yeon Kyung; An, Jisook; Park, Min Hee; Jung, Sunok
2016-05-01
The purpose of this study was to obtain an independent evaluation of the factor structure of the 12-item Health Literacy Index for Female Marriage Immigrants (HLI-FMI), the first measure for assessing health literacy for FMIs in Korea. Participants were 250 Asian women who migrated from China, Vietnam, and the Philippines to marry. The HLI-FMI was originally developed and administered in Korean, and other questionnaires were translated into participants' native languages. The HLI-FMI consisted of 2 factors: (1) Access-Understand Health Literacy (7 items) and (2) Appraise-Apply Health Literacy (5 items); Cronbach's α = .73. Confirmatory factor analysis indicated adequate fit for the 2-factor model. HLI-FMI scores were positively associated with time since immigration and Korean proficiency. Based on classical test theory and item response theory, strong support was provided for item discrimination and item difficulty. Findings suggested that the HLI-FMI is an easily administered, reliable, and valid scale. © 2016 APJPH.
Developing and Validating the Socio-Technical Model in Ontology Engineering
NASA Astrophysics Data System (ADS)
Silalahi, Mesnan; Indra Sensuse, Dana; Giri Sucahyo, Yudho; Fadhilah Akmaliah, Izzah; Rahayu, Puji; Cahyaningsih, Elin
2018-03-01
This paper describes results from an attempt to develop a model in ontology engineering methodology and a way to validate the model. The approach to methodology in ontology engineering is from the point view of socio-technical system theory. Qualitative research synthesis is used to build the model using meta-ethnography. In order to ensure the objectivity of the measurement, inter-rater reliability method was applied using a multi-rater Fleiss Kappa. The results show the accordance of the research output with the diamond model in the socio-technical system theory by evidence of the interdependency of the four socio-technical variables namely people, technology, structure and task.
Clayson, Peter E; Miller, Gregory A
2017-01-01
Failing to consider psychometric issues related to reliability and validity, differential deficits, and statistical power potentially undermines the conclusions of a study. In research using event-related brain potentials (ERPs), numerous contextual factors (population sampled, task, data recording, analysis pipeline, etc.) can impact the reliability of ERP scores. The present review considers the contextual factors that influence ERP score reliability and the downstream effects that reliability has on statistical analyses. Given the context-dependent nature of ERPs, it is recommended that ERP score reliability be formally assessed on a study-by-study basis. Recommended guidelines for ERP studies include 1) reporting the threshold of acceptable reliability and reliability estimates for observed scores, 2) specifying the approach used to estimate reliability, and 3) justifying how trial-count minima were chosen. A reliability threshold for internal consistency of at least 0.70 is recommended, and a threshold of 0.80 is preferred. The review also advocates the use of generalizability theory for estimating score dependability (the generalizability theory analog to reliability) as an improvement on classical test theory reliability estimates, suggesting that the latter is less well suited to ERP research. To facilitate the calculation and reporting of dependability estimates, an open-source Matlab program, the ERP Reliability Analysis Toolbox, is presented. Copyright © 2016 Elsevier B.V. All rights reserved.
Nuclear surface diffuseness revealed in nucleon-nucleus diffraction
NASA Astrophysics Data System (ADS)
Hatakeyama, S.; Horiuchi, W.; Kohama, A.
2018-05-01
The nuclear surface provides useful information on nuclear radius, nuclear structure, as well as properties of nuclear matter. We discuss the relationship between the nuclear surface diffuseness and elastic scattering differential cross section at the first diffraction peak of high-energy nucleon-nucleus scattering as an efficient tool in order to extract the nuclear surface information from limited experimental data involving short-lived unstable nuclei. The high-energy reaction is described by a reliable microscopic reaction theory, the Glauber model. Extending the idea of the black sphere model, we find one-to-one correspondence between the nuclear bulk structure information and proton-nucleus elastic scattering diffraction peak. This implies that we can extract both the nuclear radius and diffuseness simultaneously, using the position of the first diffraction peak and its magnitude of the elastic scattering differential cross section. We confirm the reliability of this approach by using realistic density distributions obtained by a mean-field model.
Reliability issues in active control of large flexible space structures
NASA Technical Reports Server (NTRS)
Vandervelde, W. E.
1986-01-01
Efforts in this reporting period were centered on four research tasks: design of failure detection filters for robust performance in the presence of modeling errors, design of generalized parity relations for robust performance in the presence of modeling errors, design of failure sensitive observers using the geometric system theory of Wonham, and computational techniques for evaluation of the performance of control systems with fault tolerance and redundancy management
NASA Astrophysics Data System (ADS)
Nguyen, Dung Xuan; Gromov, Andrey; Son, Dam Thanh
2018-05-01
We perform a detailed comparison of the Dirac composite fermion and the recently proposed bimetric theory for a quantum Hall Jain states near half filling. By tuning the composite Fermi liquid to the vicinity of a nematic phase transition, we find that the two theories are equivalent to each other. We verify that the single mode approximation for the response functions and the static structure factor becomes reliable near the phase transition. We show that the dispersion relation of the nematic mode near the phase transition can be obtained from the Dirac brackets between the components of the nematic order parameter. The dispersion is quadratic at low momenta and has a magnetoroton minimum at a finite momentum, which is not related to any nearby inhomogeneous phase.
Development of a Brief Questionnaire to Assess Contraceptive Intent
Raine-Bennett, Tina R; Rocca, Corinne H
2015-01-01
Objective We sought to develop and validate an instrument that can enable providers to identify young women who may be at risk of contraceptive non-adherence. Methods Item response theory based methods were used to evaluate the psychometric properties of the Contraceptive Intent Questionnaire, a 15-item self-administered questionnaire, based on theory and prior qualitative and quantitative research. The questionnaire was administered to 200 women aged 15–24 years who were initiating contraceptives. We assessed item fit to the item response model, internal consistency, internal structure validity, and differential item functioning. Results All items fit a one-dimensional model. The separation reliability coefficient was 0.73. Participants’ overall scores covered the full range of the scale (0–15), and items appropriately matched the range of participants’ contraceptive intent. Items met the criteria for internal structure validity and most items functioned similarly between groups of women. Conclusion The Contraceptive Intent Questionnaire appears to be a reliable and valid tool. Future testing is needed to assess predictive ability and clinical utility. Practice Implications The Contraceptive Intent Questionnaire may serve as a valid tool to help providers identify women who may have problems with contraceptive adherence, as well as to pinpoint areas in which counseling may be directed. PMID:26104994
Dong, Ren G; Welcome, Daniel E; McDowell, Thomas W; Wu, John Z
2013-11-25
The relationship between the vibration transmissibility and driving-point response functions (DPRFs) of the human body is important for understanding vibration exposures of the system and for developing valid models. This study identified their theoretical relationship and demonstrated that the sum of the DPRFs can be expressed as a linear combination of the transmissibility functions of the individual mass elements distributed throughout the system. The relationship is verified using several human vibration models. This study also clarified the requirements for reliably quantifying transmissibility values used as references for calibrating the system models. As an example application, this study used the developed theory to perform a preliminary analysis of the method for calibrating models using both vibration transmissibility and DPRFs. The results of the analysis show that the combined method can theoretically result in a unique and valid solution of the model parameters, at least for linear systems. However, the validation of the method itself does not guarantee the validation of the calibrated model, because the validation of the calibration also depends on the model structure and the reliability and appropriate representation of the reference functions. The basic theory developed in this study is also applicable to the vibration analyses of other structures.
Development of a brief questionnaire to assess contraceptive intent.
Raine-Bennett, Tina R; Rocca, Corinne H
2015-11-01
We sought to develop and validate an instrument that can enable providers to identify young women who may be at risk of contraceptive non-adherence. Item response theory based methods were used to evaluate the psychometric properties of the Contraceptive Intent Questionnaire, a 15-item self-administered questionnaire, based on theory and prior qualitative and quantitative research. The questionnaire was administered to 200 women aged 15-24 years who were initiating contraceptives. We assessed item fit to the item response model, internal consistency, internal structure validity, and differential item functioning. All items fit a one-dimensional model. The separation reliability coefficient was 0.73. Participants' overall scores covered the full range of the scale (0-15), and items appropriately matched the range of participants' contraceptive intent. Items met the criteria for internal structure validity and most items functioned similarly between groups of women. The Contraceptive Intent Questionnaire appears to be a reliable and valid tool. Future testing is needed to assess predictive ability and clinical utility. The Contraceptive Intent Questionnaire may serve as a valid tool to help providers identify women who may have problems with contraceptive adherence, as well as to pinpoint areas in which counseling may be directed. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.
Iwanaga, Kanako; Umucu, Emre; Wu, Jia-Rung; Yaghmaian, Rana; Lee, Hui-Ling; Fitzgerald, Sandra; Chan, Fong
2017-07-04
Self-determination theory (SDT) and self-efficacy theory (SET) can be used to conceptualize self-determined motivation to engage in mental health and vocational rehabilitation (VR) services and to predict recovery. To incorporate SDT and SET as a framework for vocational recovery, developing and validating SDT/SET measures in vocational rehabilitation is warranted. Outcome expectancy is an important SDT/SET variable affecting rehabilitation engagement and recovery. The purpose of this study was to validate the Vocational Outcome Expectancy Scale (VOES) for use within the SDT/SET vocational recovery framework. One hundred and twenty-four individuals with serious mental illness (SMI) participated in this study. Measurement structure of the VOES was evaluated using exploratory factor analysis (EFA) and confirmatory factor analysis (CFA). Both EFA and CFA results supported a two-factor structure: (a) positive outcome expectancy, and (b) negative outcome expectancy. The internal consistency reliability coefficients for both factors were acceptable. In addition, positive outcome expectancy correlated stronger than negative outcome expectancy with other SDT/SET constructs in the expected directions. The VOES is a brief, reliable and valid instrument for assessing vocational outcome expectancy in individuals with SMI that can be integrated into SDT/SET as a vocational rehabilitation engagement and recovery model in psychiatric rehabilitation.
A road map for integrating eco-evolutionary processes into biodiversity models.
Thuiller, Wilfried; Münkemüller, Tamara; Lavergne, Sébastien; Mouillot, David; Mouquet, Nicolas; Schiffers, Katja; Gravel, Dominique
2013-05-01
The demand for projections of the future distribution of biodiversity has triggered an upsurge in modelling at the crossroads between ecology and evolution. Despite the enthusiasm around these so-called biodiversity models, most approaches are still criticised for not integrating key processes known to shape species ranges and community structure. Developing an integrative modelling framework for biodiversity distribution promises to improve the reliability of predictions and to give a better understanding of the eco-evolutionary dynamics of species and communities under changing environments. In this article, we briefly review some eco-evolutionary processes and interplays among them, which are essential to provide reliable projections of species distributions and community structure. We identify gaps in theory, quantitative knowledge and data availability hampering the development of an integrated modelling framework. We argue that model development relying on a strong theoretical foundation is essential to inspire new models, manage complexity and maintain tractability. We support our argument with an example of a novel integrated model for species distribution modelling, derived from metapopulation theory, which accounts for abiotic constraints, dispersal, biotic interactions and evolution under changing environmental conditions. We hope such a perspective will motivate exciting and novel research, and challenge others to improve on our proposed approach. © 2013 John Wiley & Sons Ltd/CNRS.
Measuring Belief in Conspiracy Theories: The Generic Conspiracist Beliefs Scale
Brotherton, Robert; French, Christopher C.; Pickering, Alan D.
2013-01-01
The psychology of conspiracy theory beliefs is not yet well understood, although research indicates that there are stable individual differences in conspiracist ideation – individuals’ general tendency to engage with conspiracy theories. Researchers have created several short self-report measures of conspiracist ideation. These measures largely consist of items referring to an assortment of prominent conspiracy theories regarding specific real-world events. However, these instruments have not been psychometrically validated, and this assessment approach suffers from practical and theoretical limitations. Therefore, we present the Generic Conspiracist Beliefs (GCB) scale: a novel measure of individual differences in generic conspiracist ideation. The scale was developed and validated across four studies. In Study 1, exploratory factor analysis of a novel 75-item measure of non-event-based conspiracist beliefs identified five conspiracist facets. The 15-item GCB scale was developed to sample from each of these themes. Studies 2, 3, and 4 examined the structure and validity of the GCB, demonstrating internal reliability, content, criterion-related, convergent and discriminant validity, and good test-retest reliability. In sum, this research indicates that the GCB is a psychometrically sound and practically useful measure of conspiracist ideation, and the findings add to our theoretical understanding of conspiracist ideation as a monological belief system unpinned by a relatively small number of generic assumptions about the typicality of conspiratorial activity in the world. PMID:23734136
Branscum, Paul; Lora, Karina R.
2016-01-01
Public health interventions are greatly needed for obesity prevention, and planning for such strategies should include community participation. The study’s purpose was to develop and validate a theory-based instrument with low-income, Hispanic mothers of preschoolers, to assess theory-based determinants of maternal monitoring of child’s consumption of fruits and vegetables and sugar-sweetened beverages (SSB). Nine focus groups with mothers were conducted to determine nutrition-related behaviors that mothers found as most obesogenic for their children. Next, behaviors were operationally defined and rated for importance and changeability. Two behaviors were selected for investigation (fruits and vegetable and SSB). Twenty semi-structured interviews with mothers were conducted next to develop culturally appropriate items for the instrument. Afterwards, face and content validity were established using a panel of six experts. Finally, the instrument was tested with a sample of 238 mothers. Psychometric properties evaluated included construct validity (using the maximum likelihood extraction method of factor analysis), and internal consistency reliability (Cronbach’s alpha). Results suggested that all scales on the instrument were valid and reliable, except for the autonomy scales. Researchers and community planners working with Hispanic families can use this instrument to measure theory-based determinants of parenting behaviors related to preschoolers’ consumption of fruits and vegetables, and SSB. PMID:27271643
ERIC Educational Resources Information Center
Rutledge, Michael L.; Sadler, Kim C.
2007-01-01
The Measure of Acceptance of the Theory of Evolution (MATE) instrument was initially designed to assess high school biology teachers' acceptance of evolutionary theory. To determine if the MATE instrument is reliable with university students, it was administered to students in a non-majors biology course (n = 61) twice over a 3-week period.…
A Three-Part Theory of Critical Thinking: Dialogue, Mental Models, and Reliability
2000-08-01
A THREE-PART THEORY OF CRITICAL THINKING: DIALOGUE, MENTAL MODELS, AND RELIABILITY1 Marvin S. Cohen, Ph.D. Cognitive Technologies...1. REPORT DATE AUG 2000 2. REPORT TYPE 3. DATES COVERED 00-00-2000 to 00-00-2000 4. TITLE AND SUBTITLE A Three-part Theory of Critical...in logic or decision theory ? Does it require stand-alone courses? How will we persuade students to devote their time to the study of critical
ERIC Educational Resources Information Center
Culpepper, Steven Andrew
2013-01-01
A classic topic in the fields of psychometrics and measurement has been the impact of the number of scale categories on test score reliability. This study builds on previous research by further articulating the relationship between item response theory (IRT) and classical test theory (CTT). Equations are presented for comparing the reliability and…
Development and validation of an instrument to assess perceived social influence on health behaviors
HOLT, CHERYL L.; CLARK, EDDIE M.; ROTH, DAVID L.; CROWTHER, MARTHA; KOHLER, CONNIE; FOUAD, MONA; FOUSHEE, RUSTY; LEE, PATRICIA A.; SOUTHWARD, PENNY L.
2012-01-01
Assessment of social influence on health behavior is often approached through a situational context. The current study adapted an existing, theory-based instrument from another content domain to assess Perceived Social Influence on Health Behavior (PSI-HB) among African Americans, using an individual difference approach. The adapted instrument was found to have high internal reliability (α = .81–.84) and acceptable testretest reliability (r = .68–.85). A measurement model revealed a three-factor structure and supported the theoretical underpinnings. Scores were predictive of health behaviors, particularly among women. Future research using the new instrument may have applied value assessing social influence in the context of health interventions. PMID:20522506
Zhang, Y; Melnikov, A; Mandelis, A; Halliop, B; Kherani, N P; Zhu, R
2015-03-01
A theoretical one-dimensional two-layer linear photocarrier radiometry (PCR) model including the presence of effective interface carrier traps was used to evaluate the transport parameters of p-type hydrogenated amorphous silicon (a-Si:H) and n-type crystalline silicon (c-Si) passivated by an intrinsic hydrogenated amorphous silicon (i-layer) nanolayer. Several crystalline Si heterojunction structures were examined to investigate the influence of the i-layer thickness and the doping concentration of the a-Si:H layer. The experimental data of a series of heterojunction structures with intrinsic thin layers were fitted to PCR theory to gain insight into the transport properties of these devices. The quantitative multi-parameter results were studied with regard to measurement reliability (uniqueness) and precision using two independent computational best-fit programs. The considerable influence on the transport properties of the entire structure of two key parameters that can limit the performance of amorphous thin film solar cells, namely, the doping concentration of the a-Si:H layer and the i-layer thickness was demonstrated. It was shown that PCR can be applied to the non-destructive characterization of a-Si:H/c-Si heterojunction solar cells yielding reliable measurements of the key parameters.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhang, Y.; Institute of Electronic Engineering and Optoelectronic Technology, Nanjing University of Science and Technology, Nanjing, Jiangsu 210094; Melnikov, A.
2015-03-15
A theoretical one-dimensional two-layer linear photocarrier radiometry (PCR) model including the presence of effective interface carrier traps was used to evaluate the transport parameters of p-type hydrogenated amorphous silicon (a-Si:H) and n-type crystalline silicon (c-Si) passivated by an intrinsic hydrogenated amorphous silicon (i-layer) nanolayer. Several crystalline Si heterojunction structures were examined to investigate the influence of the i-layer thickness and the doping concentration of the a-Si:H layer. The experimental data of a series of heterojunction structures with intrinsic thin layers were fitted to PCR theory to gain insight into the transport properties of these devices. The quantitative multi-parameter results weremore » studied with regard to measurement reliability (uniqueness) and precision using two independent computational best-fit programs. The considerable influence on the transport properties of the entire structure of two key parameters that can limit the performance of amorphous thin film solar cells, namely, the doping concentration of the a-Si:H layer and the i-layer thickness was demonstrated. It was shown that PCR can be applied to the non-destructive characterization of a-Si:H/c-Si heterojunction solar cells yielding reliable measurements of the key parameters.« less
Vispoel, Walter P; Morris, Carrie A; Kilinc, Murat
2018-01-01
We applied a new approach to Generalizability theory (G-theory) involving parallel splits and repeated measures to evaluate common uses of the Paulhus Deception Scales based on polytomous and four types of dichotomous scoring. G-theory indices of reliability and validity accounting for specific-factor, transient, and random-response measurement error supported use of polytomous over dichotomous scores as contamination checks; as control, explanatory, and outcome variables; as aspects of construct validation; and as indexes of environmental effects on socially desirable responding. Polytomous scoring also provided results for flagging faking as dependable as those when using dichotomous scoring methods. These findings argue strongly against the nearly exclusive use of dichotomous scoring for the Paulhus Deception Scales in practice and underscore the value of G-theory in demonstrating this. We provide guidelines for applying our G-theory techniques to other objectively scored clinical assessments, for using G-theory to estimate how changes to a measure might improve reliability, and for obtaining software to conduct G-theory analyses free of charge.
Research and application on imaging technology of line structure light based on confocal microscopy
NASA Astrophysics Data System (ADS)
Han, Wenfeng; Xiao, Zexin; Wang, Xiaofen
2009-11-01
In 2005, the theory of line structure light confocal microscopy was put forward firstly in China by Xingyu Gao and Zexin Xiao in the Institute of Opt-mechatronics of Guilin University of Electronic Technology. Though the lateral resolution of line confocal microscopy can only reach or approach the level of the traditional dot confocal microscopy. But compared with traditional dot confocal microscopy, it has two advantages: first, by substituting line scanning for dot scanning, plane imaging only performs one-dimensional scanning, with imaging velocity greatly improved and scanning mechanism simplified, second, transfer quantity of light is greatly improved by substituting detection hairline for detection pinhole, and low illumination CCD is used directly to collect images instead of photoelectric intensifier. In order to apply the line confocal microscopy to practical system, based on the further research on the theory of the line confocal microscopy, imaging technology of line structure light is put forward on condition of implementation of confocal microscopy. Its validity and reliability are also verified by experiments.
Validation of the Spanish version of Mackey childbirth satisfaction rating scale.
Caballero, Pablo; Delgado-García, Beatriz E; Orts-Cortes, Isabel; Moncho, Joaquin; Pereyra-Zamora, Pamela; Nolasco, Andreu
2016-04-16
The "Mackey Childbirth Satisfaction Rating Scale" (MCSRS) is a complete non-validated scale which includes the most important factors associated with maternal satisfaction. Our primary purpose was to describe the internal structure of the scale and validate the reliability and validity of concept of its Spanish version MCSRS-E. The MCSRS was translated into Spanish, back-translated and adapted to the Spanish population. It was then administered following a pilot test with women who met the study participant requirements. The scale structure was obtained by performing an exploratory factorial analysis using a sample of 304 women. The structures obtained were tested by conducting a confirmatory factorial analysis using a sample of 159 women. To test the validity of concept, the structure factors were correlated with expectations prior to childbirth experiences. McDonald's omegas were calculated for each model to establish the reliability of each factor. The study was carried out at four University Hospitals; Alicante, Elche, Torrevieja and Vinalopo Salud of Elche. The inclusion criteria were women aged 18-45 years old who had just delivered a singleton live baby at 38-42 weeks through vaginal delivery. Women who had difficulty speaking and understanding Spanish were excluded. The process generated 5 different possible internal structures in a nested model more consistent with the theory than other internal structures of the MCSRS applied hitherto. All of them had good levels of validation and reliability. This nested model to explain internal structure of MCSRS-E can accommodate different clinical practice scenarios better than the other structures applied to date, and it is a flexible tool which can be used to identify the aspects that should be changed to improve maternal satisfaction and hence maternal health.
System-wide versus component-specific trust using multiple aids.
Keller, David; Rice, Stephen
2010-01-01
Previous research in operator trust toward automated aids has focused primarily on single aids. The current study focuses on how operator trust is affected by the presence of multiple aids. Two competing theories of multiple-trust are presented. A component-specific trust theory predicts that operators will differentially place their trust in automated aids that vary in reliability. A system-wide trust theory predicts that operators will treat multiple imperfect aids as one "system" and merge their trust across aids despite differences in the aids' reliability. A simulated flight task was used to test these theories, whereby operators performed a pursuit tracking task while concurrently monitoring multiple system gauges that were augmented with perfect or imperfect automated aids. The data revealed that a system-wide trust theory best predicted the data; operators merged their trust across both aids, behaving toward a perfectly reliable aid in the same manner as they did towards unreliable aids.
Extended Importance Sampling for Reliability Analysis under Evidence Theory
NASA Astrophysics Data System (ADS)
Yuan, X. K.; Chen, B.; Zhang, B. Q.
2018-05-01
In early engineering practice, the lack of data and information makes uncertainty difficult to deal with. However, evidence theory has been proposed to handle uncertainty with limited information as an alternative way to traditional probability theory. In this contribution, a simulation-based approach, called ‘Extended importance sampling’, is proposed based on evidence theory to handle problems with epistemic uncertainty. The proposed approach stems from the traditional importance sampling for reliability analysis under probability theory, and is developed to handle the problem with epistemic uncertainty. It first introduces a nominal instrumental probability density function (PDF) for every epistemic uncertainty variable, and thus an ‘equivalent’ reliability problem under probability theory is obtained. Then the samples of these variables are generated in a way of importance sampling. Based on these samples, the plausibility and belief (upper and lower bounds of probability) can be estimated. It is more efficient than direct Monte Carlo simulation. Numerical and engineering examples are given to illustrate the efficiency and feasible of the proposed approach.
Parsons, Jeffrey T; Rendina, H Jonathon; Ventuneac, Ana; Cook, Karon F; Grov, Christian; Mustanski, Brian
2013-12-01
The Hypersexual Disorder Screening Inventory (HDSI) was designed as an instrument for the screening of hypersexuality by the American Psychiatric Association's taskforce for the fifth edition of the Diagnostic and Statistical Manual of Mental Disorders. Our study sought to conduct a psychometric analysis of the HDSI, including an investigation of its underlying structure and reliability utilizing item response theory (IRT) modeling, and an examination of its polythetic scoring criteria in comparison to a standard dimensionally based cutoff score. We examined a diverse group of 202 highly sexually active gay and bisexual men in New York City. We conducted psychometric analyses of the HDSI, including both confirmatory factor analysis of its structure and IRT analysis of the item and scale reliabilities. We utilized the HDSI. The HDSI adequately fit a single-factor solution, although there was evidence that two of the items may measure a second factor that taps into sex as a form of coping. The scale showed evidence of strong reliability across much of the continuum of hypersexuality, and results suggested that, in addition to the proposed polythetic scoring criteria, a cutoff score of 20 on the severity index might be used for preliminary classification of HD. The HDSI was found to be highly reliable, and results suggested that a unidimensional, quantitative conception of hypersexuality with a clinically relevant cutoff score may be more appropriate than a qualitative syndrome comprised of multiple distinct clusters of problems. However, we also found preliminary evidence that three clusters of symptoms may constitute an HD syndrome as opposed to the two clusters initially proposed. Future research is needed to determine which of these issues are characteristic of the hypersexuality and HD constructs themselves and which are more likely to be methodological artifacts of the HDSI. © 2013 International Society for Sexual Medicine.
Huang, Fei-Fei; Yang, Qing; Han, Xuan Ye; Zhang, Jing-Ping; Lin, Ting
2017-08-01
The purpose of this study was to develop a Self-Efficacy Scale for Rehabilitation Management designed specifically for postoperative lung cancer patients (SESPRM-LC) and to evaluate its psychometric properties. Based on the concept of self-management of chronic disease, items were developed from literature review and semistructured interviews of 10 lung cancer patients and screened by expert consultation and pilot testing. Psychometric evaluation was done with 448 postoperative lung cancer patients recruited from 5 tertiary hospitals in Fuzhou, China, by incorporating classical test theory and item response theory methods. A 6-factor structure was illustrated by exploratory factor analysis and confirmed by confirmatory factor analysis, explaining 60.753% of the total variance. The SESPRM-LC achieved Cronbach's α of 0.694 to 0.893, 2-week test-retest reliability of 0.652 to 0.893, and marginal reliability of 0.565 to 0.934. The predictive and criterion validities were demonstrated by significant association with theoretically supported quality-of-life variables (r = 0.211-0.392, P < .01), and General Perceived Self-efficacy Scale (r = 0.465, P < .01), respectively. Item response theory analysis showed that the SESPRM-LC offers information about a broad range of self-efficacy measures and discriminates well between patients with high and low levels of self-efficacy. We demonstrated initial support for the reliability and validity of the 27-item SESPRM-LC, as a developmentally appropriate instrument for assessing self-efficacy among lung cancer patients during postoperative rehabilitation. Copyright © 2016 John Wiley & Sons, Ltd.
Trait-specific dependence in romantic relationships.
Ellis, Bruce J; Simpson, Jeffry A; Campbell, Lorne
2002-10-01
Informed by three theoretical frameworks--trait psychology, evolutionary psychology, and interdependence theory--we report four investigations designed to develop and test the reliability and validity of a new construct and accompanying multiscale inventory, the Trait-Specific Dependence Inventory (TSDI). The TSDI assesses comparisons between present and alternative romantic partners on major dimensions of mate value. In Study 1, principal components analyses revealed that the provisional pool of theory-generated TSDI items were represented by six factors: Agreeable/Committed, Resource Accruing Potential, Physical Prowess, Emotional Stability, Surgency, and Physical Attractiveness. In Study 2, confirmatory factor analysis replicated these results on a different sample and tested how well different structural models fit the data. Study 3 provided evidence for the convergent and discriminant validity of the six TSDI scales by correlating each one with a matched personality trait scale that did not explicitly incorporate comparisons between partners. Study 4 provided further validation evidence, revealing that the six TSDI scales successfully predicted three relationship outcome measures--love, time investment, and anger/upset--above and beyond matched sets of traditional personality trait measures. These results suggest that the TSDI is a reliable, valid, and unique construct that represents a new trait-specific method of assessing dependence in romantic relationships. The construct of trait-specific dependence is introduced and linked with other theories of mate value.
Static assignment of complex stochastic tasks using stochastic majorization
NASA Technical Reports Server (NTRS)
Nicol, David; Simha, Rahul; Towsley, Don
1992-01-01
We consider the problem of statically assigning many tasks to a (smaller) system of homogeneous processors, where a task's structure is modeled as a branching process, and all tasks are assumed to have identical behavior. We show how the theory of majorization can be used to obtain a partial order among possible task assignments. Our results show that if the vector of numbers of tasks assigned to each processor under one mapping is majorized by that of another mapping, then the former mapping is better than the latter with respect to a large number of objective functions. In particular, we show how measurements of finishing time, resource utilization, and reliability are all captured by the theory. We also show how the theory may be applied to the problem of partitioning a pool of processors for distribution among parallelizable tasks.
Nondestructive evaluation of composite materials - A design philosophy
NASA Technical Reports Server (NTRS)
Duke, J. C., Jr.; Henneke, E. G., II; Stinchcomb, W. W.; Reifsnider, K. L.
1984-01-01
Efficient and reliable structural design utilizing fiber reinforced composite materials may only be accomplished if the materials used may be nondestructively evaluated. There are two major reasons for this requirement: (1) composite materials are formed at the time the structure is fabricated and (2) at practical strain levels damage, changes in the condition of the material, that influence the structure's mechanical performance is present. The fundamental basis of such a nondestructive evaluation capability is presented. A discussion of means of assessing nondestructively the material condition as well as a damage mechanics theory that interprets the material condition in terms of its influence on the mechanical response, stiffness, strength and life is provided.
Infrared spectroscopy of copper-resveratrol complexes: A joint experimental and theoretical study
NASA Astrophysics Data System (ADS)
Chiavarino, B.; Crestoni, M. E.; Fornarini, S.; Taioli, S.; Mancini, I.; Tosi, P.
2012-07-01
Infrared multiple-photon dissociation spectroscopy has been used to record vibrational spectra of charged copper-resveratrol complexes in the 3500-3700 cm-1 and 1100-1900 cm-1 regions. Minimum energy structures have been determined by density functional theory calculations using plane waves and pseudopotentials. In particular, the copper(I)-resveratrol complex presents a tetra-coordinated metal bound with two carbon atoms of the alkenyl moiety and two closest carbons of the adjoining resorcinol ring. For these geometries vibrational spectra have been calculated by using linear response theory. The good agreement between experimental and calculated IR spectra for the selected species confirms the overall reliability of the proposed geometries.
Lundman, Berit; Årestedt, Kristofer; Norberg, Astrid; Norberg, Catharina; Fischer, Regina Santamäki; Lövheim, Hugo
2015-01-01
This study tested the psychometric properties of a Swedish version of the Self-Transcendence Scale (STS). Cohen's weighted kappa, agreement, absolute reliability, relative reliability, and internal consistency were calculated, and the underlying structure of the STS was established by exploratory factor analysis. There were 2 samples available: 1 including 194 people aged 85-103 years and a convenience sample of 60 people aged 21-69 years. Weighted kappa values ranged from .40 to .89. The intraclass correlation coefficient for the original STS was .763, and the least significant change between repeated tests was 6.25 points. The revised STS was found to have satisfactory psychometric properties, and 2 of the 4 underlying dimensions in Reed's self-transcendence theory were supported.
Thibodeau, Michel A; Leonard, Rachel C; Abramowitz, Jonathan S; Riemann, Bradley C
2015-12-01
The Dimensional Obsessive-Compulsive Scale (DOCS) is a promising measure of obsessive-compulsive disorder (OCD) symptoms but has received minimal psychometric attention. We evaluated the utility and reliability of DOCS scores. The study included 832 students and 300 patients with OCD. Confirmatory factor analysis supported the originally proposed four-factor structure. DOCS total and subscale scores exhibited good to excellent internal consistency in both samples (α = .82 to α = .96). Patient DOCS total scores reduced substantially during treatment (t = 16.01, d = 1.02). DOCS total scores discriminated between students and patients (sensitivity = 0.76, 1 - specificity = 0.23). The measure did not exhibit gender-based differential item functioning as tested by Mantel-Haenszel chi-square tests. Expected response options for each item were plotted as a function of item response theory and demonstrated that DOCS scores incrementally discriminate OCD symptoms ranging from low to extremely high severity. Incremental differences in DOCS scores appear to represent unbiased and reliable differences in true OCD symptom severity. © The Author(s) 2014.
Reliability Studies of Ceramic Capacitors.
1983-07-01
increases. This case has been found to be a good approximation for single crystals with high chemical and structural purity. Shallow traps may arise as a...theory, this sudden increase may be otherwise explained. Single crystals of ZnS have been found to exhibit this vertical increase in the current...Smith and Rose observed SCLC behavior in CdS single crystals . Branwood and Tredgold 2 8 and Branwood et al. 2 9 measured BaTiO 3 single crystals and
NASA Astrophysics Data System (ADS)
Peng, L.; Pan, H.; Ma, H.; Zhao, P.; Qin, R.; Deng, C.
2017-12-01
The irreducible water saturation (Swir) is a vital parameter for permeability prediction and original oil and gas estimation. However, the complex pore structure of the rocks makes the parameter difficult to be calculated from both laboratory and conventional well logging methods. In this study, an effective statistical method to predict Swir is derived directly from nuclear magnetic resonance (NMR) data based on fractal theory. The spectrum of transversal relaxation time (T2) is normally considered as an indicator of pore size distribution, and the micro- and meso-pore's fractal dimension in two specific range of T2 spectrum distribution are calculated. Based on the analysis of the fractal characteristics of 22 core samples, which were drilled from four boreholes of tight lithologic oil reservoirs of Ordos Basin in China, the positive correlation between Swir and porosity is derived. Afterwards a predicting model for Swir based on linear regressions of fractal dimensions is proposed. It reveals that the Swir is controlled by the pore size and the roughness of the pore. The reliability of this model is tested and an ideal consistency between predicted results and experimental data is found. This model is a reliable supplementary to predict the irreducible water saturation in the case that T2 cutoff value cannot be accurately determined.
Devine, Rory T; Hughes, Claire
2016-09-01
Recent years have seen a growth of research on the development of children's ability to reason about others' mental states (or "theory of mind") beyond the narrow confines of the preschool period. The overall aim of this study was to investigate the psychometric properties of a task battery composed of items from Happé's Strange Stories task and Devine and Hughes' Silent Film task. A sample of 460 ethnically and socially diverse children (211 boys) between 7 and 13years of age completed the task battery at two time points separated by 1month. The Strange Stories and Silent Film tasks were strongly correlated even when verbal ability and narrative comprehension were taken into account, and all items loaded onto a single theory-of-mind latent factor. The theory-of-mind latent factor provided reliable estimates of performance across a wide range of theory-of-mind ability and showed no evidence of differential item functioning across gender, ethnicity, or socioeconomic status. The theory-of-mind latent factor also exhibited strong 1-month test-retest reliability, and this stability did not vary as a function of child characteristics. Taken together, these findings provide evidence for the validity and reliability of the Strange Stories and Silent Film task battery as a measure of individual differences in theory of mind suitable for use across middle childhood. We consider the methodological and conceptual implications of these findings for research on theory of mind beyond the preschool years. Copyright © 2015 Elsevier Inc. All rights reserved.
Unreliability as a Threat to Understanding Psychopathology: The Cautionary Tale of Attentional Bias
Rodebaugh, Thomas L.; Scullin, Rachel B.; Langer, Julia K.; Dixon, David J.; Huppert, Jonathan D.; Bernstein, Amit; Zvielli, Ariel; Lenze, Eric J.
2016-01-01
The use of unreliable measures constitutes a threat to our understanding of psychopathology, because advancement of science using both behavioral and biologically-oriented measures can only be certain if such measurements are reliable. Two pillars of NIMH’s portfolio – the Research Domain Criteria (RDoC) initiative for psychopathology and the target engagement initiative in clinical trials – cannot succeed without measures that possess the high reliability necessary for tests involving mediation and selection based on individual differences. We focus on the historical lack of reliability of attentional bias measures as an illustration of how reliability can pose a threat to our understanding. Our own data replicate previous findings of poor reliability for traditionally-used scores, which suggests a serious problem with the ability to test theories regarding attentional bias. This lack of reliability may also suggest problems with the assumption (in both theory and the formula for the scores) that attentional bias is consistent and stable across time. In contrast, measures accounting for attention as a dynamic process in time show good reliability in our data. The field is sorely in need of research reporting findings and reliability for attentional bias scores using multiple methods, including those focusing on dynamic processes over time. We urge researchers to test and report reliability of all measures, considering findings of low reliability not just as a nuisance but as an opportunity to modify and improve upon the underlying theory. Full assessment of reliability of measures will maximize the possibility that RDoC (and psychological science more generally) will succeed. PMID:27322741
Cane, James; Richardson, Michelle; Johnston, Marie; Ladha, Ruhina; Michie, Susan
2015-02-01
Behaviour change technique (BCT) Taxonomy v1 is a hierarchically grouped, consensus-based taxonomy of 93 BCTs for reporting intervention content. To enhance the use and understanding of BCTs, the aims of the present study were to (1) quantitatively examine the 'bottom-up' hierarchical structure of Taxonomy v1, (2) identify whether BCTs can be reliably mapped to theoretical domains using a 'top-down' theoretically driven approach, and (3) identify any overlap between the 'bottom-up' and 'top-down' groupings. The 'bottom-up' structure was examined for higher-order groupings using a dendrogram derived from hierarchical cluster analysis. For the theory-based 'top-down' structure, 18 experts sorted BCTs into 14 theoretical domains. Discriminant Content Validity was used to identify groupings, and chi-square tests and Pearson's residuals were used to examine the overlap between groupings. Behaviour change techniques relating to 'Reward and Punishment' and 'Cues and Cue Responses' were perceived as markedly different to other BCTs. Fifty-nine of the BCTs were reliably allocated to 12 of the 14 theoretical domains; 47 were significant and 12 were of borderline significance. Thirty-four of 208 'bottom-up' × 'top-down' pairings showed greater overlap than expected by chance. However, only six combinations achieved satisfactory evidence of similarity. The moderate overlap between the groupings indicates some tendency to implicitly conceptualize BCTs in terms of the same theoretical domains. Understanding the nature of the overlap will aid the conceptualization of BCTs in terms of theory and application. Further research into different methods of developing a hierarchical taxonomic structure of BCTs for international, interdisciplinary work is now required. Statement of contribution What is already known on this subject? Behaviour change interventions are effective in improving health care and health outcomes. The 'active' components of these interventions are behaviour change techniques and over 93 have been identified. Taxonomies of behaviour change techniques require structure to enable potential applications. What does this study add? This study identifies groups of BCTs to aid the recall of BCTs for intervention coding and design. It compares two methods of grouping--'bottom-up' and theory-based 'top-down'--and finds a moderate overlap. Building on identified BCT groups, it examines relationships between theoretical domains and BCTs. © 2014 The British Psychological Society.
Daher, Aqil Mohammad; Ahmad, Syed Hassan; Winn, Than; Selamat, Mohd Ikhsan
2015-01-01
Few studies have employed the item response theory in examining reliability. We conducted this study to examine the effect of Rating Scale Categories (RSCs) on the reliability and fit statistics of the Malay Spiritual Well-Being Scale, employing the Rasch model. The Malay Spiritual Well-Being Scale (SWBS) with the original six; three and four newly structured RSCs was distributed randomly among three different samples of 50 participants each. The mean age of respondents in the three samples ranged between 36 and 39 years old. The majority was female in all samples, and Islam was the most prevalent religion among the respondents. The predominating race was Malay, followed by Chinese and Indian. The original six RSCs indicated better targeting of 0.99 and smallest model error of 0.24. The Infit Mnsq (mean square) and Zstd (Z standard) of the six RSCs were "1.1"and "-0.1"respectively. The six RSCs achieved the highest person and item reliabilities of 0.86 and 0.85 respectively. These reliabilities yielded the highest person (2.46) and item (2.38) separation indices compared to other the RSCs. The person and item reliability and, to a lesser extent, the fit statistics, were better with the six RSCs compared to the four and three RSCs.
Packham, Tara; MacDermid, Joy C
2013-01-01
The Patient-Rated Wrist and Hand Evaluation (PRWHE) is a self-reported assessment of pain and disability to evaluate outcome after hand injuries. Rasch analysis is an alternative strategy for examining the psychometric properties of a measurement scale based in item response theory, rather than classical test theory. This study used Rasch analysis to examine the content, scoring and measurement properties of the PRWHE. PRWHE scores (n = 264) from persons with a traumatic injury or reconstructive surgery to one hand were collected from an outpatient hand rehabilitation facility. Rasch analysis was conducted to assess how the PRWHE fit the Rasch model, confirms the scaling structure of the pain and disability subscales, and identifies any areas of bias from differential item functioning. Rasch analysis of the PRWHE supports internal consistency of the scale (α = 0.96) and reliability (as measured by the person separation index) of 0.95. While gender, age, diagnosis, and duration since injury all systematically influenced how people scored the PRWHE, hand dominance and affected side did not. Rasch analysis supported a 3 subscale structure (pain, specific activities and usual activities) rather than the current divisions of pain and disability. Initial examination of the PRWHE indicates the psychometric properties of consistency, reliability and responsiveness previously tested by classical methods are further supported by Rasch analysis. It also suggests the scale structure may be best considered as 3 subscales rather than simply pain and disability. Copyright © 2013 Hanley & Belfus. Published by Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Wang, Lei; Xiong, Chuang; Wang, Xiaojun; Li, Yunlong; Xu, Menghui
2018-04-01
Considering that multi-source uncertainties from inherent nature as well as the external environment are unavoidable and severely affect the controller performance, the dynamic safety assessment with high confidence is of great significance for scientists and engineers. In view of this, the uncertainty quantification analysis and time-variant reliability estimation corresponding to the closed-loop control problems are conducted in this study under a mixture of random, interval, and convex uncertainties. By combining the state-space transformation and the natural set expansion, the boundary laws of controlled response histories are first confirmed with specific implementation of random items. For nonlinear cases, the collocation set methodology and fourth Rounge-Kutta algorithm are introduced as well. Enlightened by the first-passage model in random process theory as well as by the static probabilistic reliability ideas, a new definition of the hybrid time-variant reliability measurement is provided for the vibration control systems and the related solution details are further expounded. Two engineering examples are eventually presented to demonstrate the validity and applicability of the methodology developed.
Jochems, Eline C; Mulder, Cornelis L; Duivenvoorden, Hugo J; van der Feltz-Cornelis, Christina M; van Dam, Arno
2014-08-01
Self-determination theory is potentially useful for understanding reasons why individuals with mental illness do or do not engage in psychiatric treatment. The current study examined the psychometric properties of three questionnaires based on self-determination theory-The Treatment Entry Questionnaire (TEQ), Health Care Climate Questionnaire (HCCQ), and the Short Motivation Feedback List (SMFL)-in a sample of 348 Dutch adult outpatients with primary diagnoses of mood, anxiety, psychotic, and personality disorders. Structural equation modeling showed that the empirical factor structures of the TEQ and SMFL were adequately represented by a model with three intercorrelated factors. These were interpreted as identified, introjected, and external motivation. The reliabilities of the Dutch TEQ, HCCQ, and SMFL were found to be acceptable but can be improved on; congeneric estimates ranged from 0.66 to 0.94 depending on the measure and patient subsample. Preliminary support for the construct validities of the questionnaires was found in the form of theoretically expected associations with other scales, including therapist-rated motivation and treatment engagement and with legally mandated treatment. Additionally, the study provides insights into the relations between measures of motivation based on self-determination theory, the transtheoretical model and the integral model of treatment motivation in psychiatric outpatients with severe mental illness. © The Author(s) 2013.
Item Response Theory analysis of Fagerström Test for Cigarette Dependence.
Svicher, Andrea; Cosci, Fiammetta; Giannini, Marco; Pistelli, Francesco; Fagerström, Karl
2018-02-01
The Fagerström Test for Cigarette Dependence (FTCD) and the Heaviness of Smoking Index (HSI) are the gold standard measures to assess cigarette dependence. However, FTCD reliability and factor structure have been questioned and HSI psychometric properties are in need of further investigations. The present study examined the psychometrics properties of the FTCD and the HSI via the Item Response Theory. The study was a secondary analysis of data collected in 862 Italian daily smokers. Confirmatory factor analysis was run to evaluate the dimensionality of FTCD. A Grade Response Model was applied to FTCD and HSI to verify the fit to the data. Both item and test functioning were analyzed and item statistics, Test Information Function, and scale reliabilities were calculated. Mokken Scale Analysis was applied to estimate homogeneity and Loevinger's coefficients were calculated. The FTCD showed unidimensionality and homogeneity for most of the items and for the total score. It also showed high sensitivity and good reliability from medium to high levels of cigarette dependence, although problems related to some items (i.e., items 3 and 5) were evident. HSI had good homogeneity, adequate item functioning, and high reliability from medium to high levels of cigarette dependence. Significant Differential Item Functioning was found for items 1, 4, 5 of the FTCD and for both items of HSI. HSI seems highly recommended in clinical settings addressed to heavy smokers while FTCD would be better used in smokers with a level of cigarette dependence ranging between low and high. Copyright © 2017 Elsevier Ltd. All rights reserved.
De Vet, Emely; De Ridder, Denise; Stok, Marijn; Brunso, Karen; Baban, Adriana; Gaspar, Tania
2014-09-02
Applying self-regulation strategies have proven important in eating behaviors, but it remains subject to investigation what strategies adolescents report to use to ensure healthy eating, and adequate measures are lacking. Therefore, we developed and validated a self-regulation questionnaire applied to eating (TESQ-E) for adolescents. Study 1 reports a four-step approach to develop the TESQ-E questionnaire (n = 1097). Study 2 was a cross-sectional survey among adolescents from nine European countries (n = 11,392) that assessed the TESQ-E, eating-related behaviors, dietary intake and background characteristics. In study 3, the TESQ-E was administered twice within four weeks to evaluate test-retest reliability (n = 140). Study 4 was a cross-sectional survey (n = 93) that assessed the TESQ-E and related psychological constructs (e.g., motivation, autonomy, self-control). All participants were aged between 10 and 17 years. Study 1 resulted in a 24-item questionnaire assessing adolescent-reported use of six specific strategies for healthy eating that represent three general self-regulation approaches. Study 2 showed that the easy-to-administer theory-based TESQ-E has a clear factor structure and good subscale reliabilities. The questionnaire was related to eating-related behaviors and dietary intake, indicating predictive validity. Study 3 showed good test-retest reliabilities for the TESQ-E. Study 4 indicated that TESQ-E was related to but also distinguishable from general self-regulation and motivation measures. The TESQ-E provides a reliable and valid measure to assess six theory-based self-regulation strategies that adolescents may use to ensure their healthy eating.
NASA Astrophysics Data System (ADS)
Zhang, Ruizhi; Du, Baoli; Chen, Kan; Reece, Mike; Materials Research Insititute Team
With the increasing computational power and reliable databases, high-throughput screening is playing a more and more important role in the search of new thermoelectric materials. Rather than the well established density functional theory (DFT) calculation based methods, we propose an alternative approach to screen for new TE materials: using crystal structural features as 'descriptors'. We show that a non-distorted transition metal sulphide polyhedral network can be a good descriptor for high power factor according to crystal filed theory. By using Cu/S containing compounds as an example, 1600+ Cu/S containing entries in the Inorganic Crystal Structure Database (ICSD) were screened, and of those 84 phases are identified as promising thermoelectric materials. The screening results are validated by both electronic structure calculations and experimental results from the literature. We also fabricated some new compounds to test our screening results. Another advantage of using crystal structure features as descriptors is that we can easily establish structural relationships between the identified phases. Based on this, two material design approaches are discussed: 1) High-pressure synthesis of metastable phase; 2) In-situ 2-phase composites with coherent interface. This work was supported by a Marie Curie International Incoming Fellowship of the European Community Human Potential Program.
Integrated control-system design via generalized LQG (GLQG) theory
NASA Technical Reports Server (NTRS)
Bernstein, Dennis S.; Hyland, David C.; Richter, Stephen; Haddad, Wassim M.
1989-01-01
Thirty years of control systems research has produced an enormous body of theoretical results in feedback synthesis. Yet such results see relatively little practical application, and there remains an unsettling gap between classical single-loop techniques (Nyquist, Bode, root locus, pole placement) and modern multivariable approaches (LQG and H infinity theory). Large scale, complex systems, such as high performance aircraft and flexible space structures, now demand efficient, reliable design of multivariable feedback controllers which optimally tradeoff performance against modeling accuracy, bandwidth, sensor noise, actuator power, and control law complexity. A methodology is described which encompasses numerous practical design constraints within a single unified formulation. The approach, which is based upon coupled systems or modified Riccati and Lyapunov equations, encompasses time-domain linear-quadratic-Gaussian theory and frequency-domain H theory, as well as classical objectives such as gain and phase margin via the Nyquist circle criterion. In addition, this approach encompasses the optimal projection approach to reduced-order controller design. The current status of the overall theory will be reviewed including both continuous-time and discrete-time (sampled-data) formulations.
Jakubikova, Elena; Bernstein, Elliot R
2007-12-27
Thermodynamics of reactions of vanadium oxide clusters with SO2 are studied at the BPW91/LANL2DZ level of theory. BPW91/LANL2DZ is insufficient to properly describe relative V-O and S-O bond strengths of vanadium and sulfur oxides. Calibration of theoretical results with experimental data is necessary to compute reliable enthalpy changes for reactions between VxOy and SO2. Theoretical results indicate SO2 to SO conversion occurs for oxygen-deficient clusters and SO2 to SO3 conversion occurs for oxygen-rich clusters. Stable intermediate structures of VOy (y = 1 - 4) clusters with SO2 are also obtained at the BPW91/TZVP level of theory. Some possible mechanisms for SO3 formation and catalyst regeneration for condensed-phase systems are suggested. These results are in agreement with, and complement, gas-phase experimental studies of neutral vanadium oxide clusters.
NASA Astrophysics Data System (ADS)
Alvarez, Diego A.; Uribe, Felipe; Hurtado, Jorge E.
2018-02-01
Random set theory is a general framework which comprises uncertainty in the form of probability boxes, possibility distributions, cumulative distribution functions, Dempster-Shafer structures or intervals; in addition, the dependence between the input variables can be expressed using copulas. In this paper, the lower and upper bounds on the probability of failure are calculated by means of random set theory. In order to accelerate the calculation, a well-known and efficient probability-based reliability method known as subset simulation is employed. This method is especially useful for finding small failure probabilities in both low- and high-dimensional spaces, disjoint failure domains and nonlinear limit state functions. The proposed methodology represents a drastic reduction of the computational labor implied by plain Monte Carlo simulation for problems defined with a mixture of representations for the input variables, while delivering similar results. Numerical examples illustrate the efficiency of the proposed approach.
NASA Technical Reports Server (NTRS)
Tessler, Alexander; Gherlone, Marco; Versino, Daniele; DiSciuva, Marco
2012-01-01
This paper reviews the theoretical foundation and computational mechanics aspects of the recently developed shear-deformation theory, called the Refined Zigzag Theory (RZT). The theory is based on a multi-scale formalism in which an equivalent single-layer plate theory is refined with a robust set of zigzag local layer displacements that are free of the usual deficiencies found in common plate theories with zigzag kinematics. In the RZT, first-order shear-deformation plate theory is used as the equivalent single-layer plate theory, which represents the overall response characteristics. Local piecewise-linear zigzag displacements are used to provide corrections to these overall response characteristics that are associated with the plate heterogeneity and the relative stiffnesses of the layers. The theory does not rely on shear correction factors and is equally accurate for homogeneous, laminated composite, and sandwich beams and plates. Regardless of the number of material layers, the theory maintains only seven kinematic unknowns that describe the membrane, bending, and transverse shear plate-deformation modes. Derived from the virtual work principle, RZT is well-suited for developing computationally efficient, C(sup 0)-continuous finite elements; formulations of several RZT-based elements are highlighted. The theory and its finite element approximations thus provide a unified and reliable computational platform for the analysis and design of high-performance load-bearing aerospace structures.
NASA Technical Reports Server (NTRS)
Tessler, Alexander; Gherlone, Marco; Versino, Daniele; Di Sciuva, Marco
2012-01-01
This paper reviews the theoretical foundation and computational mechanics aspects of the recently developed shear-deformation theory, called the Refined Zigzag Theory (RZT). The theory is based on a multi-scale formalism in which an equivalent single-layer plate theory is refined with a robust set of zigzag local layer displacements that are free of the usual deficiencies found in common plate theories with zigzag kinematics. In the RZT, first-order shear-deformation plate theory is used as the equivalent single-layer plate theory, which represents the overall response characteristics. Local piecewise-linear zigzag displacements are used to provide corrections to these overall response characteristics that are associated with the plate heterogeneity and the relative stiffnesses of the layers. The theory does not rely on shear correction factors and is equally accurate for homogeneous, laminated composite, and sandwich beams and plates. Regardless of the number of material layers, the theory maintains only seven kinematic unknowns that describe the membrane, bending, and transverse shear plate-deformation modes. Derived from the virtual work principle, RZT is well-suited for developing computationally efficient, C0-continuous finite elements; formulations of several RZT-based elements are highlighted. The theory and its finite elements provide a unified and reliable computational platform for the analysis and design of high-performance load-bearing aerospace structures.
NASA Astrophysics Data System (ADS)
Gan, Luping; Li, Yan-Feng; Zhu, Shun-Peng; Yang, Yuan-Jian; Huang, Hong-Zhong
2014-06-01
Failure mode, effects and criticality analysis (FMECA) and Fault tree analysis (FTA) are powerful tools to evaluate reliability of systems. Although single failure mode issue can be efficiently addressed by traditional FMECA, multiple failure modes and component correlations in complex systems cannot be effectively evaluated. In addition, correlated variables and parameters are often assumed to be precisely known in quantitative analysis. In fact, due to the lack of information, epistemic uncertainty commonly exists in engineering design. To solve these problems, the advantages of FMECA, FTA, fuzzy theory, and Copula theory are integrated into a unified hybrid method called fuzzy probability weighted geometric mean (FPWGM) risk priority number (RPN) method. The epistemic uncertainty of risk variables and parameters are characterized by fuzzy number to obtain fuzzy weighted geometric mean (FWGM) RPN for single failure mode. Multiple failure modes are connected using minimum cut sets (MCS), and Boolean logic is used to combine fuzzy risk priority number (FRPN) of each MCS. Moreover, Copula theory is applied to analyze the correlation of multiple failure modes in order to derive the failure probabilities of each MCS. Compared to the case where dependency among multiple failure modes is not considered, the Copula modeling approach eliminates the error of reliability analysis. Furthermore, for purpose of quantitative analysis, probabilities importance weight from failure probabilities are assigned to FWGM RPN to reassess the risk priority, which generalize the definition of probability weight and FRPN, resulting in a more accurate estimation than that of the traditional models. Finally, a basic fatigue analysis case drawn from turbine and compressor blades in aeroengine is used to demonstrate the effectiveness and robustness of the presented method. The result provides some important insights on fatigue reliability analysis and risk priority assessment of structural system under failure correlations.
Using generalizability theory to develop clinical assessment protocols.
Preuss, Richard A
2013-04-01
Clinical assessment protocols must produce data that are reliable, with a clinically attainable minimal detectable change (MDC). In a reliability study, generalizability theory has 2 advantages over classical test theory. These advantages provide information that allows assessment protocols to be adjusted to match individual patient profiles. First, generalizability theory allows the user to simultaneously consider multiple sources of measurement error variance (facets). Second, it allows the user to generalize the findings of the main study across the different study facets and to recalculate the reliability and MDC based on different combinations of facet conditions. In doing so, clinical assessment protocols can be chosen based on minimizing the number of measures that must be taken to achieve a realistic MDC, using repeated measures to minimize the MDC, or simply based on the combination that best allows the clinician to monitor an individual patient's progress over a specified period of time.
Theoretical modeling of the electronic structure and exchange interactions in Cu(II)Pc
NASA Astrophysics Data System (ADS)
Wu, Wei; Fisher, A. J.; Harrison, N. M.; Wang, Hai; Wu, Zhenlin; Gardener, Jules; Heutz, Sandrine; Jones, Tim; Aeppli, Gabriel
2012-12-01
We calculate the electronic structure and exchange interactions in a copper(II)phthalocyanine (Cu(II)Pc) crystal as a one-dimensional molecular chain using hybrid exchange density functional theory (DFT). In addition, the intermolecular exchange interactions are also calculated in a molecular dimer using Green's function perturbation theory (GFPT) to illustrate the underlying physics. We find that the exchange interactions depend strongly on the stacking angle, but weakly on the sliding angle (defined in the text). The hybrid DFT calculations also provide an insight into the electronic structure of the Cu(II)Pc molecular chain and demonstrate that on-site electron correlations have a significant effect on the nature of the ground state, the band gap and magnetic excitations. The exchange interactions predicted by our DFT calculations and GFPT calculations agree qualitatively with the recent experimental results on newly found η-Cu(II)Pc and the previous results for the α- and β-phases. This work provides a reliable theoretical basis for the further application of Cu(II)Pc to molecular spintronics and organic-based quantum information processing.
NASA Astrophysics Data System (ADS)
Wu, Wei; Fisher, A. J.; Harrison, N. M.
2011-07-01
We calculate the electronic structure and exchange interactions in a copper(II)phthalocyanine [Cu(II)Pc] crystal as a one-dimensional molecular chain using hybrid exchange density functional theory (DFT). In addition, the intermolecular exchange interactions are also calculated in a molecular dimer using Green’s function perturbation theory (GFPT) to illustrate the underlying physics. We find that the exchange interactions depend strongly on the stacking angle, but weakly on the sliding angle (defined in the text). The hybrid DFT calculations also provide an insight into the electronic structure of the Cu(II)Pc molecular chain and demonstrate that on-site electron correlations have a significant effect on the nature of the ground state, the band gap, and magnetic excitations. The exchange interactions predicted by our DFT calculations and GFPT calculations agree qualitatively with the recent experimental results on newly found η-Cu(II)Pc and the previous results for the α and β phases. This work provides a reliable theoretical basis for the further application of Cu(II)Pc to molecular spintronics and organic-based quantum information processing.
Construction of a memory battery for computerized administration, using item response theory.
Ferreira, Aristides I; Almeida, Leandro S; Prieto, Gerardo
2012-10-01
In accordance with Item Response Theory, a computer memory battery with six tests was constructed for use in the Portuguese adult population. A factor analysis was conducted to assess the internal structure of the tests (N = 547 undergraduate students). According to the literature, several confirmatory factor models were evaluated. Results showed better fit of a model with two independent latent variables corresponding to verbal and non-verbal factors, reproducing the initial battery organization. Internal consistency reliability for the six tests were alpha = .72 to .89. IRT analyses (Rasch and partial credit models) yielded good Infit and Outfit measures and high precision for parameter estimation. The potential utility of these memory tasks for psychological research and practice willbe discussed.
Structure and Measurement of Depression in Youth: Applying Item Response Theory to Clinical Data
Cole, David A.; Cai, Li; Martin, Nina C.; Findling, Robert L; Youngstrom, Eric A.; Garber, Judy; Curry, John F.; Hyde, Janet S.; Essex, Marilyn J.; Compas, Bruce E.; Goodyer, Ian M.; Rohde, Paul; Stark, Kevin D.; Slattery, Marcia J.; Forehand, Rex
2013-01-01
Goals of the paper were to use item response theory (IRT) to assess the relation of depressive symptoms to the underlying dimension of depression and to demonstrate how IRT-based measurement strategies can yield more reliable data about depression severity than conventional symptom counts. Participants were 3403 clinic and nonclinic children and adolescents from 12 contributing samples, all of whom received the Kiddie Schedule of Affective Disorders and Schizophrenia for school-aged children. Results revealed that some symptoms reflected higher levels of depression and were more discriminating than others. Results further demonstrated that utilization of IRT-based information about symptom severity and discriminability in the measurement of depression severity can reduce measurement error and increase measurement fidelity. PMID:21534696
Modeling study of the ABS relay valve
NASA Astrophysics Data System (ADS)
Lei, Ming; Lin, Min; Guo, Bin; Luo, Zai; Xu, Weidong
2011-05-01
The ABS (anti-lock braking system) relay valve is the key component of anti-lock braking system in most commercial vehicles such as trucks, tractor-trailers, etc. In this paper, structure of ABS relay valve and its work theory were analyzed. Then a mathematical model of ABS relay valve, which was investigated by dividing into electronic part, magnetic part, pneumatic part and mechanical part, was set up. The displacement of spools and the response of pressure increasing, holding, releasing of ABS relay valve were simulated and analyzed under conditions of control pressure 500 KPa, braking pressure 600 KPa, atmospheric pressure 100 KPa and air temperature 310 K. Thisarticle provides reliable theory for improving the performance and efficiency of anti-lock braking system of vehicles.
ERIC Educational Resources Information Center
Keller-Margulis, Milena A.; Mercer, Sterett H.; Thomas, Erin L.
2016-01-01
The purpose of this study was to examine the reliability of written expression curriculum-based measurement (WE-CBM) in the context of universal screening from a generalizability theory framework. Students in second through fifth grade (n = 145) participated in the study. The sample included 54% female students, 49% White students, 23% African…
ERIC Educational Resources Information Center
Sun, Anji; Valiga, Michael J.
In this study, the reliability of the American College Testing (ACT) Program's "Survey of Academic Advising" (SAA) was examined using both univariate and multivariate generalizability theory approaches. The primary purpose of the study was to compare the results of three generalizability theory models (a random univariate model, a mixed…
Two Prophecy Formulas for Assessing the Reliability of Item Response Theory-Based Ability Estimates
ERIC Educational Resources Information Center
Raju, Nambury S.; Oshima, T.C.
2005-01-01
Two new prophecy formulas for estimating item response theory (IRT)-based reliability of a shortened or lengthened test are proposed. Some of the relationships between the two formulas, one of which is identical to the well-known Spearman-Brown prophecy formula, are examined and illustrated. The major assumptions underlying these formulas are…
Quantification of the Relationship between Surrogate Fuel Structure and Performance
2012-07-31
order to account for know deficiencies [18]. The frequencies are then used to calculate the zero point energy ( ZPE ). In the G3 theory HF/6-31G* was used...for the ZPE and the new procedure is likely to be more reliable. Also in contrast to previous G series composite methods, the Hartree–Fock energy...The total energy is obtained by adding the previously calculated ZPE . Durant and Rohlfing [38] reported that B3LYP density functional methods provide
Disocclusion: a variational approach using level lines.
Masnou, Simon
2002-01-01
Object recognition, robot vision, image and film restoration may require the ability to perform disocclusion. We call disocclusion the recovery of occluded areas in a digital image by interpolation from their vicinity. It is shown in this paper how disocclusion can be performed by means of the level-lines structure, which offers a reliable, complete and contrast-invariant representation of images. Level-lines based disocclusion yields a solution that may have strong discontinuities. The proposed method is compatible with Kanizsa's amodal completion theory.
Stability of measures from children's interviews: the effects of time, sample length, and topic.
Heilmann, John; DeBrock, Lindsay; Riley-Tillman, T Chris
2013-08-01
The purpose of this study was to examine the reliability of, and sources of variability in, language measures from interviews collected from young school-age children. Two 10-min interviews were collected from 20 at-risk kindergarten children by an examiner using a standardized set of questions. Test-retest reliability coefficients were calculated for 8 language measures. Generalizability theory (G-theory) analyses were completed to document the variability introduced into the measures from the child, session, sample length, and topic. Significant and strong reliability correlation coefficients were observed for most of the language sample measures. The G-theory analyses revealed that most of the variance in the language measures was attributed to the child. Session, sample length, and topic accounted for negligible amounts of variance in most of the language measures. Measures from interviews were reliable across sessions, and the sample length and topic did not have a substantial impact on the reliability of the language measures. Implications regarding the clinical feasibility of language sample analysis for assessment and progress monitoring are discussed.
Reliability approach to rotating-component design. [fatigue life and stress concentration
NASA Technical Reports Server (NTRS)
Kececioglu, D. B.; Lalli, V. R.
1975-01-01
A probabilistic methodology for designing rotating mechanical components using reliability to relate stress to strength is explained. The experimental test machines and data obtained for steel to verify this methodology are described. A sample mechanical rotating component design problem is solved by comparing a deterministic design method with the new design-by reliability approach. The new method shows that a smaller size and weight can be obtained for specified rotating shaft life and reliability, and uses the statistical distortion-energy theory with statistical fatigue diagrams for optimum shaft design. Statistical methods are presented for (1) determining strength distributions for steel experimentally, (2) determining a failure theory for stress variations in a rotating shaft subjected to reversed bending and steady torque, and (3) relating strength to stress by reliability.
Applying graph theory to protein structures: an atlas of coiled coils.
Heal, Jack W; Bartlett, Gail J; Wood, Christopher W; Thomson, Andrew R; Woolfson, Derek N
2018-05-02
To understand protein structure, folding and function fully and to design proteins de novo reliably, we must learn from natural protein structures that have been characterised experimentally. The number of protein structures available is large and growing exponentially, which makes this task challenging. Indeed, computational resources are becoming increasingly important for classifying and analysing this resource. Here, we use tools from graph theory to define an atlas classification scheme for automatically categorising certain protein substructures. Focusing on the α-helical coiled coils, which are ubiquitous protein-structure and protein-protein interaction motifs, we present a suite of computational resources designed for analysing these assemblies. iSOCKET enables interactive analysis of side-chain packing within proteins to identify coiled coils automatically and with considerable user control. Applying a graph theory-based atlas classification scheme to structures identified by iSOCKET gives the Atlas of Coiled Coils, a fully automated, updated overview of extant coiled coils. The utility of this approach is illustrated with the first formal classification of an emerging subclass of coiled coils called α-helical barrels. Furthermore, in the Atlas, the known coiled-coil universe is presented alongside a partial enumeration of the 'dark matter' of coiled-coil structures; i.e., those coiled-coil architectures that are theoretically possible but have not been observed to date, and thus present defined targets for protein design. iSOCKET is available as part of the open-source GitHub repository associated with this work (https://github.com/woolfson-group/isocket). This repository also contains all the data generated when classifying the protein graphs. The Atlas of Coiled Coils is available at: http://coiledcoils.chm.bris.ac.uk/atlas/app.
Universal first-order reliability concept applied to semistatic structures
NASA Technical Reports Server (NTRS)
Verderaime, V.
1994-01-01
A reliability design concept was developed for semistatic structures which combines the prevailing deterministic method with the first-order reliability method. The proposed method surmounts deterministic deficiencies in providing uniformly reliable structures and improved safety audits. It supports risk analyses and reliability selection criterion. The method provides a reliability design factor derived from the reliability criterion which is analogous to the current safety factor for sizing structures and verifying reliability response. The universal first-order reliability method should also be applicable for air and surface vehicles semistatic structures.
Universal first-order reliability concept applied to semistatic structures
NASA Astrophysics Data System (ADS)
Verderaime, V.
1994-07-01
A reliability design concept was developed for semistatic structures which combines the prevailing deterministic method with the first-order reliability method. The proposed method surmounts deterministic deficiencies in providing uniformly reliable structures and improved safety audits. It supports risk analyses and reliability selection criterion. The method provides a reliability design factor derived from the reliability criterion which is analogous to the current safety factor for sizing structures and verifying reliability response. The universal first-order reliability method should also be applicable for air and surface vehicles semistatic structures.
Iyigun, Emine; Tastan, Sevinc; Ayhan, Hatice; Kose, Gulsah; Acikel, Cengizhan
2016-06-01
This study aimed to determine the validity and reliability levels of the Planned Behavior Theory Scale as related to a testicular self-examination. The study was carried out in a health-profession higher-education school in Ankara, Turkey, from April to June 2012. The study participants comprised 215 male students. Study data were collected by using a questionnaire, a planned behavior theory scale related to testicular self-examination, and Champion's Health Belief Model Scale (CHBMS). The sub-dimensions of the planned behavior theory scale, namely those of intention, attitude, subjective norms and self-efficacy, were found to have Cronbach's alpha values of between 0.81 and 0.89. Exploratory factor analysis showed that items of the scale had five factors that accounted for 75% of the variance. Of these, the sub-dimension of intention was found to have the highest level of contribution. A significant correlation was found between the sub-dimensions of the testicular self-examination planned behavior theory scale and those of CHBMS (p < 0.05). The findings suggest that the Turkish version of the testicular self-examination Planned Behavior Theory Scale is a valid and reliable measurement for Turkish society.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sulzbach, H.M.; Schaefer, H.F. III; Klopper, W.
1996-04-10
The question whether [10]annulene prefers olefinic structures with alternate single and double bonds or aromatic structures like all other small to medium sized uncharged (4n + 2){pi} electron homologs (e.g. benzene, [14]annulene) has been controversial for more than 20 years. Our new results suggest that only the high-order correlated methods will be able to correctly predict the [10]annulene potential energy surface. The UNO-CAS results and the strong oscillation of the MP series show that nondynamical electron correlation is important. Consequently, reliable results can only be expected at the highest correlated levels like CCSD(T) method, which predicts the olefinic twist structuremore » to be lower in energy by 3-7 kcal/mol. This prediction that the twist structure is lower in energy is supported by (a) the MP2-R12 method, which shows that large basis sets favor the olefinic structure relative to the aromatic, and (b) the fact that both structures are about equally affected by nondynamical electron correlation. We conclude that [10]annulene is a system which cannot be described adequately by either second-order Moller-Plesset perturbation theory or density functional methods. 13 refs., 3 tabs.« less
NASA Astrophysics Data System (ADS)
Dolzhenkova, E. V.; Iurieva, L. V.
2018-05-01
The study presents the author's algorithm for the industrial enterprise repair service organization simulation based on the reliability theory, as well as the results of its application. The monitoring of the industrial enterprise repair service organization is proposed to perform on the basis of the enterprise's state indexes for the main resources (equipment, labour, finances, repair areas), which allows quantitative evaluation of the reliability level as a resulting summary rating of the said parameters and the ensuring of an appropriate level of the operation reliability of the serviced technical objects. Under the conditions of the tough competition, the following approach is advisable: the higher efficiency of production and a repair service itself, the higher the innovative attractiveness of an industrial enterprise. The results of the calculations show that in order to prevent inefficient losses of production and to reduce the repair costs, it is advisable to apply the reliability theory. The overall reliability rating calculated on the basis of the author's algorithm has low values. The processing of the statistical data forms the reliability characteristics for the different workshops and services of an industrial enterprise, which allows one to define the failure rates of the various units of equipment and to establish the reliability indexes necessary for the subsequent mathematical simulation. The proposed simulating algorithm contributes to an increase of the efficiency of the repair service organization and improvement of the innovative attraction of an industrial enterprise.
ERIC Educational Resources Information Center
Hashimoto, Masanori
An economic theory of training holds that training in technical skills and training in employment relations (namely, information reliability or the ability to quickly and reliably disseminate information among the members of the firm) reinforce each other. This theory is an organizing framework for understanding some practices at Japanese firms in…
ERIC Educational Resources Information Center
Abry, Tashia; Cash, Anne H.; Bradshaw, Catherine P.
2014-01-01
Generalizability theory (GT) offers a useful framework for estimating the reliability of a measure while accounting for multiple sources of error variance. The purpose of this study was to use GT to examine multiple sources of variance in and the reliability of school-level teacher and high school student behaviors as observed using the tool,…
ERIC Educational Resources Information Center
Özenç, Emine Gül; Dogan, M. Cihangir
2014-01-01
This study aims to perform a validity-reliability test by developing the Functional Literacy Experience Scale based upon Ecological Theory (FLESBUET) for primary education students. The study group includes 209 fifth grade students at Sabri Taskin Primary School in the Kartal District of Istanbul, Turkey during the 2010-2011 academic year.…
Solubility prediction of naphthalene in carbon dioxide from crystal microstructure
NASA Astrophysics Data System (ADS)
Sang, Jiarong; Jin, Junsu; Mi, Jianguo
2018-03-01
Crystals dissolved in solvents are ubiquitous in both natural and artificial systems. Due to the complicated structures and asymmetric interactions between the crystal and solvent, it is difficult to interpret the dissolution mechanism and predict solubility using traditional theories and models. Here we use the classical density functional theory (DFT) to describe the crystal dissolution behavior. As an example, naphthalene dissolved in carbon dioxide (CO2) is considered within the DFT framework. The unit cell dimensions and microstructure of crystalline naphthalene are determined by minimizing the free-energy of the crystal. According to the microstructure, the solubilities of naphthalene in CO2 are predicted based on the equality of naphthalene's chemical potential in crystal and solution phases, and the interfacial structures and free-energies between different crystal planes and solution are determined to investigate the dissolution mechanism at the molecular level. The theoretical predictions are in general agreement with the available experimental data, implying that the present model is quantitatively reliable in describing crystal dissolution.
Jin, Zhen; Yang, Meng; Chen, Shao-Hua; Liu, Jin-Huai; Li, Qun-Xiang; Huang, Xing-Jiu
2017-02-21
Herein, we revealed that the electrochemical behaviors on the detection of heavy metal ions (HMIs) would largely rely on the exposed facets of SnO 2 nanoparticles. Compared to the high-energy {221} facet, the low-energy {110} facet of SnO 2 possessed better electrochemical performance. The adsorption/desorption tests, density-functional theory (DFT) calculations, and X-ray absorption fine structure (XAFS) studies showed that the lower barrier energy of surface diffusion on {110} facet was critical for the superior electrochemical property, which was favorable for the ions diffusion on the electrode, and further leading the enhanced electrochemical performance. Through the combination of experiments and theoretical calculations, a reliable interpretation of the mechanism for electroanalysis of HMIs with nanomaterials exposed by different crystal facets has been provided. Furthermore, it provides a deep insight into understanding the key factor to improve the electrochemical performance for HMIs detection, so as to design high-performance electrochemical sensors.
Environmental education curriculum evaluation questionnaire: A reliability and validity study
NASA Astrophysics Data System (ADS)
Minner, Daphne Diane
The intention of this research project was to bridge the gap between social science research and application to the environmental domain through the development of a theoretically derived instrument designed to give educators a template by which to evaluate environmental education curricula. The theoretical base for instrument development was provided by several developmental theories such as Piaget's theory of cognitive development, Developmental Systems Theory, Life-span Perspective, as well as curriculum research within the area of environmental education. This theoretical base fueled the generation of a list of components which were then translated into a questionnaire with specific questions relevant to the environmental education domain. The specific research question for this project is: Can a valid assessment instrument based largely on human development and education theory be developed that reliably discriminates high, moderate, and low quality in environmental education curricula? The types of analyses conducted to answer this question were interrater reliability (percent agreement, Cohen's Kappa coefficient, Pearson's Product-Moment correlation coefficient), test-retest reliability (percent agreement, correlation), and criterion-related validity (correlation). Face validity and content validity were also assessed through thorough reviews. Overall results indicate that 29% of the questions on the questionnaire demonstrated a high level of interrater reliability and 43% of the questions demonstrated a moderate level of interrater reliability. Seventy-one percent of the questions demonstrated a high test-retest reliability and 5% a moderate level. Fifty-five percent of the questions on the questionnaire were reliable (high or moderate) both across time and raters. Only eight questions (8%) did not show either interrater or test-retest reliability. The global overall rating of high, medium, or low quality was reliable across both coders and time, indicating that the questionnaire can discriminate differences in quality of environmental education curricula. Of the 35 curricula evaluated, 6 were high quality, 14 were medium quality and 15 were low quality. The criterion-related validity of the instrument is at current time unable to be established due to the lack of comparable measures or a concretely usable set of multidisciplinary standards. Face and content validity were sufficiently demonstrated.
Liu, Ying-Buh; Yang, Stephen S; Hsieh, Cheng-Hsing; Lin, Chia-Da; Chang, Shang-Jen
2014-05-01
To evaluate the inter-observer, intra-observer and intra-individual reliability of uroflowmetry and post-void residual urine (PVR) tests in adult men. Healthy volunteers aged over 40 years were enrolled. Every participant underwent two sets of uroflowmetry and PVR tests with a 2-week interval between the tests. The uroflowmetry tests were interpreted by four urologists independently. Uroflowmetry curves were classified as bell-shaped, bell-shaped with tail, obstructive, restrictive, staccato, interrupted and tower-shaped and scored from 1 (highly abnormal) to 5 (absolutely normal). The agreements between the observers, interpretations and tests within individuals were analyzed using kappa statistics and intraclass correlation coefficients. Generalizability theory with decision analysis was used to determine how many observers, tests, and interpretations were needed to obtain an acceptable reliability (> 0.80). Of 108 volunteers, we randomly selected the uroflowmetry results from 25 participants for the evaluation of reliability. The mean age of the studied adults was 55.3 years. The intra-individual and intra-observer reliability on uroflowmetry tests ranged from good to very good. However, the inter-observer reliability on normalcy and specific type of flow pattern were relatively lower. In generalizability theory, three observers were needed to obtain an acceptable reliability on normalcy of uroflow pattern if the patient underwent uroflowmetry tests twice with one observation. The intra-individual and intra-observer reliability on uroflowmetry tests were good while the inter-observer reliability was relatively lower. To improve inter-observer reliability, the definition of uroflowmetry should be clarified by the International Continence Society. © 2013 Wiley Publishing Asia Pty Ltd.
Ceramics Analysis and Reliability Evaluation of Structures (CARES). Users and programmers manual
NASA Technical Reports Server (NTRS)
Nemeth, Noel N.; Manderscheid, Jane M.; Gyekenyesi, John P.
1990-01-01
This manual describes how to use the Ceramics Analysis and Reliability Evaluation of Structures (CARES) computer program. The primary function of the code is to calculate the fast fracture reliability or failure probability of macroscopically isotropic ceramic components. These components may be subjected to complex thermomechanical loadings, such as those found in heat engine applications. The program uses results from MSC/NASTRAN or ANSYS finite element analysis programs to evaluate component reliability due to inherent surface and/or volume type flaws. CARES utilizes the Batdorf model and the two-parameter Weibull cumulative distribution function to describe the effect of multiaxial stress states on material strength. The principle of independent action (PIA) and the Weibull normal stress averaging models are also included. Weibull material strength parameters, the Batdorf crack density coefficient, and other related statistical quantities are estimated from four-point bend bar or unifrom uniaxial tensile specimen fracture strength data. Parameter estimation can be performed for single or multiple failure modes by using the least-square analysis or the maximum likelihood method. Kolmogorov-Smirnov and Anderson-Darling goodness-of-fit tests, ninety percent confidence intervals on the Weibull parameters, and Kanofsky-Srinivasan ninety percent confidence band values are also provided. The probabilistic fast-fracture theories used in CARES, along with the input and output for CARES, are described. Example problems to demonstrate various feature of the program are also included. This manual describes the MSC/NASTRAN version of the CARES program.
Influences on and Limitations of Classical Test Theory Reliability Estimates.
ERIC Educational Resources Information Center
Arnold, Margery E.
It is incorrect to say "the test is reliable" because reliability is a function not only of the test itself, but of many factors. The present paper explains how different factors affect classical reliability estimates such as test-retest, interrater, internal consistency, and equivalent forms coefficients. Furthermore, the limits of classical test…
Plant, Katherine L; Stanton, Neville A
2013-01-01
Aeronautical decision-making is complex as there is not always a clear coupling between the decision made and decision outcome. As such, there is a call for process-orientated decision research in order to understand why a decision made sense at the time it was made. Schema theory explains how we interact with the world using stored mental representations and forms an integral part of the perceptual cycle model (PCM); proposed here as a way to understand the decision-making process. This paper qualitatively analyses data from the critical decision method (CDM) based on the principles of the PCM. It is demonstrated that the approach can be used to understand a decision-making process and highlights how influential schemata can be at informing decision-making. The reliability of this approach is established, the general applicability is discussed and directions for future work are considered. This paper introduces the PCM, and the associated schema theory, as a framework to structure and explain data collected from the CDM. The reliability of both the method and coding scheme is addressed.
Long-term lunar stations: Some ecological considerations
NASA Technical Reports Server (NTRS)
Maguire, Bassett, Jr.; Scott, Kelly W.
1992-01-01
A major factor for long-term success of a lunar station is the ability to keep an agroecosystem functioning at a desirable, stable steady-state with ecological stability and reliability. Design for a long-lived extraterrestrial manned station must take into account interactions among its subsystems to insure that overall functionality is enhanced (or at least not compromised). Physical isolation of feed production, human living areas, recycling, and other systems may be straightforward, however, microbiological isolation will be very difficult. While it is possible to eliminate plant-associated microbiological communities by growing the plants asepticallly, it is not practical to keep plants germ-free on a large scale if humans are working with them. Ecological theory strongly suggests that some kinds of communities or organisms effectively increase the stability of ecosystems and will protect the plants from potential pathogens. A carefully designed and maintained (lunar-derived) soil can provide a variety of habitats for effective microbial buffers while adding structure to the agroecosystem. A soil can also increase ecosystem reliability through buffering otherwise large element and compound fluctuations (of nutrients, wastes, etc.) as well as buffering temperature level and atmosphere composition. We are doing experiments in ecological dynamics and attempting to extend the relevant theories.
Robust penalty method for structural synthesis
NASA Technical Reports Server (NTRS)
Kamat, M. P.
1983-01-01
The Sequential Unconstrained Minimization Technique (SUMT) offers an easy way of solving nonlinearly constrained problems. However, this algorithm frequently suffers from the need to minimize an ill-conditioned penalty function. An ill-conditioned minimization problem can be solved very effectively by posing the problem as one of integrating a system of stiff differential equations utilizing concepts from singular perturbation theory. This paper evaluates the robustness and the reliability of such a singular perturbation based SUMT algorithm on two different problems of structural optimization of widely separated scales. The report concludes that whereas conventional SUMT can be bogged down by frequent ill-conditioning, especially in large scale problems, the singular perturbation SUMT has no such difficulty in converging to very accurate solutions.
Comparing the Fit of Item Response Theory and Factor Analysis Models
ERIC Educational Resources Information Center
Maydeu-Olivares, Alberto; Cai, Li; Hernandez, Adolfo
2011-01-01
Linear factor analysis (FA) models can be reliably tested using test statistics based on residual covariances. We show that the same statistics can be used to reliably test the fit of item response theory (IRT) models for ordinal data (under some conditions). Hence, the fit of an FA model and of an IRT model to the same data set can now be…
The speed of metacognition: taking time to get to know one's structural knowledge.
Mealor, Andy D; Dienes, Zoltan
2013-03-01
The time course of different metacognitive experiences of knowledge was investigated using artificial grammar learning. Experiment 1 revealed that when participants are aware of the basis of their judgments (conscious structural knowledge) decisions are made most rapidly, followed by decisions made with conscious judgment but without conscious knowledge of underlying structure (unconscious structural knowledge), and guess responses (unconscious judgment knowledge) were made most slowly, even when controlling for differences in confidence and accuracy. In experiment 2, short response deadlines decreased the accuracy of unconscious but not conscious structural knowledge. Conversely, the deadline decreased the proportion of conscious structural knowledge in favour of guessing. Unconscious structural knowledge can be applied rapidly but becomes more reliable with additional metacognitive processing time whereas conscious structural knowledge is an all-or-nothing response that cannot always be applied rapidly. These dissociations corroborate quite separate theories of recognition (dual-process) and metacognition (higher order thought and cross-order integration). Copyright © 2012 Elsevier Inc. All rights reserved.
ERIC Educational Resources Information Center
Helms, LuAnn Sherbeck
This paper discusses the fact that reliability is about scores and not tests and how reliability limits effect sizes. The paper also explores the classical reliability coefficients of stability, equivalence, and internal consistency. Stability is concerned with how stable test scores will be over time, while equivalence addresses the relationship…
Anticipating the emergence of infectious diseases
Drake, John M.; Rohani, Pejman
2017-01-01
In spite of medical breakthroughs, the emergence of pathogens continues to pose threats to both human and animal populations. We present candidate approaches for anticipating disease emergence prior to large-scale outbreaks. Through use of ideas from the theories of dynamical systems and stochastic processes we develop approaches which are not specific to a particular disease system or model, but instead have general applicability. The indicators of disease emergence detailed in this paper can be classified into two parallel approaches: a set of early-warning signals based around the theory of critical slowing down and a likelihood-based approach. To test the reliability of these two approaches we contrast theoretical predictions with simulated data. We find good support for our methods across a range of different model structures and parameter values. PMID:28679666
Jiang, Zhehan; Skorupski, William
2017-12-12
In many behavioral research areas, multivariate generalizability theory (mG theory) has been typically used to investigate the reliability of certain multidimensional assessments. However, traditional mG-theory estimation-namely, using frequentist approaches-has limits, leading researchers to fail to take full advantage of the information that mG theory can offer regarding the reliability of measurements. Alternatively, Bayesian methods provide more information than frequentist approaches can offer. This article presents instructional guidelines on how to implement mG-theory analyses in a Bayesian framework; in particular, BUGS code is presented to fit commonly seen designs from mG theory, including single-facet designs, two-facet crossed designs, and two-facet nested designs. In addition to concrete examples that are closely related to the selected designs and the corresponding BUGS code, a simulated dataset is provided to demonstrate the utility and advantages of the Bayesian approach. This article is intended to serve as a tutorial reference for applied researchers and methodologists conducting mG-theory studies.
Bravini, Elisabetta; Franchignoni, Franco; Giordano, Andrea; Sartorio, Francesco; Ferriero, Giorgio; Vercelli, Stefano; Foti, Calogero
2015-01-01
To perform a comprehensive analysis of the psychometric properties and dimensionality of the Upper Limb Functional Index (ULFI) using both classical test theory and Rasch analysis (RA). Prospective, single-group observational design. Freestanding rehabilitation center. Convenience sample of Italian-speaking subjects with upper limb musculoskeletal disorders (N=174). Not applicable. The Italian version of the ULFI. Data were analyzed using parallel analysis, exploratory factor analysis, and RA for evaluating dimensionality, functioning of rating scale categories, item fit, hierarchy of item difficulties, and reliability indices. Parallel analysis revealed 2 factors explaining 32.5% and 10.7% of the response variance. RA confirmed the failure of the unidimensionality assumption, and 6 items out of the 25 misfitted the Rasch model. When the analysis was rerun excluding the misfitting items, the scale showed acceptable fit values, loading meaningfully to a single factor. Item separation reliability and person separation reliability were .98 and .89, respectively. Cronbach alpha was .92. RA revealed weakness of the scale concerning dimensionality and internal construct validity. However, a set of 19 ULFI items defined through the statistical process demonstrated a unidimensional structure, good psychometric properties, and clinical meaningfulness. These findings represent a useful starting point for further analyses of the tool (based on modern psychometric approaches and confirmatory factor analysis) in larger samples, including different patient populations and nationalities. Copyright © 2015 American Congress of Rehabilitation Medicine. Published by Elsevier Inc. All rights reserved.
Spanish validation of the Person-centered Care Assessment Tool (P-CAT).
Martínez, Teresa; Suárez-Álvarez, Javier; Yanguas, Javier; Muñiz, José
2016-01-01
Person-centered Care (PCC) is an innovative approach which seeks to improve the quality of care services given to the care-dependent elderly. At present there are no Spanish language instruments for the evaluation of PCC delivered by elderly care services. The aim of this work is the adaptation and validation of the Person-centered Care Assessment Tool (P-CAT) for a Spanish population. The P-CAT was translated and adapted into Spanish, then given to a sample of 1339 front-line care professionals from 56 residential elderly care homes. The reliability and validity of the P-CAT were analyzed, within the frameworks of Classical Test Theory and Item Response Theory models. The Spanish P-CAT demonstrated good reliability, with an alpha coefficient of .88 and a test-retest reliability coefficient of .79. The P-CAT information function indicates that the test measures with good precision for the majority of levels of the measured variables (θ values between -2 and +1). The factorial structure of the test is essentially one-dimensional and the item discrimination indices are high, with values between .26 and .61. In terms of predictive validity, the correlations which stand out are between the P-CAT and organizational climate (r = .689), and the burnout factors; personal accomplishment (r = .382), and emotional exhaustion (r = - .510). The Spanish version of the P-CAT demonstrates good psychometric properties for its use in the evaluation of elderly care homes both professionally and in research.
NASA Astrophysics Data System (ADS)
Martowicz, Adam; Uhl, Tadeusz
2012-10-01
The paper discusses the applicability of a reliability- and performance-based multi-criteria robust design optimization technique for micro-electromechanical systems, considering their technological uncertainties. Nowadays, micro-devices are commonly applied systems, especially in the automotive industry, taking advantage of utilizing both the mechanical structure and electronic control circuit on one board. Their frequent use motivates the elaboration of virtual prototyping tools that can be applied in design optimization with the introduction of technological uncertainties and reliability. The authors present a procedure for the optimization of micro-devices, which is based on the theory of reliability-based robust design optimization. This takes into consideration the performance of a micro-device and its reliability assessed by means of uncertainty analysis. The procedure assumes that, for each checked design configuration, the assessment of uncertainty propagation is performed with the meta-modeling technique. The described procedure is illustrated with an example of the optimization carried out for a finite element model of a micro-mirror. The multi-physics approach allowed the introduction of several physical phenomena to correctly model the electrostatic actuation and the squeezing effect present between electrodes. The optimization was preceded by sensitivity analysis to establish the design and uncertain domains. The genetic algorithms fulfilled the defined optimization task effectively. The best discovered individuals are characterized by a minimized value of the multi-criteria objective function, simultaneously satisfying the constraint on material strength. The restriction of the maximum equivalent stresses was introduced with the conditionally formulated objective function with a penalty component. The yielded results were successfully verified with a global uniform search through the input design domain.
[An instrument in Spanish to evaluate the performance of clinical teachers by students].
Bitran, Marcela; Mena, Beltrán; Riquelme, Arnoldo; Padilla, Oslando; Sánchez, Ignacio; Moreno, Rodrigo
2010-06-01
The modernization of clinical teaching has called for the creation of faculty development programs, and the design of suitable instruments to evaluate clinical teachers' performance. To report the development and validation of an instrument in Spanish designed to measure the students' perceptions of their clinical teachers' performance and to provide them with feedback to improve their teaching practices. In a process that included the active participation of authorities, professors in charge of courses and internships, clinical teachers, students and medical education experts, we developed a 30-item questionnaire called MEDUC30 to evaluate the performance of clinical teachers by their students. The internal validity was assessed by factor analysis of 5214 evaluations of 265 teachers, gathered from 2004 to 2007. The reliability was measured with the Cronbach's alpha coefficient and the generalizability coefficient (g). MEDUC30 had good content and construct validity. Its internal structure was compatible with four factors: patient-centered teaching, teaching skills, assessment skills and learning climate, and it proved to be consistent with the structure anticipated by the theory. The scores were highly reliable (Cronbach's alpha: 0.97); five evaluations per teacher were sufficient to reach a reliability coefficient (g) of 0.8. MEDUC30 is a valid, reliable and useful instrument to evaluate the performance of clinical teachers. To our knowledge, this is the first instrument in Spanish for which solid validity and reliability evidences have been reported. We hope that MEDUC30 will be used to improve medical education in Spanish-speaking medical schools, providing teachers a specific feedback upon which to improve their pedagogical practice, and authorities with valuable information for the assessment of their faculty.
Cooley, Richard L.
1982-01-01
Prior information on the parameters of a groundwater flow model can be used to improve parameter estimates obtained from nonlinear regression solution of a modeling problem. Two scales of prior information can be available: (1) prior information having known reliability (that is, bias and random error structure) and (2) prior information consisting of best available estimates of unknown reliability. A regression method that incorporates the second scale of prior information assumes the prior information to be fixed for any particular analysis to produce improved, although biased, parameter estimates. Approximate optimization of two auxiliary parameters of the formulation is used to help minimize the bias, which is almost always much smaller than that resulting from standard ridge regression. It is shown that if both scales of prior information are available, then a combined regression analysis may be made.
ERIC Educational Resources Information Center
Keller, Lisa A.; Clauser, Brian E.; Swanson, David B.
2010-01-01
In recent years, demand for performance assessments has continued to grow. However, performance assessments are notorious for lower reliability, and in particular, low reliability resulting from task specificity. Since reliability analyses typically treat the performance tasks as randomly sampled from an infinite universe of tasks, these estimates…
Generalizability Theory and Classical Test Theory
ERIC Educational Resources Information Center
Brennan, Robert L.
2011-01-01
Broadly conceived, reliability involves quantifying the consistencies and inconsistencies in observed scores. Generalizability theory, or G theory, is particularly well suited to addressing such matters in that it enables an investigator to quantify and distinguish the sources of inconsistencies in observed scores that arise, or could arise, over…
On the magnetic circular dichroism of benzene. A density-functional study
NASA Astrophysics Data System (ADS)
Kaminský, Jakub; Kříž, Jan; Bouř, Petr
2017-04-01
Spectroscopy of magnetic circular dichroism (MCD) provides enhanced information on molecular structure and a more reliable assignment of spectral bands than absorption alone. Theoretical modeling can significantly enhance the information obtained from experimental spectra. In the present study, the time dependent density functional theory is employed to model the lowest-energy benzene transitions, in particular to investigate the role of the Rydberg states and vibrational interference in spectral intensities. The effect of solvent is explored on model benzene-methane clusters. For the lowest-energy excitation, the vibrational sub-structure of absorption and MCD spectra is modeled within the harmonic approximation, providing a very good agreement with the experiment. The simulations demonstrate that the Rydberg states have a much stronger effect on the MCD intensities than on the absorption, and a very diffuse basis set must be used to obtain reliable results. The modeling also indicates that the Rydberg-like states and associated transitions may persist in solutions. Continuum-like solvent models are thus not suitable for their modeling; solvent-solute clusters appear to be more appropriate, providing they are large enough.
Laser-Based Surface Modification of Microstructure for Carbon Fiber-Reinforced Plastics
NASA Astrophysics Data System (ADS)
Yang, Wenfeng; Sun, Ting; Cao, Yu; Li, Shaolong; Liu, Chang; Tang, Qingru
2018-05-01
Bonding repair is a powerful feature of carbon fiber-reinforced plastics (CFRP). Based on the theory of interface bonding, the interface adhesion strength and reliability of the CFRP structure will be directly affected by the microscopic features of the CFRP surface, including the microstructure, physical, and chemical characteristics. In this paper, laser-based surface modification was compared to Peel-ply, grinding, and polishing to comparatively evaluate the surface microstructure of CFRP. The surface microstructure, morphology, fiber damage, height and space parameters were investigated by scanning electron microscopy (SEM) and laser confocal microscopy (LCM). Relative to the conventional grinding process, laser modification of the CFRP surface can result in more uniform resin removal and better processing control and repeatability. This decreases the adverse impact of surface fiber fractures and secondary damage. The surface properties were significantly optimized, which has been reflected such things as the obvious improvement of surface roughness, microstructure uniformity, and actual area. The improved surface microstructure based on laser modification is more conducive to interface bonding of CFRP structure repair. This can enhance the interfacial adhesion strength and reliability of repair.
ERIC Educational Resources Information Center
Kim, Sooyeon; Livingston, Samuel A.
2017-01-01
The purpose of this simulation study was to assess the accuracy of a classical test theory (CTT)-based procedure for estimating the alternate-forms reliability of scores on a multistage test (MST) having 3 stages. We generated item difficulty and discrimination parameters for 10 parallel, nonoverlapping forms of the complete 3-stage test and…
ICAN: A versatile code for predicting composite properties
NASA Technical Reports Server (NTRS)
Ginty, C. A.; Chamis, C. C.
1986-01-01
The Integrated Composites ANalyzer (ICAN), a stand-alone computer code, incorporates micromechanics equations and laminate theory to analyze/design multilayered fiber composite structures. Procedures for both the implementation of new data in ICAN and the selection of appropriate measured data are summarized for: (1) composite systems subject to severe thermal environments; (2) woven fabric/cloth composites; and (3) the selection of new composite systems including those made from high strain-to-fracture fibers. The comparisons demonstrate the versatility of ICAN as a reliable method for determining composite properties suitable for preliminary design.
Development of the Computer-Adaptive Version of the Late-Life Function and Disability Instrument
Tian, Feng; Kopits, Ilona M.; Moed, Richard; Pardasaney, Poonam K.; Jette, Alan M.
2012-01-01
Background. Having psychometrically strong disability measures that minimize response burden is important in assessing of older adults. Methods. Using the original 48 items from the Late-Life Function and Disability Instrument and newly developed items, a 158-item Activity Limitation and a 62-item Participation Restriction item pool were developed. The item pools were administered to a convenience sample of 520 community-dwelling adults 60 years or older. Confirmatory factor analysis and item response theory were employed to identify content structure, calibrate items, and build the computer-adaptive testings (CATs). We evaluated real-data simulations of 10-item CAT subscales. We collected data from 102 older adults to validate the 10-item CATs against the Veteran’s Short Form-36 and assessed test–retest reliability in a subsample of 57 subjects. Results. Confirmatory factor analysis revealed a bifactor structure, and multi-dimensional item response theory was used to calibrate an overall Activity Limitation Scale (141 items) and an overall Participation Restriction Scale (55 items). Fit statistics were acceptable (Activity Limitation: comparative fit index = 0.95, Tucker Lewis Index = 0.95, root mean square error approximation = 0.03; Participation Restriction: comparative fit index = 0.95, Tucker Lewis Index = 0.95, root mean square error approximation = 0.05). Correlation of 10-item CATs with full item banks were substantial (Activity Limitation: r = .90; Participation Restriction: r = .95). Test–retest reliability estimates were high (Activity Limitation: r = .85; Participation Restriction r = .80). Strength and pattern of correlations with Veteran’s Short Form-36 subscales were as hypothesized. Each CAT, on average, took 3.56 minutes to administer. Conclusions. The Late-Life Function and Disability Instrument CATs demonstrated strong reliability, validity, accuracy, and precision. The Late-Life Function and Disability Instrument CAT can achieve psychometrically sound disability assessment in older persons while reducing respondent burden. Further research is needed to assess their ability to measure change in older adults. PMID:22546960
Improving Predictions of Multiple Binary Models in ILP
2014-01-01
Despite the success of ILP systems in learning first-order rules from small number of examples and complexly structured data in various domains, they struggle in dealing with multiclass problems. In most cases they boil down a multiclass problem into multiple black-box binary problems following the one-versus-one or one-versus-rest binarisation techniques and learn a theory for each one. When evaluating the learned theories of multiple class problems in one-versus-rest paradigm particularly, there is a bias caused by the default rule toward the negative classes leading to an unrealistic high performance beside the lack of prediction integrity between the theories. Here we discuss the problem of using one-versus-rest binarisation technique when it comes to evaluating multiclass data and propose several methods to remedy this problem. We also illustrate the methods and highlight their link to binary tree and Formal Concept Analysis (FCA). Our methods allow learning of a simple, consistent, and reliable multiclass theory by combining the rules of the multiple one-versus-rest theories into one rule list or rule set theory. Empirical evaluation over a number of data sets shows that our proposed methods produce coherent and accurate rule models from the rules learned by the ILP system of Aleph. PMID:24696657
An Investigation of the Impact of Guessing on Coefficient α and Reliability
2014-01-01
Guessing is known to influence the test reliability of multiple-choice tests. Although there are many studies that have examined the impact of guessing, they used rather restrictive assumptions (e.g., parallel test assumptions, homogeneous inter-item correlations, homogeneous item difficulty, and homogeneous guessing levels across items) to evaluate the relation between guessing and test reliability. Based on the item response theory (IRT) framework, this study investigated the extent of the impact of guessing on reliability under more realistic conditions where item difficulty, item discrimination, and guessing levels actually vary across items with three different test lengths (TL). By accommodating multiple item characteristics simultaneously, this study also focused on examining interaction effects between guessing and other variables entered in the simulation to be more realistic. The simulation of the more realistic conditions and calculations of reliability and classical test theory (CTT) item statistics were facilitated by expressing CTT item statistics, coefficient α, and reliability in terms of IRT model parameters. In addition to the general negative impact of guessing on reliability, results showed interaction effects between TL and guessing and between guessing and test difficulty.
On the use of LiF:Mg,Ti thermoluminescence dosemeters in space--a critical review.
Horowitz, Y S; Satinger, D; Fuks, E; Oster, L; Podpalov, L
2003-01-01
The use of LiF:Mg,Ti thermoluminescence dosemeters (TLDs) in space radiation fields is reviewed. It is demonstrated in the context of modified track structure theory and microdosimetric track structure theory that there is no unique correlation between the relative thermoluminescence (TL) efficiency of heavy charged particles, neutrons of all energies and linear energy transfer (LET). Many experimental measurements dating back more than two decades also demonstrate the multivalued, non-universal, relationship between relative TL efficiency and LET. It is further demonstrated that the relative intensities of the dosimetric peaks and especially the high-temperature structure are dependent on a large number of variables, some controllable, some not. It is concluded that TL techniques employing the concept of LET (e.g. measurement of total dose, the high-temperature ratio (HTR) methods and other combinations of the relative TL efficiency of the various peaks used to estimate average Q or simulate Q-LET relationships) should be regarded as lacking a sound theoretical basis, highly prone to error and, as well, lack of reproducibility/universality due to the absence of a standardised experimental protocol essential to reliable experimental methodology.
Theory of the n = 2 levels in muonic helium-3 ions
NASA Astrophysics Data System (ADS)
Franke, Beatrice; Krauth, Julian J.; Antognini, Aldo; Diepold, Marc; Kottmann, Franz; Pohl, Randolf
2017-12-01
The present knowledge of Lamb shift, fine-, and hyperfine structure of the 2S and 2P states in muonic helium-3 ions is reviewed in anticipation of the results of a first measurement of several 2S → 2P transition frequencies in the muonic helium-3 ion, μ3He+. This ion is the bound state of a single negative muon μ- and a bare helium-3 nucleus (helion), 3He++. A term-by-term comparison of all available sources, including new, updated, and so far unpublished calculations, reveals reliable values and uncertainties of the QED and nuclear structure-dependent contributions to the Lamb shift and the hyperfine splitting. These values are essential for the determination of the helion rms charge radius and the nuclear structure effects to the hyperfine splitting in μ3He+. With this review we continue our series of theory summaries in light muonic atoms [see A. Antognini et al., Ann. Phys. 331, 127 (2013); J.J. Krauth et al., Ann. Phys. 366, 168 (2016); and M. Diepold et al.
Advances in Micromechanics Modeling of Composites Structures for Structural Health Monitoring
NASA Astrophysics Data System (ADS)
Moncada, Albert
Although high performance, light-weight composites are increasingly being used in applications ranging from aircraft, rotorcraft, weapon systems and ground vehicles, the assurance of structural reliability remains a critical issue. In composites, damage is absorbed through various fracture processes, including fiber failure, matrix cracking and delamination. An important element in achieving reliable composite systems is a strong capability of assessing and inspecting physical damage of critical structural components. Installation of a robust Structural Health Monitoring (SHM) system would be very valuable in detecting the onset of composite failure. A number of major issues still require serious attention in connection with the research and development aspects of sensor-integrated reliable SHM systems for composite structures. In particular, the sensitivity of currently available sensor systems does not allow detection of micro level damage; this limits the capability of data driven SHM systems. As a fundamental layer in SHM, modeling can provide in-depth information on material and structural behavior for sensing and detection, as well as data for learning algorithms. This dissertation focuses on the development of a multiscale analysis framework, which is used to detect various forms of damage in complex composite structures. A generalized method of cells based micromechanics analysis, as implemented in NASA's MAC/GMC code, is used for the micro-level analysis. First, a baseline study of MAC/GMC is performed to determine the governing failure theories that best capture the damage progression. The deficiencies associated with various layups and loading conditions are addressed. In most micromechanics analysis, a representative unit cell (RUC) with a common fiber packing arrangement is used. The effect of variation in this arrangement within the RUC has been studied and results indicate this variation influences the macro-scale effective material properties and failure stresses. The developed model has been used to simulate impact damage in a composite beam and an airfoil structure. The model data was verified through active interrogation using piezoelectric sensors. The multiscale model was further extended to develop a coupled damage and wave attenuation model, which was used to study different damage states such as fiber-matrix debonding in composite structures with surface bonded piezoelectric sensors.
Daniels, Vijay J; Bordage, Georges; Gierl, Mark J; Yudkowsky, Rachel
2014-10-01
Objective structured clinical examinations (OSCEs) are used worldwide for summative examinations but often lack acceptable reliability. Research has shown that reliability of scores increases if OSCE checklists for medical students include only clinically relevant items. Also, checklists are often missing evidence-based items that high-achieving learners are more likely to use. The purpose of this study was to determine if limiting checklist items to clinically discriminating items and/or adding missing evidence-based items improved score reliability in an Internal Medicine residency OSCE. Six internists reviewed the traditional checklists of four OSCE stations classifying items as clinically discriminating or non-discriminating. Two independent reviewers augmented checklists with missing evidence-based items. We used generalizability theory to calculate overall reliability of faculty observer checklist scores from 45 first and second-year residents and predict how many 10-item stations would be required to reach a Phi coefficient of 0.8. Removing clinically non-discriminating items from the traditional checklist did not affect the number of stations (15) required to reach a Phi of 0.8 with 10 items. Focusing the checklist on only evidence-based clinically discriminating items increased test score reliability, needing 11 stations instead of 15 to reach 0.8; adding missing evidence-based clinically discriminating items to the traditional checklist modestly improved reliability (needing 14 instead of 15 stations). Checklists composed of evidence-based clinically discriminating items improved the reliability of checklist scores and reduced the number of stations needed for acceptable reliability. Educators should give preference to evidence-based items over non-evidence-based items when developing OSCE checklists.
NASA Astrophysics Data System (ADS)
Frandsen, Benjamin A.; Brunelli, Michela; Page, Katharine; Uemura, Yasutomo J.; Staunton, Julie B.; Billinge, Simon J. L.
2016-05-01
We present a temperature-dependent atomic and magnetic pair distribution function (PDF) analysis of neutron total scattering measurements of antiferromagnetic MnO, an archetypal strongly correlated transition-metal oxide. The known antiferromagnetic ground-state structure fits the low-temperature data closely with refined parameters that agree with conventional techniques, confirming the reliability of the newly developed magnetic PDF method. The measurements performed in the paramagnetic phase reveal significant short-range magnetic correlations on a ˜1 nm length scale that differ substantially from the low-temperature long-range spin arrangement. Ab initio calculations using a self-interaction-corrected local spin density approximation of density functional theory predict magnetic interactions dominated by Anderson superexchange and reproduce the measured short-range magnetic correlations to a high degree of accuracy. Further calculations simulating an additional contribution from a direct exchange interaction show much worse agreement with the data. The Anderson superexchange model for MnO is thus verified by experimentation and confirmed by ab initio theory.
NASA Technical Reports Server (NTRS)
Lee, Timothy J.; Head-Gordon, Martin; Rendell, Alistair P.; Langhoff, Stephen R. (Technical Monitor)
1995-01-01
A diagnostic for perturbation theory calculations, S(sub 2), is defined and numerical results are compared to the established T(sub 1) diagnostic from coupled-cluster theory. S(sub 2) is the lowest order non-zero contribution to a perturbation expansion of T(sub 1). S(sub 2) is a reasonable estimate of the importance of non-dynamical electron correlation, although not as reliable as T(sub 1). S(sub 2) values less than or equal to 0.012 suggest that low orders of perturbation theory should yield reasonable results; S(sub 2) values between 0.012-0.015 suggest that caution is required in interpreting results from low orders of perturbation theory; S(sub 2) values greater than or equal to 0.015 indicate that low orders of perturbation theory are not reliable for accurate results. Although not required mathematically, S(sub 2) is always less than T(sub 1) for the examples studied here.
Fleig, Timo; Knecht, Stefan; Hättig, Christof
2007-06-28
We study the ground-state structures and singlet- and triplet-excited states of the nucleic acid bases by applying the coupled cluster model CC2 in combination with a resolution-of-the-identity approximation for electron interaction integrals. Both basis set effects and the influence of dynamic electron correlation on the molecular structures are elucidated; the latter by comparing CC2 with Hartree-Fock and Møller-Plesset perturbation theory to second order. Furthermore, we investigate basis set and electron correlation effects on the vertical excitation energies and compare our highest-level results with experiment and other theoretical approaches. It is shown that small basis sets are insufficient for obtaining accurate results for excited states of these molecules and that the CC2 approach to dynamic electron correlation is a reliable and efficient tool for electronic structure calculations on medium-sized molecules.
NASA Technical Reports Server (NTRS)
1970-01-01
Reliability Abstracts and Technical Reviews is an abstract and critical analysis service covering published and report literature on reliability. The service is designed to provide information on theory and practice of reliability as applied to aerospace and an objective appraisal of the quality, significance, and applicability of the literature abstracted.
Keller, Lisa A; Clauser, Brian E; Swanson, David B
2010-12-01
In recent years, demand for performance assessments has continued to grow. However, performance assessments are notorious for lower reliability, and in particular, low reliability resulting from task specificity. Since reliability analyses typically treat the performance tasks as randomly sampled from an infinite universe of tasks, these estimates of reliability may not be accurate. For tests built according to a table of specifications, tasks are randomly sampled from different strata (content domains, skill areas, etc.). If these strata remain fixed in the test construction process, ignoring this stratification in the reliability analysis results in an underestimate of "parallel forms" reliability, and an overestimate of the person-by-task component. This research explores the effect of representing and misrepresenting the stratification appropriately in estimation of reliability and the standard error of measurement. Both multivariate and univariate generalizability studies are reported. Results indicate that the proper specification of the analytic design is essential in yielding the proper information both about the generalizability of the assessment and the standard error of measurement. Further, illustrative D studies present the effect under a variety of situations and test designs. Additional benefits of multivariate generalizability theory in test design and evaluation are also discussed.
NASA Astrophysics Data System (ADS)
Zahedifar, Maedeh; Kratzer, Peter
2018-01-01
Various ab initio approaches to the band structure of A NiSn and A CoSb half-Heusler compounds (A = Ti, Zr, Hf) are compared and their consequences for the prediction of thermoelectric properties are explored. Density functional theory with the generalized-gradient approximation (GGA), as well as the hybrid density functional HSE06 and ab initio many-body perturbation theory in the form of the G W0 approach, are employed. The G W0 calculations confirm the trend of a smaller band gap (0.75 to 1.05 eV) in A NiSn compared to the A CoSb compounds (1.13 to 1.44 eV) already expected from the GGA calculations. While in A NiSn materials the G W0 band gap is 20% to 50% larger than in HSE06, the fundamental gap of A CoSb materials is smaller in G W0 compared to HSE06. This is because G W0 , similar to PBE, locates the valence band maximum at the L point of the Brillouin zone, whereas it is at the Γ point in the HSE06 calculations. The differences are attributed to the observation that the relative positions of the d levels of the transition metal atoms vary among the different methods. Using the calculated band structures and scattering rates taking into account the band effective masses at the extrema, the Seebeck coefficients, thermoelectric power factors, and figures of merit Z T are predicted for all six half-Heusler compounds. Comparable performance is predicted for the n -type A NiSn materials, whereas clear differences are found for the p -type A CoSb materials. Using the most reliable G W0 electronic structure, ZrCoSb is predicted to be the most efficient material with a power factor of up to 0.07 W/(K2 m) at a temperature of 600 K. We find strong variations among the different ab initio methods not only in the prediction of the maximum power factor and Z T value of a given material, but also in comparing different materials to each other, in particular in the p -type thermoelectric materials. Thus we conclude that the most elaborate, but also most costly G W0 method is required to perform a reliable computational search for the optimum material.
ERIC Educational Resources Information Center
Anderson, Daniel; Park, Jasmine, Bitnara; Lai, Cheng-Fei; Alonzo, Julie; Tindal, Gerald
2012-01-01
This technical report is one in a series of five describing the reliability (test/retest/and alternate form) and G-Theory/D-Study research on the easy CBM reading measures, grades 1-5. Data were gathered in the spring 2011 from a convenience sample of students nested within classrooms at a medium-sized school district in the Pacific Northwest. Due…
Composite Beam Theory with Material Nonlinearities and Progressive Damage
NASA Astrophysics Data System (ADS)
Jiang, Fang
Beam has historically found its broad applications. Nowadays, many engineering constructions still rely on this type of structure which could be made of anisotropic and heterogeneous materials. These applications motivate the development of beam theory in which the impact of material nonlinearities and damage on the global constitutive behavior has been a focus in recent years. Reliable predictions of these nonlinear beam responses depend on not only the quality of the material description but also a comprehensively generalized multiscale methodology which fills the theoretical gaps between the scales in an efficient yet high-fidelity manner. The conventional beam modeling methodologies which are built upon ad hoc assumptions are in lack of such reliability in need. Therefore, the focus of this dissertation is to create a reliable yet efficient method and the corresponding tool for composite beam modeling. A nonlinear beam theory is developed based on the Mechanics of Structure Genome (MSG) using the variational asymptotic method (VAM). The three-dimensional (3D) nonlinear continuum problem is rigorously reduced to a one-dimensional (1D) beam model and a two-dimensional (2D) cross-sectional analysis featuring both geometric and material nonlinearities by exploiting the small geometric parameter which is an inherent geometric characteristic of the beam. The 2D nonlinear cross-sectional analysis utilizes the 3D material models to homogenize the beam cross-sectional constitutive responses considering the nonlinear elasticity and progressive damage. The results from such a homogenization are inputs as constitutive laws into the global nonlinear 1D beam analysis. The theoretical foundation is formulated without unnecessary kinematic assumptions. Curvilinear coordinates and vector calculus are utilized to build the 3D deformation gradient tensor, of which the components are formulated in terms of cross-sectional coordinates, generalized beam strains, unknown warping functions, and the 3D spatial gradients of these warping functions. Asymptotic analysis of the extended Hamiltonian's principle suggests dropping the terms of axial gradients of the warping functions. As a result, the solid mechanics problem resolved into a 3D continuum is dimensionally reduced to a problem of solving the warping functions on a 2D cross-sectional field by minimizing the information loss. The present theory is implemented using the finite element method (FEM) in Variational Asymptotic Beam Sectional Analysis (VABS), a general-purpose cross-sectional analysis tool. An iterative method is applied to solve the finite warping field for the classical-type model in the form of the Euler-Bernoulli beam theory. The deformation gradient tensor is directly used to enable the capability of dealing with finite deformation, various strain definitions, and several types of material constitutive laws regarding the nonlinear elasticity and progressive damage. Analytical and numerical examples are given for various problems including the trapeze effect, Poynting effect, Brazier effect, extension-bending coupling effect, and free edge damage. By comparison with the predictions from 3D finite element analyses (FEA), 2D FEA based on plane stress assumptions, and experimental data, the structural and material responses are proven to be rigorously captured by the present theory and the computational cost is significantly reduced. Due to the semi-analytical feature of the code developed, the unrealistic numerical issues widely seen in the conventional FEA with strain softening material behaviors are prevented by VABS. In light of these intrinsic features, the nonlinear elastic and inelastic 3D material models can be economically calibrated by data-matching the VABS predictions directly with the experimental measurements from slender coupons. Furthermore, the global behavior of slender composite structures in meters can also be effectively characterized by VABS without unnecessary loss of important information of its local laminae in micrometers.
IRT-Estimated Reliability for Tests Containing Mixed Item Formats
ERIC Educational Resources Information Center
Shu, Lianghua; Schwarz, Richard D.
2014-01-01
As a global measure of precision, item response theory (IRT) estimated reliability is derived for four coefficients (Cronbach's a, Feldt-Raju, stratified a, and marginal reliability). Models with different underlying assumptions concerning test-part similarity are discussed. A detailed computational example is presented for the targeted…
Structural reliability assessment capability in NESSUS
NASA Technical Reports Server (NTRS)
Millwater, H.; Wu, Y.-T.
1992-01-01
The principal capabilities of NESSUS (Numerical Evaluation of Stochastic Structures Under Stress), an advanced computer code developed for probabilistic structural response analysis, are reviewed, and its structural reliability assessed. The code combines flexible structural modeling tools with advanced probabilistic algorithms in order to compute probabilistic structural response and resistance, component reliability and risk, and system reliability and risk. An illustrative numerical example is presented.
Structural reliability assessment capability in NESSUS
NASA Astrophysics Data System (ADS)
Millwater, H.; Wu, Y.-T.
1992-07-01
The principal capabilities of NESSUS (Numerical Evaluation of Stochastic Structures Under Stress), an advanced computer code developed for probabilistic structural response analysis, are reviewed, and its structural reliability assessed. The code combines flexible structural modeling tools with advanced probabilistic algorithms in order to compute probabilistic structural response and resistance, component reliability and risk, and system reliability and risk. An illustrative numerical example is presented.
Using Perturbation Theory to Reduce Noise in Diffusion Tensor Fields
Bansal, Ravi; Staib, Lawrence H.; Xu, Dongrong; Laine, Andrew F.; Liu, Jun; Peterson, Bradley S.
2009-01-01
We propose the use of Perturbation theory to reduce noise in Diffusion Tensor (DT) fields. Diffusion Tensor Imaging (DTI) encodes the diffusion of water molecules along different spatial directions in a positive-definite, 3 × 3 symmetric tensor. Eigenvectors and eigenvalues of DTs allow the in vivo visualization and quantitative analysis of white matter fiber bundles across the brain. The validity and reliability of these analyses are limited, however, by the low spatial resolution and low Signal-to-Noise Ratio (SNR) in DTI datasets. Our procedures can be applied to improve the validity and reliability of these quantitative analyses by reducing noise in the tensor fields. We model a tensor field as a three-dimensional Markov Random Field and then compute the likelihood and the prior terms of this model using Perturbation theory. The prior term constrains the tensor field to be smooth, whereas the likelihood term constrains the smoothed tensor field to be similar to the original field. Thus, the proposed method generates a smoothed field that is close in structure to the original tensor field. We evaluate the performance of our method both visually and quantitatively using synthetic and real-world datasets. We quantitatively assess the performance of our method by computing the SNR for eigenvalues and the coherence measures for eigenvectors of DTs across tensor fields. In addition, we quantitatively compare the performance of our procedures with the performance of one method that uses a Riemannian distance to compute the similarity between two tensors, and with another method that reduces noise in tensor fields by anisotropically filtering the diffusion weighted images that are used to estimate diffusion tensors. These experiments demonstrate that our method significantly increases the coherence of the eigenvectors and the SNR of the eigenvalues, while simultaneously preserving the fine structure and boundaries between homogeneous regions, in the smoothed tensor field. PMID:19540791
Integrated Safety Risk Reduction Approach to Enhancing Human-Rated Spaceflight Safety
NASA Astrophysics Data System (ADS)
Mikula, J. F. Kip
2005-12-01
This paper explores and defines the current accepted concept and philosophy of safety improvement based on a Reliability enhancement (called here Reliability Enhancement Based Safety Theory [REBST]). In this theory a Reliability calculation is used as a measure of the safety achieved on the program. This calculation may be based on a math model or a Fault Tree Analysis (FTA) of the system, or on an Event Tree Analysis (ETA) of the system's operational mission sequence. In each case, the numbers used in this calculation are hardware failure rates gleaned from past similar programs. As part of this paper, a fictional but representative case study is provided that helps to illustrate the problems and inaccuracies of this approach to safety determination. Then a safety determination and enhancement approach based on hazard, worst case analysis, and safety risk determination (called here Worst Case Based Safety Theory [WCBST]) is included. This approach is defined and detailed using the same example case study as shown in the REBST case study. In the end it is concluded that an approach combining the two theories works best to reduce Safety Risk.
Zheng, Y.
2013-01-01
Temporal sound cues are essential for sound recognition, pitch, rhythm, and timbre perception, yet how auditory neurons encode such cues is subject of ongoing debate. Rate coding theories propose that temporal sound features are represented by rate tuned modulation filters. However, overwhelming evidence also suggests that precise spike timing is an essential attribute of the neural code. Here we demonstrate that single neurons in the auditory midbrain employ a proportional code in which spike-timing precision and firing reliability covary with the sound envelope cues to provide an efficient representation of the stimulus. Spike-timing precision varied systematically with the timescale and shape of the sound envelope and yet was largely independent of the sound modulation frequency, a prominent cue for pitch. In contrast, spike-count reliability was strongly affected by the modulation frequency. Spike-timing precision extends from sub-millisecond for brief transient sounds up to tens of milliseconds for sounds with slow-varying envelope. Information theoretic analysis further confirms that spike-timing precision depends strongly on the sound envelope shape, while firing reliability was strongly affected by the sound modulation frequency. Both the information efficiency and total information were limited by the firing reliability and spike-timing precision in a manner that reflected the sound structure. This result supports a temporal coding strategy in the auditory midbrain where proportional changes in spike-timing precision and firing reliability can efficiently signal shape and periodicity temporal cues. PMID:23636724
Exposing Students to the Idea that Theories Can Change
ERIC Educational Resources Information Center
Hoellwarth, Chance; Moelter, Matthew J.
2011-01-01
The scientific method is arguably the most reliable way to understand the physical world, yet this aspect of science is rarely addressed in introductory science courses. Students typically learn about the theory in its final, refined form, and seldom experience the experiment-to-theory cycle that goes into producing the theory. One exception to…
Covariate-free and Covariate-dependent Reliability.
Bentler, Peter M
2016-12-01
Classical test theory reliability coefficients are said to be population specific. Reliability generalization, a meta-analysis method, is the main procedure for evaluating the stability of reliability coefficients across populations. A new approach is developed to evaluate the degree of invariance of reliability coefficients to population characteristics. Factor or common variance of a reliability measure is partitioned into parts that are, and are not, influenced by control variables, resulting in a partition of reliability into a covariate-dependent and a covariate-free part. The approach can be implemented in a single sample and can be applied to a variety of reliability coefficients.
NASA Astrophysics Data System (ADS)
Lin, Xiangyue; Peng, Minli; Lei, Fengming; Tan, Jiangxian; Shi, Huacheng
2017-12-01
Based on the assumptions of uniform corrosion and linear elastic expansion, an analytical model of cracking due to rebar corrosion expansion in concrete was established, which is able to consider the structure internal force. And then, by means of the complex variable function theory and series expansion technology established by Muskhelishvili, the corresponding stress component functions of concrete around the reinforcement were obtained. Also, a comparative analysis was conducted between the numerical simulation model and present model in this paper. The results show that the calculation results of both methods were consistent with each other, and the numerical deviation was less than 10%, proving that the analytical model established in this paper is reliable.
Focusing the view on nature's water-splitting catalyst.
Zein, Samir; Kulik, Leonid V; Yano, Junko; Kern, Jan; Pushkar, Yulia; Zouni, Athina; Yachandra, Vittal K; Lubitz, Wolfgang; Neese, Frank; Messinger, Johannes
2008-03-27
Nature invented a catalyst about 3Gyr ago, which splits water with high efficiency into molecular oxygen and hydrogen equivalents (protons and electrons). This reaction is energetically driven by sunlight and the active centre contains relatively cheap and abundant metals: manganese and calcium. This biological system therefore forms the paradigm for all man-made attempts for direct solar fuel production, and several studies are underway to determine the electronic and geometric structures of this catalyst. In this report we briefly summarize the problems and the current status of these efforts and propose a density functional theory-based strategy for obtaining a reliable high-resolution structure of this unique catalyst that includes both the inorganic core and the first ligand sphere.
Focusing the view on nature's water-splitting catalyst
Zein, Samir; Kulik, Leonid V; Yano, Junko; Kern, Jan; Pushkar, Yulia; Zouni, Athina; Yachandra, Vittal K; Lubitz, Wolfgang; Neese, Frank; Messinger, Johannes
2007-01-01
Nature invented a catalyst about 3 Gyr ago, which splits water with high efficiency into molecular oxygen and hydrogen equivalents (protons and electrons). This reaction is energetically driven by sunlight and the active centre contains relatively cheap and abundant metals: manganese and calcium. This biological system therefore forms the paradigm for all man-made attempts for direct solar fuel production, and several studies are underway to determine the electronic and geometric structures of this catalyst. In this report we briefly summarize the problems and the current status of these efforts and propose a density functional theory-based strategy for obtaining a reliable high-resolution structure of this unique catalyst that includes both the inorganic core and the first ligand sphere. PMID:17989003
Density functional theory calculations of 95Mo NMR parameters in solid-state compounds.
Cuny, Jérôme; Furet, Eric; Gautier, Régis; Le Pollès, Laurent; Pickard, Chris J; d'Espinose de Lacaillerie, Jean-Baptiste
2009-12-21
The application of periodic density functional theory-based methods to the calculation of (95)Mo electric field gradient (EFG) and chemical shift (CS) tensors in solid-state molybdenum compounds is presented. Calculations of EFG tensors are performed using the projector augmented-wave (PAW) method. Comparison of the results with those obtained using the augmented plane wave + local orbitals (APW+lo) method and with available experimental values shows the reliability of the approach for (95)Mo EFG tensor calculation. CS tensors are calculated using the recently developed gauge-including projector augmented-wave (GIPAW) method. This work is the first application of the GIPAW method to a 4d transition-metal nucleus. The effects of ultra-soft pseudo-potential parameters, exchange-correlation functionals and structural parameters are precisely examined. Comparison with experimental results allows the validation of this computational formalism.
ERIC Educational Resources Information Center
Kim, Seonghoon; Feldt, Leonard S.
2010-01-01
The primary purpose of this study is to investigate the mathematical characteristics of the test reliability coefficient rho[subscript XX'] as a function of item response theory (IRT) parameters and present the lower and upper bounds of the coefficient. Another purpose is to examine relative performances of the IRT reliability statistics and two…
Fransson, Thomas; Burdakova, Daria; Norman, Patrick
2016-05-21
X-ray absorption spectra of carbon, silicon, germanium, and sulfur compounds have been investigated by means of damped four-component density functional response theory. It is demonstrated that a reliable description of relativistic effects is obtained at both K- and L-edges. Notably, an excellent agreement with experimental results is obtained for L2,3-spectra-with spin-orbit effects well accounted for-also in cases when the experimental intensity ratio deviates from the statistical one of 2 : 1. The theoretical results are consistent with calculations using standard response theory as well as recently reported real-time propagation methods in time-dependent density functional theory, and the virtues of different approaches are discussed. As compared to silane and silicon tetrachloride, an anomalous error in the absolute energy is reported for the L2,3-spectrum of silicon tetrafluoride, amounting to an additional spectral shift of ∼1 eV. This anomaly is also observed for other exchange-correlation functionals, but it is seen neither at other silicon edges nor at the carbon K-edge of fluorine derivatives of ethene. Considering the series of molecules SiH4-XFX with X = 1, 2, 3, 4, a gradual divergence from interpolated experimental ionization potentials is observed at the level of Kohn-Sham density functional theory (DFT), and to a smaller extent with the use of Hartree-Fock. This anomalous error is thus attributed partly to difficulties in correctly emulating the electronic structure effects imposed by the very electronegative fluorines, and partly due to inconsistencies in the spurious electron self-repulsion in DFT. Substitution with one, or possibly two, fluorine atoms is estimated to yield small enough errors to allow for reliable interpretations and predictions of L2,3-spectra of more complex and extended silicon-based systems.
Applying Gradient Descent in Convolutional Neural Networks
NASA Astrophysics Data System (ADS)
Cui, Nan
2018-04-01
With the development of the integrated circuit and computer science, people become caring more about solving practical issues via information technologies. Along with that, a new subject called Artificial Intelligent (AI) comes up. One popular research interest of AI is about recognition algorithm. In this paper, one of the most common algorithms, Convolutional Neural Networks (CNNs) will be introduced, for image recognition. Understanding its theory and structure is of great significance for every scholar who is interested in this field. Convolution Neural Network is an artificial neural network which combines the mathematical method of convolution and neural network. The hieratical structure of CNN provides it reliable computer speed and reasonable error rate. The most significant characteristics of CNNs are feature extraction, weight sharing and dimension reduction. Meanwhile, combining with the Back Propagation (BP) mechanism and the Gradient Descent (GD) method, CNNs has the ability to self-study and in-depth learning. Basically, BP provides an opportunity for backwardfeedback for enhancing reliability and GD is used for self-training process. This paper mainly discusses the CNN and the related BP and GD algorithms, including the basic structure and function of CNN, details of each layer, the principles and features of BP and GD, and some examples in practice with a summary in the end.
A safety-based decision making architecture for autonomous systems
NASA Technical Reports Server (NTRS)
Musto, Joseph C.; Lauderbaugh, L. K.
1991-01-01
Engineering systems designed specifically for space applications often exhibit a high level of autonomy in the control and decision-making architecture. As the level of autonomy increases, more emphasis must be placed on assimilating the safety functions normally executed at the hardware level or by human supervisors into the control architecture of the system. The development of a decision-making structure which utilizes information on system safety is detailed. A quantitative measure of system safety, called the safety self-information, is defined. This measure is analogous to the reliability self-information defined by McInroy and Saridis, but includes weighting of task constraints to provide a measure of both reliability and cost. An example is presented in which the safety self-information is used as a decision criterion in a mobile robot controller. The safety self-information is shown to be consistent with the entropy-based Theory of Intelligent Machines defined by Saridis.
Durability evaluation of ceramic components using CARES/LIFE
NASA Technical Reports Server (NTRS)
Nemeth, Noel N.; Powers, Lynn M.; Janosik, Lesley A.; Gyekenyesi, John P.
1994-01-01
The computer program CARES/LIFE calculates the time-dependent reliability of monolithic ceramic components subjected to thermomechanical and/or proof test loading. This program is an extension of the CARES (Ceramics Analysis and Reliability Evaluation of Structures) computer program. CARES/LIFE accounts for the phenomenon of subcritical crack growth (SCG) by utilizing the power law, Paris law, or Walker equation. The two-parameter Weibull cumulative distribution function is used to characterize the variation in component strength. The effects of multiaxial stresses are modeled using either the principle of independent action (PIA), the Weibull normal stress averaging method (NSA), or the Batdorf theory. Inert strength and fatigue parameters are estimated from rupture strength data of naturally flawed specimens loaded in static, dynamic, or cyclic fatigue. Application of this design methodology is demonstrated using experimental data from alumina bar and disk flexure specimens which exhibit SCG when exposed to water.
Durability evaluation of ceramic components using CARES/LIFE
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nemeth, N.N.; Janosik, L.A.; Gyekenyesi, J.P.
1996-01-01
The computer program CARES/LIFE calculates the time-dependent reliability of monolithic ceramic components subjected to thermomechanical and/or proof test loading. This program is an extension of the CARES (Ceramics Analysis and Reliability Evaluation of Structures) computer program. CARES/LIFE accounts for the phenomenon of subcritical crack growth (SCG) by utilizing the power law, Paris law, or Walker equation. The two-parameter Weibull cumulative distribution function is used to characterize the variation in component strength. The effects of multiaxial stresses are modeled using either the principle of independent action (PIA), the Weibull normal stress averaging method (NSA), or the Batdorf theory. Inert strength andmore » fatigue parameters are estimated from rupture strength data of naturally flawed specimens loaded in static, dynamic, or cyclic fatigue. Application of this design methodology is demonstrated using experimental data from alumina bar and disk flexure specimens, which exhibit SCG when exposed to water.« less
Bond, Frank W; Hayes, Steven C; Baer, Ruth A; Carpenter, Kenneth M; Guenole, Nigel; Orcutt, Holly K; Waltz, Tom; Zettle, Robert D
2011-12-01
The present research describes the development and psychometric evaluation of a second version of the Acceptance and Action Questionnaire (AAQ-II), which assesses the construct referred to as, variously, acceptance, experiential avoidance, and psychological inflexibility. Results from 2,816 participants across six samples indicate the satisfactory structure, reliability, and validity of this measure. For example, the mean alpha coefficient is .84 (.78-.88), and the 3- and 12-month test-retest reliability is .81 and .79, respectively. Results indicate that AAQ-II scores concurrently, longitudinally, and incrementally predict a range of outcomes, from mental health to work absence rates, that are consistent with its underlying theory. The AAQ-II also demonstrates appropriate discriminant validity. The AAQ-II appears to measure the same concept as the AAQ-I (r=.97) but with better psychometric consistency. Copyright © 2011. Published by Elsevier Ltd.
Life Prediction Issues in Thermal/Environmental Barrier Coatings in Ceramic Matrix Composites
NASA Technical Reports Server (NTRS)
Shah, Ashwin R.; Brewer, David N.; Murthy, Pappu L. N.
2001-01-01
Issues and design requirements for the environmental barrier coating (EBC)/thermal barrier coating (TBC) life that are general and those specific to the NASA Ultra-Efficient Engine Technology (UEET) development program have been described. The current state and trend of the research, methods in vogue related to the failure analysis, and long-term behavior and life prediction of EBCITBC systems are reported. Also, the perceived failure mechanisms, variables, and related uncertainties governing the EBCITBC system life are summarized. A combined heat transfer and structural analysis approach based on the oxidation kinetics using the Arrhenius theory is proposed to develop a life prediction model for the EBC/TBC systems. Stochastic process-based reliability approach that includes the physical variables such as gas pressure, temperature, velocity, moisture content, crack density, oxygen content, etc., is suggested. Benefits of the reliability-based approach are also discussed in the report.
NASA Technical Reports Server (NTRS)
Kellas, Sotiris; Knight, Norman F., Jr.
2002-01-01
A lightweight energy-absorbing keel-beam concept was developed and retrofitted in a general aviation type aircraft to improve crashworthiness performance. The energy-absorbing beam consisted of a foam-filled cellular structure with glass fiber and hybrid glass/kevlar cell walls. Design, analysis, fabrication and testing of the keel beams prior to installation and subsequent full-scale crash testing of the aircraft are described. Factors such as material and fabrication constraints, damage tolerance, crush stress/strain response, seat-rail loading, and post crush integrity, which influenced the course of the design process are also presented. A theory similar to the one often used for ductile metal box structures was employed with appropriate modifications to estimate the sustained crush loads for the beams. This, analytical tool, coupled with dynamic finite element simulation using MSC.Dytran were the prime design and analysis tools. The validity of the theory as a reliable design tool was examined against test data from static crush tests of beam sections while the overall performance of the energy-absorbing subfloor was assessed through dynamic testing of 24 in long subfloor assemblies.
Halogen Bonding versus Hydrogen Bonding: A Molecular Orbital Perspective
Wolters, Lando P; Bickelhaupt, F Matthias
2012-01-01
We have carried out extensive computational analyses of the structure and bonding mechanism in trihalides DX⋅⋅⋅A− and the analogous hydrogen-bonded complexes DH⋅⋅⋅A− (D, X, A=F, Cl, Br, I) using relativistic density functional theory (DFT) at zeroth-order regular approximation ZORA-BP86/TZ2P. One purpose was to obtain a set of consistent data from which reliable trends in structure and stability can be inferred over a large range of systems. The main objective was to achieve a detailed understanding of the nature of halogen bonds, how they resemble, and also how they differ from, the better understood hydrogen bonds. Thus, we present an accurate physical model of the halogen bond based on quantitative Kohn–Sham molecular orbital (MO) theory, energy decomposition analyses (EDA) and Voronoi deformation density (VDD) analyses of the charge distribution. It appears that the halogen bond in DX⋅⋅⋅A− arises not only from classical electrostatic attraction but also receives substantial stabilization from HOMO–LUMO interactions between the lone pair of A− and the σ* orbital of D–X. PMID:24551497
ERIC Educational Resources Information Center
Anderson, Daniel; Lai, Cheg-Fei; Park, Bitnara Jasmine; Alonzo, Julie; Tindal, Gerald
2012-01-01
This technical report is one in a series of five describing the reliability (test/retest an alternate form) and G-Theory/D-Study on the easyCBM reading measures, grades 1-5. Data were gathered in the spring of 2011 from the convenience sample of students nested within classrooms at a medium-sized school district in the Pacific Northwest. Due to…
ERIC Educational Resources Information Center
Lai, Cheng-Fei; Park, Bitnara Jasmine; Anderson, Daniel; Alonzo, Julie; Tindal, Gerald
2012-01-01
This technical report is one in a series of five describing the reliability (test/retest and alternate form) and G-Theory/D-Study research on the easyCBM reading measures, grades 1-5. Data were gathered in the spring of 2011 from a convenience sample of students nested within classrooms at a medium-sized school district in the Pacific Northwest.…
ERIC Educational Resources Information Center
Park, Bitnara Jasmine; Anderson, Daniel; Alonzo, Julie; Lai, Cheng-Fei; Tindal, Gerald
2012-01-01
This technical report is one in a series of five describing the reliability (test/retest and alternate form) and G-Theory/D-Study research on the easyCBM reading measures, grades 1-5. Data were gathered in the spring of 2011 from a convenience sample of students nested within classrooms at a medium-sized school district in the Pacific Northwest.…
1990-01-10
reason for the fairly low reliability of the fourth and fifth MEOCS factors), issues of sexism and more subtle forms of racism have come to the fore...psychological climate (for which the individual is the unit for theory ). One approach, described by Glick, would use the intraclass correlation from a...and outcome measures are forced to remain obscure. A major flaw in the measurement of organizational climate is the lack of theory which would serve
Kunicki, Zachary J; Schick, Melissa R; Spillane, Nichea S; Harlow, Lisa L
2018-06-01
Those who binge drink are at increased risk for alcohol-related consequences when compared to non-binge drinkers. Research shows individuals may face barriers to reducing their drinking behavior, but few measures exist to assess these barriers. This study created and validated the Barriers to Alcohol Reduction (BAR) scale. Participants were college students ( n = 230) who endorsed at least one instance of past-month binge drinking (4+ drinks for women or 5+ drinks for men). Using classical test theory, exploratory structural equation modeling found a two-factor structure of personal/psychosocial barriers and perceived program barriers. The sub-factors, and full scale had reasonable internal consistency (i.e., coefficient omega = 0.78 (personal/psychosocial), 0.82 (program barriers), and 0.83 (full measure)). The BAR also showed evidence for convergent validity with the Brief Young Adult Alcohol Consequences Questionnaire ( r = 0.39, p < .001) and discriminant validity with Barriers to Physical Activity ( r = -0.02, p = .81). Item Response Theory (IRT) analysis showed the two factors separately met the unidimensionality assumption, and provided further evidence for severity of the items on the two factors. Results suggest that the BAR measure appears reliable and valid for use in an undergraduate student population of binge drinkers. Future studies may want to re-examine this measure in a more diverse sample.
Calculating the optical properties of defects and surfaces in wide band gap materials
NASA Astrophysics Data System (ADS)
Deák, Peter
2018-04-01
The optical properties of a material critically depend on its defects, and understanding that requires substantial and accurate input from theory. This paper describes recent developments in the electronic structure theory of defects in wide band gap materials, where the standard local or semi-local approximations of density functional theory fail. The success of the HSE06 screened hybrid functional is analyzed in case of Group-IV semiconductors and TiO2, and shown that it is the consequence of error compensation between semi-local and non-local exchange, resulting in a proper derivative discontinuity (reproduction of the band gap) and a total energy which is a linear function of the fractional occupation numbers (removing most of the electron self-interaction). This allows the calculation of electronic transitions with accuracy unseen before, as demonstrated on the single-photon emitter NV(-) center in diamond and on polaronic states in TiO2. Having a reliable tool for electronic structure calculations, theory can contribute to the understanding of complicated cases of light-matter interaction. Two examples are considered here: surface termination effects on the blinking and bleaching of the light-emission of the NV(-) center in diamond, and on the efficiency of photocatalytic water-splitting by TiO2. Finally, an outlook is presented for the application of hybrid functionals in other materials, as, e.g., ZnO, Ga2O3 or CuGaS2.
Verhoef, J; Toussaint, P J; Putter, H; Zwetsloot-Schonk, J H M; Vliet Vlieland, T P M
2005-10-01
Coordinated teams with multidisciplinary team conferences are generally seen as a solution to the management of complex health conditions. However, problems regarding the process of communication during team conferences are reported, such as the absence of a common language or viewpoint and the exchange of irrelevant or repeated information. To determine the outcome of interventions aimed at improving communication during team conferences, a reliable and valid assessment method is needed. To investigate the feasibility of a theory-based measurement instrument for assessing the process of the communication during multidisciplinary team conferences in rheumatology. An observation instrument was developed based on communication theory. The instrument distinguishes three types of communication: (I) grounding activities, (II) coordination of non-team activities, and (III) coordination of team activities. To assess the process of communication during team conferences in a rheumatology clinic with inpatient and day patient facilities, team conferences were videotaped. To determine the inter-rater reliability, in 20 conferences concerning 10 patients with rheumatoid arthritis admitted to the inpatient unit, the instrument was applied by two investigators independently. Content validity was determined by analysing and comparing the results of initial and follow-up team conferences of 25 consecutive patients with rheumatoid arthritis admitted to the day patient unit (Wilcoxon signed rank test). The inter-rater reliability was excellent with the intra-class correlation coefficients being >0.98 for both types I and III communications in 10 initial and 10 follow-up conferences (type II was not observed). An analysis of an additional 25 initial and 86 follow-up team conferences showed that time spent on grounding (type I) made up the greater part of the contents of communication (87% S.D. 14 and 60% S.D. 29 in initial and follow-up conferences, respectively), which is significantly more compared to time spent on co-ordination (p<0.001 and 0.02 for categories II and III, respectively). Moreover, significantly less time spent was spent on grounding in follow-up as compared to initial team conferences, whereas the time spent on coordination (type III) increased (both p-values<0.001). This theory-based measurement instrument for describing and evaluating the communication process during team conferences proved to be reliable and valid in this pilot study. Its usefulness to detect changes in the communication process, e.g. after implementing systems for re-structuring team conferences mediated by ICT applications, should be further examined.
Bai, Yeon K; Dinour, Lauren M
2017-11-01
A proper assessment of multidimensional needs for breastfeeding mothers in various settings is crucial to facilitate and support breastfeeding and its exclusivity. The theory of planned behavior (TPB) has been used frequently to measure factors associated with breastfeeding. Full utility of the TPB requires accurate measurement of theory constructs. Research aim: This study aimed to develop and confirm the psychometric properties of an instrument, Milk Expression on Campus, based on the TPB and to establish the reliability and validity of the instrument. In spring 2015, 218 breastfeeding (current or in the recent past) employees and students at one university campus in northern New Jersey completed the online questionnaire containing demography and theory-based items. Internal consistency (α) and split-half reliability ( r) tests and factor analyses established and confirmed the reliability and construct validity of this instrument. Milk Expression on Campus showed strong and significant reliabilities as a full scale (α = .78, r = .74, p < .001) and theory construct subscales. Validity was confirmed as psychometric properties corresponded to the factors extracted from the scale. Four factors extracted from the direct construct subscales accounted for 79.49% of the total variability. Four distinct factors from the indirect construct subscales accounted for 73.68% of the total variability. Milk Expression on Campus can serve as a model TPB-based instrument to examine factors associated with women's milk expression behavior. The utility of this instrument extends to designing effective promotion programs to foster breastfeeding and milk expression behaviors in diverse settings.
The application of the statistical theory of extreme values to gust-load problems
NASA Technical Reports Server (NTRS)
Press, Harry
1950-01-01
An analysis is presented which indicates that the statistical theory of extreme values is applicable to the problems of predicting the frequency of encountering the larger gust loads and gust velocities for both specific test conditions as well as commercial transport operations. The extreme-value theory provides an analytic form for the distributions of maximum values of gust load and velocity. Methods of fitting the distribution are given along with a method of estimating the reliability of the predictions. The theory of extreme values is applied to available load data from commercial transport operations. The results indicate that the estimates of the frequency of encountering the larger loads are more consistent with the data and more reliable than those obtained in previous analyses. (author)
ERIC Educational Resources Information Center
Gage, Nicholas A.; Prykanowski, Debra; Hirn, Regina
2014-01-01
Reliability of direct observation outcomes ensures the results are consistent, dependable, and trustworthy. Typically, reliability of direct observation measurement approaches is assessed using interobserver agreement (IOA) and the calculation of observer agreement (e.g., percentage of agreement). However, IOA does not address intraobserver…
ERIC Educational Resources Information Center
Arce-Ferrer, Alvaro J.; Castillo, Irene Borges
2007-01-01
The use of face-to-face interviews is controversial for college admissions decisions in light of the lack of availability of validity and reliability evidence for most college admission processes. This study investigated reliability and incremental predictive validity of a face-to-face postgraduate college admission interview with a sample of…
The Information Function for the One-Parameter Logistic Model: Is it Reliability?
ERIC Educational Resources Information Center
Doran, Harold C.
2005-01-01
The information function is an important statistic in item response theory (IRT) applications. Although the information function is often described as the IRT version of reliability, it differs from the classical notion of reliability from a critical perspective: replication. This article first explores the information function for the…
Smith, Laramie R.; Earnshaw, Valerie A.; Copenhaver, Michael M.; Cunningham, Chinazo O.
2016-01-01
Background Substance use disorders consistently rank among the most stigmatized conditions worldwide. Thus, substance use stigma fosters health inequities among persons with substance use disorders and remains a key barrier to successful screening and treatment efforts. Current efforts to measure substance use stigma are limited. This study aims to advance measurement efforts by drawing on stigma theory to develop and evaluate the Substance Use Stigma Mechanisms Scale (SU-SMS). The SU-SMS was designed to capture enacted, anticipated, and internalized substance use stigma mechanisms among persons with current and past substance use disorders, and distinguish between key stigma sources most likely to impact this target population. Methods This study was a cross-sectional evaluation of the validity, reliability, and generalizability of the SU-SMS across two independent samples with diverse substance use and treatment histories. Results Findings support the structural and construct validity of the SU-SMS, suggesting the scale was able to capture enacted, anticipated, and internalized stigma as distinct stigma experiences. It also further differentiated between two distinct stigma sources (family and healthcare providers). Analysis of these mechanisms and psychosocial metrics suggests that the scale is also associated with other health-related outcomes. Furthermore, the SU-SMS demonstrated high levels of internal reliability and generalizability across two independent samples of persons with diverse substance use disorders and treatment histories. Conclusion The SU-SMS may serve as a valuable tool for better understanding the processes through which substance use stigma serves to undermine key health behaviors and outcomes among persons with substance use disorders. PMID:26972790
Validation of the Community Integration Questionnaire in the adult burn injury population.
Gerrard, Paul; Kazis, Lewis E; Ryan, Colleen M; Shie, Vivian L; Holavanahalli, Radha; Lee, Austin; Jette, Alan; Fauerbach, James A; Esselman, Peter; Herndon, David; Schneider, Jeffrey C
2015-11-01
With improved survival, long-term effects of burn injuries on quality of life, particularly community integration, are important outcomes. This study aims to assess the Community Integration Questionnaire's psychometric properties in the adult burn population. Data were obtained from a multicenter longitudinal data set of burn survivors. The psychometric properties of the Community Integration Questionnaire (n = 492) were examined. The questionnaire items were evaluated for clinical and substantive relevance; validation procedures were conducted on different samples of the population; construct validity was assessed using exploratory factor analysis; internal consistency reliability was examined using Cronbach's α statistics; and item response theory was applied to the final models. The CIQ-15 was reduced by two questions to form the CIQ-13, with a two-factor structure, interpreted as self/family care and social integration. Item response theory testing suggests that Factor 2 captures a wider range of community integration levels. Cronbach's α was 0.80 for Factor 1, 0.77 for Factor 2, and 0.79 for the test as a whole. The CIQ-13 demonstrates validity and reliability in the adult burn survivor population addressing issues of self/family care and social integration. This instrument is useful in future research of community reintegration outcomes in the burn population.
A Foreign Object Damage Event Detector Data Fusion System for Turbofan Engines
NASA Technical Reports Server (NTRS)
Turso, James A.; Litt, Jonathan S.
2004-01-01
A Data Fusion System designed to provide a reliable assessment of the occurrence of Foreign Object Damage (FOD) in a turbofan engine is presented. The FOD-event feature level fusion scheme combines knowledge of shifts in engine gas path performance obtained using a Kalman filter, with bearing accelerometer signal features extracted via wavelet analysis, to positively identify a FOD event. A fuzzy inference system provides basic probability assignments (bpa) based on features extracted from the gas path analysis and bearing accelerometers to a fusion algorithm based on the Dempster-Shafer-Yager Theory of Evidence. Details are provided on the wavelet transforms used to extract the foreign object strike features from the noisy data and on the Kalman filter-based gas path analysis. The system is demonstrated using a turbofan engine combined-effects model (CEM), providing both gas path and rotor dynamic structural response, and is suitable for rapid-prototyping of control and diagnostic systems. The fusion of the disparate data can provide significantly more reliable detection of a FOD event than the use of either method alone. The use of fuzzy inference techniques combined with Dempster-Shafer-Yager Theory of Evidence provides a theoretical justification for drawing conclusions based on imprecise or incomplete data.
Validation of a condition-specific measure for women having an abnormal screening mammography.
Brodersen, John; Thorsen, Hanne; Kreiner, Svend
2007-01-01
The aim of this study is to assess the validity of a new condition-specific instrument measuring psychosocial consequences of abnormal screening mammography (PCQ-DK33). The draft version of the PCQ-DK33 was completed on two occasions by 184 women who had received an abnormal screening mammography and on one occasion by 240 women who had received a normal screening result. Item Response Theories and Classical Test Theories were used to analyze data. Construct validity, concurrent validity, known group validity, objectivity and reliability were established by item analysis examining the fit between item responses and Rasch models. Six dimensions covering anxiety, behavioral impact, sense of dejection, impact on sleep, breast examination, and sexuality were identified. One item belonging to the dejection dimension had uniform differential item functioning. Two items not fitting the Rasch models were retained because of high face validity. A sick leave item added useful information when measuring side effects and socioeconomic consequences of breast cancer screening. Five "poor items" were identified and should be deleted from the final instrument. Preliminary evidence for a valid and reliable condition-specific measure for women having an abnormal screening mammography was established. The measure includes 27 "good" items measuring different attributes of the same overall latent structure-the psychosocial consequences of abnormal screening mammography.
Harradine, Paul; Gates, Lucy; Bowen, Catherine
2018-03-01
The use of subtalar joint neutral (STJN) in the assessment and treatment of foot-related musculoskeletal symptomology is common in daily practice and still widely taught. The main pioneer of this theory was Dr Merton L. Root, and it has been labeled with a variety of names: "the foot morphology theory," "the subtalar joint neutral theory," or simply "Rootian theory" or "Root model." The theory's core concepts still underpin a common approach to musculoskeletal assessment of the foot, as well as the consequent design of foot orthoses. The available literature continues to point to Dr Root's theory as the most prevalently utilized. Concurrently, the worth of this theory has been challenged due to its poor reliability and limited external validity. This Viewpoint reviews the main clinical areas of the STJN theory, and concludes with a possible explanation and concerns for its ongoing use. To support our view, we will discuss (1) historical inaccuracies, (2) challenges with reliability, and (3) concerns with validity. J Orthop Sports Phys Ther 2018;48(3):130-132. doi:10.2519/jospt.2018.0604.
Suen, Yi-Nam; Cerin, Ester; Barnett, Anthony; Huang, Wendy Y J; Mellecker, Robin R
2017-09-01
Valid instruments of parenting practices related to children's physical activity (PA) are essential to understand how parents affect preschoolers' PA. This study developed and validated a questionnaire of PA-related parenting practices for Chinese-speaking parents of preschoolers in Hong Kong. Parents (n = 394) completed a questionnaire developed using findings from formative qualitative research and literature searches. Test-retest reliability was determined on a subsample (n = 61). Factorial validity was assessed using confirmatory factor analysis. Subscale internal consistency was determined. The scale of parenting practices encouraging PA comprised 2 latent factors: Modeling, structure and participatory engagement in PA (23 items), and Provision of appropriate places for child's PA (4 items). The scale of parenting practices discouraging PA scale encompassed 4 latent factors: Safety concern/overprotection (6 items), Psychological/behavioral control (5 items), Promoting inactivity (4 items), and Promoting screen time (2 items). Test-retest reliabilities were moderate to excellent (0.58 to 0.82), and internal subscale reliabilities were acceptable (0.63 to 0.89). We developed a theory-based questionnaire for assessing PA-related parenting practices among Chinese-speaking parents of Hong Kong preschoolers. While some items were context and culture specific, many were similar to those previously found in other populations, indicating a degree of construct generalizability across cultures.
Wan, Chonghua; Li, Hezhan; Fan, Xuejin; Yang, Ruixue; Pan, Jiahua; Chen, Wenru; Zhao, Rong
2014-06-04
Quality of life (QOL) for patients with coronary heart disease (CHD) is now concerned worldwide with the specific instruments being seldom and no one developed by the modular approach. This paper is aimed to develop the CHD scale of the system of Quality of Life Instruments for Chronic Diseases (QLICD-CHD) by the modular approach and validate it by both classical test theory and Generalizability Theory. The QLICD-CHD was developed based on programmed decision procedures with multiple nominal and focus group discussions, in-depth interview, pre-testing and quantitative statistical procedures. 146 inpatients with CHD were used to provide the data measuring QOL three times before and after treatments. The psychometric properties of the scale were evaluated with respect to validity, reliability and responsiveness employing correlation analysis, factor analyses, multi-trait scaling analysis, t-tests and also G studies and D studies of Genralizability Theory analysis. Multi-trait scaling analysis, correlation and factor analyses confirmed good construct validity and criterion-related validity when using SF-36 as a criterion. The internal consistency α and test-retest reliability coefficients (Pearson r and Intra-class correlations ICC) for the overall instrument and all domains were higher than 0.70 and 0.80 respectively; The overall and all domains except for social domain had statistically significant changes after treatments with moderate effect size SRM (standardized response mea) ranging from 0.32 to 0.67. G-coefficients and index of dependability (Ф coefficients) confirmed the reliability of the scale further with more exact variance components. The QLICD-CHD has good validity, reliability, and moderate responsiveness and some highlights, and can be used as the quality of life instrument for patients with CHD. However, in order to obtain better reliability, the numbers of items for social domain should be increased or the items' quality, not quantity, should be improved.
Exploring plant defense theory in tall goldenrod, Solidago altissima.
Heath, Jeremy J; Kessler, André; Woebbe, Eric; Cipollini, Don; Stireman, John O
2014-06-01
Understanding the evolutionary reasons for patterns of chemical defense in plants is an ongoing theoretical and empirical challenge. The goal is to develop a model that can reliably predict how defenses are distributed within the plant over space and time. This is difficult given that evolutionary, ecological, and physiological processes and tradeoffs can operate over different spatial and temporal scales. We evaluated the major predictions of two leading defense theories, the growth-differentiation balance hypothesis (GDBH) and optimal defense theory (ODT). To achieve this, enemies, fitness components, terpenoids, and protease inhibitors were measured in Solidago altissima and used to construct conventional univariate and structural equation models (SEMs). Leaf-tissue value indices extracted from an SEM revealed a strong correlation between tissue value and terpenoid defense that supports ODT. A tradeoff between serine protease inhibition and growth as well as an indirect tradeoff between growth and terpenoids manifested through galling insects supported the GDBH. Interestingly, there was a strong direct effect of terpenoids on rhizome mass, suggesting service to both storage and defense. The results support established theories but unknown genotypic traits explained much of the variation in defense, confirming the need to integrate emerging theories such as pollination constraints, defense syndromes, tolerance, mutualisms, and facilitation. © 2014 The Authors. New Phytologist © 2014 New Phytologist Trust.
Uncertainties in Forecasting Streamflow using Entropy Theory
NASA Astrophysics Data System (ADS)
Cui, H.; Singh, V. P.
2017-12-01
Streamflow forecasting is essential in river restoration, reservoir operation, power generation, irrigation, navigation, and water management. However, there is always uncertainties accompanied in forecast, which may affect the forecasting results and lead to large variations. Therefore, uncertainties must be considered and be assessed properly when forecasting streamflow for water management. The aim of our work is to quantify the uncertainties involved in forecasting streamflow and provide reliable streamflow forecast. Despite that streamflow time series are stochastic, they exhibit seasonal and periodic patterns. Therefore, streamflow forecasting entails modeling seasonality, periodicity, and its correlation structure, and assessing uncertainties. This study applies entropy theory to forecast streamflow and measure uncertainties during the forecasting process. To apply entropy theory for streamflow forecasting, spectral analysis is combined to time series analysis, as spectral analysis can be employed to characterize patterns of streamflow variation and identify the periodicity of streamflow. That is, it permits to extract significant information for understanding the streamflow process and prediction thereof. Application of entropy theory for streamflow forecasting involves determination of spectral density, determination of parameters, and extension of autocorrelation function. The uncertainties brought by precipitation input, forecasting model and forecasted results are measured separately using entropy. With information theory, how these uncertainties transported and aggregated during these processes will be described.
Autonomous Energy Grids | Grid Modernization | NREL
control themselves using advanced machine learning and simulation to create resilient, reliable, and affordable optimized energy systems. Current frameworks to monitor, control, and optimize large-scale energy of optimization theory, control theory, big data analytics, and complex system theory and modeling to
ERIC Educational Resources Information Center
Park, Bitnara Jasmine; Anderson, Daniel; Alonzo, Julie; Lai, Cheng-Fei; Tindal, Gerald
2012-01-01
This technical report is one in a series of five describing the reliability (test/retest and alternate form) and G-Theory/D-Study research on the easyCBM reading measures, grades 1-5. Data were gathered in the spring of 2011 from a convenience sample of students nested within classrooms at a medium-sized school district in the Pacific Northwest.…
NASA Technical Reports Server (NTRS)
Zender, George W
1956-01-01
The experimental deflections and stresses of six plastic multicell-wing models of unswept, delta, and swept plan form are presented and compared with previously published theoretical results obtained by the electrical analog method. The comparisons indicate that the theory is reliable except for the evaluation of stresses in the vicinity of the leading edge of delta wings and the leading and trailing edges of swept wings. The stresses in these regions are questionable, apparently because of simplifications employed in idealizing the actual structure for theoretical purposes and because of local effects of concentrated loads.
NASA Technical Reports Server (NTRS)
1990-01-01
Structural Reliability Consultants' computer program creates graphic plots showing the statistical parameters of glue laminated timbers, or 'glulam.' The company president, Dr. Joseph Murphy, read in NASA Tech Briefs about work related to analysis of Space Shuttle surface tile strength performed for Johnson Space Center by Rockwell International Corporation. Analysis led to a theory of 'consistent tolerance bounds' for statistical distributions, applicable in industrial testing where statistical analysis can influence product development and use. Dr. Murphy then obtained the Tech Support Package that covers the subject in greater detail. The TSP became the basis for Dr. Murphy's computer program PC-DATA, which he is marketing commercially.
Reliability and risk assessment of structures
NASA Technical Reports Server (NTRS)
Chamis, C. C.
1991-01-01
Development of reliability and risk assessment of structural components and structures is a major activity at Lewis Research Center. It consists of five program elements: (1) probabilistic loads; (2) probabilistic finite element analysis; (3) probabilistic material behavior; (4) assessment of reliability and risk; and (5) probabilistic structural performance evaluation. Recent progress includes: (1) the evaluation of the various uncertainties in terms of cumulative distribution functions for various structural response variables based on known or assumed uncertainties in primitive structural variables; (2) evaluation of the failure probability; (3) reliability and risk-cost assessment; and (4) an outline of an emerging approach for eventual certification of man-rated structures by computational methods. Collectively, the results demonstrate that the structural durability/reliability of man-rated structural components and structures can be effectively evaluated by using formal probabilistic methods.
Elapsed decision time affects the weighting of prior probability in a perceptual decision task
Hanks, Timothy D.; Mazurek, Mark E.; Kiani, Roozbeh; Hopp, Elizabeth; Shadlen, Michael N.
2012-01-01
Decisions are often based on a combination of new evidence with prior knowledge of the probable best choice. Optimal combination requires knowledge about the reliability of evidence, but in many realistic situations, this is unknown. Here we propose and test a novel theory: the brain exploits elapsed time during decision formation to combine sensory evidence with prior probability. Elapsed time is useful because (i) decisions that linger tend to arise from less reliable evidence, and (ii) the expected accuracy at a given decision time depends on the reliability of the evidence gathered up to that point. These regularities allow the brain to combine prior information with sensory evidence by weighting the latter in accordance with reliability. To test this theory, we manipulated the prior probability of the rewarded choice while subjects performed a reaction-time discrimination of motion direction using a range of stimulus reliabilities that varied from trial to trial. The theory explains the effect of prior probability on choice and reaction time over a wide range of stimulus strengths. We found that prior probability was incorporated into the decision process as a dynamic bias signal that increases as a function of decision time. This bias signal depends on the speed-accuracy setting of human subjects, and it is reflected in the firing rates of neurons in the lateral intraparietal cortex (LIP) of rhesus monkeys performing this task. PMID:21525274
Elapsed decision time affects the weighting of prior probability in a perceptual decision task.
Hanks, Timothy D; Mazurek, Mark E; Kiani, Roozbeh; Hopp, Elisabeth; Shadlen, Michael N
2011-04-27
Decisions are often based on a combination of new evidence with prior knowledge of the probable best choice. Optimal combination requires knowledge about the reliability of evidence, but in many realistic situations, this is unknown. Here we propose and test a novel theory: the brain exploits elapsed time during decision formation to combine sensory evidence with prior probability. Elapsed time is useful because (1) decisions that linger tend to arise from less reliable evidence, and (2) the expected accuracy at a given decision time depends on the reliability of the evidence gathered up to that point. These regularities allow the brain to combine prior information with sensory evidence by weighting the latter in accordance with reliability. To test this theory, we manipulated the prior probability of the rewarded choice while subjects performed a reaction-time discrimination of motion direction using a range of stimulus reliabilities that varied from trial to trial. The theory explains the effect of prior probability on choice and reaction time over a wide range of stimulus strengths. We found that prior probability was incorporated into the decision process as a dynamic bias signal that increases as a function of decision time. This bias signal depends on the speed-accuracy setting of human subjects, and it is reflected in the firing rates of neurons in the lateral intraparietal area (LIP) of rhesus monkeys performing this task.
ERIC Educational Resources Information Center
Gugiu, Mihaiela R.; Gugiu, Paul C.; Baldus, Robert
2012-01-01
Background: Educational researchers have long espoused the virtues of writing with regard to student cognitive skills. However, research on the reliability of the grades assigned to written papers reveals a high degree of contradiction, with some researchers concluding that the grades assigned are very reliable whereas others suggesting that they…
Conceptualizing Essay Tests' Reliability and Validity: From Research to Theory
ERIC Educational Resources Information Center
Badjadi, Nour El Imane
2013-01-01
The current paper on writing assessment surveys the literature on the reliability and validity of essay tests. The paper aims to examine the two concepts in relationship with essay testing as well as to provide a snapshot of the current understandings of the reliability and validity of essay tests as drawn in recent research studies. Bearing in…
ERIC Educational Resources Information Center
Kim, Won J.
2012-01-01
Reliable measurements for effective teaching are lacking. In contrast, some theories of leadership (particularly transformational leadership) have been tested and found to have efficacy in a variety of organizational settings. In this study, the full-range leadership theory, which includes transformational leadership, was applied to the…
Applying Learning Theories and Instructional Design Models for Effective Instruction
ERIC Educational Resources Information Center
Khalil, Mohammed K.; Elkhider, Ihsan A.
2016-01-01
Faculty members in higher education are involved in many instructional design activities without formal training in learning theories and the science of instruction. Learning theories provide the foundation for the selection of instructional strategies and allow for reliable prediction of their effectiveness. To achieve effective learning…
Structural vulnerability assessment using reliability of slabs in avalanche area
NASA Astrophysics Data System (ADS)
Favier, Philomène; Bertrand, David; Eckert, Nicolas; Naaim, Mohamed
2013-04-01
Improvement of risk assessment or hazard zoning requires a better understanding of the physical vulnerability of structures. To consider natural hazard issue such as snow avalanches, once the flow is characterized, highlight on the mechanical behaviour of the structure is a decisive step. A challenging approach is to quantify the physical vulnerability of impacted structures according to various avalanche loadings. The main objective of this presentation is to introduce methodology and outcomes regarding the assessment of vulnerability of reinforced concrete buildings using reliability methods. Reinforced concrete has been chosen as it is one of the usual material used to build structures exposed to potential avalanche loadings. In avalanche blue zones, structures have to resist to a pressure up to 30kPa. Thus, by providing systematic fragility relations linked to the global failure of the structure, this method may serve the avalanche risk assessment. To do so, a slab was numerically designed. It represented the avalanche facing wall of a house. Different configuration cases of the element in stake have been treated to quantify numerical aspects of the problem, such as the boundary conditions or the mechanical behaviour of the structure. The structure is analysed according to four different limit states, semi-local and global failures are considered to describe the slab behaviour. The first state is attained when cracks appear in the tensile zone, then the two next states are described consistent with the Eurocode, the final state is the total collapse of the structure characterized by the yield line theory. Failure probability is estimated in accordance to the reliability framework. Monte Carlo simulations were conducted to quantify the fragility to different loadings. Sensitivity of models in terms of input distributions were defined with statistical tools such as confidence intervals and Sobol's indexes. Conclusion and discussion of this work are established to well determine contributions, limits and future needs or developments of the research. First of all, this study provides spectrum of fragility curves of reinforced concrete structures which could be used to improve risk assessment. Second, the influence of the failure criterion picked up in this survey are discussed. Then, the weight of the statistical distribution choice is analysed. Finally, the limit between vulnerability and fragility relations is set up to establish the boundary use of our approach.
Theory of Work Adjustment Personality Constructs.
ERIC Educational Resources Information Center
Lawson, Loralie
1993-01-01
To measure Theory of Work Adjustment personality and adjustment style dimensions, content-based scales were analyzed for homogeneity and successively reanalyzed for reliability improvement. Three sound scales were developed: inflexibility, activeness, and reactiveness. (SK)
Simplified DFT methods for consistent structures and energies of large systems
NASA Astrophysics Data System (ADS)
Caldeweyher, Eike; Gerit Brandenburg, Jan
2018-05-01
Kohn–Sham density functional theory (DFT) is routinely used for the fast electronic structure computation of large systems and will most likely continue to be the method of choice for the generation of reliable geometries in the foreseeable future. Here, we present a hierarchy of simplified DFT methods designed for consistent structures and non-covalent interactions of large systems with particular focus on molecular crystals. The covered methods are a minimal basis set Hartree–Fock (HF-3c), a small basis set screened exchange hybrid functional (HSE-3c), and a generalized gradient approximated functional evaluated in a medium-sized basis set (B97-3c), all augmented with semi-classical correction potentials. We give an overview on the methods design, a comprehensive evaluation on established benchmark sets for geometries and lattice energies of molecular crystals, and highlight some realistic applications on large organic crystals with several hundreds of atoms in the primitive unit cell.
A predictive framework for evaluating models of semantic organization in free recall
Morton, Neal W; Polyn, Sean M.
2016-01-01
Research in free recall has demonstrated that semantic associations reliably influence the organization of search through episodic memory. However, the specific structure of these associations and the mechanisms by which they influence memory search remain unclear. We introduce a likelihood-based model-comparison technique, which embeds a model of semantic structure within the context maintenance and retrieval (CMR) model of human memory search. Within this framework, model variants are evaluated in terms of their ability to predict the specific sequence in which items are recalled. We compare three models of semantic structure, latent semantic analysis (LSA), global vectors (GloVe), and word association spaces (WAS), and find that models using WAS have the greatest predictive power. Furthermore, we find evidence that semantic and temporal organization is driven by distinct item and context cues, rather than a single context cue. This finding provides important constraint for theories of memory search. PMID:28331243
Theoretical Studies on Structures and Relative Stability for Polynitrohexaazaadamantanes
NASA Astrophysics Data System (ADS)
Xu, Xiao-juan; Xiao, He-ming; Wang, Gui-xiang; Ju, Xue-hai
2006-10-01
The density function theory at the B3LYP/6-31G* level was employed to study the structures, including the total energies (EZPE), the geometries, the oxygen balances (OB100), the dipole moments, of polynitro-hexaazaadamantanes (PNHAAs) and the potential candidates of high energy density compounds (HEDCs). The structural parameters of PNHAAs, such as the the maximum N—NO2 bond length (LBmax), the least N—N Mulliken population (BN—N), the least negative charge on the nitro group (QNO2) and OB100, were studied to predict their relative stability or sensitivity (the easiness for initiating a detonation, high sensitivity means low stability). It was found that the same conclusion was drawn from the four parameters. With the number of nitro groups increasing, the stabilities of these compounds decrease. OB100 failed in identifying the isomers, but the EZPE energy and the dipole moment were considered to give more reliable results for the isomers.
Nuclear structure and dynamics with density functional theory
NASA Astrophysics Data System (ADS)
Stetcu, Ionel
2015-10-01
Even in the absence of ab initio methods capable of tackling heavy nuclei without restrictions, one can obtain an ab initio description of ground-state properties by means of the density functional theory (DFT), and its extension to superfluid systems in its local variant, the superfluid local density approximation (SLDA). Information about the properties of excited states can be obtained in the same framework by using an extension to the time-dependent (TD) phenomena. Unlike other approaches in which the nuclear structure information is used as a separate input into reaction models, the TD approach treats on the same footing the nuclear structure and dynamics, and is well suited to provide more reliable description for a large number of processes involving heavy nuclei, from the nuclear response to electroweak probes, to nuclear reactions, such as neutron-induced reactions, or nuclear fusion and fission. Such processes, sometimes part of integrated nuclear systems, have important applications in astrophysics, energy production, global security, etc. In this talk, I will present the simulation of a simple reaction, that is the Coulomb excitation of a 238U nucleus, and discuss the application of the TD-DFT formalism to the description of induced fission. I gratefully acknowledge partial support of the U.S. Department of Energy through an Early Career Award of the LANL/LDRD Program.
Castillo, Isabel; Tomás, Inés; Ntoumanis, Nikos; Bartholomew, Kimberley; Duda, Joan L; Balaguer, Isabel
2014-01-01
The purpose of this research was to translate into Spanish and examine the psychometric properties of the Spanish version of the Controlling Coach Behaviors Scale (CCBS) in male soccer players. The CCBS is a questionnaire designed to assess athletes' perceptions of sports coaches' controlling interpersonal style from the perspective of the self-determination theory. Study 1 tested the factorial structure of the translated scale using confirmatory factor analysis (CFA) and provided evidence of discriminant validity. Studies 2 and 3 examined the invariance across time and across competitive level via multi-sample CFA. Reliability analyses were also conducted. The CFA results revealed that a four-factor model was acceptable, indicating that a controlling interpersonal style is a multidimensional construct represented by four separate and related controlling coaching strategies. Further, results supported the invariance of the CCBS factor structure across time and competitive level and provided support for the internal consistency of the scale. Overall, the CCBS demonstrated adequate internal consistency, as well as good factorial validity. The Spanish version of the CCBS represents a valid and reliable adaptation of the instrument, which can be confidently used to measure soccer players' perceptions of their coaches' controlling interpersonal style.
Castanelli, D J; Smith, N A
2017-05-01
The learning environment describes the context and culture in which trainees learn. In order to establish the feasibility and reliability of measuring the anaesthetic learning environment in individual departments we implemented a previously developed instrument in hospitals across New South Wales. We distributed the instrument to trainees from 25 anaesthesia departments and supplied summarized results to individual departments. Exploratory and confirmatory factor analyses were performed to assess internal structure validity and generalizability theory was used to calculate reliability. The number of trainees required for acceptable precision in results was determined using the standard error of measurement. We received 172 responses (59% response rate). Suitable internal structure validity was confirmed. Measured reliability was acceptable (G-coefficient 0.69) with nine trainees per department. Eight trainees were required for a 95% confidence interval of plus or minus 0.25 in the mean total score. Eight trainees as assessors also allow a 95% confidence interval of approximately plus or minus 0.3 in the subscale mean scores. Results for individual departments varied, with scores below the expected level recorded on individual subscales, particularly the 'teaching' subscale. Our results confirm that, using this instrument, individual departments can obtain acceptable precision in results with achievable trainee numbers. Additionally, with the exception of departments with few trainees, implementation proved feasible across a training region. Repeated use would allow departments or accrediting bodies to monitor their individual learning environment and the impact of changes such as the introduction of new curricular elements, or local initiatives to improve trainee experience. © The Author 2017. Published by Oxford University Press on behalf of the British Journal of Anaesthesia. All rights reserved. For Permissions, please email: journals.permissions@oup.com
Lei, Pingguang; Lei, Guanghe; Tian, Jianjun; Zhou, Zengfen; Zhao, Miao; Wan, Chonghua
2014-10-01
This paper is aimed to develop the irritable bowel syndrome (IBS) scale of the system of Quality of Life Instruments for Chronic Diseases (QLICD-IBS) by the modular approach and validate it by both classical test theory and generalizability theory. The QLICD-IBS was developed based on programmed decision procedures with multiple nominal and focus group discussions, in-depth interview, and quantitative statistical procedures. One hundred twelve inpatients with IBS were used to provide the data measuring QOL three times before and after treatments. The psychometric properties of the scale were evaluated with respect to validity, reliability, and responsiveness employing correlation analysis, factor analyses, multi-trait scaling analysis, t tests and also G studies and D studies of generalizability theory analysis. Multi-trait scaling analysis, correlation, and factor analyses confirmed good construct validity and criterion-related validity when using SF-36 as a criterion. Test-retest reliability coefficients (Pearson r and intra-class correlation (ICC)) for the overall score and all domains were higher than 0.80; the internal consistency α for all domains at two measurements were higher than 0.70 except for the social domain (0.55 and 0.67, respectively). The overall score and scores for all domains/facets had statistically significant changes after treatments with moderate or higher effect size standardized response mean (SRM) ranging from 0.72 to 1.02 at domain levels. G coefficients and index of dependability (Ф coefficients) confirmed the reliability of the scale further with more exact variance components. The QLICD-IBS has good validity, reliability, responsiveness, and some highlights and can be used as the quality of life instrument for patients with IBS.
Adapting the academic motivation scale for use in pre-tertiary mathematics classrooms
NASA Astrophysics Data System (ADS)
Lim, Siew Yee; Chapman, Elaine
2015-09-01
The Academic Motivation Scale ( ams) is a comprehensive and widely used instrument for assessing motivation based on the self-determination theory. Currently, no such comprehensive instrument exists to assess the different domains of motivation (stipulated by the self-determination theory) in mathematics education at the pre-tertiary level (grades 11 and 12) in Asia. This study adapted the ams for this use and assessed the properties of the adapted instrument with 1610 students from Singapore. Exploratory and confirmatory factor analyses indicated a five-factor structure for the modified instrument (the three original ams intrinsic subscales collapsed into a single factor). Additionally, the modified instrument exhibited good internal consistency (mean α = .88), and satisfactory test-retest reliability over a 1-month interval (mean r xx = .73). The validity of the modified ams was further demonstrated through correlational analyses among scores on its subscales, and with scores on other instruments measuring mathematics attitudes, anxiety and achievement.
NASA Astrophysics Data System (ADS)
Wang, RuLin; Zheng, Xiao; Kwok, YanHo; Xie, Hang; Chen, GuanHua; Yam, ChiYung
2015-04-01
Understanding electronic dynamics on material surfaces is fundamentally important for applications including nanoelectronics, inhomogeneous catalysis, and photovoltaics. Practical approaches based on time-dependent density functional theory for open systems have been developed to characterize the dissipative dynamics of electrons in bulk materials. The accuracy and reliability of such approaches depend critically on how the electronic structure and memory effects of surrounding material environment are accounted for. In this work, we develop a novel squared-Lorentzian decomposition scheme, which preserves the positive semi-definiteness of the environment spectral matrix. The resulting electronic dynamics is guaranteed to be both accurate and convergent even in the long-time limit. The long-time stability of electronic dynamics simulation is thus greatly improved within the current decomposition scheme. The validity and usefulness of our new approach are exemplified via two prototypical model systems: quasi-one-dimensional atomic chains and two-dimensional bilayer graphene.
On some methods of discrete systems behaviour simulation
NASA Astrophysics Data System (ADS)
Sytnik, Alexander A.; Posohina, Natalia I.
1998-07-01
The project is solving one of the fundamental problems of mathematical cybernetics and discrete mathematics, the one connected with synthesis and analysis of managing systems, depending on the research of their functional opportunities and reliable behaviour. This work deals with the case of finite-state machine behaviour restoration when the structural redundancy is not available and the direct updating of current behaviour is impossible. The described below method, uses number theory to build a special model of finite-state machine, it is simulating the transition between the states of the finite-state machine using specially defined functions of exponential type with the help of several methods of number theory and algebra it is easy to determine, whether there is an opportunity to restore the behaviour (with the help of this method) in the given case or not and also derive the class of finite-state machines, admitting such restoration.
Sliding Mode Thermal Control System for Space Station Furnace Facility
NASA Technical Reports Server (NTRS)
Jackson Mark E.; Shtessel, Yuri B.
1998-01-01
The decoupled control of the nonlinear, multiinput-multioutput, and highly coupled space station furnace facility (SSFF) thermal control system is addressed. Sliding mode control theory, a subset of variable-structure control theory, is employed to increase the performance, robustness, and reliability of the SSFF's currently designed control system. This paper presents the nonlinear thermal control system description and develops the sliding mode controllers that cause the interconnected subsystems to operate in their local sliding modes, resulting in control system invariance to plant uncertainties and external and interaction disturbances. The desired decoupled flow-rate tracking is achieved by optimization of the local linear sliding mode equations. The controllers are implemented digitally and extensive simulation results are presented to show the flow-rate tracking robustness and invariance to plant uncertainties, nonlinearities, external disturbances, and variations of the system pressure supplied to the controlled subsystems.
NASA Astrophysics Data System (ADS)
Sanson, A.; Pokrovski, G. S.; Giarola, M.; Mariotto, G.
2015-01-01
The vibrational dynamics of germanium dioxide in the rutile structure has been investigated by using polarized micro-Raman scattering spectroscopy coupled with first-principles calculations. Raman spectra were carried out in backscattering geometry at room temperature from micro-crystalline samples either unoriented or oriented by means of a micromanipulator, which enabled successful detection and identification of all the Raman active modes expected on the basis of the group theory. In particular, the Eg mode, incorrectly assigned or not detected in the literature, has been definitively observed by us and unambiguously identified at 525 \\text{cm}-1 under excitation by certain laser lines, thus revealing an unusual resonance phenomenon. First-principles calculations within the framework of the density functional theory allow quantifying both wave number and intensity of the Raman vibrational spectra. The excellent agreement between calculated and experimental data corroborates the reliability of our findings.
Zhang, Yu; Mukamel, Shaul; Khalil, Munira; Govind, Niranjan
2015-12-08
Valence-to-core (VtC) X-ray emission spectroscopy (XES) has emerged as a powerful technique for the structural characterization of complex organometallic compounds in realistic environments. Since the spectrum represents electronic transitions from the ligand molecular orbitals to the core holes of the metal centers, the approach is more chemically sensitive to the metal-ligand bonding character compared with conventional X-ray absorption techniques. In this paper we study how linear-response time-dependent density functional theory (LR-TDDFT) can be harnessed to simulate K-edge VtC X-ray emission spectra reliably. LR-TDDFT allows one to go beyond the single-particle picture that has been extensively used to simulate VtC-XES. We consider seven low- and high-spin model complexes involving chromium, manganese, and iron transition metal centers. Our results are in good agreement with experiment.
Organizational readiness for implementing change: a psychometric assessment of a new measure.
Shea, Christopher M; Jacobs, Sara R; Esserman, Denise A; Bruce, Kerry; Weiner, Bryan J
2014-01-10
Organizational readiness for change in healthcare settings is an important factor in successful implementation of new policies, programs, and practices. However, research on the topic is hindered by the absence of a brief, reliable, and valid measure. Until such a measure is developed, we cannot advance scientific knowledge about readiness or provide evidence-based guidance to organizational leaders about how to increase readiness. This article presents results of a psychometric assessment of a new measure called Organizational Readiness for Implementing Change (ORIC), which we developed based on Weiner's theory of organizational readiness for change. We conducted four studies to assess the psychometric properties of ORIC. In study one, we assessed the content adequacy of the new measure using quantitative methods. In study two, we examined the measure's factor structure and reliability in a laboratory simulation. In study three, we assessed the reliability and validity of an organization-level measure of readiness based on aggregated individual-level data from study two. In study four, we conducted a small field study utilizing the same analytic methods as in study three. Content adequacy assessment indicated that the items developed to measure change commitment and change efficacy reflected the theoretical content of these two facets of organizational readiness and distinguished the facets from hypothesized determinants of readiness. Exploratory and confirmatory factor analysis in the lab and field studies revealed two correlated factors, as expected, with good model fit and high item loadings. Reliability analysis in the lab and field studies showed high inter-item consistency for the resulting individual-level scales for change commitment and change efficacy. Inter-rater reliability and inter-rater agreement statistics supported the aggregation of individual level readiness perceptions to the organizational level of analysis. This article provides evidence in support of the ORIC measure. We believe this measure will enable testing of theories about determinants and consequences of organizational readiness and, ultimately, assist healthcare leaders to reduce the number of health organization change efforts that do not achieve desired benefits. Although ORIC shows promise, further assessment is needed to test for convergent, discriminant, and predictive validity.
Organizational readiness for implementing change: a psychometric assessment of a new measure
2014-01-01
Background Organizational readiness for change in healthcare settings is an important factor in successful implementation of new policies, programs, and practices. However, research on the topic is hindered by the absence of a brief, reliable, and valid measure. Until such a measure is developed, we cannot advance scientific knowledge about readiness or provide evidence-based guidance to organizational leaders about how to increase readiness. This article presents results of a psychometric assessment of a new measure called Organizational Readiness for Implementing Change (ORIC), which we developed based on Weiner’s theory of organizational readiness for change. Methods We conducted four studies to assess the psychometric properties of ORIC. In study one, we assessed the content adequacy of the new measure using quantitative methods. In study two, we examined the measure’s factor structure and reliability in a laboratory simulation. In study three, we assessed the reliability and validity of an organization-level measure of readiness based on aggregated individual-level data from study two. In study four, we conducted a small field study utilizing the same analytic methods as in study three. Results Content adequacy assessment indicated that the items developed to measure change commitment and change efficacy reflected the theoretical content of these two facets of organizational readiness and distinguished the facets from hypothesized determinants of readiness. Exploratory and confirmatory factor analysis in the lab and field studies revealed two correlated factors, as expected, with good model fit and high item loadings. Reliability analysis in the lab and field studies showed high inter-item consistency for the resulting individual-level scales for change commitment and change efficacy. Inter-rater reliability and inter-rater agreement statistics supported the aggregation of individual level readiness perceptions to the organizational level of analysis. Conclusions This article provides evidence in support of the ORIC measure. We believe this measure will enable testing of theories about determinants and consequences of organizational readiness and, ultimately, assist healthcare leaders to reduce the number of health organization change efforts that do not achieve desired benefits. Although ORIC shows promise, further assessment is needed to test for convergent, discriminant, and predictive validity. PMID:24410955
Reliable and valid assessment of point-of-care ultrasonography.
Todsen, Tobias; Tolsgaard, Martin Grønnebæk; Olsen, Beth Härstedt; Henriksen, Birthe Merete; Hillingsø, Jens Georg; Konge, Lars; Jensen, Morten Lind; Ringsted, Charlotte
2015-02-01
To explore the reliability and validity of the Objective Structured Assessment of Ultrasound Skills (OSAUS) scale for point-of-care ultrasonography (POC US) performance. POC US is increasingly used by clinicians and is an essential part of the management of acute surgical conditions. However, the quality of performance is highly operator-dependent. Therefore, reliable and valid assessment of trainees' ultrasonography competence is needed to ensure patient safety. Twenty-four physicians, representing novices, intermediates, and experts in POC US, scanned 4 different surgical patient cases in a controlled set-up. All ultrasound examinations were video-recorded and assessed by 2 blinded radiologists using OSAUS. Reliability was examined using generalizability theory. Construct validity was examined by comparing performance scores between the groups and by correlating physicians' OSAUS scores with diagnostic accuracy. The generalizability coefficient was high (0.81) and a D-study demonstrated that 1 assessor and 5 cases would result in similar reliability. The construct validity of the OSAUS scale was supported by a significant difference in the mean scores between the novice group (17.0; SD 8.4) and the intermediate group (30.0; SD 10.1), P = 0.007, as well as between the intermediate group and the expert group (72.9; SD 4.4), P = 0.04, and by a high correlation between OSAUS scores and diagnostic accuracy (Spearman ρ correlation coefficient = 0.76; P < 0.001). This study demonstrates high reliability as well as evidence of construct validity of the OSAUS scale for assessment of POC US competence. Hence, the OSAUS scale may be suitable for both in-training as well as end-of-training assessment.
NASA Astrophysics Data System (ADS)
Aleksandrov, V. A.; Vladimirov, V. V.; Dmitriev, R. D.; Osipov, S. O.
This book takes into consideration domestic and foreign developments related to launch vehicles. General information concerning launch vehicle systems is presented, taking into account details of rocket structure, basic design considerations, and a number of specific Soviet and American launch vehicles. The basic theory of reaction propulsion is discussed, giving attention to physical foundations, the various types of forces acting on a rocket in flight, basic parameters characterizing rocket motion, the effectiveness of various approaches to obtain the desired velocity, and rocket propellants. Basic questions concerning the classification of launch vehicles are considered along with construction and design considerations, aspects of vehicle control, reliability, construction technology, and details of structural design. Attention is also given to details of rocket motor design, the basic systems of the carrier rocket, and questions of carrier rocket development.
Biology doesn't waste energy: that's really smart
NASA Astrophysics Data System (ADS)
Vincent, Julian F. V.; Bogatyreva, Olga; Bogatyrev, Nikolaj
2006-03-01
Biology presents us with answers to design problems that we suspect would be very useful if only we could implement them successfully. We use the Russian theory of problem solving - TRIZ - in a novel way to provide a system for analysis and technology transfer. The analysis shows that whereas technology uses energy as the main means of solving technical problems, biology uses information and structure. Biology is also strongly hierarchical. The suggestion is that smart technology in hierarchical structures can help us to design much more efficient technology. TRIZ also suggests that biological design is autonomous and can be defined by the prefix "self-" with any function. This autonomy extends to the control system, so that the sensor is commonly also the actuator, resulting in simpler systems and greater reliability.
NASA Astrophysics Data System (ADS)
Guo, Feng-Kun; Hanhart, Christoph; Meißner, Ulf-G.; Wang, Qian; Zhao, Qiang; Zou, Bing-Song
2018-01-01
A large number of experimental discoveries especially in the heavy quarkonium sector that did not meet the expectations of the until then very successful quark model led to a renaissance of hadron spectroscopy. Among various explanations of the internal structure of these excitations, hadronic molecules, being analogs of light nuclei, play a unique role since for those predictions can be made with controlled uncertainty. Experimental evidence of various candidates of hadronic molecules and methods of identifying such structures are reviewed. Nonrelativistic effective field theories are the suitable framework for studying hadronic molecules and are discussed in both the continuum and finite volumes. Also pertinent lattice QCD results are presented. Further, the production mechanisms and decays of hadronic molecules are discussed and comments are given on the reliability of certain assertions often made in the literature.
NASA Astrophysics Data System (ADS)
McPhee, J.; William, Y. W.
2005-12-01
This work presents a methodology for pumping test design based on the reliability requirements of a groundwater model. Reliability requirements take into consideration the application of the model results in groundwater management, expressed in this case as a multiobjective management model. The pumping test design is formulated as a mixed-integer nonlinear programming (MINLP) problem and solved using a combination of genetic algorithm (GA) and gradient-based optimization. Bayesian decision theory provides a formal framework for assessing the influence of parameter uncertainty over the reliability of the proposed pumping test. The proposed methodology is useful for selecting a robust design that will outperform all other candidate designs under most potential 'true' states of the system
NASA Astrophysics Data System (ADS)
Lombardo, Giuseppe M.; Portalone, Gustavo; Colapietro, Marcello; Rescifina, Antonio; Punzo, Francesco
2011-05-01
The ability of caffeic acid to act as antioxidant against hyperoxo-radicals as well as its recently found therapeutic properties in the treatment of hepatocarcinoma, still make this compound, more than 20 years later the refinement of its crystal structure, object of study. It belongs to the vast family of humic substances, which play a key role in the biodegradation processes and easily form complexes with ions widely diffused in the environment. This class of compounds is therefore interesting for potential environmental chemistry applications concerning the possible complexation of heavy metals. Our study focused on the characterization of caffeic acid as a starting necessary step, which will be followed in the future by the application of our findings on the study of the properties of caffeate anion interaction with heavy metal ions. To reach this goal, we applied a low cost approach - in terms of computational time and resources - aimed at the achievement of a high resolution, robust and trustable structure using the X-ray single crystal data, recollected with a higher resolution, as touchstone for a detailed check. A comparison between the calculations carried out with density functional theory (DFT), Hartree-Fock (HF) method and post SCF second order Møller-Plesset perturbation method (MP2), at the 6-31G ** level of the theory, molecular mechanics (MM) and molecular dynamics (MD) was performed. As a consequence we explained on one hand the possible reasons for the pitfalls of the DFT approach and on the other the benefits of using a good and robust force field developed for condensed phases, as AMBER, with MM and MD. The reliability of the latter, highlighted by the overall agreement extended up to the anisotropic displacement parameters calculated by means of MD and the ones gathered by X-ray measurements, makes it very promising for the above-mentioned goals.
Frandsen, Benjamin A; Brunelli, Michela; Page, Katharine; Uemura, Yasutomo J; Staunton, Julie B; Billinge, Simon J L
2016-05-13
We present a temperature-dependent atomic and magnetic pair distribution function (PDF) analysis of neutron total scattering measurements of antiferromagnetic MnO, an archetypal strongly correlated transition-metal oxide. The known antiferromagnetic ground-state structure fits the low-temperature data closely with refined parameters that agree with conventional techniques, confirming the reliability of the newly developed magnetic PDF method. The measurements performed in the paramagnetic phase reveal significant short-range magnetic correlations on a ∼1 nm length scale that differ substantially from the low-temperature long-range spin arrangement. Ab initio calculations using a self-interaction-corrected local spin density approximation of density functional theory predict magnetic interactions dominated by Anderson superexchange and reproduce the measured short-range magnetic correlations to a high degree of accuracy. Further calculations simulating an additional contribution from a direct exchange interaction show much worse agreement with the data. The Anderson superexchange model for MnO is thus verified by experimentation and confirmed by ab initio theory.
A Protection Motivation Theory-Based Scale for Tobacco Research among Chinese Youth
MacDonell, Karen; Chen, Xinguang; Yan, Yaqiong; Li, Fang; Gong, Jie; Sun, Huiling; Li, Xiaoming; Stanton, Bonita
2014-01-01
Rates of tobacco use among adolescents in China and other lower and middle-income countries remain high despite notable prevention and intervention programs. One reason for this may be the lack of theory-based research in tobacco use prevention in these countries. In the current study, a culturally appropriate 21-item measurement scale for cigarette smoking was developed based on the core constructs of Protection Motivation Theory (PMT). The scale was assessed among a sample of 553 Chinese vocational high school students. Results from correlational and measurement modeling analysis indicated adequate measurement reliability for the proposed PMT scale structure. The two PMT Pathways and the seven PMT constructs were significantly correlated with adolescent intention to smoke and actual smoking behavior. This study is the first to evaluate a PMT scale for cigarette smoking among Chinese adolescents. The scale provides a potential tool for assessing social cognitive processes underlying tobacco use. This is essential for understanding smoking behavior among Chinese youth and to support more effective tobacco use prevention efforts. Additional studies are needed to assess its utility for use with Chinese youth in other settings. PMID:24478933
DOE Office of Scientific and Technical Information (OSTI.GOV)
Benjamin A. Frandsen; Brunelli, Michela; Page, Katharine
Here, we present a temperature-dependent atomic and magnetic pair distribution function (PDF) analysis of neutron total scattering measurements of antiferromagnetic MnO, an archetypal strongly correlated transition-metal oxide. The known antiferromagnetic ground-state structure fits the low-temperature data closely with refined parameters that agree with conventional techniques, confirming the reliability of the newly developed magnetic PDF method. The measurements performed in the paramagnetic phase reveal significant short-range magnetic correlations on a ~1 nm length scale that differ substantially from the low-temperature long-range spin arrangement. Ab initio calculations using a self-interaction-corrected local spin density approximation of density functional theory predict magnetic interactions dominatedmore » by Anderson superexchange and reproduce the measured short-range magnetic correlations to a high degree of accuracy. Further calculations simulating an additional contribution from a direct exchange interaction show much worse agreement with the data. Furthermore, the Anderson superexchange model for MnO is thus verified by experimentation and confirmed by ab initio theory.« less
Benjamin A. Frandsen; Brunelli, Michela; Page, Katharine; ...
2016-05-11
Here, we present a temperature-dependent atomic and magnetic pair distribution function (PDF) analysis of neutron total scattering measurements of antiferromagnetic MnO, an archetypal strongly correlated transition-metal oxide. The known antiferromagnetic ground-state structure fits the low-temperature data closely with refined parameters that agree with conventional techniques, confirming the reliability of the newly developed magnetic PDF method. The measurements performed in the paramagnetic phase reveal significant short-range magnetic correlations on a ~1 nm length scale that differ substantially from the low-temperature long-range spin arrangement. Ab initio calculations using a self-interaction-corrected local spin density approximation of density functional theory predict magnetic interactions dominatedmore » by Anderson superexchange and reproduce the measured short-range magnetic correlations to a high degree of accuracy. Further calculations simulating an additional contribution from a direct exchange interaction show much worse agreement with the data. Furthermore, the Anderson superexchange model for MnO is thus verified by experimentation and confirmed by ab initio theory.« less
Sensor Data Fusion with Z-Numbers and Its Application in Fault Diagnosis
Jiang, Wen; Xie, Chunhe; Zhuang, Miaoyan; Shou, Yehang; Tang, Yongchuan
2016-01-01
Sensor data fusion technology is widely employed in fault diagnosis. The information in a sensor data fusion system is characterized by not only fuzziness, but also partial reliability. Uncertain information of sensors, including randomness, fuzziness, etc., has been extensively studied recently. However, the reliability of a sensor is often overlooked or cannot be analyzed adequately. A Z-number, Z = (A, B), can represent the fuzziness and the reliability of information simultaneously, where the first component A represents a fuzzy restriction on the values of uncertain variables and the second component B is a measure of the reliability of A. In order to model and process the uncertainties in a sensor data fusion system reasonably, in this paper, a novel method combining the Z-number and Dempster–Shafer (D-S) evidence theory is proposed, where the Z-number is used to model the fuzziness and reliability of the sensor data and the D-S evidence theory is used to fuse the uncertain information of Z-numbers. The main advantages of the proposed method are that it provides a more robust measure of reliability to the sensor data, and the complementary information of multi-sensors reduces the uncertainty of the fault recognition, thus enhancing the reliability of fault detection. PMID:27649193
ERIC Educational Resources Information Center
Fife, Dustin A.; Mendoza, Jorge L.; Terry, Robert
2012-01-01
Though much research and attention has been directed at assessing the correlation coefficient under range restriction, the assessment of reliability under range restriction has been largely ignored. This article uses item response theory to simulate dichotomous item-level data to assess the robustness of KR-20 ([alpha]), [omega], and test-retest…
Electronic Structure and Transport in Solids from First Principles
NASA Astrophysics Data System (ADS)
Mustafa, Jamal Ibrahim
The focus of this dissertation is the determination of the electronic structure and trans- port properties of solids. We first review some of the theory and computational methodology used in the calculation of electronic structure and materials properties. Throughout the dissertation, we make extensive use of state-of-the-art software packages that implement density functional theory, density functional perturbation theory, and the GW approximation, in addition to specialized methods for interpolating matrix elements for extremely accurate results. The first application of the computational framework introduced is the determination of band offsets in semiconductor heterojunctions using a theory of quantum dipoles at the interface. This method is applied to the case of heterojunction formed between a new metastable phase of silicon, with a rhombohedral structure, and cubic silicon. Next, we introduce a novel method for the construction of localized Wannier functions, which we have named the optimized projection functions method (OPFM). We illustrate the method on a variety of systems and find that it can reliably construct localized Wannier functions with minimal user intervention. We further develop the OPFM to investigate a class of materials called topological insulators, which are insulating in the bulk but have conductive surface states. These properties are a result of a nontrivial topology in their band structure, which has interesting effects on the character of the Wannier functions. In the last sections of the main text, the noble metals are studied in great detail, including their electronic properties and carrier dynamics. In particular, we investigate, the Fermi surface properties of the noble metals, specifically electron-phonon scattering lifetimes, and subsequently the transport properties determined by carriers on the Fermi surface. To achieve this, a novel sampling technique is developed, with wide applicability to transport calculations. Additionally, the generation and transport of hot carriers is studied extensively. The distribution of hot carriers generated from the decay of plasmons is explored over a range of energy, and the transport properties, particularly the lifetimes and mean-free-paths, of the hot carriers are determined. Lastly, appendices detailing the implementation of the algorithms developed in the work is presented, along with a useful derivation of the electron-plasmon matrix elements.
Detecting Nonadditivity in Single-Facet Generalizability Theory Applications: Tukey's Test
ERIC Educational Resources Information Center
Lin, Chih-Kai; Zhang, Jinming
2018-01-01
Under the generalizability-theory (G-theory) framework, the estimation precision of variance components (VCs) is of significant importance in that they serve as the foundation of estimating reliability. Zhang and Lin advanced the discussion of nonadditivity in data from a theoretical perspective and showed the adverse effects of nonadditivity on…
Methodological Consequences of Situation Specificity: Biases in Assessments
Patry, Jean-Luc
2011-01-01
Social research is plagued by many biases. Most of them are due to situation specificity of social behavior and can be explained using a theory of situation specificity. The historical background of situation specificity in personality social psychology research is briefly sketched, then a theory of situation specificity is presented in detail, with as centerpiece the relationship between the behavior and its outcome which can be described as either “the more, the better” or “not too much and not too little.” This theory is applied to reliability and validity of assessments in social research. The distinction between “maximum performance” and “typical performance” is shown to correspond to the two behavior-outcome relations. For maximum performance, issues of reliability and validity are much easier to be solved, whereas typical performance is sensitive to biases, as predicted by the theory. Finally, it is suggested that biases in social research are not just systematic error, but represent relevant features to be explained just as other behavior, and that the respective theories should be integrated into a theory system. PMID:21713072
EFFECTIVE HYPERFINE-STRUCTURE FUNCTIONS OF AMMONIA
DOE Office of Scientific and Technical Information (OSTI.GOV)
Augustovičová, L.; Soldán, P.; Špirko, V., E-mail: spirko@marge.uochb.cas.cz
The hyperfine structure of the rotation-inversion ( v {sub 2} = 0{sup +}, 0{sup −}, 1{sup +}, 1{sup −}) states of the {sup 14}NH{sub 3} and {sup 15}NH{sub 3} ammonia isotopomers is rationalized in terms of effective (ro-inversional) hyperfine-structure (hfs) functions. These are determined by fitting to available experimental data using the Hougen’s effective hyperfine-structure Hamiltonian within the framework of the non-rigid inverter theory. Involving only a moderate number of mass independent fitting parameters, the fitted hfs functions provide a fairly close reproduction of a large majority of available experimental data, thus evidencing adequacy of these functions for reliable prediction.more » In future experiments, this may help us derive spectroscopic constants of observed inversion and rotation-inversion transitions deperturbed from hyperfine effects. The deperturbed band centers of ammonia come to the forefront of fundamental physics especially as the probes of a variable proton-to-electron mass ratio.« less
Young's modulus measurement of aluminum thin film with cantilever structure
NASA Astrophysics Data System (ADS)
Lee, ByoungChan; Lee, SangHun; Lee, Hwasu; Shin, Hyungjae
2001-09-01
Micromachined cantilever structures are commonly used for measuring mechanical properties of thin film materials in MEMS. The application of conventional cantilever theory in experiment raises severe problem. The deformation of the supporting post and flange is produced by the applied electrostatic force and lead to more reduced measurement value than real Young's modulus of thin film materials. In order to determine Young's modulus of aluminum thin film robustly and reproducibly, the modified cantilever structure is proposed. Two measurement methods, which are cantilever tip deflection measurement and resonant frequency measurement, are used for confirming the reliability of the proposed cantilever structure as well. Measured results indicate that the proposed measurement scheme provides useful and credible Young's modulus value for thin film materials with sub-micron thickness. The proved validation of the proposed scheme makes sure that in addition to Young's modulus of aluminum thin film, that of other thin film materials which are aluminum alloy, metal, and so forth, can be extracted easily and clearly.
NASA Technical Reports Server (NTRS)
Hopkins, Dale A.
1992-01-01
The presentation gives a partial overview of research and development underway in the Structures Division of LeRC, which collectively is referred to as the Computational Structures Technology Program. The activities in the program are diverse and encompass four major categories: (1) composite materials and structures; (2) probabilistic analysis and reliability; (3) design optimization and expert systems; and (4) computational methods and simulation. The approach of the program is comprehensive and entails exploration of fundamental theories of structural mechanics to accurately represent the complex physics governing engine structural performance, formulation, and implementation of computational techniques and integrated simulation strategies to provide accurate and efficient solutions of the governing theoretical models by exploiting the emerging advances in computer technology, and validation and verification through numerical and experimental tests to establish confidence and define the qualities and limitations of the resulting theoretical models and computational solutions. The program comprises both in-house and sponsored research activities. The remainder of the presentation provides a sample of activities to illustrate the breadth and depth of the program and to demonstrate the accomplishments and benefits that have resulted.
Rozman, Marko
2005-10-01
The most stable charge-solvated (CS) and zwitterionic (ZW) structures of sodiated and cesiated leucine and isoleucine were studied by density functional theory methods. According to the Boltzmann distribution in gas phase, both forms of LeuNa+ and IleNa+ exist, but in LeuCs+ and IleCs+, the ZW forms are dominant. Results for the sodiated compounds are consistent with the relationship found between decrease in relative stability of CS versus ZW form and aliphatic amino acid side chain length. The observed degeneracy in energy for IleNa+ conformers is at odds with kinetic method results. Additional calculations showed that kinetic method structural determinations for IleNa+ do not reflect relative order of populations in the lowest energy conformers. Since complexation of cationized amino acids into ion-bound dimers disfavors ZW structure by approximately 8 kJ mol(-1), it is suggested that for energy close conformers of sodium-cationized amino acids, the kinetic method may not be reliable for structural determinations. Copyright (c) 2005 John Wiley & Sons, Ltd.
Evaluation of Quantitative Environmental Stress Screening (ESS) Methods. Volume 1
1991-11-01
required information on screening strength from the curvefitting parameters. The underlying theory and approach taken are discussed in Appendix A. To...in field 1020 arm St.,Vas AJ Ptr 0.4i Currpnt. c- ugt ~ing DU?/SYS 2.7264 Wll/IY.3 at Factory Stress- NaxiLMw outgoing W?is’ys 0.288 DrW/5?S at Field...182 125 K.W.Fertig and V.X. Murthy, Models for Reliability Growth During Burn-in: Theory and Applicat’ons,Proceedings 1978 Annual Reliability and
Modeling Sensor Reliability in Fault Diagnosis Based on Evidence Theory
Yuan, Kaijuan; Xiao, Fuyuan; Fei, Liguo; Kang, Bingyi; Deng, Yong
2016-01-01
Sensor data fusion plays an important role in fault diagnosis. Dempster–Shafer (D-R) evidence theory is widely used in fault diagnosis, since it is efficient to combine evidence from different sensors. However, under the situation where the evidence highly conflicts, it may obtain a counterintuitive result. To address the issue, a new method is proposed in this paper. Not only the statistic sensor reliability, but also the dynamic sensor reliability are taken into consideration. The evidence distance function and the belief entropy are combined to obtain the dynamic reliability of each sensor report. A weighted averaging method is adopted to modify the conflict evidence by assigning different weights to evidence according to sensor reliability. The proposed method has better performance in conflict management and fault diagnosis due to the fact that the information volume of each sensor report is taken into consideration. An application in fault diagnosis based on sensor fusion is illustrated to show the efficiency of the proposed method. The results show that the proposed method improves the accuracy of fault diagnosis from 81.19% to 89.48% compared to the existing methods. PMID:26797611
NASA Technical Reports Server (NTRS)
Montoya, R. J. (Compiler); Howell, W. E. (Compiler); Bundick, W. T. (Compiler); Ostroff, A. J. (Compiler); Hueschen, R. M. (Compiler); Belcastro, C. M. (Compiler)
1983-01-01
Restructurable control system theory, robust reconfiguration for high reliability and survivability for advanced aircraft, restructurable controls problem definition and research, experimentation, system identification methods applied to aircraft, a self-repairing digital flight control system, and state-of-the-art theory application are addressed.
Damschroder, Laura J; Goodrich, David E; Kim, Hyungjin Myra; Holleman, Robert; Gillon, Leah; Kirsh, Susan; Richardson, Caroline R; Lutes, Lesley D
2016-09-01
Practical and valid instruments are needed to assess fidelity of coaching for weight loss. The purpose of this study was to develop and validate the ASPIRE Coaching Fidelity Checklist (ACFC). Classical test theory guided ACFC development. Principal component analyses were used to determine item groupings. Psychometric properties, internal consistency, and inter-rater reliability were evaluated for each subscale. Criterion validity was tested by predicting weight loss as a function of coaching fidelity. The final 19-item ACFC consists of two domains (session process and session structure) and five subscales (sets goals and monitor progress, assess and personalize self-regulatory content, manages the session, creates a supportive and empathetic climate, and stays on track). Four of five subscales showed high internal consistency (Cronbach alphas > 0.70) for group-based coaching; only two of five subscales had high internal reliability for phone-based coaching. All five sub-scales were positively and significantly associated with weight loss for group- but not for phone-based coaching. The ACFC is a reliable and valid instrument that can be used to assess fidelity and guide skill-building for weight management interventionists.
Validation of the Resilience Scale for Adolescents in Norwegian adolescents 13-18 years.
Moksnes, Unni K; Haugan, Gørill
2018-03-01
Resilience is seen as a vital resource for coping and mental health in adolescents. However, there is no universally accepted theory or definition of resilience, leading to considerable challenges regarding how to operationalise and measure this construct. The study aimed at providing further knowledge of the psychometric properties (dimensionality, construct validity and internal consistency) of the 28-item version of the Resilience Scale for Adolescents (READ) in N = 1183 Norwegian adolescents, 13-18 years old. Dimensionality of READ was tested using confirmatory factor analysis (CFA). Convergent validity and reliability were tested using Pearson's correlation analysis, Cronbach's alpha and composite reliability. The CFA supported a modified, 20-item, five-factor structure with high reliability, supporting the dimensionality and internal consistency of the instrument. Convergent validity was confirmed where all factors correlated in expected directions with measures of sense of coherence, self-esteem, stress and depression. The psychometric properties of the READ need to be further evaluated in adolescents; however, the results indicate that a modified 20-item version of READ is adequate for assessing resilience in the present sample of Norwegian adolescents. © 2017 Nordic College of Caring Science.
Motivation in later life: theory and assessment.
Vallerand, R J; O'Connor, B P; Hamel, M
1995-01-01
A framework that has been found useful in research on young adults, Deci and Ryan's self-determination theory [1, 2], is suggested as a promising direction for research on motivation in later life. The theory proposes the existence of four types of motivation (intrinsic, self-determined extrinsic, nonself-determined extrinsic, and amotivation) which are assumed to have varying consequences for adaptation and well-being. A previously published French measure of motivational styles which is known to be reliable and valid was translated into English and was tested on seventy-seven nursing home residents (aged 60 to 98 years). It was found that the four motivational styles can be reliably measured; that the intercorrelations between the motivational styles are consistent with theoretical predictions; and that the four types of motivation are related to other important aspects of the lives of elderly people in a theoretically meaningful manner. Suggestions are made for further research using self-determination theory and the present scales.
The quest for a general theory of aging and longevity.
Gavrilov, Leonid A; Gavrilova, Natalia S
2003-07-16
Extensive studies of phenomena related to aging have produced many diverse findings, which require a general theoretical framework to be organized into a comprehensive body of knowledge. As demonstrated by the success of evolutionary theories of aging, quite general theoretical considerations can be very useful when applied to research on aging. In this theoretical study, we attempt to gain insight into aging by applying a general theory of systems failure known as reliability theory. Considerations of this theory lead to the following conclusions: (i) Redundancy is a concept of crucial importance for understanding aging, particularly the systemic nature of aging. Systems that are redundant in numbers of irreplaceable elements deteriorate (that is, age) over time, even if they are built of elements that do not themselves age. (ii) An apparent aging rate or expression of aging is higher for systems that have higher levels of redundancy. (iii) Redundancy exhaustion over the life course explains a number of observations about mortality, including mortality convergence at later life (when death rates are becoming relatively similar at advanced ages for different populations of the same species) as well as late-life mortality deceleration, leveling off, and mortality plateaus. (iv) Living organisms apparently contain a high load of initial damage from the early stages of development, and therefore their life span and aging patterns may be sensitive to early-life conditions that determine this initial damage load. Thus, the reliability theory provides a parsimonious explanation for many important aging-related phenomena and suggests a number of interesting testable predictions. We therefore suggest adding the reliability theory to the arsenal of methodological approaches applied to research on aging.
The value of SPaCE in delivering patient feedback.
Clapham, Laura; Allan, Laura; Stirling, Kevin
2016-02-01
The use of simulated patients (SPs) within undergraduate medical curricula is an established and valued learning opportunity. Within the context of simulation, it is imperative to capture feedback from all participants within the simulation activity. The Simulated Patient Candidate Evaluation (SPaCE) tool was developed to deliver SP feedback following a simulation activity. SpaCE is a closed feedback tool that allows SPs to rate a student's performance, using a five-point Likert scale, in three domains: attitude; interaction skills; and management. This research study examined the value of the SPaCE tool and how it contributes to the overall feedback that a student receives. Classical test theory was used to determine the reliability of the SPaCE tool. An evaluation of all SP responses was conducted to observe trends in scoring patterns for each question. Qualitative data were collected via a free-text questionnaire and subsequent focus group discussion. It is imperative to capture feedback from all participants within the simulation activity Classical test theory determined that the SPaCE tool had a reliability co-efficient of 0.89. A total of 13 SPs replied to the questionnaire. A thematic analysis of all questionnaire data identified that the SPaCE tool provides a structure that allows patient feedback to be given effectively following a simulation activity. These themes were discussed further with six SPs who attended the subsequent focus group session. The SPaCE tool has been shown to be a reliable closed feedback tool that allows SPs to discriminate between students, based on their performance. The next stage in the development of the SPaCE tool is to test the wider applicability of this feedback tool. © 2015 John Wiley & Sons Ltd.
Design of Oil-Lubricated Machine for Life and Reliability
NASA Technical Reports Server (NTRS)
Zaretsky, Erwin V.
2007-01-01
In the post-World War II era, the major technology drivers for improving the life, reliability, and performance of rolling-element bearings and gears have been the jet engine and the helicopter. By the late 1950s, most of the materials used for bearings and gears in the aerospace industry had been introduced into use. By the early 1960s, the life of most steels was increased over that experienced in the early 1940s, primarily by the introduction of vacuum degassing and vacuum melting processes in the late 1950s. The development of elastohydrodynamic (EHD) theory showed that most rolling bearings and gears have a thin film separating the contacting bodies during motion and it is that film which affects their lives. Computer programs modeling bearing and gear dynamics that incorporate probabilistic life prediction methods and EHD theory enable optimization of rotating machinery based on life and reliability. With improved manufacturing and processing, the potential improvement in bearing and gear life can be as much as 80 times that attainable in the early 1950s. The work presented summarizes the use of laboratory fatigue data for bearings and gears coupled with probabilistic life prediction and EHD theories to predict the life and reliability of a commercial turboprop gearbox. The resulting predictions are compared with field data.
Safety assessment of a shallow foundation using the random finite element method
NASA Astrophysics Data System (ADS)
Zaskórski, Łukasz; Puła, Wojciech
2015-04-01
A complex structure of soil and its random character are reasons why soil modeling is a cumbersome task. Heterogeneity of soil has to be considered even within a homogenous layer of soil. Therefore an estimation of shear strength parameters of soil for the purposes of a geotechnical analysis causes many problems. In applicable standards (Eurocode 7) there is not presented any explicit method of an evaluation of characteristic values of soil parameters. Only general guidelines can be found how these values should be estimated. Hence many approaches of an assessment of characteristic values of soil parameters are presented in literature and can be applied in practice. In this paper, the reliability assessment of a shallow strip footing was conducted using a reliability index β. Therefore some approaches of an estimation of characteristic values of soil properties were compared by evaluating values of reliability index β which can be achieved by applying each of them. Method of Orr and Breysse, Duncan's method, Schneider's method, Schneider's method concerning influence of fluctuation scales and method included in Eurocode 7 were examined. Design values of the bearing capacity based on these approaches were referred to the stochastic bearing capacity estimated by the random finite element method (RFEM). Design values of the bearing capacity were conducted for various widths and depths of a foundation in conjunction with design approaches DA defined in Eurocode. RFEM was presented by Griffiths and Fenton (1993). It combines deterministic finite element method, random field theory and Monte Carlo simulations. Random field theory allows to consider a random character of soil parameters within a homogenous layer of soil. For this purpose a soil property is considered as a separate random variable in every element of a mesh in the finite element method with proper correlation structure between points of given area. RFEM was applied to estimate which theoretical probability distribution fits the empirical probability distribution of bearing capacity basing on 3000 realizations. Assessed probability distribution was applied to compute design values of the bearing capacity and related reliability indices β. Conducted analysis were carried out for a cohesion soil. Hence a friction angle and a cohesion were defined as a random parameters and characterized by two dimensional random fields. A friction angle was described by a bounded distribution as it differs within limited range. While a lognormal distribution was applied in case of a cohesion. Other properties - Young's modulus, Poisson's ratio and unit weight were assumed as deterministic values because they have negligible influence on the stochastic bearing capacity. Griffiths D. V., & Fenton G. A. (1993). Seepage beneath water retaining structures founded on spatially random soil. Géotechnique, 43(6), 577-587.
2016-10-01
Reports an error in "Unreliability as a threat to understanding psychopathology: The cautionary tale of attentional bias" by Thomas L. Rodebaugh, Rachel B. Scullin, Julia K. Langer, David J. Dixon, Jonathan D. Huppert, Amit Bernstein, Ariel Zvielli and Eric J. Lenze ( Journal of Abnormal Psychology , 2016[Aug], Vol 125[6], 840-851). There was an error in the Author Note concerning the support of the MacBrain Face Stimulus Set. The correct statement is provided. (The following abstract of the original article appeared in record 2016-30117-001.) The use of unreliable measures constitutes a threat to our understanding of psychopathology, because advancement of science using both behavioral and biologically oriented measures can only be certain if such measurements are reliable. Two pillars of the National Institute of Mental Health's portfolio-the Research Domain Criteria (RDoC) initiative for psychopathology and the target engagement initiative in clinical trials-cannot succeed without measures that possess the high reliability necessary for tests involving mediation and selection based on individual differences. We focus on the historical lack of reliability of attentional bias measures as an illustration of how reliability can pose a threat to our understanding. Our own data replicate previous findings of poor reliability for traditionally used scores, which suggests a serious problem with the ability to test theories regarding attentional bias. This lack of reliability may also suggest problems with the assumption (in both theory and the formula for the scores) that attentional bias is consistent and stable across time. In contrast, measures accounting for attention as a dynamic process in time show good reliability in our data. The field is sorely in need of research reporting findings and reliability for attentional bias scores using multiple methods, including those focusing on dynamic processes over time. We urge researchers to test and report reliability of all measures, considering findings of low reliability not just as a nuisance but as an opportunity to modify and improve upon the underlying theory. Full assessment of reliability of measures will maximize the possibility that RDoC (and psychological science more generally) will succeed. (PsycINFO Database Record (c) 2016 APA, all rights reserved).
System reliability approaches for advanced propulsion system structures
NASA Technical Reports Server (NTRS)
Cruse, T. A.; Mahadevan, S.
1991-01-01
This paper identifies significant issues that pertain to the estimation and use of system reliability in the design of advanced propulsion system structures. Linkages between the reliabilities of individual components and their effect on system design issues such as performance, cost, availability, and certification are examined. The need for system reliability computation to address the continuum nature of propulsion system structures and synergistic progressive damage modes has been highlighted. Available system reliability models are observed to apply only to discrete systems. Therefore a sequential structural reanalysis procedure is formulated to rigorously compute the conditional dependencies between various failure modes. The method is developed in a manner that supports both top-down and bottom-up analyses in system reliability.
NASA Technical Reports Server (NTRS)
Lalli, Vincent R. (Editor); Malec, Henry A. (Editor); Dillard, Richard B.; Wong, Kam L.; Barber, Frank J.; Barina, Frank J.
1992-01-01
Discussed here is failure physics, the study of how products, hardware, software, and systems fail and what can be done about it. The intent is to impart useful information, to extend the limits of production capability, and to assist in achieving low cost reliable products. A review of reliability for the years 1940 to 2000 is given. Next, a review of mathematics is given as well as a description of what elements contribute to product failures. Basic reliability theory and the disciplines that allow us to control and eliminate failures are elucidated.
Mokkink, Lidwine Brigitta; Galindo-Garre, Francisca; Uitdehaag, Bernard Mj
2016-12-01
The Multiple Sclerosis Walking Scale-12 (MSWS-12) measures walking ability from the patients' perspective. We examined the quality of the MSWS-12 using an item response theory model, the graded response model (GRM). A total of 625 unique Dutch multiple sclerosis (MS) patients were included. After testing for unidimensionality, monotonicity, and absence of local dependence, a GRM was fit and item characteristics were assessed. Differential item functioning (DIF) for the variables gender, age, duration of MS, type of MS and severity of MS, reliability, total test information, and standard error of the trait level (θ) were investigated. Confirmatory factor analysis showed a unidimensional structure of the 12 items of the scale, explaining 88% of the variance. Item 2 did not fit into the GRM model. Reliability was 0.93. Items 8 and 9 (of the 11 and 12 item version respectively) showed DIF on the variable severity, based on the Expanded Disability Status Scale (EDSS). However, the EDSS is strongly related to the content of both items. Our results confirm the good quality of the MSWS-12. The trait level (θ) scores and item parameters of both the 12- and 11-item versions were highly comparable, although we do not suggest to change the content of the MSWS-12. © The Author(s), 2016.
Ownsworth, Tamara; Little, Trudi; Turner, Ben; Hawkes, Anna; Shum, David
2008-10-01
To investigate the clinical potential of the Depression, Anxiety and Stress Scales (DASS 42) and its shorter version (DASS 21) for assessing emotional status following acquired brain injury. Participants included 23 individuals with traumatic brain injury (TBI), 25 individuals with brain tumour and 29 non-clinical controls. Investigations of internal consistency, test-re-test reliability, theory-consistent differences, sensitivity to change and concurrent validity were conducted. Internal consistency of the DASS was generally acceptable (r > 0.70), with the exception of the anxiety scale for the TBI sample. Test-re-test reliability (1-3 weeks) was sound for the depression scale (r > 0.75) and significant but comparatively lower for other scales (r = 0.60-0.73, p < 0.01). Theory-consistent differences were only evident between the brain tumour sample and non-clinical control sample on the anxiety scale (p < 0.01). Sensitivity to change of the DASS in the context of hospital discharge was demonstrated for depression and stress (p < 0.01), but not for anxiety (p > 0.05). Concurrent validity with the Hospital Anxiety and Depression Scale was significant for all scales of the DASS (p < 0.05). While the results generally support the clinical application of the DASS following ABI, further research examining the factor structure of existing and modified versions of the DASS is recommended.
Alonso-Tapia, Jesús; Huertas, Juan A; Ruiz, Miguel A
2010-05-01
In a historical revision of the achievement goal construct, Elliot (2005) recognized that there is little consensus on whether the term "goal" in "achievement goal orientations" (GO) is best represented as an "aim", as an overarching orientation encompassing several "aims", or as a combination of aims and other processes -self-regulation, etc.-. Elliot pointed also that goal theory research provides evidence for different models of GO. As there were no consensus on these issues, we decided to get evidence about the nature and structure of GO, about the role of gender differences in the configuration of such structure, and about relations between GO, expectancies, volitional processes and achievement. A total of 382 university students from different faculties of two public universities of Madrid (Spain) that voluntarily accepted to fill in a questionnaire that assessed different goals, expectancies and self-regulatory processes participated in the study. Scales reliability, confirmatory factor analyses, multiple-group analyses, and correlation and regression analyses were carried out. Results support the trichotomous model of GO, the consideration of GO as a combination of aims and other psychological processes, showed some gender differences and favour the adoption of a multiple goal perspective for explaining students' motivation.
Numerical Simulation of the Layer-Bylayer Destruction of Cylindrical Shells Under Explosive Loading
NASA Astrophysics Data System (ADS)
Abrosimov, N. A.; Novoseltseva, N. A.
2015-09-01
A technique of numerical analysis of the influence of reinforcement structure on the nature of the dynamic response and the process of layer-by-layer destruction of layered fiberglass cylindrical shells under an axisymmetric internal explosive loading is elaborated. The kinematic model of deformation of the laminate package is based on a nonclassical theory of shells. The geometric dependences are based on simple quadratic relations of the nonlinear theory of elasticity. The relationship between the stress and strain tensors are established by using Hooke's law for orthotropic bodies with account of degradation of stiffness characteristics of the multilayer composite due to the local destruction of some its elementary layers. An energetically consistent system of dynamic equations for composite cylindrical shells is obtained by minimizing the functional of total energy of the shell as a three-dimensional body. The numerical method for solving the formulated initial boundary-value problem is based on an explicit variational-difference scheme. Results confirming the reliability of the method used to analyze the influence of reinforcement structure on the character of destruction and the bearing capacity of pulse-loaded cylindrical shells are presented.
Constructing a Grounded Theory of E-Learning Assessment
ERIC Educational Resources Information Center
Alonso-Díaz, Laura; Yuste-Tosina, Rocío
2015-01-01
This study traces the development of a grounded theory of assessment in e-learning environments, a field in need of research to establish the parameters of an assessment that is both reliable and worthy of higher learning accreditation. Using grounded theory as a research method, we studied an e-assessment model that does not require physical…
Boeschen Hospers, J Mirjam; Smits, Niels; Smits, Cas; Stam, Mariska; Terwee, Caroline B; Kramer, Sophia E
2016-04-01
We reevaluated the psychometric properties of the Amsterdam Inventory for Auditory Disability and Handicap (AIADH; Kramer, Kapteyn, Festen, & Tobi, 1995) using item response theory. Item response theory describes item functioning along an ability continuum. Cross-sectional data from 2,352 adults with and without hearing impairment, ages 18-70 years, were analyzed. They completed the AIADH in the web-based prospective cohort study "Netherlands Longitudinal Study on Hearing." A graded response model was fitted to the AIADH data. Category response curves, item information curves, and the standard error as a function of self-reported hearing ability were plotted. The graded response model showed a good fit. Item information curves were most reliable for adults who reported having hearing disability and less reliable for adults with normal hearing. The standard error plot showed that self-reported hearing ability is most reliably measured for adults reporting mild up to moderate hearing disability. This is one of the few item response theory studies on audiological self-reports. All AIADH items could be hierarchically placed on the self-reported hearing ability continuum, meaning they measure the same construct. This provides a promising basis for developing a clinically useful computerized adaptive test, where item selection adapts to the hearing ability of individuals, resulting in efficient assessment of hearing disability.
Reliability and validity of advanced theory-of-mind measures in middle childhood and adolescence.
Hayward, Elizabeth O; Homer, Bruce D
2017-09-01
Although theory-of-mind (ToM) development is well documented for early childhood, there is increasing research investigating changes in ToM reasoning in middle childhood and adolescence. However, the psychometric properties of most advanced ToM measures for use with older children and adolescents have not been firmly established. We report on the reliability and validity of widely used, conventional measures of advanced ToM with this age group. Notable issues with both reliability and validity of several of the measures were evident in the findings. With regard to construct validity, results do not reveal a clear empirical commonality between tasks, and, after accounting for comprehension, developmental trends were evident in only one of the tasks investigated. Statement of contribution What is already known on this subject? Second-order false belief tasks have acceptable internal consistency. The Eyes Test has poor internal consistency. Validity of advanced theory-of-mind tasks is often based on the ability to distinguish clinical from typical groups. What does this study add? This study examines internal consistency across six widely used advanced theory-of-mind tasks. It investigates validity of tasks based on comprehension of items by typically developing individuals. It further assesses construct validity, or commonality between tasks. © 2017 The British Psychological Society.
Recent advances in computational structural reliability analysis methods
NASA Astrophysics Data System (ADS)
Thacker, Ben H.; Wu, Y.-T.; Millwater, Harry R.; Torng, Tony Y.; Riha, David S.
1993-10-01
The goal of structural reliability analysis is to determine the probability that the structure will adequately perform its intended function when operating under the given environmental conditions. Thus, the notion of reliability admits the possibility of failure. Given the fact that many different modes of failure are usually possible, achievement of this goal is a formidable task, especially for large, complex structural systems. The traditional (deterministic) design methodology attempts to assure reliability by the application of safety factors and conservative assumptions. However, the safety factor approach lacks a quantitative basis in that the level of reliability is never known and usually results in overly conservative designs because of compounding conservatisms. Furthermore, problem parameters that control the reliability are not identified, nor their importance evaluated. A summary of recent advances in computational structural reliability assessment is presented. A significant level of activity in the research and development community was seen recently, much of which was directed towards the prediction of failure probabilities for single mode failures. The focus is to present some early results and demonstrations of advanced reliability methods applied to structural system problems. This includes structures that can fail as a result of multiple component failures (e.g., a redundant truss), or structural components that may fail due to multiple interacting failure modes (e.g., excessive deflection, resonate vibration, or creep rupture). From these results, some observations and recommendations are made with regard to future research needs.
Recent advances in computational structural reliability analysis methods
NASA Technical Reports Server (NTRS)
Thacker, Ben H.; Wu, Y.-T.; Millwater, Harry R.; Torng, Tony Y.; Riha, David S.
1993-01-01
The goal of structural reliability analysis is to determine the probability that the structure will adequately perform its intended function when operating under the given environmental conditions. Thus, the notion of reliability admits the possibility of failure. Given the fact that many different modes of failure are usually possible, achievement of this goal is a formidable task, especially for large, complex structural systems. The traditional (deterministic) design methodology attempts to assure reliability by the application of safety factors and conservative assumptions. However, the safety factor approach lacks a quantitative basis in that the level of reliability is never known and usually results in overly conservative designs because of compounding conservatisms. Furthermore, problem parameters that control the reliability are not identified, nor their importance evaluated. A summary of recent advances in computational structural reliability assessment is presented. A significant level of activity in the research and development community was seen recently, much of which was directed towards the prediction of failure probabilities for single mode failures. The focus is to present some early results and demonstrations of advanced reliability methods applied to structural system problems. This includes structures that can fail as a result of multiple component failures (e.g., a redundant truss), or structural components that may fail due to multiple interacting failure modes (e.g., excessive deflection, resonate vibration, or creep rupture). From these results, some observations and recommendations are made with regard to future research needs.
On fatigue crack growth under random loading
NASA Astrophysics Data System (ADS)
Zhu, W. Q.; Lin, Y. K.; Lei, Y.
1992-09-01
A probabilistic analysis of the fatigue crack growth, fatigue life and reliability of a structural or mechanical component is presented on the basis of fracture mechanics and theory of random processes. The material resistance to fatigue crack growth and the time-history of the stress are assumed to be random. Analytical expressions are obtained for the special case in which the random stress is a stationary narrow-band Gaussian random process, and a randomized Paris-Erdogan law is applicable. As an example, the analytical method is applied to a plate with a central crack, and the results are compared with those obtained from digital Monte Carlo simulations.
An adaptive interpolation scheme for molecular potential energy surfaces
NASA Astrophysics Data System (ADS)
Kowalewski, Markus; Larsson, Elisabeth; Heryudono, Alfa
2016-08-01
The calculation of potential energy surfaces for quantum dynamics can be a time consuming task—especially when a high level of theory for the electronic structure calculation is required. We propose an adaptive interpolation algorithm based on polyharmonic splines combined with a partition of unity approach. The adaptive node refinement allows to greatly reduce the number of sample points by employing a local error estimate. The algorithm and its scaling behavior are evaluated for a model function in 2, 3, and 4 dimensions. The developed algorithm allows for a more rapid and reliable interpolation of a potential energy surface within a given accuracy compared to the non-adaptive version.
Density functional calculations of the Mössbauer parameters in hexagonal ferrite SrFe12O19
NASA Astrophysics Data System (ADS)
Ikeno, Hidekazu
2018-03-01
Mössbauer parameters in a magnetoplumbite-type hexagonal ferrite, SrFe12O19, are computed using the all-electron band structure calculation based on the density functional theory. The theoretical isomer shift and quadrupole splitting are consistent with experimentally obtained values. The absolute values of hyperfine splitting parameters are found to be underestimated, but the relative scale can be reproduced. The present results validate the site-dependence of Mössbauer parameters obtained by analyzing experimental spectra of hexagonal ferrites. The results also show the usefulness of theoretical calculations for increasing the reliability of interpretation of the Mössbauer spectra.
Application Research of Fault Tree Analysis in Grid Communication System Corrective Maintenance
NASA Astrophysics Data System (ADS)
Wang, Jian; Yang, Zhenwei; Kang, Mei
2018-01-01
This paper attempts to apply the fault tree analysis method to the corrective maintenance field of grid communication system. Through the establishment of the fault tree model of typical system and the engineering experience, the fault tree analysis theory is used to analyze the fault tree model, which contains the field of structural function, probability importance and so on. The results show that the fault tree analysis can realize fast positioning and well repairing of the system. Meanwhile, it finds that the analysis method of fault tree has some guiding significance to the reliability researching and upgrading f the system.
Theoretical model for plasmonic photothermal response of gold nanostructures solutions
NASA Astrophysics Data System (ADS)
Phan, Anh D.; Nga, Do T.; Viet, Nguyen A.
2018-03-01
Photothermal effects of gold core-shell nanoparticles and nanorods dispersed in water are theoretically investigated using the transient bioheat equation and the extended Mie theory. Properly calculating the absorption cross section is an extremely crucial milestone to determine the elevation of solution temperature. The nanostructures are assumed to be randomly and uniformly distributed in the solution. Compared to previous experiments, our theoretical temperature increase during laser light illumination provides, in various systems, both reasonable qualitative and quantitative agreement. This approach can be a highly reliable tool to predict photothermal effects in experimentally unexplored structures. We also validate our approach and discuss itslimitations.
Blouin, Danielle; Day, Andrew G.; Pavlov, Andrey
2011-01-01
Background Although never directly compared, structured interviews are reported as being more reliable than unstructured interviews. This study compared the reliability of both types of interview when applied to a common pool of applicants for positions in an emergency medicine residency program. Methods In 2008, one structured interview was added to the two unstructured interviews traditionally used in our resident selection process. A formal job analysis using the critical incident technique guided the development of the structured interview tool. This tool consisted of 7 scenarios assessing 4 of the domains deemed essential for success as a resident in this program. The traditional interview tool assessed 5 general criteria. In addition to these criteria, the unstructured panel members were asked to rate each candidate on the same 4 essential domains rated by the structured panel members. All 3 panels interviewed all candidates. Main outcomes were the overall, interitem, and interrater reliabilities, the correlations between interview panels, and the dimensionality of each interview tool. Results Thirty candidates were interviewed. The overall reliability reached 0.43 for the structured interview, and 0.81 and 0.71 for the unstructured interviews. Analyses of the variance components showed a high interrater, low interitem reliability for the structured interview, and a high interrater, high interitem reliability for the unstructured interviews. The summary measures from the 2 unstructured interviews were significantly correlated, but neither was correlated with the structured interview. Only the structured interview was multidimensional. Conclusions A structured interview did not yield a higher overall reliability than both unstructured interviews. The lower reliability is explained by a lower interitem reliability, which in turn is due to the multidimensionality of the interview tool. Both unstructured panels consistently rated a single dimension, even when prompted to assess the 4 specific domains established as essential to succeed in this residency program. PMID:23205201
Blouin, Danielle; Day, Andrew G; Pavlov, Andrey
2011-12-01
Although never directly compared, structured interviews are reported as being more reliable than unstructured interviews. This study compared the reliability of both types of interview when applied to a common pool of applicants for positions in an emergency medicine residency program. In 2008, one structured interview was added to the two unstructured interviews traditionally used in our resident selection process. A formal job analysis using the critical incident technique guided the development of the structured interview tool. This tool consisted of 7 scenarios assessing 4 of the domains deemed essential for success as a resident in this program. The traditional interview tool assessed 5 general criteria. In addition to these criteria, the unstructured panel members were asked to rate each candidate on the same 4 essential domains rated by the structured panel members. All 3 panels interviewed all candidates. Main outcomes were the overall, interitem, and interrater reliabilities, the correlations between interview panels, and the dimensionality of each interview tool. Thirty candidates were interviewed. The overall reliability reached 0.43 for the structured interview, and 0.81 and 0.71 for the unstructured interviews. Analyses of the variance components showed a high interrater, low interitem reliability for the structured interview, and a high interrater, high interitem reliability for the unstructured interviews. The summary measures from the 2 unstructured interviews were significantly correlated, but neither was correlated with the structured interview. Only the structured interview was multidimensional. A structured interview did not yield a higher overall reliability than both unstructured interviews. The lower reliability is explained by a lower interitem reliability, which in turn is due to the multidimensionality of the interview tool. Both unstructured panels consistently rated a single dimension, even when prompted to assess the 4 specific domains established as essential to succeed in this residency program.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kuwahara, Tomotaka, E-mail: tomotaka.phys@gmail.com; WPI, Advanced Institute for Materials Research, Tohoku University, Sendai 980-8577; Mori, Takashi
2016-04-15
This work explores a fundamental dynamical structure for a wide range of many-body quantum systems under periodic driving. Generically, in the thermodynamic limit, such systems are known to heat up to infinite temperature states in the long-time limit irrespective of dynamical details, which kills all the specific properties of the system. In the present study, instead of considering infinitely long-time scale, we aim to provide a general framework to understand the long but finite time behavior, namely the transient dynamics. In our analysis, we focus on the Floquet–Magnus (FM) expansion that gives a formal expression of the effective Hamiltonian onmore » the system. Although in general the full series expansion is not convergent in the thermodynamics limit, we give a clear relationship between the FM expansion and the transient dynamics. More precisely, we rigorously show that a truncated version of the FM expansion accurately describes the exact dynamics for a certain time-scale. Our theory reveals an experimental time-scale for which non-trivial dynamical phenomena can be reliably observed. We discuss several dynamical phenomena, such as the effect of small integrability breaking, efficient numerical simulation of periodically driven systems, dynamical localization and thermalization. Especially on thermalization, we discuss a generic scenario on the prethermalization phenomenon in periodically driven systems. -- Highlights: •A general framework to describe transient dynamics for periodically driven systems. •The theory is applicable to generic quantum many-body systems including long-range interacting systems. •Physical meaning of the truncation of the Floquet–Magnus expansion is rigorously established. •New mechanism of the prethermalization is proposed. •Revealing an experimental time-scale for which non-trivial dynamical phenomena can be reliably observed.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Schultz, Peter A.
For the purposes of making reliable first-principles predictions of defect energies in semiconductors, it is crucial to distinguish between effective-mass-like defects, which cannot be treated accurately with existing supercell methods, and deep defects, for which density functional theory calculations can yield reliable predictions of defect energy levels. The gallium antisite defect GaAs is often associated with the 78/203 meV shallow double acceptor in Ga-rich gallium arsenide. Within a conceptual framework of level patterns, analyses of structure and spin stabilization can be used within a supercell approach to distinguish localized deep defect states from shallow acceptors such as B As. Thismore » systematic approach determines that the gallium antisite supercell results has signatures inconsistent with an effective mass state and cannot be the 78/203 shallow double acceptor. Lastly, the properties of the Ga antisite in GaAs are described, total energy calculations that explicitly map onto asymptotic discrete localized bulk states predict that the Ga antisite is a deep double acceptor and has at least one deep donor state.« less
Student engagement and its relationship with early high school dropout.
Archambault, Isabelle; Janosz, Michel; Fallu, Jean-Sébastien; Pagani, Linda S
2009-06-01
Although the concept of school engagement figures prominently in most school dropout theories, there has been little empirical research conducted on its nature and course and, more importantly, the association with dropout. Information on the natural development of school engagement would greatly benefit those interested in preventing student alienation during adolescence. Using a longitudinal sample of 11,827 French-Canadian high school students, we tested behavioral, affective, cognitive indices of engagement both separately and as a global construct. We then assessed their contribution as prospective predictors of school dropout using factor analysis and structural equation modeling. Global engagement reliably predicted school dropout. Among its three specific dimensions, only behavioral engagement made a significant contribution in the prediction equation. Our findings confirm the robustness of the overall multidimensional construct of school engagement, which reflects both cognitive and psychosocial characteristics, and underscore the importance attributed to basic participation and compliance issues in reliably estimating risk of not completing basic schooling during adolescence.
Abou Samra, Haifa; McGrath, Jacqueline M; Estes, Tracy
2013-06-01
No instrument exists that measures student perceptions of the faculty role. Such a measure is necessary to evaluate the efficacy of interventions aimed at attracting students to the faculty career path. We developed the Nurse Educator Scale (NES). The initial scale items were generated using the social cognitive career theory (SCCT) constructs and were reviewed by an expert panel to ensure content validity. Exploratory factor analysis was used. The optimized 25-item, 7-point Likert scale has a Cronbach's alpha reliability coefficient of 0.85, with a total variance of 42%. The underlying factor structure supported three defining characteristics congruent with SCCT: outcome expectations (alpha = 0.79), relevant knowledge (alpha = 0.67), and social influence (alpha = 0.80). A stand-alone, item-measuring goal setting was also supported. The NES provides a valid and reliable measure of students' intentions and motivations to pursue a future career as a nurse educator or scientist. Copyright 2013, SLACK Incorporated.
Paap, Muirne C S; Braeken, Johan; Pedersen, Geir; Urnes, Øyvind; Karterud, Sigmund; Wilberg, Theresa; Hummelen, Benjamin
2017-12-01
This study aims at evaluating the psychometric properties of the antisocial personality disorder (ASPD) criteria in a large sample of patients, most of whom had one or more personality disorders (PD). PD diagnoses were assessed by experienced clinicians using the Structured Clinical Interview for Diagnostic and Statistical Manual of Mental Disorders, 4th edition, Axis II PDs. Analyses were performed within an item response theory framework. Results of the analyses indicated that ASPD is a unidimensional construct that can be measured reliably at the upper range of the latent trait scale. Differential item functioning across gender was restricted to two criteria and had little impact on the latent ASPD trait level. Patients fulfilling both the adult ASPD criteria and the conduct disorder criteria had similar latent trait distributions as patients fulfilling only the adult ASPD criteria. Overall, the ASPD items fit the purpose of a diagnostic instrument well, that is, distinguishing patients with moderate from those with high antisocial personality scores.
Tamuz, Michal; Harrison, Michael I
2006-01-01
Objective To identify the distinctive contributions of high-reliability theory (HRT) and normal accident theory (NAT) as frameworks for examining five patient safety practices. Data Sources/Study Setting We reviewed and drew examples from studies of organization theory and health services research. Study Design After highlighting key differences between HRT and NAT, we applied the frames to five popular safety practices: double-checking medications, crew resource management (CRM), computerized physician order entry (CPOE), incident reporting, and root cause analysis (RCA). Principal Findings HRT highlights how double checking, which is designed to prevent errors, can undermine mindfulness of risk. NAT emphasizes that social redundancy can diffuse and reduce responsibility for locating mistakes. CRM promotes high reliability organizations by fostering deference to expertise, rather than rank. However, HRT also suggests that effective CRM depends on fundamental changes in organizational culture. NAT directs attention to an underinvestigated feature of CPOE: it tightens the coupling of the medication ordering process, and tight coupling increases the chances of a rapid and hard-to-contain spread of infrequent, but harmful errors. Conclusions Each frame can make a valuable contribution to improving patient safety. By applying the HRT and NAT frames, health care researchers and administrators can identify health care settings in which new and existing patient safety interventions are likely to be effective. Furthermore, they can learn how to improve patient safety, not only from analyzing mishaps, but also by studying the organizational consequences of implementing safety measures. PMID:16898984
Evaluating and enhancing quantum capacitance in graphene-based electrodes from first principles
NASA Astrophysics Data System (ADS)
Ogitsu, Tadashi; Otani, Minoru; Lee, Jonathan; Bagge-Hansen, Michael; Biener, Juergen; Wood, Brandon
2013-03-01
Graphene derivatives are attractive as supercapacitor electrodes because they are lightweight, chemically inert, have high surface area and conductivity, and are stable in electrolyte solutions. Nevertheless, devising reliable strategies for improving energy density relies on an understanding of the specific factors that control electrode performance. We use density-functional theory calculations of pristine and defective graphene to extract quantum capacitance, as well as to identify specific limiting factors. The effect of structural point defects and strain-related morphological changes on the density of states is also evaluated. The results are combined with predicted and measured in situ X-ray absorption spectra in order to give insight into the structural and chemical features present in synthesized carbon aerogel samples. Performed under the auspices of the U.S. DOE by LLNL under Contract DE-AC52-07NA27344.
NASA Astrophysics Data System (ADS)
Iskandar, Ismed; Satria Gondokaryono, Yudi
2016-02-01
In reliability theory, the most important problem is to determine the reliability of a complex system from the reliability of its components. The weakness of most reliability theories is that the systems are described and explained as simply functioning or failed. In many real situations, the failures may be from many causes depending upon the age and the environment of the system and its components. Another problem in reliability theory is one of estimating the parameters of the assumed failure models. The estimation may be based on data collected over censored or uncensored life tests. In many reliability problems, the failure data are simply quantitatively inadequate, especially in engineering design and maintenance system. The Bayesian analyses are more beneficial than the classical one in such cases. The Bayesian estimation analyses allow us to combine past knowledge or experience in the form of an apriori distribution with life test data to make inferences of the parameter of interest. In this paper, we have investigated the application of the Bayesian estimation analyses to competing risk systems. The cases are limited to the models with independent causes of failure by using the Weibull distribution as our model. A simulation is conducted for this distribution with the objectives of verifying the models and the estimators and investigating the performance of the estimators for varying sample size. The simulation data are analyzed by using Bayesian and the maximum likelihood analyses. The simulation results show that the change of the true of parameter relatively to another will change the value of standard deviation in an opposite direction. For a perfect information on the prior distribution, the estimation methods of the Bayesian analyses are better than those of the maximum likelihood. The sensitivity analyses show some amount of sensitivity over the shifts of the prior locations. They also show the robustness of the Bayesian analysis within the range between the true value and the maximum likelihood estimated value lines.
Huang, Xuan-Yi; Yen, Wen-Jiuan; Liu, Shwu-Jiuan; Lin, Chouh-Jiuan
2008-03-01
The aim was to develop a practice theory that can be used to guide the direction of community nursing practice to help clients with schizophrenia and those who care for them. Substantive grounded theory was developed through use of grounded theory method of Strauss and Corbin. Two groups of participants in Taiwan were selected using theoretical sampling: one group consisted of community mental health nurses and the other group was clients with schizophrenia and those who cared for them. The number of participants in each group was determined by theoretical saturation. Semi-structured one-to-one in-depth interviews and unstructured non-participant observation were utilized for data collection. Data analysis involved three stages: open, axial and selective coding. During the process of coding and analysis, both inductive and deductive thinking were utilized and the constant comparative analysis process continued until data saturation occurred. To establish trustworthiness, the four criteria of credibility, transferability, dependability and confirmability were followed along with field trial, audit trial, member check and peer debriefing for reliability and validity. A substantive grounded theory, the role of community mental health nurses caring for people with schizophrenia in Taiwan, was developed through utilization of grounded theory method of Strauss and Corbin. In this paper, results and discussion focus on causal conditions, context, intervening conditions, consequences and phenomenon. The theory is the first to contribute knowledge about the field of mental health home visiting services in Taiwan to provide guidance for the delivery of quality care to assist people in the community with schizophrenia and their carers.
Reliability Analysis of Sealing Structure of Electromechanical System Based on Kriging Model
NASA Astrophysics Data System (ADS)
Zhang, F.; Wang, Y. M.; Chen, R. W.; Deng, W. W.; Gao, Y.
2018-05-01
The sealing performance of aircraft electromechanical system has a great influence on flight safety, and the reliability of its typical seal structure is analyzed by researcher. In this paper, we regard reciprocating seal structure as a research object to study structural reliability. Having been based on the finite element numerical simulation method, the contact stress between the rubber sealing ring and the cylinder wall is calculated, and the relationship between the contact stress and the pressure of the hydraulic medium is built, and the friction force on different working conditions are compared. Through the co-simulation, the adaptive Kriging model obtained by EFF learning mechanism is used to describe the failure probability of the seal ring, so as to evaluate the reliability of the sealing structure. This article proposes a new idea of numerical evaluation for the reliability analysis of sealing structure, and also provides a theoretical basis for the optimal design of sealing structure.
Correcting Fallacies in Validity, Reliability, and Classification
ERIC Educational Resources Information Center
Sijtsma, Klaas
2009-01-01
This article reviews three topics from test theory that continue to raise discussion and controversy and capture test theorists' and constructors' interest. The first topic concerns the discussion of the methodology of investigating and establishing construct validity; the second topic concerns reliability and its misuse, alternative definitions…
NASA Astrophysics Data System (ADS)
Radgolchin, Moeen; Moeenfard, Hamid
2018-02-01
The construction of self-powered micro-electro-mechanical units by converting the mechanical energy of the systems into electrical power has attracted much attention in recent years. While power harvesting from deterministic external excitations is state of the art, it has been much more difficult to derive mathematical models for scavenging electrical energy from ambient random vibrations, due to the stochastic nature of the excitations. The current research concerns analytical modeling of micro-bridge energy harvesters based on random vibration theory. Since classical elasticity fails to accurately predict the mechanical behavior of micro-structures, strain gradient theory is employed as a powerful tool to increase the accuracy of the random vibration modeling of the micro-harvester. Equations of motion of the system in the time domain are derived using the Lagrange approach. These are then utilized to determine the frequency and impulse responses of the structure. Assuming the energy harvester to be subjected to a combination of broadband and limited-band random support motion and transverse loading, closed-form expressions for mean, mean square, correlation and spectral density of the output power are derived. The suggested formulation is further exploited to investigate the effect of the different design parameters, including the geometric properties of the structure as well as the properties of the electrical circuit on the resulting power. Furthermore, the effect of length scale parameters on the harvested energy is investigated in detail. It is observed that the predictions of classical and even simple size-dependent theories (such as couple stress) appreciably differ from the findings of strain gradient theory on the basis of random vibration. This study presents a first-time modeling of micro-scale harvesters under stochastic excitations using a size-dependent approach and can be considered as a reliable foundation for future research in the field of micro/nano harvesters subjected to non-deterministic loads.
ERIC Educational Resources Information Center
Pearl, Lisa; Ho, Timothy; Detrano, Zephyr
2017-01-01
It has long been recognized that there is a natural dependence between theories of knowledge representation and theories of knowledge acquisition, with the idea that the right knowledge representation enables acquisition to happen as reliably as it does. Given this, a reasonable criterion for a theory of knowledge representation is that it be…
ERIC Educational Resources Information Center
Turner, Brandon M.; Betz, Nancy E.; Edwards, Michael C.; Borgen, Fred H.
2010-01-01
The psychometric properties of measures of self-efficacy for the six themes of Holland's theory were examined using item response theory. Item and scale quality were compared across levels of the trait continuum; all the scales were highly reliable but differentiated better at some levels of the continuum than others. Applications for adaptive…
ERIC Educational Resources Information Center
Wagler, Amy; Wagler, Ron
2013-01-01
The Measure of Acceptance of the Theory of Evolution (MATE) was constructed to be a single-factor instrument that assesses an individual's overall acceptance of evolutionary theory. The MATE was validated and the scores resulting from the MATE were found to be reliable for the population of inservice high school biology teachers. However, many…
Lee, Kangjoo; Lina, Jean-Marc; Gotman, Jean; Grova, Christophe
2016-07-01
Functional hubs are defined as the specific brain regions with dense connections to other regions in a functional brain network. Among them, connector hubs are of great interests, as they are assumed to promote global and hierarchical communications between functionally specialized networks. Damage to connector hubs may have a more crucial effect on the system than does damage to other hubs. Hubs in graph theory are often identified from a correlation matrix, and classified as connector hubs when the hubs are more connected to regions in other networks than within the networks to which they belong. However, the identification of hubs from functional data is more complex than that from structural data, notably because of the inherent problem of multicollinearity between temporal dynamics within a functional network. In this context, we developed and validated a method to reliably identify connectors and corresponding overlapping network structure from resting-state fMRI. This new method is actually handling the multicollinearity issue, since it does not rely on counting the number of connections from a thresholded correlation matrix. The novelty of the proposed method is that besides counting the number of networks involved in each voxel, it allows us to identify which networks are actually involved in each voxel, using a data-driven sparse general linear model in order to identify brain regions involved in more than one network. Moreover, we added a bootstrap resampling strategy to assess statistically the reproducibility of our results at the single subject level. The unified framework is called SPARK, i.e. SParsity-based Analysis of Reliable k-hubness, where k-hubness denotes the number of networks overlapping in each voxel. The accuracy and robustness of SPARK were evaluated using two dimensional box simulations and realistic simulations that examined detection of artificial hubs generated on real data. Then, test/retest reliability of the method was assessed using the 1000 Functional Connectome Project database, which includes data obtained from 25 healthy subjects at three different occasions with long and short intervals between sessions. We demonstrated that SPARK provides an accurate and reliable estimation of k-hubness, suggesting a promising tool for understanding hub organization in resting-state fMRI. Copyright © 2016 Elsevier Inc. All rights reserved.
Illustrated structural application of universal first-order reliability method
NASA Technical Reports Server (NTRS)
Verderaime, V.
1994-01-01
The general application of the proposed first-order reliability method was achieved through the universal normalization of engineering probability distribution data. The method superimposes prevailing deterministic techniques and practices on the first-order reliability method to surmount deficiencies of the deterministic method and provide benefits of reliability techniques and predictions. A reliability design factor is derived from the reliability criterion to satisfy a specified reliability and is analogous to the deterministic safety factor. Its application is numerically illustrated on several practical structural design and verification cases with interesting results and insights. Two concepts of reliability selection criteria are suggested. Though the method was developed to support affordable structures for access to space, the method should also be applicable for most high-performance air and surface transportation systems.
NASA Astrophysics Data System (ADS)
Ksenofontov, Alexander A.; Guseva, Galina B.; Antina, Elena V.
2016-10-01
Density functional theory (DFT) and Time-dependent density functional theory (TD- DFT) computations have been used to reveal structural, molecular, electronic and spectral-luminescent parameters and features of several homoleptic transition metals bis(dipyrrine) complexes. The influence of complexing agent and ligand nature on the regularities in geometric, spectral-luminescent properties, kinetic and thermal stability changes in the [M2L2] complexes series were studied. Special attention is paid to the influence of the solvating media (PCM/TD-B3LYP/Def2-SVP) on changing spectral-luminescent properties of d-metals bis(dipyrrinate)s. The interpretation of the dependence between spectral-luminescent properties of the complexes and HOMO-LUMO (highest occupied molecular orbital and lowest unoccupied molecular orbital) energy gap's width was given. It was shown that the regularities in changing the helicates' quantum yield depending on the nature of complexing agent, ligand and solvent properties, obtained from quantum-chemical calculations, are in the agreement with our previously obtained experimental data. Thus, structural and spectral-luminescent characteristics of new [M2L2] luminophors can be evaluated with high reliability, and good forecast prospects for their use as fluorescent dyes for optical devices can be made in terms of the results of theoretical studies (B3LYP/Def2-SVP and TD-B3LYP/Def2-SVP).
Measuring theory of mind in children. Psychometric properties of the ToM Storybooks.
Blijd-Hoogewys, E M A; van Geert, P L C; Serra, M; Minderaa, R B
2008-11-01
Although research on Theory-of-Mind (ToM) is often based on single task measurements, more comprehensive instruments result in a better understanding of ToM development. The ToM Storybooks is a new instrument measuring basic ToM-functioning and associated aspects. There are 34 tasks, tapping various emotions, beliefs, desires and mental-physical distinctions. Four studies on the validity and reliability of the test are presented, in typically developing children (n = 324, 3-12 years) and children with PDD-NOS (n = 30). The ToM Storybooks have good psychometric qualities. A component analysis reveals five components corresponding with the underlying theoretical constructs. The internal consistency, test-retest reliability, inter-rater reliability, construct validity and convergent validity are good. The ToM Storybooks can be used in research as well as in clinical settings.
Hallgren, Kevin A.; Greenfield, Brenna L.; Ladd, Benjamin O.
2016-01-01
Background Behavioral economic theories of drinking posit that the reinforcing value of engaging in activities with versus without alcohol influences drinking behavior. Measures of the reinforcement value of drugs and alcohol have been used in previous research, but little work has examined the psychometric properties of these measures. Objectives The present study aims to evaluate the factor structure, test-retest reliability, and concurrent validity of an alcohol-only version of the Adolescent Reinforcement Survey Schedule (ARSS-AUV). Methods A sample of 157 college student drinkers completed the ARSS-AUV at two time points 2–3 days apart. Test-retest reliability, hierarchical factor analysis, and correlations with other drinking measures were examined. Results Single, unidimensional general factors accounted for a majority of the variance in alcohol and alcohol-free reinforcement items. Residual factors emerged that typically represented alcohol or alcohol-free reinforcement while doing activities with friends, romantic or sexual partners, and family members. Individual ARSS-AUV items had fair-to-good test-retest reliability, while general and residual factors had excellent test-retest reliability. General alcohol reinforcement and alcohol reinforcement from friends and romantic partners were positively correlated with past-year alcohol consumption, heaviest drinking episode, and alcohol-related negative consequences. Alcohol-free reinforcement indices were unrelated to alcohol use or consequences. Conclusions/Importance The ARSS-AUV appears to demonstrate good reliability and mixed concurrent validity among college student drinkers. The instrument may provide useful information about alcohol reinforcement from various activities and people and could provide clinically-relevant information for prevention and treatment programs. PMID:27096713
Hallgren, Kevin A; Greenfield, Brenna L; Ladd, Benjamin O
2016-06-06
Behavioral economic theories of drinking posit that the reinforcing value of engaging in activities with versus without alcohol influences drinking behavior. Measures of the reinforcement value of drugs and alcohol have been used in previous research, but little work has examined the psychometric properties of these measures. The present study aims to evaluate the factor structure, test-retest reliability, and concurrent validity of an alcohol-only version of the Adolescent Reinforcement Survey Schedule (ARSS-AUV). A sample of 157 college student drinkers completed the ARSS-AUV at two time points 2-3 days apart. Test-retest reliability, hierarchical factor analysis, and correlations with other drinking measures were examined. Single, unidimensional general factors accounted for a majority of the variance in alcohol and alcohol-free reinforcement items. Residual factors emerged that typically represented alcohol or alcohol-free reinforcement while doing activities with friends, romantic or sexual partners, and family members. Individual ARSS-AUV items had fair-to-good test-retest reliability, while general and residual factors had excellent test-retest reliability. General alcohol reinforcement and alcohol reinforcement from friends and romantic partners were positively correlated with past-year alcohol consumption, heaviest drinking episode, and alcohol-related negative consequences. Alcohol-free reinforcement indices were unrelated to alcohol use or consequences. The ARSS-AUV appears to demonstrate good reliability and mixed concurrent validity among college student drinkers. The instrument may provide useful information about alcohol reinforcement from various activities and people and could provide clinically-relevant information for prevention and treatment programs.
Validation Evidence of the Motivation for Teaching Scale in Secondary Education.
Abós, Ángel; Sevil, Javier; Martín-Albo, José; Aibar, Alberto; García-González, Luis
2018-04-10
Grounded in self-determination theory, the aim of this study was to develop a scale with adequate psychometric properties to assess motivation for teaching and to explain some outcomes of secondary education teachers at work. The sample comprised 584 secondary education teachers. Analyses supported the five-factor model (intrinsic motivation, identified regulation, introjected regulation, external regulation and amotivation) and indicated the presence of a continuum of self-determination. Evidence of reliability was provided by Cronbach's alpha, composite reliability and average variance extracted. Multigroup confirmatory factor analyses supported the partial invariance (configural and metric) of the scale in different sub-samples, in terms of gender and type of school. Concurrent validity was analyzed by a structural equation modeling that explained 71% of the work dedication variance and 69% of the boredom at work variance. Work dedication was positively predicted by intrinsic motivation (ß = .56, p < .001) and external regulation (ß = .29, p < .001) and negatively predicted by introjected regulation (ß = -.22, p < .001) and amotivation (ß = -.49, p < .001). Boredom at work was negatively predicted by intrinsic motivation (ß = -.28, p < .005) and positively predicted by amotivation (ß = .68, p < .001). The Motivation for Teaching Scale in Secondary Education (Spanish acronym EME-ES, Escala de Motivación por la Enseñanza en Educación Secundaria) is discussed as a valid and reliable instrument. This is the first specific scale in the work context of secondary teachers that has integrated the five-factor structure together with their dedication and boredom at work.
ERIC Educational Resources Information Center
Han, Chao
2016-01-01
As a property of test scores, reliability/dependability constitutes an important psychometric consideration, and it underpins the validity of measurement results. A review of interpreter certification performance tests (ICPTs) reveals that (a) although reliability/dependability checking has been recognized as an important concern, its theoretical…
Monte Carlo Approach for Reliability Estimations in Generalizability Studies.
ERIC Educational Resources Information Center
Dimitrov, Dimiter M.
A Monte Carlo approach is proposed, using the Statistical Analysis System (SAS) programming language, for estimating reliability coefficients in generalizability theory studies. Test scores are generated by a probabilistic model that considers the probability for a person with a given ability score to answer an item with a given difficulty…
Surfing a spike wave down the ventral stream.
VanRullen, Rufin; Thorpe, Simon J
2002-10-01
Numerous theories of neural processing, often motivated by experimental observations, have explored the computational properties of neural codes based on the absolute or relative timing of spikes in spike trains. Spiking neuron models and theories however, as well as their experimental counterparts, have generally been limited to the simulation or observation of isolated neurons, isolated spike trains, or reduced neural populations. Such theories would therefore seem inappropriate to capture the properties of a neural code relying on temporal spike patterns distributed across large neuronal populations. Here we report a range of computer simulations and theoretical considerations that were designed to explore the possibilities of one such code and its relevance for visual processing. In a unified framework where the relation between stimulus saliency and spike relative timing plays the central role, we describe how the ventral stream of the visual system could process natural input scenes and extract meaningful information, both rapidly and reliably. The first wave of spikes generated in the retina in response to a visual stimulation carries information explicitly in its spatio-temporal structure: the most salient information is represented by the first spikes over the population. This spike wave, propagating through a hierarchy of visual areas, is regenerated at each processing stage, where its temporal structure can be modified by (i). the selectivity of the cortical neurons, (ii). lateral interactions and (iii). top-down attentional influences from higher order cortical areas. The resulting model could account for the remarkable efficiency and rapidity of processing observed in the primate visual system.
Reis, H; Rasulev, B; Papadopoulos, M G; Leszczynski, J
2015-01-01
Fullerene and its derivatives are currently one of the most intensively investigated species in the area of nanomedicine and nanochemistry. Various unique properties of fullerenes are responsible for their wide range applications in industry, biology and medicine. A large pool of functionalized C60 and C70 fullerenes is investigated theoretically at different levels of quantum-mechanical theory. The semiempirial PM6 method, density functional theory with the B3LYP functional, and correlated ab initio MP2 method are employed to compute the optimized structures, and an array of properties for the considered species. In addition to the calculations for isolated molecules, the results of solution calculations are also reported at the DFT level, using the polarizable continuum model (PCM). Ionization potentials (IPs) and electron affinities (EAs) are computed by means of Koopmans' theorem as well as with the more accurate but computationally expensive ΔSCF method. Both procedures yield comparable values, while comparison of IPs and EAs computed with different quantum-mechanical methods shows surprisingly large differences. Harmonic vibrational frequencies are computed at the PM6 and B3LYP levels of theory and compared with each other. A possible application of the frequencies as 3D descriptors in the EVA (EigenVAlues) method is shown. All the computed data are made available, and may be used to replace experimental data in routine applications where large amounts of data are required, e.g. in structure-activity relationship studies of the toxicity of fullerene derivatives.
An Analytical Methodology for Predicting Repair Time Distributions of Advanced Technology Aircraft.
1985-12-01
1984. 3. Barlow, Richard E. "Mathematical Theory of Reliabilitys A Historical Perspective." ZEEE Transactions on Reliability, 33. 16-19 (April 1984...Technology (AU), Wright-Patterson AFB OH, March 1971. 11. Coppola, Anthony. "Reliability Engineering of J- , Electronic Equipment," ZEEE Transactions on...1982. 64. Woodruff, Brian W. at al. "Modified Goodness-o-Fit Tests for Gamma Distributions with Unknown Location and Scale Parameters," ZEEE
Guerin, Rebecca J; Toland, Michael D; Okun, Andrea H; Rojas-Guyler, Liliana; Bernard, Amy L
2018-03-31
Work, a defining feature of adolescence in the United States, has many benefits. Work also has risks, as adolescents experience a higher rate of serious job-related injuries compared to adults. Talking Safety, a free curriculum from the National Institute for Occupational Safety and Health, is one tool educators may adopt to provide teens with essential workplace safety and health education. Adolescents (N = 2503; female, 50.1%; Hispanic, 50.0%) in a large urban school district received Talking Safety from their eighth-grade science teachers. This study used a modified theory of planned behavior (which included a knowledge construct), to examine students' pre- and post-intervention scores on workplace safety and health knowledge, attitude, self-efficacy, and behavioral intention to enact job safety skills. The results from confirmatory factor analyses indicate three unique dimensions reflecting the theory, with a separate knowledge factor. Reliability estimates are ω ≥ .83. The findings from the structural equation models demonstrate that all paths, except pre- to posttest behavioral intention, are statistically significant. Self-efficacy is the largest contributor to the total effect of these associations. As hypothesized, knowledge has indirect effects on behavioral intention. Hispanic students scored lower at posttest on all but the behavioral intention measure, possibly suggesting the need for tailored materials to reach some teens. Overall the findings support the use of a modified theory of planned behavior to evaluate the effectiveness of a foundational workplace safety and health curriculum. This study may inform future efforts to ensure that safe and healthy work becomes integral to the adolescent experience.
NASA Astrophysics Data System (ADS)
Jauhariyah, M. N. R.; Zulfa, I.; Harizah, Z.; Setyarsih, W.
2018-04-01
In the learning process, students still have misconceptions although the concept has been taught by the teacher. One of the causes of misconceptions comes from the students themselves. The teacher often does not know that students have misconceptions because the assessment that has been given does not review the cause of misconception that occurs in students. Students have misconception especially in Abstract concept like in chapter Kinetic Theory of Gases. Because of that, this study made three tier diagnostic instrument test to diagnose student’s misconception in chapter Kinetic Theory of Gases. Moreover, it is also done an analysis in order to know the quality of instrument that has been made. According to the trials, the percentage of average internal validity is about 84.6% which means it is in proper to use and the reliability of instrument is 0.752, meaning reliable to use. This instrument used to diagnose the students’ misconceptions on chapter Kinetic Theory of Gases.
Extension of nanoconfined DNA: Quantitative comparison between experiment and theory
NASA Astrophysics Data System (ADS)
Iarko, V.; Werner, E.; Nyberg, L. K.; Müller, V.; Fritzsche, J.; Ambjörnsson, T.; Beech, J. P.; Tegenfeldt, J. O.; Mehlig, K.; Westerlund, F.; Mehlig, B.
2015-12-01
The extension of DNA confined to nanochannels has been studied intensively and in detail. However, quantitative comparisons between experiments and model calculations are difficult because most theoretical predictions involve undetermined prefactors, and because the model parameters (contour length, Kuhn length, effective width) are difficult to compute reliably, leading to substantial uncertainties. Here we use a recent asymptotically exact theory for the DNA extension in the "extended de Gennes regime" that allows us to compare experimental results with theory. For this purpose, we performed experiments measuring the mean DNA extension and its standard deviation while varying the channel geometry, dye intercalation ratio, and ionic strength of the buffer. The experimental results agree very well with theory at high ionic strengths, indicating that the model parameters are reliable. At low ionic strengths, the agreement is less good. We discuss possible reasons. In principle, our approach allows us to measure the Kuhn length and the effective width of a single DNA molecule and more generally of semiflexible polymers in solution.
Kulke, Louisa; von Duhn, Britta; Schneider, Dana; Rakoczy, Hannes
2018-06-01
Recently, theory-of-mind research has been revolutionized by findings from novel implicit tasks suggesting that at least some aspects of false-belief reasoning develop earlier in ontogeny than previously assumed and operate automatically throughout adulthood. Although these findings are the empirical basis for far-reaching theories, systematic replications are still missing. This article reports a preregistered large-scale attempt to replicate four influential anticipatory-looking implicit theory-of-mind tasks using original stimuli and procedures. Results showed that only one of the four paradigms was reliably replicated. A second set of studies revealed, further, that this one paradigm was no longer replicated once confounds were removed, which calls its validity into question. There were also no correlations between paradigms, and thus, no evidence for their convergent validity. In conclusion, findings from anticipatory-looking false-belief paradigms seem less reliable and valid than previously assumed, thus limiting the conclusions that can be drawn from them.
Bourion-Bédès, Stéphanie; Schwan, Raymund; Epstein, Jonathan; Laprevote, Vincent; Bédès, Alex; Bonnet, Jean-Louis; Baumann, Cédric
2015-02-01
The study aimed to examine the construct validity and reliability of the Quality of Life Enjoyment and Satisfaction Questionnaire-Short Form (Q-LES-Q-SF) according to both classical test and item response theories. The psychometric properties of the French version of this instrument were investigated in a cross-sectional, multicenter study. A total of 124 outpatients with a substance dependence diagnosis participated in the study. Psychometric evaluation included descriptive analysis, internal consistency, test-retest reliability, and validity. The dimensionality of the instrument was explored using a combination of the classical test, confirmatory factor analysis (CFA), and an item response theory analysis, the Person Separation Index (PSI), in a complementary manner. The results of the Q-LES-Q-SF revealed that the questionnaire was easy to administer and the acceptability was good. The internal consistency and the test-retest reliability were 0.9 and 0.88, respectively. All items were significantly correlated with the total score and the SF-12 used in the study. The CFA with one factor model was good, and for the unidimensional construct, the PSI was found to be 0.902. The French version of the Q-LES-Q-SF yielded valid and reliable clinical assessments of the quality of life for future research and clinical practice involving French substance abusers. In response to recent questioning regarding the unidimensionality or bidimensionality of the instrument and according to the underlying theoretical unidimensional construct used for its development, this study suggests the Q-LES-Q-SF as a one-dimension questionnaire in French QoL studies.
NASA Astrophysics Data System (ADS)
Dar, M. A.; Sheikh, M. W.; Malla, M. S.; Varshney, Dinesh
2016-05-01
The composites of (1-x) La0.67Ba0.33MnO3 (LBMO) + xBaTiO3 (BTO) (x = 0, 0.25 and 1.0) were synthesized by conventional solid-state reaction method. Rietveld refinement was employed to characterize the structural information of the prepared ceramics. The result of the Rietveld refinement of X-ray powder diffraction of La0.67Ba0.33MnO3 and BaTiO3 shows that these compounds crystallize in rhombohedral (R3c) and tetragonal (P4mm), respectively. The structural parameters and the reliability factors for the LBMO-BTO composite ceramics were successfully determined by the Rietveld refinement. At room temperature, Raman active phonon modes predicted by the group theory were observed only in BaTiO3 and composite sample. Pure LBMO does not show any Raman active Phonon mode at room temperature.
Valence electronic structure of Ni in Ni Si alloys from relative K X-ray intensity studies
NASA Astrophysics Data System (ADS)
Kalayci, Y.; Aydinuraz, A.; Tugluoglu, B.; Mutlu, R. H.
2007-02-01
The Kβ-to-Kα X-ray intensity ratio of Ni in Ni 3Si, Ni 2Si and NiSi has been determined by energy dispersive X-ray fluorescence technique. It is found that the intensity ratio of Ni decreases from pure Ni to Ni 2Si and then increases from Ni 2Si to NiSi, in good agreement with the electronic structure calculations cited in the literature. We have also performed band structure calculations for pure Ni in various atomic configurations by means of linear muffin-tin orbital method and used this data with the normalized theoretical intensity ratios cited in the literature to estimate the 3d-occupation numbers of Ni in Ni-Si alloys. It is emphasized that investigation of alloying effect in terms of X-ray intensity ratios should be carried out for the stoichiometric alloys in order to make reliable and quantitative comparisons between theory and experiment in transition metal alloys.
Numerical and Experimental Study on Hydrodynamic Performance of A Novel Semi-Submersible Concept
NASA Astrophysics Data System (ADS)
Gao, Song; Tao, Long-bin; Kou, Yu-feng; Lu, Chao; Sun, Jiang-long
2018-04-01
Multiple Column Platform (MCP) semi-submersible is a newly proposed concept, which differs from the conventional semi-submersibles, featuring centre column and middle pontoon. It is paramount to ensure its structural reliability and safe operation at sea, and a rigorous investigation is conducted to examine the hydrodynamic and structural performance for the novel structure concept. In this paper, the numerical and experimental studies on the hydrodynamic performance of MCP are performed. Numerical simulations are conducted in both the frequency and time domains based on 3D potential theory. The numerical models are validated by experimental measurements obtained from extensive sets of model tests under both regular wave and irregular wave conditions. Moreover, a comparative study on MCP and two conventional semi-submersibles are carried out using numerical simulation. Specifically, the hydrodynamic characteristics, including hydrodynamic coefficients, natural periods and motion response amplitude operators (RAOs), mooring line tension are fully examined. The present study proves the feasibility of the novel MCP and demonstrates the potential possibility of optimization in the future study.
A novel series of thiosemicarbazone drugs: From synthesis to structure
NASA Astrophysics Data System (ADS)
Ebrahimi, Hossein Pasha; Hadi, Jabbar S.; Alsalim, Tahseen A.; Ghali, Thaer S.; Bolandnazar, Zeinab
2015-02-01
A new series of thiosemicarbazones (TSCs) and their 1,3,4-thiadiazolines (TDZs) containing acetamide group have been synthesized from thiosemicarbazide compounds by the reaction of TSCs with cyclic ketones as well as aromatic aldehydes. The structures of newly synthesized 1,3,4-thiadiazole derivatives obtained by heterocyclization of the TSCs with acetic anhydride were experimentally characterized by spectral methods using IR, 1H NMR, 13C NMR and mass spectroscopic methods. Furthermore, the structural, thermodynamic, and electronic properties of the studied compounds were also studied theoretically by performing Density Functional Theory (DFT) to access reliable results to the experimental values. The molecular geometry, the highest occupied molecular orbital (HOMO), the lowest unoccupied molecular orbital (LUMO) and Mulliken atomic charges of the studied compounds have been calculated at the B3LYP method and standard 6-31+G(d,p) basis set starting from optimized geometry. The theoretical 13C chemical shift results were also calculated using the gauge independent atomic orbital (GIAO) approach and their respective linear correlations were obtained.
Forward modelling requires intention recognition and non-impoverished predictions.
de Ruiter, Jan P; Cummins, Chris
2013-08-01
We encourage Pickering & Garrod (P&G) to implement this promising theory in a computational model. The proposed theory crucially relies on having an efficient and reliable mechanism for early intention recognition. Furthermore, the generation of impoverished predictions is incompatible with a number of key phenomena that motivated P&G's theory. Explaining these phenomena requires fully specified perceptual predictions in both comprehension and production.
ERIC Educational Resources Information Center
Hutchins, Tiffany L.; Bonazinga, Laura A.; Prelock, Patricia A.; Taylor, Rebecca S.
2008-01-01
The Perceptions of Children's Theory of Mind Measure (Experimental version; PCToMM-E) is an informant measure designed to tap children's theory of mind competence. Study one evaluated the measure when completed by primary caregivers of children with autism spectrum disorder. Scores demonstrated high test-retest reliability and correlated with…
NASA Astrophysics Data System (ADS)
Tacnet, Jean-Marc; Dupouy, Guillaume; Carladous, Simon; Dezert, Jean; Batton-Hubert, Mireille
2017-04-01
In mountain areas, natural phenomena such as snow avalanches, debris-flows and rock-falls, put people and objects at risk with sometimes dramatic consequences. Risk is classically considered as a combination of hazard, the combination of the intensity and frequency of the phenomenon, and vulnerability which corresponds to the consequences of the phenomenon on exposed people and material assets. Risk management consists in identifying the risk level as well as choosing the best strategies for risk prevention, i.e. mitigation. In the context of natural phenomena in mountainous areas, technical and scientific knowledge is often lacking. Risk management decisions are therefore based on imperfect information. This information comes from more or less reliable sources ranging from historical data, expert assessments, numerical simulations etc. Finally, risk management decisions are the result of complex knowledge management and reasoning processes. Tracing the information and propagating information quality from data acquisition to decisions are therefore important steps in the decision-making process. One major goal today is therefore to assist decision-making while considering the availability, quality and reliability of information content and sources. A global integrated framework is proposed to improve the risk management process in a context of information imperfection provided by more or less reliable sources: uncertainty as well as imprecision, inconsistency and incompleteness are considered. Several methods are used and associated in an original way: sequential decision context description, development of specific multi-criteria decision-making methods, imperfection propagation in numerical modeling and information fusion. This framework not only assists in decision-making but also traces the process and evaluates the impact of information quality on decision-making. We focus and present two main developments. The first one relates to uncertainty and imprecision propagation in numerical modeling using both classical Monte-Carlo probabilistic approach and also so-called Hybrid approach using possibility theory. Second approach deals with new multi-criteria decision-making methods which consider information imperfection, source reliability, importance and conflict, using fuzzy sets as well as possibility and belief function theories. Implemented methods consider information imperfection propagation and information fusion in total aggregation methods such as AHP (Saaty, 1980) or partial aggregation methods such as the Electre outranking method (see Soft Electre Tri ) or decisions in certain but also risky or uncertain contexts (see new COWA-ER and FOWA-ER- Cautious and Fuzzy Ordered Weighted Averaging-Evidential Reasoning). For example, the ER-MCDA methodology considers expert assessment as a multi-criteria decision process based on imperfect information provided by more or less heterogeneous, reliable and conflicting sources: it mixes AHP, fuzzy sets theory, possibility theory and belief function theory using DSmT (Dezert-Smarandache Theory) framework which provides powerful fusion rules.
A game theory-based trust measurement model for social networks.
Wang, Yingjie; Cai, Zhipeng; Yin, Guisheng; Gao, Yang; Tong, Xiangrong; Han, Qilong
2016-01-01
In social networks, trust is a complex social network. Participants in online social networks want to share information and experiences with as many reliable users as possible. However, the modeling of trust is complicated and application dependent. Modeling trust needs to consider interaction history, recommendation, user behaviors and so on. Therefore, modeling trust is an important focus for online social networks. We propose a game theory-based trust measurement model for social networks. The trust degree is calculated from three aspects, service reliability, feedback effectiveness, recommendation credibility, to get more accurate result. In addition, to alleviate the free-riding problem, we propose a game theory-based punishment mechanism for specific trust and global trust, respectively. We prove that the proposed trust measurement model is effective. The free-riding problem can be resolved effectively through adding the proposed punishment mechanism.
Self-consistent linear response for the spin-orbit interaction related properties
NASA Astrophysics Data System (ADS)
Solovyev, I. V.
2014-07-01
In many cases, the relativistic spin-orbit (SO) interaction can be regarded as a small perturbation to the electronic structure of solids and treated using regular perturbation theory. The major obstacle on this route comes from the fact that the SO interaction can also polarize the electron system and produce some additional contributions to the perturbation theory expansion, which arise from the electron-electron interactions in the same order of the SO coupling. In electronic structure calculations, it may even lead to the necessity of abandoning the perturbation theory and returning to the original self-consistent solution of Kohn-Sham-like equations with the effective potential v̂, incorporating simultaneously the effects of the electron-electron interactions and the SO coupling, even though the latter is small. In this work, we present the theory of self-consistent linear response (SCLR), which allows us to get rid of numerical self-consistency and formulate the last step fully analytically in the first order of the SO coupling. This strategy is applied to the unrestricted Hartree-Fock solution of an effective Hubbard-type model, derived from the first-principles electronic structure calculations in the basis of Wannier functions for the magnetically active states. We show that by using v̂, obtained in SCLR, one can successfully reproduce results of ordinary self-consistent calculations for the orbital magnetization and other properties, which emerge in the first order of the SO coupling. Particularly, SCLR appears to be an extremely useful approach for calculations of antisymmetric Dzyaloshinskii-Moriya (DM) interactions based on the magnetic force theorem, where only by using the total perturbation one can make a reliable estimate for the DM parameters. Furthermore, due to the powerful 2n+1 theorem, the SCLR theory allows us to obtain the total energy change up to the third order of the SO coupling, which can be used in calculations of magnetic anisotropy of compounds with low crystal symmetry. The fruitfulness of this approach for the analysis of complex magnetic structures is illustrated in a number of examples, including the quantitative description of the spin canting in YTiO3 and LaMnO3, formation of the spin-spiral order in BiFeO3, and the magnetic inversion symmetry breaking in BiMnO3, which gives rise to both ferroelectric activity and DM interactions, responsible for the ferromagnetism. In all these cases, the use of SCLR tremendously reduces the computational efforts related to the search for noncollinear magnetic structures in the ground state.
Bastien, Olivier; Maréchal, Eric
2008-08-07
Confidence in pairwise alignments of biological sequences, obtained by various methods such as Blast or Smith-Waterman, is critical for automatic analyses of genomic data. Two statistical models have been proposed. In the asymptotic limit of long sequences, the Karlin-Altschul model is based on the computation of a P-value, assuming that the number of high scoring matching regions above a threshold is Poisson distributed. Alternatively, the Lipman-Pearson model is based on the computation of a Z-value from a random score distribution obtained by a Monte-Carlo simulation. Z-values allow the deduction of an upper bound of the P-value (1/Z-value2) following the TULIP theorem. Simulations of Z-value distribution is known to fit with a Gumbel law. This remarkable property was not demonstrated and had no obvious biological support. We built a model of evolution of sequences based on aging, as meant in Reliability Theory, using the fact that the amount of information shared between an initial sequence and the sequences in its lineage (i.e., mutual information in Information Theory) is a decreasing function of time. This quantity is simply measured by a sequence alignment score. In systems aging, the failure rate is related to the systems longevity. The system can be a machine with structured components, or a living entity or population. "Reliability" refers to the ability to operate properly according to a standard. Here, the "reliability" of a sequence refers to the ability to conserve a sufficient functional level at the folded and maturated protein level (positive selection pressure). Homologous sequences were considered as systems 1) having a high redundancy of information reflected by the magnitude of their alignment scores, 2) which components are the amino acids that can independently be damaged by random DNA mutations. From these assumptions, we deduced that information shared at each amino acid position evolved with a constant rate, corresponding to the information hazard rate, and that pairwise sequence alignment scores should follow a Gumbel distribution, which parameters could find some theoretical rationale. In particular, one parameter corresponds to the information hazard rate. Extreme value distribution of alignment scores, assessed from high scoring segments pairs following the Karlin-Altschul model, can also be deduced from the Reliability Theory applied to molecular sequences. It reflects the redundancy of information between homologous sequences, under functional conservative pressure. This model also provides a link between concepts of biological sequence analysis and of systems biology.
Space shuttle propellant constitutive law verification tests
NASA Technical Reports Server (NTRS)
Thompson, James R.
1995-01-01
As part of the Propellants Task (Task 2.0) on the Solid Propulsion Integrity Program (SPIP), a database of material properties was generated for the Space Shuttle Redesigned Solid Rocket Motor (RSRM) PBAN-based propellant. A parallel effort on the Propellants Task was the generation of an improved constitutive theory for the PBAN propellant suitable for use in a finite element analysis (FEA) of the RSRM. The outcome of an analysis with the improved constitutive theory would be more reliable prediction of structural margins of safety. The work described in this report was performed by Materials Laboratory personnel at Thiokol Corporation/Huntsville Division under NASA contract NAS8-39619, Mod. 3. The report documents the test procedures for the refinement and verification tests for the improved Space Shuttle RSRM propellant material model, and summarizes the resulting test data. TP-H1148 propellant obtained from mix E660411 (manufactured February 1989) which had experienced ambient igloo storage in Huntsville, Alabama since January 1990, was used for these tests.
The Development and Validation of the Online Shopping Addiction Scale.
Zhao, Haiyan; Tian, Wei; Xin, Tao
2017-01-01
We report the development and validation of a scale to measure online shopping addiction. Inspired by previous theories and research on behavioral addiction, the Griffiths's widely accepted six-factor component model was referred to and an 18-item scale was constructed, with each component measured by three items. The results of exploratory factor analysis, based on Sample 1 (999 college students) and confirmatory factor analysis, based on Sample 2 (854 college students) showed the Griffiths's substantive six-factor structure underlay the online shopping addiction scale. Cronbach's alpha suggested that the resulting scale was highly reliable. Concurrent validity, based on Sample 3 (328 college students), was also satisfactory as indicated by correlations between the scale and measures of similar constructs. Finally, self-perceived online shopping addiction can be predicted to a relatively high degree. The present 18-item scale is a solid theory-based instrument to empirically measure online shopping addiction and can be used for understanding the phenomena among young adults.
The Development and Validation of the Online Shopping Addiction Scale
Zhao, Haiyan; Tian, Wei; Xin, Tao
2017-01-01
We report the development and validation of a scale to measure online shopping addiction. Inspired by previous theories and research on behavioral addiction, the Griffiths's widely accepted six-factor component model was referred to and an 18-item scale was constructed, with each component measured by three items. The results of exploratory factor analysis, based on Sample 1 (999 college students) and confirmatory factor analysis, based on Sample 2 (854 college students) showed the Griffiths's substantive six-factor structure underlay the online shopping addiction scale. Cronbach's alpha suggested that the resulting scale was highly reliable. Concurrent validity, based on Sample 3 (328 college students), was also satisfactory as indicated by correlations between the scale and measures of similar constructs. Finally, self-perceived online shopping addiction can be predicted to a relatively high degree. The present 18-item scale is a solid theory-based instrument to empirically measure online shopping addiction and can be used for understanding the phenomena among young adults. PMID:28559864
Genet, Martin; Houmard, Manuel; Eslava, Salvador; Saiz, Eduardo; Tomsia, Antoni P.
2012-01-01
This paper introduces our approach to modeling the mechanical behavior of cellular ceramics, through the example of calcium phosphate scaffolds made by robocasting for bone-tissue engineering. The Weibull theory is used to deal with the scaffolds’ constitutive rods statistical failure, and the Sanchez-Palencia theory of periodic homogenization is used to link the rod- and scaffold-scales. Uniaxial compression of scaffolds and three-point bending of rods were performed to calibrate and validate the model. If calibration based on rod-scale data leads to over-conservative predictions of scaffold’s properties (as rods’ successive failures are not taken into account), we show that, for a given rod diameter, calibration based on scaffold-scale data leads to very satisfactory predictions for a wide range of rod spacing, i.e. of scaffold porosity, as well as for different loading conditions. This work establishes the proposed model as a reliable tool for understanding and optimizing cellular ceramics’ mechanical properties. PMID:23439936
Liao, Peilin; Carter, Emily A
2011-09-07
Quantitative characterization of low-lying excited electronic states in materials is critical for the development of solar energy conversion materials. The many-body Green's function method known as the GW approximation (GWA) directly probes states corresponding to photoemission and inverse photoemission experiments, thereby determining the associated band structure. Several versions of the GW approximation with different levels of self-consistency exist in the field. While the GWA based on density functional theory (DFT) works well for conventional semiconductors, less is known about its reliability for strongly correlated semiconducting materials. Here we present a systematic study of the GWA using hematite (α-Fe(2)O(3)) as the benchmark material. We analyze its performance in terms of the calculated photoemission/inverse photoemission band gaps, densities of states, and dielectric functions. Overall, a non-self-consistent G(0)W(0) using input from DFT+U theory produces physical observables in best agreement with experiments. This journal is © the Owner Societies 2011
Maximum likelihood techniques applied to quasi-elastic light scattering
NASA Technical Reports Server (NTRS)
Edwards, Robert V.
1992-01-01
There is a necessity of having an automatic procedure for reliable estimation of the quality of the measurement of particle size from QELS (Quasi-Elastic Light Scattering). Getting the measurement itself, before any error estimates can be made, is a problem because it is obtained by a very indirect measurement of a signal derived from the motion of particles in the system and requires the solution of an inverse problem. The eigenvalue structure of the transform that generates the signal is such that an arbitrarily small amount of noise can obliterate parts of any practical inversion spectrum. This project uses the Maximum Likelihood Estimation (MLE) as a framework to generate a theory and a functioning set of software to oversee the measurement process and extract the particle size information, while at the same time providing error estimates for those measurements. The theory involved verifying a correct form of the covariance matrix for the noise on the measurement and then estimating particle size parameters using a modified histogram approach.
Kohut's Psychology of the Self: Theory and Measures of Counseling Outcome.
ERIC Educational Resources Information Center
Patton, Michael J.; And Others
1982-01-01
Introduces Heinz Kohut's psychology of the self and its counseling implications and reports on the development of 10 eight-point rating scales of counseling outcome that are derived for his theory. Reports data on interrater reliability and agreement for the 10 scales. (Author)
Reliability analysis of composite structures
NASA Technical Reports Server (NTRS)
Kan, Han-Pin
1992-01-01
A probabilistic static stress analysis methodology has been developed to estimate the reliability of a composite structure. Closed form stress analysis methods are the primary analytical tools used in this methodology. These structural mechanics methods are used to identify independent variables whose variations significantly affect the performance of the structure. Once these variables are identified, scatter in their values is evaluated and statistically characterized. The scatter in applied loads and the structural parameters are then fitted to appropriate probabilistic distribution functions. Numerical integration techniques are applied to compute the structural reliability. The predicted reliability accounts for scatter due to variability in material strength, applied load, fabrication and assembly processes. The influence of structural geometry and mode of failure are also considerations in the evaluation. Example problems are given to illustrate various levels of analytical complexity.
Wafer level reliability for high-performance VLSI design
NASA Technical Reports Server (NTRS)
Root, Bryan J.; Seefeldt, James D.
1987-01-01
As very large scale integration architecture requires higher package density, reliability of these devices has approached a critical level. Previous processing techniques allowed a large window for varying reliability. However, as scaling and higher current densities push reliability to its limit, tighter control and instant feedback becomes critical. Several test structures developed to monitor reliability at the wafer level are described. For example, a test structure was developed to monitor metal integrity in seconds as opposed to weeks or months for conventional testing. Another structure monitors mobile ion contamination at critical steps in the process. Thus the reliability jeopardy can be assessed during fabrication preventing defective devices from ever being placed in the field. Most importantly, the reliability can be assessed on each wafer as opposed to an occasional sample.
NASA Astrophysics Data System (ADS)
Lin, Chien-Liang
2018-02-01
This study sought to develop a self-report instrument to be used in the assessment of the project competences of college students engaged in online project-based learning. Three scales of the KIPSSE instrument developed for this study, namely, the knowledge integration, project skills, and self-efficacy scales, were based on related theories and the analysis results of three project advisor interviews. Those items of knowledge integration and project skill scales focused on the integration of different disciplines and technological skills separately. Two samples of data were collected from information technology-related courses taught with an online project-based learning strategy over different semesters at a college in southern Taiwan. The validity and reliability of the KIPSSE instrument were confirmed through item analysis and confirmatory factor analysis using structural equation modeling of two samples of students' online response sets separately. The Cronbach's alpha reliability coefficient for the entire instrument was 0.931; for each scale, the alpha ranged from 0.832 to 0.907. There was also a significant correlation ( r = 0.55, p < 0.01) between the KIPSSE instrument results and the students' product evaluation scores. The findings of this study confirmed the validity and reliability of the KIPSSE instrument. The confirmation process and related implications are also discussed.
Leon, Jaime; Medina-Garrido, Elena; Núñez, Juan L.
2017-01-01
Math achievement and engagement declines in secondary education; therefore, educators are faced with the challenge of engaging students to avoid school failure. Within self-determination theory, we address the need to assess comprehensively student perceptions of teaching quality that predict engagement and achievement. In study one we tested, in a sample of 548 high school students, a preliminary version of a scale to assess nine factors: teaching for relevance, acknowledge negative feelings, participation encouragement, controlling language, optimal challenge, focus on the process, class structure, positive feedback, and caring. In the second study, we analyzed the scale’s reliability and validity in a sample of 1555 high school students. The scale showed evidence of reliability, and with regard to criterion validity, at the classroom level, teaching quality was a predictor of behavioral engagement, and higher grades were observed in classes where students, as a whole, displayed more behavioral engagement. At the within level, behavioral engagement was associated with achievement. We not only provide a reliable and valid method to assess teaching quality, but also a method to design interventions, these could be designed based on the scale items to encourage students to persist and display more engagement on school duties, which in turn bolsters student achievement. PMID:28701964
Setyonugroho, Winny; Kropmans, Thomas; Murphy, Ruth; Hayes, Peter; van Dalen, Jan; Kennedy, Kieran M
2018-01-01
Comparing outcome of clinical skills assessment is challenging. This study proposes reliable and valid comparison of communication skills (1) assessment as practiced in Objective Structured Clinical Examinations (2). The aim of the present study is to compare CS assessment, as standardized according to the MAAS Global, between stations in a single undergraduate medical year. An OSCE delivered in an Irish undergraduate curriculum was studied. We chose the MAAS-Global as an internationally recognized and validated instrument to calibrate the OSCE station items. The MAAS-Global proportion is the percentage of station checklist items that can be considered as 'true' CS. The reliability of the OSCE was calculated with G-Theory analysis and nested ANOVA was used to compare mean scores of all years. MAAS-Global scores in psychiatry stations were significantly higher than those in other disciplines (p<0.03) and above the initial pass mark of 50%. The higher students' scores in psychiatry stations were related to higher MAAS-Global proportions when compared to the general practice stations. Comparison of outcome measurements, using the MAAS Global as a standardization instrument, between interdisciplinary station checklists was valid and reliable. The MAAS-Global was used as a single validated instrument and is suggested as gold standard. Copyright © 2017. Published by Elsevier B.V.
Judging nursing information on the WWW: a theoretical understanding.
Cader, Raffik; Campbell, Steve; Watson, Don
2009-09-01
This paper is a report of a study of the judgement processes nurses use when evaluating World Wide Web information related to nursing practice. The World Wide Web has increased the global accessibility of online health information. However, the variable nature of the quality of World Wide Web information and its perceived level of reliability may lead to misinformation. This makes demands on healthcare professionals, and on nurses in particular, to ensure that health information of reliable quality is selected for use in practice. A grounded theory approach was adopted. Semi-structured interviews and focus groups were used to collect data, between 2004 and 2005, from 20 nurses undertaking a postqualification graduate course at a university and 13 nurses from a local hospital in the United Kingdom. A theoretical framework emerged that gave insight into the judgement process nurses use when evaluating World Wide Web information. Participants broke the judgement process down into specific tasks. In addition, they used tacit, process and propositional knowledge and intuition, quasi-rational cognition and analysis to undertake these tasks. World Wide Web information cues, time available and nurses' critical skills were influencing factors in their judgement process. Addressing the issue of quality and reliability associated with World Wide Web information is a global challenge. This theoretical framework could contribute towards meeting this challenge.
Leon, Jaime; Medina-Garrido, Elena; Núñez, Juan L
2017-01-01
Math achievement and engagement declines in secondary education; therefore, educators are faced with the challenge of engaging students to avoid school failure. Within self-determination theory, we address the need to assess comprehensively student perceptions of teaching quality that predict engagement and achievement. In study one we tested, in a sample of 548 high school students, a preliminary version of a scale to assess nine factors: teaching for relevance, acknowledge negative feelings, participation encouragement, controlling language, optimal challenge, focus on the process, class structure, positive feedback, and caring. In the second study, we analyzed the scale's reliability and validity in a sample of 1555 high school students. The scale showed evidence of reliability, and with regard to criterion validity, at the classroom level, teaching quality was a predictor of behavioral engagement, and higher grades were observed in classes where students, as a whole, displayed more behavioral engagement. At the within level, behavioral engagement was associated with achievement. We not only provide a reliable and valid method to assess teaching quality, but also a method to design interventions, these could be designed based on the scale items to encourage students to persist and display more engagement on school duties, which in turn bolsters student achievement.
Stirling Convertor Fasteners Reliability Quantification
NASA Technical Reports Server (NTRS)
Shah, Ashwin R.; Korovaichuk, Igor; Kovacevich, Tiodor; Schreiber, Jeffrey G.
2006-01-01
Onboard Radioisotope Power Systems (RPS) being developed for NASA s deep-space science and exploration missions require reliable operation for up to 14 years and beyond. Stirling power conversion is a candidate for use in an RPS because it offers a multifold increase in the conversion efficiency of heat to electric power and reduced inventory of radioactive material. Structural fasteners are responsible to maintain structural integrity of the Stirling power convertor, which is critical to ensure reliable performance during the entire mission. Design of fasteners involve variables related to the fabrication, manufacturing, behavior of fasteners and joining parts material, structural geometry of the joining components, size and spacing of fasteners, mission loads, boundary conditions, etc. These variables have inherent uncertainties, which need to be accounted for in the reliability assessment. This paper describes these uncertainties along with a methodology to quantify the reliability, and provides results of the analysis in terms of quantified reliability and sensitivity of Stirling power conversion reliability to the design variables. Quantification of the reliability includes both structural and functional aspects of the joining components. Based on the results, the paper also describes guidelines to improve the reliability and verification testing.
An Energy-Based Limit State Function for Estimation of Structural Reliability in Shock Environments
Guthrie, Michael A.
2013-01-01
limit state function is developed for the estimation of structural reliability in shock environments. This limit state function uses peak modal strain energies to characterize environmental severity and modal strain energies at failure to characterize the structural capacity. The Hasofer-Lind reliability index is briefly reviewed and its computation for the energy-based limit state function is discussed. Applications to two degree of freedom mass-spring systems and to a simple finite element model are considered. For these examples, computation of the reliability index requires little effort beyond a modal analysis, but still accounts for relevant uncertainties in both the structure and environment.more » For both examples, the reliability index is observed to agree well with the results of Monte Carlo analysis. In situations where fast, qualitative comparison of several candidate designs is required, the reliability index based on the proposed limit state function provides an attractive metric which can be used to compare and control reliability.« less
Bellili, A; Linguerri, R; Hochlaf, M; Puzzarini, C
2015-11-14
In an effort to provide an accurate structural and spectroscopic characterization of acetyl cyanide, its two enolic isomers and the corresponding cationic species, state-of-the-art computational methods, and approaches have been employed. The coupled-cluster theory including single and double excitations together with a perturbative treatment of triples has been used as starting point in composite schemes accounting for extrapolation to the complete basis-set limit as well as core-valence correlation effects to determine highly accurate molecular structures, fundamental vibrational frequencies, and rotational parameters. The available experimental data for acetyl cyanide allowed us to assess the reliability of our computations: structural, energetic, and spectroscopic properties have been obtained with an overall accuracy of about, or better than, 0.001 Å, 2 kcal/mol, 1-10 MHz, and 11 cm(-1) for bond distances, adiabatic ionization potentials, rotational constants, and fundamental vibrational frequencies, respectively. We are therefore confident that the highly accurate spectroscopic data provided herein can be useful for guiding future experimental investigations and/or astronomical observations.
NASA Astrophysics Data System (ADS)
Roy, A.; Staino, A.; (D Ghosh, A.; Basu, B.; Chatterjee, S.
2016-09-01
Elevated water tanks (EWTs), being top-heavy structures, are highly vulnerable to earthquake forces, and several have experienced damage/failure in past seismic events. However, as these are critical facilities whose continued performance in the post-earthquake scenario is of vital concern, it is significant to investigate their seismic vibration control using reliable and cost-effective passive dampers such as the Tuned Liquid Damper (TLD). Here, this aspect is studied for flexible EWT structures, such as those with annular shaft supports. The criterion of tuning the sloshing frequency of the TLD to the structural frequency necessitates dimensions of the TLD larger than those hitherto examined in literature. Hence the nonlinear model of the TLD based on established shallow water wave theory is verified for large container size by employing Real-Time-Hybrid-Testing (RTHT). Simulation studies are further carried out on a realistic example of a flexible EWT structure with TLDs. Results indicate that the TLD can be applied very effectively for the seismic vibration mitigation of EWTs.
Fluid dynamics simulation for design on sludge drying equipment
NASA Astrophysics Data System (ADS)
Li, Shuiping; Liang, Wang; Kai, Zhang
2017-10-01
Sludge drying equipment is a key component in the sludge drying disposal, the structure of drying equipment directly affects the drying disposal of the sludge, so it is necessary to analyse the performance of the drying equipment with different structure. Fluent software can be very convenient to get the distribution of the flow field and temperature field inside the drying equipment which reflects the performance of the structure. In this paper, the outlet position of the sludge and the shape of the sludge inlet are designed. The geometrical model of the drying equipment is established by using pre-processing software Gambit, and the meshing of the model is carried out. The Eulerian model is used to simulate the flow of each phase and the interaction between them, and the realizable turbulence model is used to simulate the turbulence of each phase. Finally, the simulation results of the scheme are compared and the optimal structure scheme is obtained, the operational requirement is proposed. The CFD theory provides a reliable basis for the drying equipment research and reduces the time and costs of the research.
NASA Astrophysics Data System (ADS)
Lin, Jingwu; Wang, Lei; Hu, Zhi; Li, Xiao; Yan, Hong
2017-02-01
The structural, thermodynamic, mechanical and electronic properties of cubic Al2Sm intermetallic compound are investigated by the first-principles method on the basis of density functional theory. In light of the strong on-site Coulomb repulsion between the highly localized 4f electrons of Sm atoms, the local spin density approximation approach paired with additional Hubbard terms is employed to achieve appropriate results. Moreover, to examine the reliability of this study, the experimental value of lattice parameter is procured from the analysis of the TEM image and diffraction pattern of Al2Sm phase in the AZ31 alloy to verify the authenticity of the results originated from the computational method. The value of cohesive energy reveals Al2Sm to be a stable in absolute zero Kelvin. According to the stability criteria, the subject of this work is mechanically stable. Afterward, elastic moduli are deduced by performing Voigt-Reuss-Hill approximation. Furthermore, elastic anisotropy and anisotropy of sound velocity are discussed. Finally, the calculation of electronic density of states is implemented to explore the underlying mechanism of structural stability.
ERIC Educational Resources Information Center
Schweig, Jonathan
2013-01-01
Measuring school and classroom environments has become central in a nation-wide effort to develop comprehensive programs that measure teacher quality and teacher effectiveness. Formulating successful programs necessitates accurate and reliable methods for measuring these environmental variables. This paper uses a generalizability theory framework…
ERIC Educational Resources Information Center
Meyer, J. Patrick; Liu, Xiang; Mashburn, Andrew J.
2014-01-01
Researchers often use generalizability theory to estimate relative error variance and reliability in teaching observation measures. They also use it to plan future studies and design the best possible measurement procedures. However, designing the best possible measurement procedure comes at a cost, and researchers must stay within their budget…
Reliability, Validity and Utility of a Multiple Intelligences Assessment for Career Planning.
ERIC Educational Resources Information Center
Shearer, C. Branton
"The Multiple Intelligences Developmental Assessment Scales" (MIDAS) is a self- (or other-) completed instrument which is based upon the theory of multiple intelligences. The validity, reliability, and utility data regarding the MIDAS are reported here. The measure consists of 7 main scales and 24 subscales which summarize a person's intellectual…
Kelly, Natasha B.; Alonzo, Suzanne H.
2009-01-01
Existing theory predicts that male signalling can be an unreliable indicator of paternal care, but assumes that males with high levels of mating success can have high current reproductive success, without providing any parental care. As a result, this theory does not hold for the many species where offspring survival depends on male parental care. We modelled male allocation of resources between advertisement and care for species with male care where males vary in quality, and the effect of care and advertisement on male fitness is multiplicative rather than additive. Our model predicts that males will allocate proportionally more of their resources to whichever trait (advertisement or paternal care) is more fitness limiting. In contrast to previous theory, we find that male advertisement is always a reliable indicator of paternal care and male phenotypic quality (e.g. males with higher levels of advertisement never allocate less to care than males with lower levels of advertisement). Our model shows that the predicted pattern of male allocation and the reliability of male signalling depend very strongly on whether paternal care is assumed to be necessary for offspring survival and how male care affects offspring survival and male fitness. PMID:19520802
Kelly, Natasha B; Alonzo, Suzanne H
2009-09-07
Existing theory predicts that male signalling can be an unreliable indicator of paternal care, but assumes that males with high levels of mating success can have high current reproductive success, without providing any parental care. As a result, this theory does not hold for the many species where offspring survival depends on male parental care. We modelled male allocation of resources between advertisement and care for species with male care where males vary in quality, and the effect of care and advertisement on male fitness is multiplicative rather than additive. Our model predicts that males will allocate proportionally more of their resources to whichever trait (advertisement or paternal care) is more fitness limiting. In contrast to previous theory, we find that male advertisement is always a reliable indicator of paternal care and male phenotypic quality (e.g. males with higher levels of advertisement never allocate less to care than males with lower levels of advertisement). Our model shows that the predicted pattern of male allocation and the reliability of male signalling depend very strongly on whether paternal care is assumed to be necessary for offspring survival and how male care affects offspring survival and male fitness.
Lang, Jonas W B
2014-07-01
The measurement of implicit or unconscious motives using the picture story exercise (PSE) has long been a target of debate in the psychological literature. Most debates have centered on the apparent paradox that PSE measures of implicit motives typically show low internal consistency reliability on common indices like Cronbach's alpha but nevertheless predict behavioral outcomes. I describe a dynamic Thurstonian item response theory (IRT) model that builds on dynamic system theories of motivation, theorizing on the PSE response process, and recent advancements in Thurstonian IRT modeling of choice data. To assess the models' capability to explain the internal consistency paradox, I first fitted the model to archival data (Gurin, Veroff, & Feld, 1957) and then simulated data based on bias-corrected model estimates from the real data. Simulation results revealed that the average squared correlation reliability for the motives in the Thurstonian IRT model was .74 and that Cronbach's alpha values were similar to the real data (<.35). These findings suggest that PSE motive measures have long been reliable and increase the scientific value of extant evidence from motivational research using PSE motive measures. (c) 2014 APA, all rights reserved.
Measurement error: Implications for diagnosis and discrepancy models of developmental dyslexia.
Cotton, Sue M; Crewther, David P; Crewther, Sheila G
2005-08-01
The diagnosis of developmental dyslexia (DD) is reliant on a discrepancy between intellectual functioning and reading achievement. Discrepancy-based formulae have frequently been employed to establish the significance of the difference between 'intelligence' and 'actual' reading achievement. These formulae, however, often fail to take into consideration test reliability and the error associated with a single test score. This paper provides an illustration of the potential effects that test reliability and measurement error can have on the diagnosis of dyslexia, with particular reference to discrepancy models. The roles of reliability and standard error of measurement (SEM) in classic test theory are also briefly reviewed. This is followed by illustrations of how SEM and test reliability can aid with the interpretation of a simple discrepancy-based formula of DD. It is proposed that a lack of consideration of test theory in the use of discrepancy-based models of DD can lead to misdiagnosis (both false positives and false negatives). Further, misdiagnosis in research samples affects reproducibility and generalizability of findings. This in turn, may explain current inconsistencies in research on the perceptual, sensory, and motor correlates of dyslexia.
The Development of an Instrument for Measuring Healing
Meza, James Peter; Fahoome, Gail F.
2008-01-01
PURPOSE Our lack of ability to measure healing attributes impairs our ability to research the topic. The specific aim of this project is to describe the psychological and social construct of healing and to create a valid and reliable measurement scale for attributes of healing. METHODS A content expert conducted a domain analysis examining the existing literature of midrange theories of healing. Theme saturation of content sampling was ensured by brainstorming more than 220 potential items. Selection of items was sequential: pile sorting and data reduction, with factor analysis of a mailed 54-item questionnaire. Criterion validity (convergent and divergent) and temporal reliability were established using a second mailing of the development version of the instrument. Construct validity was judged with structural equation modeling for goodness of fit. RESULTS Cronbach’s α of the original questionnaire was .869 and the final scale was .862. The test-retest reliability was .849. Eigenvalues for the 2 factors were 8 and 4, respectively. Divergent and convergent validity using the Spann-Fischer Codependency Scale and SF-36 mental health and emotional subscales were consistent with predictions. The root mean square error of approximation was 0.066 and Bentler’s Comparative Fit Index was 0.871. Root mean square residual was 0.102. CONCLUSIONS We developed a valid and reliable measurement scale for attributes of healing, which we named the Self-Integration Scale v 2.1. By creating a new variable, new areas of research in humanistic health care are possible. PMID:18626036
Progress in GaN devices performances and reliability
NASA Astrophysics Data System (ADS)
Saunier, P.; Lee, C.; Jimenez, J.; Balistreri, A.; Dumka, D.; Tserng, H. Q.; Kao, M. Y.; Chowdhury, U.; Chao, P. C.; Chu, K.; Souzis, A.; Eliashevich, I.; Guo, S.; del Alamo, J.; Joh, J.; Shur, M.
2008-02-01
With the DARPA Wide Bandgap Semiconductor Technology RF Thrust Contract, TriQuint Semiconductor and its partners, BAE Systems, Lockheed Martin, IQE-RF, II-VI, Nitronex, M.I.T., and R.P.I. are achieving great progress towards the overall goal of making Gallium Nitride a revolutionary RF technology ready to be inserted in defense and commercial applications. Performance and reliability are two critical components of success (along with cost and manufacturability). In this paper we will discuss these two aspects. Our emphasis is now operation at 40 V bias voltage (we had been working at 28 V). 1250 µm devices have power densities in the 6 to 9 W/mm with associated efficiencies in the low- to mid 60 % and associated gain in the 12 to 12.5 dB at 10 GHz. We are using a dual field-plate structure to optimize these performances. Very good performances have also been achieved at 18 GHz with 400 µm devices. Excellent progress has been made in reliability. Our preliminary DC and RF reliability tests at 40 V indicate a MTTF of 1E6hrs with1.3 eV activation energy at 150 0C channel temperature. Jesus Del Alamo at MIT has greatly refined our initial findings leading to a strain related theory of degradation that is driven by electric fields. Degradation can occur on the drain edge of the gate due to excessive strain given by inverse piezoelectric effect.
A neural-network potential through charge equilibration for WS2: From clusters to sheets
NASA Astrophysics Data System (ADS)
Hafizi, Roohollah; Ghasemi, S. Alireza; Hashemifar, S. Javad; Akbarzadeh, Hadi
2017-12-01
In the present work, we use a machine learning method to construct a high-dimensional potential for tungsten disulfide using a charge equilibration neural-network technique. A training set of stoichiometric WS2 clusters is prepared in the framework of density functional theory. After training the neural-network potential, the reliability and transferability of the potential are verified by performing a crystal structure search on bulk phases of WS2 and by plotting energy-area curves of two different monolayers. Then, we use the potential to investigate various triangular nano-clusters and nanotubes of WS2. In the case of nano-structures, we argue that 2H atomic configurations with sulfur rich edges are thermodynamically more stable than the other investigated configurations. We also studied a number of WS2 nanotubes which revealed that 1T tubes with armchair chirality exhibit lower bending stiffness.
The coefficient of bond thermal expansion measured by extended x-ray absorption fine structure.
Fornasini, P; Grisenti, R
2014-10-28
The bond thermal expansion is in principle different from the lattice expansion and can be measured by correlation sensitive probes such as extended x-ray absorption fine structure (EXAFS) and diffuse scattering. The temperature dependence of the coefficient α(bond)(T) of bond thermal expansion has been obtained from EXAFS for CdTe and for Cu. A coefficient α(tens)(T) of negative expansion due to tension effects has been calculated from the comparison of bond and lattice expansions. Negative lattice expansion is present in temperature intervals where α(bond) prevails over α(tens); this real-space approach is complementary but not equivalent to the Grüneisen theory. The relevance of taking into account the asymmetry of the nearest-neighbours distribution of distances in order to get reliable bond expansion values and the physical meaning of the third cumulant are thoroughly discussed.
Efficient first-principles prediction of solid stability: Towards chemical accuracy
NASA Astrophysics Data System (ADS)
Zhang, Yubo; Kitchaev, Daniil A.; Yang, Julia; Chen, Tina; Dacek, Stephen T.; Sarmiento-Pérez, Rafael A.; Marques, Maguel A. L.; Peng, Haowei; Ceder, Gerbrand; Perdew, John P.; Sun, Jianwei
2018-03-01
The question of material stability is of fundamental importance to any analysis of system properties in condensed matter physics and materials science. The ability to evaluate chemical stability, i.e., whether a stoichiometry will persist in some chemical environment, and structure selection, i.e. what crystal structure a stoichiometry will adopt, is critical to the prediction of materials synthesis, reactivity and properties. Here, we demonstrate that density functional theory, with the recently developed strongly constrained and appropriately normed (SCAN) functional, has advanced to a point where both facets of the stability problem can be reliably and efficiently predicted for main group compounds, while transition metal compounds are improved but remain a challenge. SCAN therefore offers a robust model for a significant portion of the periodic table, presenting an opportunity for the development of novel materials and the study of fine phase transformations even in largely unexplored systems with little to no experimental data.
NASA Technical Reports Server (NTRS)
Hanagud, S.; Uppaluri, B.
1975-01-01
This paper describes a methodology for making cost effective fatigue design decisions. The methodology is based on a probabilistic model for the stochastic process of fatigue crack growth with time. The development of a particular model for the stochastic process is also discussed in the paper. The model is based on the assumption of continuous time and discrete space of crack lengths. Statistical decision theory and the developed probabilistic model are used to develop the procedure for making fatigue design decisions on the basis of minimum expected cost or risk function and reliability bounds. Selections of initial flaw size distribution, NDT, repair threshold crack lengths, and inspection intervals are discussed.
Long-term changes after brief dynamic psychotherapy: symptomatic versus dynamic assessments.
Høglend, P; Sørlie, T; Sørbye, O; Heyerdahl, O; Amlo, S
1992-08-01
Dynamic change in psychotherapy, as measured by theory-related or mode-specific instruments, have been criticized for being too intercorrelated with symptomatic change measures. In this study, long-term changes after brief dynamic psychotherapy were studied in 45 moderately disturbed neurotic patients by a reliable outcome battery. The factor structure of all the change variables suggested that they tapped 2 distinct and stable sources of variance: dynamic and symptomatic change. The categories of overall dynamic change were different from categories of change on the Global Assessment Scale. A small systematic difference was found between the categories of overall dynamic change and the categories of target complaints change also, due to false solutions of dynamic conflicts.
An adaptive interpolation scheme for molecular potential energy surfaces
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kowalewski, Markus, E-mail: mkowalew@uci.edu; Larsson, Elisabeth; Heryudono, Alfa
The calculation of potential energy surfaces for quantum dynamics can be a time consuming task—especially when a high level of theory for the electronic structure calculation is required. We propose an adaptive interpolation algorithm based on polyharmonic splines combined with a partition of unity approach. The adaptive node refinement allows to greatly reduce the number of sample points by employing a local error estimate. The algorithm and its scaling behavior are evaluated for a model function in 2, 3, and 4 dimensions. The developed algorithm allows for a more rapid and reliable interpolation of a potential energy surface within amore » given accuracy compared to the non-adaptive version.« less
Simulation of pipeline in the area of the underwater crossing
NASA Astrophysics Data System (ADS)
Burkov, P.; Chernyavskiy, D.; Burkova, S.; Konan, E. C.
2014-08-01
The article studies stress-strain behavior of the main oil-pipeline section Alexandrovskoye-Anzhero-Sudzhensk using software system Ansys. This method of examination and assessment of technical conditions of objects of pipeline transport studies the objects and the processes that affect the technical condition of these facilities, including the research on the basis of computer simulation. Such approach allows to develop the theory, methods of calculations and designing of objects of pipeline transport, units and parts of machines, regardless of their industry and destination with a view to improve the existing constructions and create new structures, machines of high performance, durability and reliability, maintainability, low material capacity and cost, which have competitiveness on the world market.
NASA Astrophysics Data System (ADS)
Kubicka, Katarzyna; Radoń, Urszula; Szaniec, Waldemar; Pawlak, Urszula
2017-10-01
The paper concerns the reliability analysis of steel structures subjected to high temperatures of fire gases. Two types of spatial structures were analysed, namely with pinned and rigid nodes. The fire analysis was carried out according to prescriptions of Eurocode. The static-strength analysis was conducted using the finite element method (FEM). The MES3D program, developed by Szaniec (Kielce University of Technology, Poland), was used for this purpose. The results received from MES3D made it possible to carry out the reliability analysis using the Numpress Explore program that was developed at the Institute of Fundamental Technological Research of the Polish Academy of Sciences [9]. The measurement of reliability of structures is the Hasofer-Lind reliability index (β). The reliability analysis was carried out according to approximation (FORM, SORM) and simulation (Importance Sampling, Monte Carlo) methods. As the fire progresses, the value of reliability index decreases. The analysis conducted for the study made it possible to evaluate the impact of node types on those changes. In real structures, it is often difficult to define correctly types of nodes, so some simplifications are made. The presented analysis contributes to the recognition of consequences of such assumptions for the safety of structures, subjected to fire.
Design optimization for cost and quality: The robust design approach
NASA Technical Reports Server (NTRS)
Unal, Resit
1990-01-01
Designing reliable, low cost, and operable space systems has become the key to future space operations. Designing high quality space systems at low cost is an economic and technological challenge to the designer. A systematic and efficient way to meet this challenge is a new method of design optimization for performance, quality, and cost, called Robust Design. Robust Design is an approach for design optimization. It consists of: making system performance insensitive to material and subsystem variation, thus allowing the use of less costly materials and components; making designs less sensitive to the variations in the operating environment, thus improving reliability and reducing operating costs; and using a new structured development process so that engineering time is used most productively. The objective in Robust Design is to select the best combination of controllable design parameters so that the system is most robust to uncontrollable noise factors. The robust design methodology uses a mathematical tool called an orthogonal array, from design of experiments theory, to study a large number of decision variables with a significantly small number of experiments. Robust design also uses a statistical measure of performance, called a signal-to-noise ratio, from electrical control theory, to evaluate the level of performance and the effect of noise factors. The purpose is to investigate the Robust Design methodology for improving quality and cost, demonstrate its application by the use of an example, and suggest its use as an integral part of space system design process.
Evaluation, Use, and Refinement of Knowledge Representations through Acquisition Modeling
ERIC Educational Resources Information Center
Pearl, Lisa
2017-01-01
Generative approaches to language have long recognized the natural link between theories of knowledge representation and theories of knowledge acquisition. The basic idea is that the knowledge representations provided by Universal Grammar enable children to acquire language as reliably as they do because these representations highlight the…
Allometric scaling theory applied to FIA biomass estimation
David C. Chojnacky
2002-01-01
Tree biomass estimates in the Forest Inventory and Analysis (FIA) database are derived from numerous methodologies whose abundance and complexity raise questions about consistent results throughout the U.S. A new model based on allometric scaling theory ("WBE") offers simplified methodology and a theoretically sound basis for improving the reliability and...
Using Rasch Analysis to Inform Rating Scale Development
ERIC Educational Resources Information Center
Van Zile-Tamsen, Carol
2017-01-01
The use of surveys, questionnaires, and rating scales to measure important outcomes in higher education is pervasive, but reliability and validity information is often based on problematic Classical Test Theory approaches. Rasch Analysis, based on Item Response Theory, provides a better alternative for examining the psychometric quality of rating…
Validation of the Mindful Coping Scale
ERIC Educational Resources Information Center
Tharaldsen, Kjersti B.; Bru, Edvin
2011-01-01
The aim of this research is to develop and validate a self-report measure of mindfulness and coping, the mindful coping scale (MCS). Dimensions of mindful coping were theoretically deduced from mindfulness theory and coping theory. The MCS was empirically evaluated by use of factor analyses, reliability testing and nomological network validation.…
Cognitive Diagnostic Attribute-Level Discrimination Indices
ERIC Educational Resources Information Center
Henson, Robert; Roussos, Louis; Douglas, Jeff; He, Xuming
2008-01-01
Cognitive diagnostic models (CDMs) model the probability of correctly answering an item as a function of an examinee's attribute mastery pattern. Because estimation of the mastery pattern involves more than a continuous measure of ability, reliability concepts introduced by classical test theory and item response theory do not apply. The cognitive…
The Theory of Planned Behavior and Helmet Use among College Students
ERIC Educational Resources Information Center
Ross, Lisa Thomson; Ross, Thomas P.; Farber, Sarah; Davidson, Caroline; Trevino, Meredith; Hawkins, Ashley
2011-01-01
Objectives: To assess undergraduate helmet use attitudes and behaviors in accordance with the theory of planned behavior (TPB). We predicted helmet wearers and nonwearers would differ on our subscales. Methods: Participants (N = 414, 69% female, 84% white) completed a survey. Results: Principal component analysis and reliability analysis guided…
Naive Theories of Social Groups
ERIC Educational Resources Information Center
Rhodes, Marjorie
2012-01-01
Four studies examined children's (ages 3-10, Total N = 235) naive theories of social groups, in particular, their expectations about how group memberships constrain social interactions. After introduction to novel groups of people, preschoolers (ages 3-5) reliably expected agents from one group to harm members of the other group (rather than…
Assessing the Dependability of Drinking Motives via Generalizability Theory
ERIC Educational Resources Information Center
Arterberry, Brooke J.; Martens, Matthew P.; Cadigan, Jennifer M.; Smith, Ashley E.
2012-01-01
This study assessed the score reliability of the Drinking Motives Questionnaire-Revised (DMQ-R) via generalizability theory. Participants (n = 367 college students) completed the DMQ-R at three time points. Across subscale scores, persons, persons x occasions, and persons x items interactions accounted for meaningful variance. Findings illustrate…
Coordination of Knowledge in Judging Animated Motion
ERIC Educational Resources Information Center
Thaden-Koch, Thomas C.; Dufresne, Robert J.; Mestre, Jose P.
2006-01-01
Coordination class theory is used to explain college students' judgments about animated depictions of moving objects. diSessa's coordination class theory models a "concept" as a complex knowledge system that can reliably determine a particular type of information in widely varying situations. In the experiment described here, fifty individually…
A Study of Sustainable Assessment Theory in Higher Education Tutorials
ERIC Educational Resources Information Center
Beck, Robert J.; Skinner, William F.; Schwabrow, Lynsey A.
2013-01-01
A study of sustainable assessment theory in nine tutorial courses at four colleges demonstrated that three long-term learning outcomes improved: Independence, Intellectual Maturity and Creativity. Eight of 10 traits associated with these outcomes were validated through internal reliability, faculty and student rubrics, and faculty case studies…
[Multiple mini interviews before the occupation of main training posts in paediatrics].
Hertel, Niels Thomas; Bjerager, Mia; Boas, Malene; Boisen, Kirsten A; Børch, Klaus; Frederiksen, Marianne Sjølin; Holm, Kirsten; Grum-Nymann, Anette; Johnsen, Martin M; Whitehouse, Stine; Balslev, Thomas
2013-09-09
Interviews are mandatory in Denmark when selecting doctors for training positions. We used multiple mini interviews (MMI) at four recruitment rounds for the main training posts in paediatrics. In total, 125 candidates were evaluated and assessed by CV and MMI (4-5 stations). Reliability for individual stations in MMI assessed by Cronbach's alpha was adequate (0.63-0.92). The overall reliability assessed by G-theory was lower, suggesting that different skills were tested. The acceptability was high. Our experiences with MMI suggest good feasibility and reliability. An increasing number of stations may improve the overall reliability.
Soil variability in engineering applications
NASA Astrophysics Data System (ADS)
Vessia, Giovanna
2014-05-01
Natural geomaterials, as soils and rocks, show spatial variability and heterogeneity of physical and mechanical properties. They can be measured by in field and laboratory testing. The heterogeneity concerns different values of litho-technical parameters pertaining similar lithological units placed close to each other. On the contrary, the variability is inherent to the formation and evolution processes experienced by each geological units (homogeneous geomaterials on average) and captured as a spatial structure of fluctuation of physical property values about their mean trend, e.g. the unit weight, the hydraulic permeability, the friction angle, the cohesion, among others. The preceding spatial variations shall be managed by engineering models to accomplish reliable designing of structures and infrastructures. Materon (1962) introduced the Geostatistics as the most comprehensive tool to manage spatial correlation of parameter measures used in a wide range of earth science applications. In the field of the engineering geology, Vanmarcke (1977) developed the first pioneering attempts to describe and manage the inherent variability in geomaterials although Terzaghi (1943) already highlighted that spatial fluctuations of physical and mechanical parameters used in geotechnical designing cannot be neglected. A few years later, Mandelbrot (1983) and Turcotte (1986) interpreted the internal arrangement of geomaterial according to Fractal Theory. In the same years, Vanmarcke (1983) proposed the Random Field Theory providing mathematical tools to deal with inherent variability of each geological units or stratigraphic succession that can be resembled as one material. In this approach, measurement fluctuations of physical parameters are interpreted through the spatial variability structure consisting in the correlation function and the scale of fluctuation. Fenton and Griffiths (1992) combined random field simulation with the finite element method to produce the Random Finite Element Method (RFEM). This method has been used to investigate the random behavior of soils in the context of a variety of classical geotechnical problems. Afterward, some following studies collected the worldwide variability values of many technical parameters of soils (Phoon and Kulhawy 1999a) and their spatial correlation functions (Phoon and Kulhawy 1999b). In Italy, Cherubini et al. (2007) calculated the spatial variability structure of sandy and clayey soils from the standard cone penetration test readings. The large extent of the worldwide measured spatial variability of soils and rocks heavily affects the reliability of geotechnical designing as well as other uncertainties introduced by testing devices and engineering models. So far, several methods have been provided to deal with the preceding sources of uncertainties in engineering designing models (e.g. First Order Reliability Method, Second Order Reliability Method, Response Surface Method, High Dimensional Model Representation, etc.). Nowadays, the efforts in this field have been focusing on (1) measuring spatial variability of different rocks and soils and (2) developing numerical models that take into account the spatial variability as additional physical variable. References Cherubini C., Vessia G. and Pula W. 2007. Statistical soil characterization of Italian sites for reliability analyses. Proc. 2nd Int. Workshop. on Characterization and Engineering Properties of Natural Soils, 3-4: 2681-2706. Griffiths D.V. and Fenton G.A. 1993. Seepage beneath water retaining structures founded on spatially random soil, Géotechnique, 43(6): 577-587. Mandelbrot B.B. 1983. The Fractal Geometry of Nature. San Francisco: W H Freeman. Matheron G. 1962. Traité de Géostatistique appliquée. Tome 1, Editions Technip, Paris, 334 p. Phoon K.K. and Kulhawy F.H. 1999a. Characterization of geotechnical variability. Can Geotech J, 36(4): 612-624. Phoon K.K. and Kulhawy F.H. 1999b. Evaluation of geotechnical property variability. Can Geotech J, 36(4): 625-639. Terzaghi K. 1943. Theoretical Soil Mechanics. New York: John Wiley and Sons. Turcotte D.L. 1986. Fractals and fragmentation. J Geophys Res, 91: 1921-1926. Vanmarcke E.H. 1977. Probabilistic modeling of soil profiles. J Geotech Eng Div, ASCE, 103: 1227-1246. Vanmarcke E.H. 1983. Random fields: analysis and synthesis. MIT Press, Cambridge.
Evaluation methodologies for an advanced information processing system
NASA Technical Reports Server (NTRS)
Schabowsky, R. S., Jr.; Gai, E.; Walker, B. K.; Lala, J. H.; Motyka, P.
1984-01-01
The system concept and requirements for an Advanced Information Processing System (AIPS) are briefly described, but the emphasis of this paper is on the evaluation methodologies being developed and utilized in the AIPS program. The evaluation tasks include hardware reliability, maintainability and availability, software reliability, performance, and performability. Hardware RMA and software reliability are addressed with Markov modeling techniques. The performance analysis for AIPS is based on queueing theory. Performability is a measure of merit which combines system reliability and performance measures. The probability laws of the performance measures are obtained from the Markov reliability models. Scalar functions of this law such as the mean and variance provide measures of merit in the AIPS performability evaluations.
Development of a probabilistic analysis methodology for structural reliability estimation
NASA Technical Reports Server (NTRS)
Torng, T. Y.; Wu, Y.-T.
1991-01-01
The novel probabilistic analysis method for assessment of structural reliability presented, which combines fast-convolution with an efficient structural reliability analysis, can after identifying the most important point of a limit state proceed to establish a quadratic-performance function. It then transforms the quadratic function into a linear one, and applies fast convolution. The method is applicable to problems requiring computer-intensive structural analysis. Five illustrative examples of the method's application are given.
1982-11-22
RD-Ri42 354 APPLICATION OF ADVANCED FRACTURE MECHANICS TECHNOLOGY i/i TT ENSURE STRUCTURA..(U) 1WESTINGHOUSE RESEARCH FND DEVELOPMENT CENTER...I Iml .4. 47 Igo 12. 4 %B 1. __ ~. ~% ski Z L __ 12 APPLICATION OF ADVANCED FRACTURE MECHANICS -p TECHNOLOGY TO ENSURE STRUCTURAL RELIABILITY IN...Road W Pilttsburgh. Pennsylvania 15235 84 06 18 207 APPLICATION OF ADVANCED FRACTURE MECHANICS TECHNOLOGY TO ENSURE STRUCTURAL RELIABILITY IN CRITICAL
Higher-Order Theory: Structural/MicroAnalysis Code (HOTSMAC) Developed
NASA Technical Reports Server (NTRS)
Arnold, Steven M.
2002-01-01
The full utilization of advanced materials (be they composite or functionally graded materials) in lightweight aerospace components requires the availability of accurate analysis, design, and life-prediction tools that enable the assessment of component and material performance and reliability. Recently, a new commercially available software product called HOTSMAC (Higher-Order Theory--Structural/MicroAnalysis Code) was jointly developed by Collier Research Corporation, Engineered Materials Concepts LLC, and the NASA Glenn Research Center under funding provided by Glenn's Commercial Technology Office. The analytical framework for HOTSMAC is based on almost a decade of research into the coupled micromacrostructural analysis of heterogeneous materials. Consequently, HOTSMAC offers a comprehensive approach for analyzing/designing the response of components with various microstructural details, including certain advantages not always available in standard displacement-based finite element analysis techniques. The capabilities of HOTSMAC include combined thermal and mechanical analysis, time-independent and time-dependent material behavior, and internal boundary cells (e.g., those that can be used to represent internal cooling passages, see the preceding figure) to name a few. In HOTSMAC problems, materials can be randomly distributed and/or functionally graded (as shown in the figure, wherein the inclusions are distributed linearly), or broken down by strata, such as in the case of thermal barrier coatings or composite laminates.
Theoretical and experimental NMR studies on muscimol from fly agaric mushroom (Amanita muscaria)
NASA Astrophysics Data System (ADS)
Kupka, Teobald; Wieczorek, Piotr P.
2016-01-01
In this article we report results of combined theoretical and experimental NMR studies on muscimol, the bioactive alkaloid from fly agaric mushroom (Amanita muscaria). The assignment of 1H and 13C NMR spectra of muscimol in DMSO-d6 was supported by additional two-dimensional heteronuclear correlated spectra (2D NMR) and gauge independent atomic orbital (GIAO) NMR calculations using density functional theory (DFT). The effect of solvent in theoretical calculations was included via polarized continuum model (PCM) and the hybrid three-parameter B3LYP density functional in combination with 6-311++G(3df,2pd) basis set enabled calculation of reliable structures of non-ionized (neutral) molecule and its NH and zwitterionic forms in the gas phase, chloroform, DMSO and water. GIAO NMR calculations, using equilibrium and rovibrationally averaged geometry, at B3LYP/6-31G* and B3LYP/aug-cc-pVTZ-J levels of theory provided muscimol nuclear magnetic shieldings. The theoretical proton and carbon chemical shifts were critically compared with experimental NMR spectra measured in DMSO. Our results provide useful information on its structure in solution. We believe that such data could improve the understanding of basic features of muscimol at atomistic level and provide another tool in studies related to GABA analogs.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhang, Yubo; Zhang, Jiawei; Wang, Youwei
Diamond-like Cu-based multinary semiconductors are a rich family of materials that hold promise in a wide range of applications. Unfortunately, accurate theoretical understanding of the electronic properties of these materials is hindered by the involvement of Cu d electrons. Density functional theory (DFT) based calculations using the local density approximation or generalized gradient approximation often give qualitative wrong electronic properties of these materials, especially for narrow-gap systems. The modified Becke-Johnson (mBJ) method has been shown to be a promising alternative to more elaborate theory such as the GW approximation for fast materials screening and predictions. However, straightforward applications of themore » mBJ method to these materials still encounter significant difficulties because of the insufficient treatment of the localized d electrons. We show that combining the promise of mBJ potential and the spirit of the well-established DFT + U method leads to a much improved description of the electronic structures, including the most challenging narrow-gap systems. A survey of the band gaps of about 20 Cu-based semiconductors calculated using the mBJ + U method shows that the results agree with reliable values to within ±0.2 eV.« less
Peterson, Elizabeth R; Mohal, Jatender; Waldie, Karen E; Reese, Elaine; Atatoa Carr, Polly E; Grant, Cameron C; Morton, Susan M B
2017-01-01
The Infant Behavior Questionnaire-Revised Very Short Form (IBQ-R VSF; Putnam, Helbig, Gartstein, Rothbart, & Leerkes, 2014 ) is a newly published measure of infant temperament with a 3-factor structure. Recently Peterson et al. ( 2017 ) suggested that a 5-factor structure (Positive Affectivity/Surgency, Negative Emotionality, Orienting Capacity, Affiliation/Regulation, and Fear) was more parsimonious and showed promising reliability and predictive validity in a large, diverse sample. However, little is known about the 5-factor model's precision across the temperament dimensions range and whether it discriminates equally well across ethnicities. A total of 5,567 mothers responded to the IBQ-R VSF in relation to their infants (N = 5,639) between 23 and 52 weeks old. Using item response theory, we conducted a series of 2 parameter logistic item response models and found that 5 IBQ-R VSF temperament dimensions showed a good distribution of estimates across each latent trait range and these estimates centered close to the population mean. The IBQ-R VSF was also similarly precise across 4 ethnic groups (European, Māori, Pacific peoples, and Asians), suggesting that it can be used as comparable measure for infant temperament in a diversity of ethnic groups.
Lord, Sarah Peregrine; Can, Doğan; Yi, Michael; Marin, Rebeca; Dunn, Christopher W.; Imel, Zac E.; Georgiou, Panayiotis; Narayanan, Shrikanth; Steyvers, Mark; Atkins, David C.
2014-01-01
The current paper presents novel methods for collecting MISC data and accurately assessing reliability of behavior codes at the level of the utterance. The MISC 2.1 was used to rate MI interviews from five randomized trials targeting alcohol and drug use. Sessions were coded at the utterance-level. Utterance-based coding reliability was estimated using three methods and compared to traditional reliability estimates of session tallies. Session-level reliability was generally higher compared to reliability using utterance-based codes, suggesting that typical methods for MISC reliability may be biased. These novel methods in MI fidelity data collection and reliability assessment provided rich data for therapist feedback and further analyses. Beyond implications for fidelity coding, utterance-level coding schemes may elucidate important elements in the counselor-client interaction that could inform theories of change and the practice of MI. PMID:25242192
Lord, Sarah Peregrine; Can, Doğan; Yi, Michael; Marin, Rebeca; Dunn, Christopher W; Imel, Zac E; Georgiou, Panayiotis; Narayanan, Shrikanth; Steyvers, Mark; Atkins, David C
2015-02-01
The current paper presents novel methods for collecting MISC data and accurately assessing reliability of behavior codes at the level of the utterance. The MISC 2.1 was used to rate MI interviews from five randomized trials targeting alcohol and drug use. Sessions were coded at the utterance-level. Utterance-based coding reliability was estimated using three methods and compared to traditional reliability estimates of session tallies. Session-level reliability was generally higher compared to reliability using utterance-based codes, suggesting that typical methods for MISC reliability may be biased. These novel methods in MI fidelity data collection and reliability assessment provided rich data for therapist feedback and further analyses. Beyond implications for fidelity coding, utterance-level coding schemes may elucidate important elements in the counselor-client interaction that could inform theories of change and the practice of MI. Copyright © 2015 Elsevier Inc. All rights reserved.
Mraity, Hussien A A B; England, Andrew; Cassidy, Simon; Eachus, Peter; Dominguez, Alejandro; Hogg, Peter
2016-01-01
The aim of this article was to apply psychometric theory to develop and validate a visual grading scale for assessing the visual perception of digital image quality anteroposterior (AP) pelvis. Psychometric theory was used to guide scale development. Seven phantom and seven cadaver images of visually and objectively predetermined quality were used to help assess scale reliability and validity. 151 volunteers scored phantom images, and 184 volunteers scored cadaver images. Factor analysis and Cronbach's alpha were used to assess scale validity and reliability. A 24-item scale was produced. Aggregated mean volunteer scores for each image correlated with the rank order of the visually and objectively predetermined image qualities. Scale items had good interitem correlation (≥0.2) and high factor loadings (≥0.3). Cronbach's alpha (reliability) revealed that the scale has acceptable levels of internal reliability for both phantom and cadaver images (α = 0.8 and 0.9, respectively). Factor analysis suggested that the scale is multidimensional (assessing multiple quality themes). This study represents the first full development and validation of a visual image quality scale using psychometric theory. It is likely that this scale will have clinical, training and research applications. This article presents data to create and validate visual grading scales for radiographic examinations. The visual grading scale, for AP pelvis examinations, can act as a validated tool for future research, teaching and clinical evaluations of image quality.
England, Andrew; Cassidy, Simon; Eachus, Peter; Dominguez, Alejandro; Hogg, Peter
2016-01-01
Objective: The aim of this article was to apply psychometric theory to develop and validate a visual grading scale for assessing the visual perception of digital image quality anteroposterior (AP) pelvis. Methods: Psychometric theory was used to guide scale development. Seven phantom and seven cadaver images of visually and objectively predetermined quality were used to help assess scale reliability and validity. 151 volunteers scored phantom images, and 184 volunteers scored cadaver images. Factor analysis and Cronbach's alpha were used to assess scale validity and reliability. Results: A 24-item scale was produced. Aggregated mean volunteer scores for each image correlated with the rank order of the visually and objectively predetermined image qualities. Scale items had good interitem correlation (≥0.2) and high factor loadings (≥0.3). Cronbach's alpha (reliability) revealed that the scale has acceptable levels of internal reliability for both phantom and cadaver images (α = 0.8 and 0.9, respectively). Factor analysis suggested that the scale is multidimensional (assessing multiple quality themes). Conclusion: This study represents the first full development and validation of a visual image quality scale using psychometric theory. It is likely that this scale will have clinical, training and research applications. Advances in knowledge: This article presents data to create and validate visual grading scales for radiographic examinations. The visual grading scale, for AP pelvis examinations, can act as a validated tool for future research, teaching and clinical evaluations of image quality. PMID:26943836
Petrillo, Jennifer; Bressler, Neil M; Lamoureux, Ecosse; Ferreira, Alberto; Cano, Stefan
2017-08-14
The NEI VFQ-25 has undergone psychometric evaluation in patients with varying ocular conditions and the general population. However, important limitations which may affect the interpretation of clinical trial results have been previously identified, such as concerns with reliability and validity. The purpose of this study was to evaluate the National Eye Institute Visual Functioning Questionnaire (NEI VFQ-25) and make recommendations for a revised scoring structure, with a view to improving its psychometric performance and interpretability. Rasch Measurement Theory analyses were conducted in two stages using pooled baseline NEI VFQ-25 data for 2487 participants with retinal diseases enrolled in six clinical trials. In stage 1, we examined: scale-to-sample targeting; thresholds for item response options; item fit statistics; stability; local dependence; and reliability. In stage 2, a post-hoc revision of the scoring structure (VFQ-28R) was created and psychometrically re-evaluated. In stage 1, we found that the NEI VFQ-25 was mis-targeted to the sample, and had disordered response thresholds (15/25 items) and mis-fitting items (8/25 items). However, items appeared to be stable (differential item functioning for three items), have minimal item dependency (one pair of items) and good reliability (person-separation index, 0.93). In stage 2, the modified Rasch-scored NEI VFQ-28-R was assessed. It comprised two broad domains: Activity Limitation (19 items) and Socio-Emotional Functioning (nine items). The NEI VFQ-28-R demonstrated improved performance with fewer disordered response thresholds (no items), less item misfit (three items) and improved population targeting (reduced ceiling effect) compared with the NEI VFQ-25. Compared with the original version, the proposed NEI VFQ-28-R, with Rasch-based scoring and a two-domain structure, appears to offer improved psychometric performance and interpretability of the vision-related quality of life scale for the population analysed.
The Trunk Impairment Scale - modified to ordinal scales in the Norwegian version.
Gjelsvik, Bente; Breivik, Kyrre; Verheyden, Geert; Smedal, Tori; Hofstad, Håkon; Strand, Liv Inger
2012-01-01
To translate the Trunk Impairment Scale (TIS), a measure of trunk control in patients after stroke, into Norwegian (TIS-NV), and to explore its construct validity, internal consistency, intertester and test-retest reliability. TIS was translated according to international guidelines. The validity study was performed on data from 201 patients with acute stroke. Fifty patients with stroke and acquired brain injury were recruited to examine intertester and test-retest reliability. Construct validity was analyzed with exploratory and confirmatory factor analysis and item response theory, internal consistency with Cronbach's alpha test, and intertester and test-retest reliability with kappa and intraclass correlation coefficient tests. The back-translated version of TIS-NV was validated by the original developer. The subscale Static sitting balance was removed. By combining items from the subscales Dynamic sitting balance and Coordination, six ordinal superitems (testlets) were constructed. The TIS-NV was renamed the modified TIS-NV (TIS-modNV). After modifications the TIS-modNV fitted well to a locally dependent unidimensional item response theory model. It demonstrated good construct validity, excellent internal consistency, and high intertester and test-retest reliability for the total score. This study supports that the TIS-modNV is a valid and reliable scale for use in clinical practice and research.
The Basics: What's Essential about Theory for Community Development Practice?
ERIC Educational Resources Information Center
Hustedde, Ronald J.; Ganowicz, Jacek
2002-01-01
Relates three classical theories (structural functionalism, conflict theory, symbolic interactionism) to fundamental concerns of community development (structure, power, and shared meaning). Links these theories to Giddens' structuration theory, which connects macro and micro structures and community influence on change through cultural norms.…
Developing and Validating a Science Notebook Rubric for Fifth-Grade Non-Mainstream Students
NASA Astrophysics Data System (ADS)
Huerta, Margarita; Lara-Alecio, Rafael; Tong, Fuhui; Irby, Beverly J.
2014-07-01
We present the development and validation of a science notebook rubric intended to measure the academic language and conceptual understanding of non-mainstream students, specifically fifth-grade male and female economically disadvantaged Hispanic English language learner (ELL) and African-American or Hispanic native English-speaking students. The science notebook rubric is based on two main constructs: academic language and conceptual understanding. The constructs are grounded in second-language acquisition theory and theories of writing and conceptual understanding. We established content validity and calculated reliability measures using G theory and percent agreement (for comparison) with a sample of approximately 144 unique science notebook entries and 432 data points. Results reveal sufficient reliability estimates, indicating that the instrument is promising for use in future research studies including science notebooks in classrooms with populations of economically disadvantaged Hispanic ELL and African-American or Hispanic native English-speaking students.
Bai, Xiao-ping; Zhang, Xi-wei
2013-01-01
Selecting construction schemes of the building engineering project is a complex multiobjective optimization decision process, in which many indexes need to be selected to find the optimum scheme. Aiming at this problem, this paper selects cost, progress, quality, and safety as the four first-order evaluation indexes, uses the quantitative method for the cost index, uses integrated qualitative and quantitative methodologies for progress, quality, and safety indexes, and integrates engineering economics, reliability theories, and information entropy theory to present a new evaluation method for building construction project. Combined with a practical case, this paper also presents detailed computing processes and steps, including selecting all order indexes, establishing the index matrix, computing score values of all order indexes, computing the synthesis score, sorting all selected schemes, and making analysis and decision. Presented method can offer valuable references for risk computing of building construction projects.
Ponterotto, Joseph G; Ruckdeschel, Daniel E
2007-12-01
The present article addresses issues in reliability assessment that are often neglected in psychological research such as acceptable levels of internal consistency for research purposes, factors affecting the magnitude of coefficient alpha (alpha), and considerations for interpreting alpha within the research context. A new reliability matrix anchored in classical test theory is introduced to help researchers judge adequacy of internal consistency coefficients with research measures. Guidelines and cautions in applying the matrix are provided.
Ferreiro, Diego U.; Komives, Elizabeth A.; Wolynes, Peter G.
2014-01-01
Biomolecules are the prime information processing elements of living matter. Most of these inanimate systems are polymers that compute their own structures and dynamics using as input seemingly random character strings of their sequence, following which they coalesce and perform integrated cellular functions. In large computational systems with a finite interaction-codes, the appearance of conflicting goals is inevitable. Simple conflicting forces can lead to quite complex structures and behaviors, leading to the concept of frustration in condensed matter. We present here some basic ideas about frustration in biomolecules and how the frustration concept leads to a better appreciation of many aspects of the architecture of biomolecules, and how biomolecular structure connects to function. These ideas are simultaneously both seductively simple and perilously subtle to grasp completely. The energy landscape theory of protein folding provides a framework for quantifying frustration in large systems and has been implemented at many levels of description. We first review the notion of frustration from the areas of abstract logic and its uses in simple condensed matter systems. We discuss then how the frustration concept applies specifically to heteropolymers, testing folding landscape theory in computer simulations of protein models and in experimentally accessible systems. Studying the aspects of frustration averaged over many proteins provides ways to infer energy functions useful for reliable structure prediction. We discuss how frustration affects folding mechanisms. We review here how a large part of the biological functions of proteins are related to subtle local physical frustration effects and how frustration influences the appearance of metastable states, the nature of binding processes, catalysis and allosteric transitions. We hope to illustrate how Frustration is a fundamental concept in relating function to structural biology. PMID:25225856
High-Performance First-Principles Molecular Dynamics for Predictive Theory and Modeling
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gygi, Francois; Galli, Giulia; Schwegler, Eric
This project focused on developing high-performance software tools for First-Principles Molecular Dynamics (FPMD) simulations, and applying them in investigations of materials relevant to energy conversion processes. FPMD is an atomistic simulation method that combines a quantum-mechanical description of electronic structure with the statistical description provided by molecular dynamics (MD) simulations. This reliance on fundamental principles allows FPMD simulations to provide a consistent description of structural, dynamical and electronic properties of a material. This is particularly useful in systems for which reliable empirical models are lacking. FPMD simulations are increasingly used as a predictive tool for applications such as batteries, solarmore » energy conversion, light-emitting devices, electro-chemical energy conversion devices and other materials. During the course of the project, several new features were developed and added to the open-source Qbox FPMD code. The code was further optimized for scalable operation of large-scale, Leadership-Class DOE computers. When combined with Many-Body Perturbation Theory (MBPT) calculations, this infrastructure was used to investigate structural and electronic properties of liquid water, ice, aqueous solutions, nanoparticles and solid-liquid interfaces. Computing both ionic trajectories and electronic structure in a consistent manner enabled the simulation of several spectroscopic properties, such as Raman spectra, infrared spectra, and sum-frequency generation spectra. The accuracy of the approximations used allowed for direct comparisons of results with experimental data such as optical spectra, X-ray and neutron diffraction spectra. The software infrastructure developed in this project, as applied to various investigations of solids, liquids and interfaces, demonstrates that FPMD simulations can provide a detailed, atomic-scale picture of structural, vibrational and electronic properties of complex systems relevant to energy conversion devices.« less
Deployable antenna kinematics using tensegrity structure design
NASA Astrophysics Data System (ADS)
Knight, Byron Franklin
With vast changes in spacecraft development over the last decade, a new, cheaper approach was needed for deployable kinematic systems such as parabolic antenna reflectors. Historically, these mesh-surface reflectors have resembled folded umbrellas, with incremental redesigns utilized to save packaging size. These systems are typically over-constrained designs, the assumption being that high reliability necessary for space operations requires this level of conservatism. But with the rapid commercialization of space, smaller launch platforms and satellite buses have demanded much higher efficiency from all space equipment than can be achieved through this incremental approach. This work applies an approach called tensegrity to deployable antenna development. Kenneth Snelson, a student of R. Buckminster Fuller, invented Tensegrity structures in 1948. Such structures use a minimum number of compression members (struts); stability is maintain using tension members (ties). The novelty introduced in this work is that the ties are elastic, allowing the struts to extend or contract, and in this way changing the surface of the antenna. Previously, the University of Florida developed an approach to quantify the stability and motion of parallel manipulators. This approach was applied to deployable, tensegrity, antenna structures. Based on the kinematic analyses for the 3-3 (octahedron) and 4-4 (square anti-prism) structures, the 6-6 (hexagonal anti-prism) analysis was completed which establishes usable structural parameters. The primary objective for this work was to prove the stability of this class of deployable structures, and their potential application to space structures. The secondary objective is to define special motions for tensegrity antennas, to meet the subsystem design requirements, such as addressing multiple antenna-feed locations. This work combines the historical experiences of the artist (Snelson), the mathematician (Ball), and the space systems engineer (Wertz) to develop a new, practical design approach. This kinematic analysis of tensegrity structures blends these differences to provide the design community with a new approach to lightweight, robust, adaptive structures with the high reliability that space demands. Additionally, by applying Screw Theory, a tensegrity structure antenna can be commanded to move along a screw axis, and therefore meeting the requirement to address multiple feed locations.
NASA Astrophysics Data System (ADS)
Yu, Bo; Ning, Chao-lie; Li, Bing
2017-03-01
A probabilistic framework for durability assessment of concrete structures in marine environments was proposed in terms of reliability and sensitivity analysis, which takes into account the uncertainties under the environmental, material, structural and executional conditions. A time-dependent probabilistic model of chloride ingress was established first to consider the variations in various governing parameters, such as the chloride concentration, chloride diffusion coefficient, and age factor. Then the Nataf transformation was adopted to transform the non-normal random variables from the original physical space into the independent standard Normal space. After that the durability limit state function and its gradient vector with respect to the original physical parameters were derived analytically, based on which the first-order reliability method was adopted to analyze the time-dependent reliability and parametric sensitivity of concrete structures in marine environments. The accuracy of the proposed method was verified by comparing with the second-order reliability method and the Monte Carlo simulation. Finally, the influences of environmental conditions, material properties, structural parameters and execution conditions on the time-dependent reliability of concrete structures in marine environments were also investigated. The proposed probabilistic framework can be implemented in the decision-making algorithm for the maintenance and repair of deteriorating concrete structures in marine environments.
A reliability analysis of the revised competitiveness index.
Harris, Paul B; Houston, John M
2010-06-01
This study examined the reliability of the Revised Competitiveness Index by investigating the test-retest reliability, interitem reliability, and factor structure of the measure based on a sample of 280 undergraduates (200 women, 80 men) ranging in age from 18 to 28 years (M = 20.1, SD = 2.1). The findings indicate that the Revised Competitiveness Index has high test-retest reliability, high inter-item reliability, and a stable factor structure. The results support the assertion that the Revised Competitiveness Index assesses competitiveness as a stable trait rather than a dynamic state.
Reliability and validity of the Nurse Practitioners' Roles and Competencies Scale.
Lin, Li-Chun; Lee, Sheuan; Ueng, Steve Wen-Neng; Tang, Woung-Ru
2016-01-01
The objective of this study was to test the reliability and construct validity of the Nurse Practitioners' Roles and Competencies Scale. The role of nurse practitioners has attracted international attention. The advanced nursing role played by nurse practitioners varies with national conditions and medical environments. To date, no suitable measurement tool has been available for assessing the roles and competencies of nurse practitioners in Asian countries. Secondary analysis of data from three studies related to nurse practitioners' role competencies. We analysed data from 563 valid questionnaires completed in three studies to identify the factor structure of the Nurse Practitioners' Roles and Competencies Scale. To this end, we performed exploratory factor analysis using principal component analysis extraction with varimax orthogonal rotation. The internal consistency reliabilities of the overall scale and its subscales were examined using Cronbach's alpha coefficient. The scale had six factors: professionalism, direct care, clinical research, practical guidance, medical assistance, as well as leadership and reform. These factors explained 67·5% of the total variance in nurse practitioners' role competencies. Cronbach's alpha coefficient for the overall scale was 0·98, and those of its subscales ranged from 0·83-0·97. The internal consistency reliability and construct validity of the Nurse Practitioners' Roles and Competencies Scale were good. The high internal consistency reliabilities suggest item redundancy, which should be minimised by using item response theory to enhance the applicability of this questionnaire for future academic and clinical studies. The Nurse Practitioners' Roles and Competencies Scale can be used as a tool for assessing the roles and competencies of nurse practitioners in Taiwan. Our findings can also serve as a reference for other Asian countries to develop the nurse practitioner role. © 2015 John Wiley & Sons Ltd.
Kuhn, T; Gullett, J M; Nguyen, P; Boutzoukas, A E; Ford, A; Colon-Perez, L M; Triplett, W; Carney, P R; Mareci, T H; Price, C C; Bauer, R M
2016-06-01
This study examined the reliability of high angular resolution diffusion tensor imaging (HARDI) data collected on a single individual across several sessions using the same scanner. HARDI data was acquired for one healthy adult male at the same time of day on ten separate days across a one-month period. Environmental factors (e.g. temperature) were controlled across scanning sessions. Tract Based Spatial Statistics (TBSS) was used to assess session-to-session variability in measures of diffusion, fractional anisotropy (FA) and mean diffusivity (MD). To address reliability within specific structures of the medial temporal lobe (MTL; the focus of an ongoing investigation), probabilistic tractography segmented the Entorhinal cortex (ERc) based on connections with Hippocampus (HC), Perirhinal (PRc) and Parahippocampal (PHc) cortices. Streamline tractography generated edge weight (EW) metrics for the aforementioned ERc connections and, as comparison regions, connections between left and right rostral and caudal anterior cingulate cortex (ACC). Coefficients of variation (CoV) were derived for the surface area and volumes of these ERc connectivity-defined regions (CDR) and for EW across all ten scans, expecting that scan-to-scan reliability would yield low CoVs. TBSS revealed no significant variation in FA or MD across scanning sessions. Probabilistic tractography successfully reproduced histologically-verified adjacent medial temporal lobe circuits. Tractography-derived metrics displayed larger ranges of scanner-to-scanner variability. Connections involving HC displayed greater variability than metrics of connection between other investigated regions. By confirming the test retest reliability of HARDI data acquisition, support for the validity of significant results derived from diffusion data can be obtained.
MEMS Reliability Assurance Activities at JPL
NASA Technical Reports Server (NTRS)
Kayali, S.; Lawton, R.; Stark, B.
2000-01-01
An overview of Microelectromechanical Systems (MEMS) reliability assurance and qualification activities at JPL is presented along with the a discussion of characterization of MEMS structures implemented on single crystal silicon, polycrystalline silicon, CMOS, and LIGA processes. Additionally, common failure modes and mechanisms affecting MEMS structures, including radiation effects, are discussed. Common reliability and qualification practices contained in the MEMS Reliability Assurance Guideline are also presented.
Effects of Differential Item Functioning on Examinees' Test Performance and Reliability of Test
ERIC Educational Resources Information Center
Lee, Yi-Hsuan; Zhang, Jinming
2017-01-01
Simulations were conducted to examine the effect of differential item functioning (DIF) on measurement consequences such as total scores, item response theory (IRT) ability estimates, and test reliability in terms of the ratio of true-score variance to observed-score variance and the standard error of estimation for the IRT ability parameter. The…
Interpreting Variance Components as Evidence for Reliability and Validity.
ERIC Educational Resources Information Center
Kane, Michael T.
The reliability and validity of measurement is analyzed by a sampling model based on generalizability theory. A model for the relationship between a measurement procedure and an attribute is developed from an analysis of how measurements are used and interpreted in science. The model provides a basis for analyzing the concept of an error of…
ERIC Educational Resources Information Center
Lam, Ling Chi Tenny
2010-01-01
In writing assessment, there are quite a number of factors influencing the marking stability and the reliability of the assessment such as the attitude towards marking and consistency of markers, the physical environment, the design of the items, and marking rubrics. Even the methods to train markers have effects on the reliability of the…
ERIC Educational Resources Information Center
Miskel, Cecil; Heller, Leonard E.
The investigation attempted to establish the factorial validity and reliability of an industrial selection device based on Herzberg's theory of work motivation related to the school organization. The questionnaire was reworded to reflect an educational work situation; and a random sample of 197 students, 118 administrators, and 432 teachers was…
NASA Technical Reports Server (NTRS)
Berg, M.; Kim, H.; Phan, A.; Seidleck, C.; LaBel, K.; Pellish, J.; Campola, M.
2015-01-01
Space applications are complex systems that require intricate trade analyses for optimum implementations. We focus on a subset of the trade process, using classical reliability theory and SEU data, to illustrate appropriate TMR scheme selection.
Generalizability Theory Analysis of CBM Maze Reliability in Third- through Fifth-Grade Students
ERIC Educational Resources Information Center
Mercer, Sterett H.; Dufrene, Brad A.; Zoder-Martell, Kimberly; Harpole, Lauren Lestremau; Mitchell, Rachel R.; Blaze, John T.
2012-01-01
Despite growing use of CBM Maze in universal screening and research, little information is available regarding the number of CBM Maze probes needed for reliable decisions. The current study extends existing research on the technical adequacy of CBM Maze by investigating the number of probes and assessment durations (1-3 min) needed for reliable…
A Note on the Reliability Coefficients for Item Response Model-Based Ability Estimates
ERIC Educational Resources Information Center
Kim, Seonghoon
2012-01-01
Assuming item parameters on a test are known constants, the reliability coefficient for item response theory (IRT) ability estimates is defined for a population of examinees in two different ways: as (a) the product-moment correlation between ability estimates on two parallel forms of a test and (b) the squared correlation between the true…
Reliability measures in item response theory: manifest versus latent correlation functions.
Milanzi, Elasma; Molenberghs, Geert; Alonso, Ariel; Verbeke, Geert; De Boeck, Paul
2015-02-01
For item response theory (IRT) models, which belong to the class of generalized linear or non-linear mixed models, reliability at the scale of observed scores (i.e., manifest correlation) is more difficult to calculate than latent correlation based reliability, but usually of greater scientific interest. This is not least because it cannot be calculated explicitly when the logit link is used in conjunction with normal random effects. As such, approximations such as Fisher's information coefficient, Cronbach's α, or the latent correlation are calculated, allegedly because it is easy to do so. Cronbach's α has well-known and serious drawbacks, Fisher's information is not meaningful under certain circumstances, and there is an important but often overlooked difference between latent and manifest correlations. Here, manifest correlation refers to correlation between observed scores, while latent correlation refers to correlation between scores at the latent (e.g., logit or probit) scale. Thus, using one in place of the other can lead to erroneous conclusions. Taylor series based reliability measures, which are based on manifest correlation functions, are derived and a careful comparison of reliability measures based on latent correlations, Fisher's information, and exact reliability is carried out. The latent correlations are virtually always considerably higher than their manifest counterparts, Fisher's information measure shows no coherent behaviour (it is even negative in some cases), while the newly introduced Taylor series based approximations reflect the exact reliability very closely. Comparisons among the various types of correlations, for various IRT models, are made using algebraic expressions, Monte Carlo simulations, and data analysis. Given the light computational burden and the performance of Taylor series based reliability measures, their use is recommended. © 2014 The British Psychological Society.
Huang, Xintao; Yang, Jucai
2017-12-26
The most stable structures and electronic properties of TmSi n (n = 3-10) clusters and their anions have been probed by using the ABCluster global search technique combined with the PBE, TPSSh, and B3LYP density functional methods. The results revealed that the most stable structures of neutral TmSi n and their anions can be regarded as substituting a Si atom of the ground state structure of Si n + 1 with a Tm atom. The reliable AEAs, VDEs and simulated PES of TmSi n (n = 3-10) are presented. Calculations of HOMO-LUMO gap revealed that introducing Tm atom to Si cluster can improve photochemical reactivity of the cluster. The NPA analyses indicated that the 4f electron of Tm atom in TmSi n (n = 3-10) and their anions do not participate in bonding. The total magnetic moments of TmSi n are mainly provided by the 4f electrons of Tm atom. The dissociation energy of Tm atom from the most stable structure of TmSi n and their anions has been calculated to examine relative stability.
Whittleton, Sarah R; Otero-de-la-Roza, A; Johnson, Erin R
2017-02-14
Accurate energy ranking is a key facet to the problem of first-principles crystal-structure prediction (CSP) of molecular crystals. This work presents a systematic assessment of B86bPBE-XDM, a semilocal density functional combined with the exchange-hole dipole moment (XDM) dispersion model, for energy ranking using 14 compounds from the first five CSP blind tests. Specifically, the set of crystals studied comprises 11 rigid, planar compounds and 3 co-crystals. The experimental structure was correctly identified as the lowest in lattice energy for 12 of the 14 total crystals. One of the exceptions is 4-hydroxythiophene-2-carbonitrile, for which the experimental structure was correctly identified once a quasi-harmonic estimate of the vibrational free-energy contribution was included, evidencing the occasional importance of thermal corrections for accurate energy ranking. The other exception is an organic salt, where charge-transfer error (also called delocalization error) is expected to cause the base density functional to be unreliable. Provided the choice of base density functional is appropriate and an estimate of temperature effects is used, XDM-corrected density-functional theory is highly reliable for the energetic ranking of competing crystal structures.
Accuracy and Transferability of Ab Initio Electronic Band Structure Calculations for Doped BiFeO3
NASA Astrophysics Data System (ADS)
Gebhardt, Julian; Rappe, Andrew M.
2017-11-01
BiFeO3 is a multiferroic material and, therefore, highly interesting with respect to future oxide electronics. In order to realize such devices, pn junctions need to be fabricated, which are currently impeded by the lack of successful p-type doping in this material. In order to guide the numerous research efforts in this field, we recently finished a comprehensive computational study, investigating the influence of many dopants onto the electronic structure of BiFeO3. In order to allow for this large scale ab initio study, the computational setup had to be accurate and efficient. Here we discuss the details of this assessment, showing that standard density-functional theory (DFT) yields good structural properties. The obtained electronic structure, however, suffers from well-known shortcomings. By comparing the conventional DFT results for alkali and alkaline-earth metal doping with more accurate hybrid-DFT calculations, we show that, in this case, the problems of standard DFT go beyond a simple systematic error. Conventional DFT shows bad transferability and the more reliable hybrid-DFT has to be chosen for a qualitatively correct prediction of doping induced changes in the electronic structure of BiFeO3.
On-clip high frequency reliability and failure test structures
Snyder, Eric S.; Campbell, David V.
1997-01-01
Self-stressing test structures for realistic high frequency reliability characterizations. An on-chip high frequency oscillator, controlled by DC signals from off-chip, provides a range of high frequency pulses to test structures. The test structures provide information with regard to a variety of reliability failure mechanisms, including hot-carriers, electromigration, and oxide breakdown. The system is normally integrated at the wafer level to predict the failure mechanisms of the production integrated circuits on the same wafer.
The Reliability of Psychiatric Diagnosis Revisited
Rankin, Eric; France, Cheryl; El-Missiry, Ahmed; John, Collin
2006-01-01
Background: The authors reviewed the topic of reliability of psychiatric diagnosis from the turn of the 20th century to present. The objectives of this paper are to explore the reasons of unreliability of psychiatric diagnosis and propose ways to improve the reliability of psychiatric diagnosis. Method: The authors reviewed the literature on the concept of reliability of psychiatric diagnosis with emphasis on the impact of interviewing skills, use of diagnostic criteria, and structured interviews on the reliability of psychiatric diagnosis. Results: Causes of diagnostic unreliability are attributed to the patient, the clinician and psychiatric nomenclature. The reliability of psychiatric diagnosis can be enhanced by using diagnostic criteria, defining psychiatric symptoms and structuring the interviews. Conclusions: The authors propose the acronym ‘DR.SED,' which stands for diagnostic criteria, reference definitions, structuring the interview, clinical experience, and data. The authors recommend that clinicians use the DR.SED paradigm to improve the reliability of psychiatric diagnoses. PMID:21103149
Analysis of whisker-toughened CMC structural components using an interactive reliability model
NASA Technical Reports Server (NTRS)
Duffy, Stephen F.; Palko, Joseph L.
1992-01-01
Realizing wider utilization of ceramic matrix composites (CMC) requires the development of advanced structural analysis technologies. This article focuses on the use of interactive reliability models to predict component probability of failure. The deterministic William-Warnke failure criterion serves as theoretical basis for the reliability model presented here. The model has been implemented into a test-bed software program. This computer program has been coupled to a general-purpose finite element program. A simple structural problem is presented to illustrate the reliability model and the computer algorithm.
2011-01-01
Background The Consultation and Relational Empathy (CARE) Measure is a widely used patient-rated experience measure which has recently been translated into Chinese and has undergone preliminary qualitative and quantitative validation. The objective of this study was to determine the reliability of the Chinese-version of the CARE Measure in reliably differentiating between doctors in a primary care setting in Hong Kong Methods Data were collected from 984 primary care patients attending 20 doctors with differing levels of training in family medicine in 5 public clinics in Hong Kong. The acceptability of the Chinese-CARE measure to patients was assessed. The reliability of the measure in discriminating effectively between doctors was analysed by Generalisability-theory (G-Theory) Results The items in the Chinese-CARE measure were regarded as important by patients and there were few 'not applicable' responses. The measure showed high internal reliability (coefficient 0.95) and effectively differentiated between doctors with only 15-20 patient ratings per doctor (inter-rater reliability > 0.8). Doctors' mean CARE measure scores varied widely, ranging from 24.1 to 45.9 (maximum possible score 50) with a mean of 34.6. CARE Measure scores were positively correlated with level of training in family medicine (Spearman's rho 0.493, p < 0.05). Conclusion These data demonstrate the acceptability, feasibility and reliability of using the Chinese-CARE Measure in primary care in Hong Kong to differentiate between doctors interpersonal competencies. Training in family medicine appears to enhance these key interpersonal skills. PMID:21631927
On the design of high-rise buildings with a specified level of reliability
NASA Astrophysics Data System (ADS)
Dolganov, Andrey; Kagan, Pavel
2018-03-01
High-rise buildings have a specificity, which significantly distinguishes them from traditional buildings of high-rise and multi-storey buildings. Steel structures in high-rise buildings are advisable to be used in earthquake-proof regions, since steel, due to its plasticity, provides damping of the kinetic energy of seismic impacts. These aspects should be taken into account when choosing a structural scheme of a high-rise building and designing load-bearing structures. Currently, modern regulatory documents do not quantify the reliability of structures. Although the problem of assigning an optimal level of reliability has existed for a long time. The article shows the possibility of designing metal structures of high-rise buildings with specified reliability. Currently, modern regulatory documents do not quantify the reliability of high-rise buildings. Although the problem of assigning an optimal level of reliability has existed for a long time. It is proposed to establish the value of reliability 0.99865 (3σ) for constructions of buildings and structures of a normal level of responsibility in calculations for the first group of limiting states. For increased (construction of high-rise buildings) and reduced levels of responsibility for the provision of load-bearing capacity, it is proposed to assign respectively 0.99997 (4σ) and 0.97725 (2σ). The coefficients of the use of the cross section of a metal beam for different levels of security are given.
Compressive sensing based wireless sensor for structural health monitoring
NASA Astrophysics Data System (ADS)
Bao, Yuequan; Zou, Zilong; Li, Hui
2014-03-01
Data loss is a common problem for monitoring systems based on wireless sensors. Reliable communication protocols, which enhance communication reliability by repetitively transmitting unreceived packets, is one approach to tackle the problem of data loss. An alternative approach allows data loss to some extent and seeks to recover the lost data from an algorithmic point of view. Compressive sensing (CS) provides such a data loss recovery technique. This technique can be embedded into smart wireless sensors and effectively increases wireless communication reliability without retransmitting the data. The basic idea of CS-based approach is that, instead of transmitting the raw signal acquired by the sensor, a transformed signal that is generated by projecting the raw signal onto a random matrix, is transmitted. Some data loss may occur during the transmission of this transformed signal. However, according to the theory of CS, the raw signal can be effectively reconstructed from the received incomplete transformed signal given that the raw signal is compressible in some basis and the data loss ratio is low. This CS-based technique is implemented into the Imote2 smart sensor platform using the foundation of Illinois Structural Health Monitoring Project (ISHMP) Service Tool-suite. To overcome the constraints of limited onboard resources of wireless sensor nodes, a method called random demodulator (RD) is employed to provide memory and power efficient construction of the random sampling matrix. Adaptation of RD sampling matrix is made to accommodate data loss in wireless transmission and meet the objectives of the data recovery. The embedded program is tested in a series of sensing and communication experiments. Examples and parametric study are presented to demonstrate the applicability of the embedded program as well as to show the efficacy of CS-based data loss recovery for real wireless SHM systems.
A Proposed Model of Jazz Theory Knowledge Acquisition
ERIC Educational Resources Information Center
Ciorba, Charles R.; Russell, Brian E.
2014-01-01
The purpose of this study was to test a hypothesized model that proposes a causal relationship between motivation and academic achievement on the acquisition of jazz theory knowledge. A reliability analysis of the latent variables ranged from 0.92 to 0.94. Confirmatory factor analyses of the motivation (standardized root mean square residual…
Why Are You Learning a Second Language? Motivational Orientations and Self-Determination Theory.
ERIC Educational Resources Information Center
Noels, Kimberly A.; Pelletier, Luc G.; Clement, Richard; Vallerand, Robert J.
2003-01-01
Examined self-determination theory (SDT) in the language learning context. Involved the development of a valid and reliable instrument to assess different subtypes of intrinsic and extrinsic motivation and explored the link between these motivational sub-types and various orientations to language learning. Showed instrumental orientation and the…
Bias Reduction in Quasi-Experiments with Little Selection Theory but Many Covariates
ERIC Educational Resources Information Center
Steiner, Peter M.; Cook, Thomas D.; Li, Wei; Clark, M. H.
2015-01-01
In observational studies, selection bias will be completely removed only if the selection mechanism is ignorable, namely, all confounders of treatment selection and potential outcomes are reliably measured. Ideally, well-grounded substantive theories about the selection process and outcome-generating model are used to generate the sample of…
Toward a Sociology of Criminological Theory
ERIC Educational Resources Information Center
Hauhart, Robert C.
2012-01-01
It is a truism to remind ourselves that scientific theory is a human product subject to many of the same social processes that govern other social acts. Science, however, whether social or natural, pretends to claim a higher mission, a more sophisticated methodology, and more consequential and reliable outcomes than human efforts arising from…
Wu, Jiayuan; Hu, Liren; Zhang, Gaohua; Liang, Qilian; Meng, Qiong; Wan, Chonghua
2016-08-01
This research was designed to develop a nasopharyngeal cancer (NPC) scale based on quality of life (QOL) instruments for cancer patients (QLICP-NA). This scale was developed by using a modular approach and was evaluated by classical test and generalizability theories. Programmed decision procedures and theories on instrument development were applied to create QLICP-NA V2.0. A total of 121 NPC inpatients were assessed using QLICP-NA V2.0 to measure their QOL data from hospital admission until discharge. Scale validity, reliability, and responsiveness were evaluated by correlation, factor, parallel, multi-trait scaling, and t test analyses, as well as by generalizability (G) and decision (D) studies of the generalizability theory. Results of multi-trait scaling, correlation, factor, and parallel analyses indicated that QLICP-NA V2.0 exhibited good construct validity. The significant difference of QOL between the treated and untreated NPC patients indicated a good clinical validity of the questionnaire. The internal consistency (α) and test-retest reliability coefficients (intra-class correlations) of each domain, as well as the overall scale, were all >0.70. Ceiling effects were not found in all domains and most facets, except for common side effects (24.8 %) in the domain of common symptoms and side effects, tumor early symptoms (27.3 %) and therapeutic side effects (23.2 %) in specific domain, whereas floor effects did not exist in each domain/facet. The overall changes in the physical and social domains were significantly different between pre- and post-treatments with a moderate effective size (standard response mean) ranging from 0.21 to 0.27 (p < 0.05), but these changes were not obvious in the other domains, as well as in the overall scale. Scale reliability was further confirmed by G coefficients and index of dependability, with more exact variance components based on generalizability theory. QLICP-NA V2.0 exhibited reasonable degrees of validity, reliability, and responsiveness. However, this scale must be further improved before it can be used as a practical instrument to evaluate the QOL of NPC patients in China.
Research on the Fusion of Dependent Evidence Based on Rank Correlation Coefficient.
Shi, Fengjian; Su, Xiaoyan; Qian, Hong; Yang, Ning; Han, Wenhua
2017-10-16
In order to meet the higher accuracy and system reliability requirements, the information fusion for multi-sensor systems is an increasing concern. Dempster-Shafer evidence theory (D-S theory) has been investigated for many applications in multi-sensor information fusion due to its flexibility in uncertainty modeling. However, classical evidence theory assumes that the evidence is independent of each other, which is often unrealistic. Ignoring the relationship between the evidence may lead to unreasonable fusion results, and even lead to wrong decisions. This assumption severely prevents D-S evidence theory from practical application and further development. In this paper, an innovative evidence fusion model to deal with dependent evidence based on rank correlation coefficient is proposed. The model first uses rank correlation coefficient to measure the dependence degree between different evidence. Then, total discount coefficient is obtained based on the dependence degree, which also considers the impact of the reliability of evidence. Finally, the discount evidence fusion model is presented. An example is illustrated to show the use and effectiveness of the proposed method.
Research on the Fusion of Dependent Evidence Based on Rank Correlation Coefficient
Su, Xiaoyan; Qian, Hong; Yang, Ning; Han, Wenhua
2017-01-01
In order to meet the higher accuracy and system reliability requirements, the information fusion for multi-sensor systems is an increasing concern. Dempster–Shafer evidence theory (D–S theory) has been investigated for many applications in multi-sensor information fusion due to its flexibility in uncertainty modeling. However, classical evidence theory assumes that the evidence is independent of each other, which is often unrealistic. Ignoring the relationship between the evidence may lead to unreasonable fusion results, and even lead to wrong decisions. This assumption severely prevents D–S evidence theory from practical application and further development. In this paper, an innovative evidence fusion model to deal with dependent evidence based on rank correlation coefficient is proposed. The model first uses rank correlation coefficient to measure the dependence degree between different evidence. Then, total discount coefficient is obtained based on the dependence degree, which also considers the impact of the reliability of evidence. Finally, the discount evidence fusion model is presented. An example is illustrated to show the use and effectiveness of the proposed method. PMID:29035341
Osman, Augustine; Lamis, Dorian A; Bagge, Courtney L; Freedenthal, Stacey; Barnes, Sean M
2016-01-01
We examined the factor structure and psychometric properties of the Mindful Attention Awareness Scale (MAAS) in a sample of 810 undergraduate students. Using common exploratory factor analysis (EFA), we obtained evidence for a 1-factor solution (41.84% common variance). To confirm unidimensionality of the 15-item MAAS, we conducted a 1-factor confirmatory factor analysis (CFA). Results of the EFA and CFA, respectively, provided support for a unidimensional model. Using differential item functioning analysis methods within item response theory modeling (IRT-based DIF), we found that individuals with high and low levels of nonattachment responded similarly to the MAAS items. Following a detailed item analysis, we proposed a 5-item short version of the instrument and present descriptive statistics and composite score reliability for the short and full versions of the MAAS. Finally, correlation analyses showed that scores on the full and short versions of the MAAS were associated with measures assessing related constructs. The 5-item MAAS is as useful as the original MAAS in enhancing our understanding of the mindfulness construct.
Schultz, Peter A.
2016-03-01
For the purposes of making reliable first-principles predictions of defect energies in semiconductors, it is crucial to distinguish between effective-mass-like defects, which cannot be treated accurately with existing supercell methods, and deep defects, for which density functional theory calculations can yield reliable predictions of defect energy levels. The gallium antisite defect GaAs is often associated with the 78/203 meV shallow double acceptor in Ga-rich gallium arsenide. Within a conceptual framework of level patterns, analyses of structure and spin stabilization can be used within a supercell approach to distinguish localized deep defect states from shallow acceptors such as B As. Thismore » systematic approach determines that the gallium antisite supercell results has signatures inconsistent with an effective mass state and cannot be the 78/203 shallow double acceptor. Lastly, the properties of the Ga antisite in GaAs are described, total energy calculations that explicitly map onto asymptotic discrete localized bulk states predict that the Ga antisite is a deep double acceptor and has at least one deep donor state.« less
Determination of the coronal magnetic field from vector magnetograph data
NASA Technical Reports Server (NTRS)
Mikic, Zoran
1991-01-01
A new algorithm was developed, tested, and applied to determine coronal magnetic fields above solar active regions. The coronal field above NOAA active region AR5747 was successfully estimated on 20 Oct. 1989 from data taken at the Mees Solar Observatory of the Univ. of Hawaii. It was shown that observational data can be used to obtain realistic estimates of coronal magnetic fields. The model has significantly extended the realism with which the coronal magnetic field can be inferred from observations. The understanding of coronal phenomena will be greatly advanced by a reliable technique, such as the one presented, for deducing the detailed spatial structure of the coronal field. The payoff from major current and proposed NASA observational efforts is heavily dependent on the success with which the coronal field can be inferred from vector magnetograms. In particular, the present inability to reliably obtain the coronal field has been a major obstacle to the theoretical advancement of solar flare theory and prediction. The results have shown that the evolutional algorithm can be used to estimate coronal magnetic fields.
Zhou, Hongmei; Romero, Stephanie Ballon; Qin, Xiao
2016-10-01
This paper aimed to examine pedestrians' self-reported violating crossing behavior intentions by applying the theory of planned behavior (TPB). We studied the behavior intentions regarding instrumental attitude, subjective norm, perceived behavioral control, the three basic components of TPB, and extended the theory by adding new factors including descriptive norm, perceived risk and conformity tendency to evaluate their respective impacts on pedestrians' behavior intentions. A questionnaire presented with a scenario that pedestrians crossed the road violating the pedestrian lights at an intersection was designed, and the survey was conducted in Dalian, China. Based on the 260 complete and valid responses, reliability and validity of the data for each question was evaluated. The data were then analyzed by using the structural equation modeling (SEM). The results showed that people had a negative attitude toward the behavior of violating road-crossing rules; they perceived social influences from their family and friends; and they believed that this kind of risky behavior would potentially harm them in a traffic accident. The results also showed that instrumental attitude and subjective norm were significant in the basic TPB model. After adding descriptive norm, subjective norm was no more significant. Other models showed that conformity tendency was a strong predictor, indicating that the presence of other pedestrians would influence behavioral intention. The findings could help to design more effective interventions and safety campaigns, such as changing people's attitude toward this violation behavior, correcting the social norms, increasing their safety awareness, etc. in order to reduce pedestrians' road crossing violations. Copyright © 2015 Elsevier Ltd. All rights reserved.