Validation of psychoanalytic theories: towards a conceptualization of references.
Zachrisson, Anders; Zachrisson, Henrik Daae
2005-10-01
The authors discuss criteria for the validation of psychoanalytic theories and develop a heuristic and normative model of the references needed for this. Their core question in this paper is: can psychoanalytic theories be validated exclusively from within psychoanalytic theory (internal validation), or are references to sources of knowledge other than psychoanalysis also necessary (external validation)? They discuss aspects of the classic truth criteria correspondence and coherence, both from the point of view of contemporary psychoanalysis and of contemporary philosophy of science. The authors present arguments for both external and internal validation. Internal validation has to deal with the problems of subjectivity of observations and circularity of reasoning, external validation with the problem of relevance. They recommend a critical attitude towards psychoanalytic theories, which, by carefully scrutinizing weak points and invalidating observations in the theories, reduces the risk of wishful thinking. The authors conclude by sketching a heuristic model of validation. This model combines correspondence and coherence with internal and external validation into a four-leaf model for references for the process of validating psychoanalytic theories.
ERIC Educational Resources Information Center
Teo, Timothy; Tan, Lynde
2012-01-01
This study applies the theory of planned behavior (TPB), a theory that is commonly used in commercial settings, to the educational context to explain pre-service teachers' technology acceptance. It is also interested in examining its validity when used for this purpose. It has found evidence that the TPB is a valid model to explain pre-service…
Hagger, Martin S.; Gucciardi, Daniel F.; Chatzisarantis, Nikos L. D.
2017-01-01
Tests of social cognitive theories provide informative data on the factors that relate to health behavior, and the processes and mechanisms involved. In the present article, we contend that tests of social cognitive theories should adhere to the principles of nomological validity, defined as the degree to which predictions in a formal theoretical network are confirmed. We highlight the importance of nomological validity tests to ensure theory predictions can be disconfirmed through observation. We argue that researchers should be explicit on the conditions that lead to theory disconfirmation, and identify any auxiliary assumptions on which theory effects may be conditional. We contend that few researchers formally test the nomological validity of theories, or outline conditions that lead to model rejection and the auxiliary assumptions that may explain findings that run counter to hypotheses, raising potential for ‘falsification evasion.’ We present a brief analysis of studies (k = 122) testing four key social cognitive theories in health behavior to illustrate deficiencies in reporting theory tests and evaluations of nomological validity. Our analysis revealed that few articles report explicit statements suggesting that their findings support or reject the hypotheses of the theories tested, even when findings point to rejection. We illustrate the importance of explicit a priori specification of fundamental theory hypotheses and associated auxiliary assumptions, and identification of the conditions which would lead to rejection of theory predictions. We also demonstrate the value of confirmatory analytic techniques, meta-analytic structural equation modeling, and Bayesian analyses in providing robust converging evidence for nomological validity. We provide a set of guidelines for researchers on how to adopt and apply the nomological validity approach to testing health behavior models. PMID:29163307
ERIC Educational Resources Information Center
Dimitrov, Dimiter M.
2007-01-01
The validation of cognitive attributes required for correct answers on binary test items or tasks has been addressed in previous research through the integration of cognitive psychology and psychometric models using parametric or nonparametric item response theory, latent class modeling, and Bayesian modeling. All previous models, each with their…
ERIC Educational Resources Information Center
Schilling, Stephen G.
2007-01-01
In this paper the author examines the role of item response theory (IRT), particularly multidimensional item response theory (MIRT) in test validation from a validity argument perspective. The author provides justification for several structural assumptions and interpretations, taking care to describe the role he believes they should play in any…
Koch, Ina; Junker, Björn H; Heiner, Monika
2005-04-01
Because of the complexity of metabolic networks and their regulation, formal modelling is a useful method to improve the understanding of these systems. An essential step in network modelling is to validate the network model. Petri net theory provides algorithms and methods, which can be applied directly to metabolic network modelling and analysis in order to validate the model. The metabolism between sucrose and starch in the potato tuber is of great research interest. Even if the metabolism is one of the best studied in sink organs, it is not yet fully understood. We provide an approach for model validation of metabolic networks using Petri net theory, which we demonstrate for the sucrose breakdown pathway in the potato tuber. We start with hierarchical modelling of the metabolic network as a Petri net and continue with the analysis of qualitative properties of the network. The results characterize the net structure and give insights into the complex net behaviour.
Construct validity of the Moral Development Scale for Professionals (MDSP).
Söderhamn, Olle; Bjørnestad, John Olav; Skisland, Anne; Cliffordson, Christina
2011-01-01
The aim of this study was to investigate the construct validity of the Moral Development Scale for Professionals (MDSP) using structural equation modeling. The instrument is a 12-item self-report instrument, developed in the Scandinavian cultural context and based on Kohlberg's theory. A hypothesized simplex structure model underlying the MDSP was tested through structural equation modeling. Validity was also tested as the proportion of respondents older than 20 years that reached the highest moral level, which according to the theory should be small. A convenience sample of 339 nursing students with a mean age of 25.3 years participated. Results confirmed the simplex model structure, indicating that MDSP reflects a moral construct empirically organized from low to high. A minority of respondents >20 years of age (13.5%) scored more than 80% on the highest moral level. The findings support the construct validity of the MDSP and the stages and levels in Kohlberg's theory.
Cultural Geography Model Validation
2010-03-01
the Cultural Geography Model (CGM), a government owned, open source multi - agent system utilizing Bayesian networks, queuing systems, the Theory of...referent determined either from theory or SME opinion. 4. CGM Overview The CGM is a government-owned, open source, data driven multi - agent social...HSCB, validation, social network analysis ABSTRACT: In the current warfighting environment , the military needs robust modeling and simulation (M&S
Likert or Not, Survey (In)Validation Requires Explicit Theories and True Grit
ERIC Educational Resources Information Center
McGrane, Joshua A.; Nowland, Trisha
2017-01-01
From the time of Likert (1932) on, attitudes of expediency regarding both theory and methodology became apparent with reference to survey construction and validation practices. In place of theory and more--theoretically minded methods, such as those found in the early work of Thurstone (1928) and Coombs (1964), statistical models and…
Development and Validation of the Sorokin Psychosocial Love Inventory for Divorced Individuals
ERIC Educational Resources Information Center
D'Ambrosio, Joseph G.; Faul, Anna C.
2013-01-01
Objective: This study describes the development and validation of the Sorokin Psychosocial Love Inventory (SPSLI) measuring love actions toward a former spouse. Method: Classical measurement theory and confirmatory factor analysis (CFA) were utilized with an a priori theory and factor model to validate the SPSLI. Results: A 15-item scale…
Construct validity of the Moral Development Scale for Professionals (MDSP)
Söderhamn, Olle; Bjørnestad, John Olav; Skisland, Anne; Cliffordson, Christina
2011-01-01
The aim of this study was to investigate the construct validity of the Moral Development Scale for Professionals (MDSP) using structural equation modeling. The instrument is a 12-item self-report instrument, developed in the Scandinavian cultural context and based on Kohlberg’s theory. A hypothesized simplex structure model underlying the MDSP was tested through structural equation modeling. Validity was also tested as the proportion of respondents older than 20 years that reached the highest moral level, which according to the theory should be small. A convenience sample of 339 nursing students with a mean age of 25.3 years participated. Results confirmed the simplex model structure, indicating that MDSP reflects a moral construct empirically organized from low to high. A minority of respondents >20 years of age (13.5%) scored more than 80% on the highest moral level. The findings support the construct validity of the MDSP and the stages and levels in Kohlberg’s theory. PMID:21655343
Quantifying falsifiability of scientific theories
NASA Astrophysics Data System (ADS)
Nemenman, Ilya
I argue that the notion of falsifiability, a key concept in defining a valid scientific theory, can be quantified using Bayesian Model Selection, which is a standard tool in modern statistics. This relates falsifiability to the quantitative version of the statistical Occam's razor, and allows transforming some long-running arguments about validity of scientific theories from philosophical discussions to rigorous mathematical calculations.
ERIC Educational Resources Information Center
Li, Ying; Jiao, Hong; Lissitz, Robert W.
2012-01-01
This study investigated the application of multidimensional item response theory (IRT) models to validate test structure and dimensionality. Multiple content areas or domains within a single subject often exist in large-scale achievement tests. Such areas or domains may cause multidimensionality or local item dependence, which both violate the…
Kohlberg's Moral Development Model: Cohort Influences on Validity.
ERIC Educational Resources Information Center
Bechtel, Ashleah
An overview of Kohlberg's theory of moral development is presented; three interviews regarding the theory are reported, and the author's own moral development is compared to the model; finally, a critique of the theory is addressed along with recommendations for future enhancement. Lawrence Kohlberg's model of moral development, also referred to…
Ground-water models: Validate or invalidate
Bredehoeft, J.D.; Konikow, Leonard F.
1993-01-01
The word validation has a clear meaning to both the scientific community and the general public. Within the scientific community the validation of scientific theory has been the subject of philosophical debate. The philosopher of science, Karl Popper, argued that scientific theory cannot be validated, only invalidated. Popper’s view is not the only opinion in this debate; however, many scientists today agree with Popper (including the authors). To the general public, proclaiming that a ground-water model is validated carries with it an aura of correctness that we do not believe many of us who model would claim. We can place all the caveats we wish, but the public has its own understanding of what the word implies. Using the word valid with respect to models misleads the public; verification carries with it similar connotations as far as the public is concerned. Our point is this: using the terms validation and verification are misleading, at best. These terms should be abandoned by the ground-water community.
ERIC Educational Resources Information Center
Mayhew, Matthew J.; Hubbard, Steven M.; Finelli, Cynthia J.; Harding, Trevor S.; Carpenter, Donald D.
2009-01-01
The purpose of this paper is to validate the use of a modified Theory of Planned Behavior (TPB) for predicting undergraduate student cheating. Specifically, we administered a survey assessing how the TPB relates to cheating along with a measure of moral reasoning (DIT- 2) to 527 undergraduate students across three institutions; and analyzed the…
Hunt, Hillary R; Gross, Alan M
2009-11-01
Obesity is a world-wide health concern approaching epidemic proportions. Successful long-term treatment involves a combination of bariatric surgery, diet, and exercise. Social cognitive models, such as the Theory of Reasoned Action (TRA) and the Theory of Planned Behavior (TPB), are among the most commonly tested theories utilized in the prediction of exercise. As exercise is not a completely volitional behavior, it is hypothesized that the TPB is a superior theoretical model for the prediction of exercise intentions and behavior. This study tested validity of the TPB in a sample of bariatric patients and further validated its improvement over the TRA in predicting exercise adherence at different operative stages. Results generally confirmed research hypotheses. Superiority of the TPB model was validated in this sample of bariatric patients, and Perceived Behavioral Control emerged as the single-best predictor of both exercise intentions and self-reported behavior. Finally, results suggested that both subjective norms and attitudes toward exercise played a larger role in the prediction of intention and behavior than previously reported.
Riding on irrelevant operators
DOE Office of Scientific and Technical Information (OSTI.GOV)
De Rham, Claudia; Ribeiro, Raquel H., E-mail: Claudia.deRham@case.edu, E-mail: RaquelHRibeiro@case.edu
2014-11-01
We investigate the stability of a class of derivative theories known as P(X) and Galileons against corrections generated by quantum effects. We use an exact renormalisation group approach to argue that these theories are stable under quantum corrections at all loops in regions where the kinetic term is large compared to the strong coupling scale. This is the regime of interest for screening or Vainshtein mechanisms, and in inflationary models that rely on large kinetic terms. Next, we clarify the role played by the symmetries. While symmetries protect the form of the quantum corrections, theories equipped with more symmetries domore » not necessarily have a broader range of scales for which they are valid. We show this by deriving explicitly the regime of validity of the classical solutions for P(X) theories including Dirac-Born-Infeld (DBI) models, both in generic and for specific background field configurations. Indeed, we find that despite the existence of an additional symmetry, the DBI effective field theory has a regime of validity similar to an arbitrary P(X) theory. We explore the implications of our results for both early and late universe contexts. Conversely, when applied to static and spherical screening mechanisms, we deduce that the regime of validity of typical power-law P(X) theories is much larger than that of DBI.« less
Shilov, V N; Borkovskaja, Y B; Dukhin, A S
2004-09-15
Existing theories of electroacoustic phenomena in concentrated colloids neglect the possibility of double layer overlap and are valid mostly for the "thin double layer," when the double layer thickness is much less than the particle size. In this paper we present a new electroacoustic theory which removes this restriction. This would make this new theory applicable to characterizing a variety of aqueous nanocolloids and of nonaqueous dispersions. There are two versions of the theory leading to the analytical solutions. The first version corresponds to strongly overlapped diffuse layers (so-called quasi-homogeneous model). It yields a simple analytical formula for colloid vibration current (CVI), which is valid for arbitrary ultrasound frequency, but for restricted kappa alpha range. This version of the theory, as well the Smoluchowski theory for microelectrophoresis, is independent of particle shape and polydispersity. This makes it very attractive for practical use, with the hope that it might be as useful as classical Smoluchowski theory. In order to determine the kappa alpha range of the quasi-homogeneous model validity we develop the second version that limits ultrasound frequency, but applies no restriction on kappa alpha. The ultrasound frequency should substantially exceed the Maxwell-Wagner relaxation frequency. This limitation makes active conductivity related current negligible compared to the passive dielectric displacement current. It is possible to derive an expression for CVI in the concentrated dispersion as formulae inhering definite integrals with integrands depending on equilibrium potential distribution. This second version allowed us to estimate the ranges of the applicability of the first, quasi-homogeneous version. It turns out that the quasi-homogeneous model works for kappa alpha values up to almost 1. For instance, at volume fraction 30%, the highest kappa alpha limit of the quasi-homogeneous model is 0.65. Therefore, this version of the electroacoustic theory is valid for almost all nonaqueous dispersions and a wide variety of nanocolloids, especially with sizes under 100 nm.
2012-08-01
biomechanical modeling (e.g. arteries). It is also possible to go still fur- ther with the concept and blend shell theories with continuum solid theories in the...spirit of transition elements. Again biomechanical modeling opportunities present themselves, such as for heart-artery models . We also note that all...these blended theories can be developed within the IGA format of exact CAD modeling . The blended formulation presented here is valid for a broad class
Calibrated Blade-Element/Momentum Theory Aerodynamic Model of the MARIN Stock Wind Turbine: Preprint
DOE Office of Scientific and Technical Information (OSTI.GOV)
Goupee, A.; Kimball, R.; de Ridder, E. J.
2015-04-02
In this paper, a calibrated blade-element/momentum theory aerodynamic model of the MARIN stock wind turbine is developed and documented. The model is created using open-source software and calibrated to closely emulate experimental data obtained by the DeepCwind Consortium using a genetic algorithm optimization routine. The provided model will be useful for those interested in validating interested in validating floating wind turbine numerical simulators that rely on experiments utilizing the MARIN stock wind turbine—for example, the International Energy Agency Wind Task 30’s Offshore Code Comparison Collaboration Continued, with Correlation project.
Modality, probability, and mental models.
Hinterecker, Thomas; Knauff, Markus; Johnson-Laird, P N
2016-10-01
We report 3 experiments investigating novel sorts of inference, such as: A or B or both. Therefore, possibly (A and B). Where the contents were sensible assertions, for example, Space tourism will achieve widespread popularity in the next 50 years or advances in material science will lead to the development of antigravity materials in the next 50 years, or both . Most participants accepted the inferences as valid, though they are invalid in modal logic and in probabilistic logic too. But, the theory of mental models predicts that individuals should accept them. In contrast, inferences of this sort—A or B but not both. Therefore, A or B or both—are both logically valid and probabilistically valid. Yet, as the model theory also predicts, most reasoners rejected them. The participants’ estimates of probabilities showed that their inferences tended not to be based on probabilistic validity, but that they did rate acceptable conclusions as more probable than unacceptable conclusions. We discuss the implications of the results for current theories of reasoning. PsycINFO Database Record (c) 2016 APA, all rights reserved
A signal detection-item response theory model for evaluating neuropsychological measures.
Thomas, Michael L; Brown, Gregory G; Gur, Ruben C; Moore, Tyler M; Patt, Virginie M; Risbrough, Victoria B; Baker, Dewleen G
2018-02-05
Models from signal detection theory are commonly used to score neuropsychological test data, especially tests of recognition memory. Here we show that certain item response theory models can be formulated as signal detection theory models, thus linking two complementary but distinct methodologies. We then use the approach to evaluate the validity (construct representation) of commonly used research measures, demonstrate the impact of conditional error on neuropsychological outcomes, and evaluate measurement bias. Signal detection-item response theory (SD-IRT) models were fitted to recognition memory data for words, faces, and objects. The sample consisted of U.S. Infantry Marines and Navy Corpsmen participating in the Marine Resiliency Study. Data comprised item responses to the Penn Face Memory Test (PFMT; N = 1,338), Penn Word Memory Test (PWMT; N = 1,331), and Visual Object Learning Test (VOLT; N = 1,249), and self-report of past head injury with loss of consciousness. SD-IRT models adequately fitted recognition memory item data across all modalities. Error varied systematically with ability estimates, and distributions of residuals from the regression of memory discrimination onto self-report of past head injury were positively skewed towards regions of larger measurement error. Analyses of differential item functioning revealed little evidence of systematic bias by level of education. SD-IRT models benefit from the measurement rigor of item response theory-which permits the modeling of item difficulty and examinee ability-and from signal detection theory-which provides an interpretive framework encompassing the experimentally validated constructs of memory discrimination and response bias. We used this approach to validate the construct representation of commonly used research measures and to demonstrate how nonoptimized item parameters can lead to erroneous conclusions when interpreting neuropsychological test data. Future work might include the development of computerized adaptive tests and integration with mixture and random-effects models.
Developing and Validating the Socio-Technical Model in Ontology Engineering
NASA Astrophysics Data System (ADS)
Silalahi, Mesnan; Indra Sensuse, Dana; Giri Sucahyo, Yudho; Fadhilah Akmaliah, Izzah; Rahayu, Puji; Cahyaningsih, Elin
2018-03-01
This paper describes results from an attempt to develop a model in ontology engineering methodology and a way to validate the model. The approach to methodology in ontology engineering is from the point view of socio-technical system theory. Qualitative research synthesis is used to build the model using meta-ethnography. In order to ensure the objectivity of the measurement, inter-rater reliability method was applied using a multi-rater Fleiss Kappa. The results show the accordance of the research output with the diamond model in the socio-technical system theory by evidence of the interdependency of the four socio-technical variables namely people, technology, structure and task.
From theory to experimental design-Quantifying a trait-based theory of predator-prey dynamics.
Laubmeier, A N; Wootton, Kate; Banks, J E; Bommarco, Riccardo; Curtsdotter, Alva; Jonsson, Tomas; Roslin, Tomas; Banks, H T
2018-01-01
Successfully applying theoretical models to natural communities and predicting ecosystem behavior under changing conditions is the backbone of predictive ecology. However, the experiments required to test these models are dictated by practical constraints, and models are often opportunistically validated against data for which they were never intended. Alternatively, we can inform and improve experimental design by an in-depth pre-experimental analysis of the model, generating experiments better targeted at testing the validity of a theory. Here, we describe this process for a specific experiment. Starting from food web ecological theory, we formulate a model and design an experiment to optimally test the validity of the theory, supplementing traditional design considerations with model analysis. The experiment itself will be run and described in a separate paper. The theory we test is that trophic population dynamics are dictated by species traits, and we study this in a community of terrestrial arthropods. We depart from the Allometric Trophic Network (ATN) model and hypothesize that including habitat use, in addition to body mass, is necessary to better model trophic interactions. We therefore formulate new terms which account for micro-habitat use as well as intra- and interspecific interference in the ATN model. We design an experiment and an effective sampling regime to test this model and the underlying assumptions about the traits dominating trophic interactions. We arrive at a detailed sampling protocol to maximize information content in the empirical data obtained from the experiment and, relying on theoretical analysis of the proposed model, explore potential shortcomings of our design. Consequently, since this is a "pre-experimental" exercise aimed at improving the links between hypothesis formulation, model construction, experimental design and data collection, we hasten to publish our findings before analyzing data from the actual experiment, thus setting the stage for strong inference.
An NCME Instructional Module on Item-Fit Statistics for Item Response Theory Models
ERIC Educational Resources Information Center
Ames, Allison J.; Penfield, Randall D.
2015-01-01
Drawing valid inferences from item response theory (IRT) models is contingent upon a good fit of the data to the model. Violations of model-data fit have numerous consequences, limiting the usefulness and applicability of the model. This instructional module provides an overview of methods used for evaluating the fit of IRT models. Upon completing…
1997-10-14
Metrics + Modeling and Results + Conclusions ------------------- ------------------- Introduction Floquet Theory * Primary mathematical tool for...addition, a higher order plate theory is incorporated into the plate segment constitutive equations. The shear strain correction influences the torsion...behavior while the higher order plate theory influences the transverse shear behavior. The theory is validated against 3-D finite element results
Cappelleri, Joseph C; Jason Lundy, J; Hays, Ron D
2014-05-01
The US Food and Drug Administration's guidance for industry document on patient-reported outcomes (PRO) defines content validity as "the extent to which the instrument measures the concept of interest" (FDA, 2009, p. 12). According to Strauss and Smith (2009), construct validity "is now generally viewed as a unifying form of validity for psychological measurements, subsuming both content and criterion validity" (p. 7). Hence, both qualitative and quantitative information are essential in evaluating the validity of measures. We review classical test theory and item response theory (IRT) approaches to evaluating PRO measures, including frequency of responses to each category of the items in a multi-item scale, the distribution of scale scores, floor and ceiling effects, the relationship between item response options and the total score, and the extent to which hypothesized "difficulty" (severity) order of items is represented by observed responses. If a researcher has few qualitative data and wants to get preliminary information about the content validity of the instrument, then descriptive assessments using classical test theory should be the first step. As the sample size grows during subsequent stages of instrument development, confidence in the numerical estimates from Rasch and other IRT models (as well as those of classical test theory) would also grow. Classical test theory and IRT can be useful in providing a quantitative assessment of items and scales during the content-validity phase of PRO-measure development. Depending on the particular type of measure and the specific circumstances, the classical test theory and/or the IRT should be considered to help maximize the content validity of PRO measures. Copyright © 2014 Elsevier HS Journals, Inc. All rights reserved.
Sebire, Simon J; Jago, Russell; Fox, Kenneth R; Edwards, Mark J; Thompson, Janice L
2013-09-26
Understanding children's physical activity motivation, its antecedents and associations with behavior is important and can be advanced by using self-determination theory. However, research among youth is largely restricted to adolescents and studies of motivation within certain contexts (e.g., physical education). There are no measures of self-determination theory constructs (physical activity motivation or psychological need satisfaction) for use among children and no previous studies have tested a self-determination theory-based model of children's physical activity motivation. The purpose of this study was to test the reliability and validity of scores derived from scales adapted to measure self-determination theory constructs among children and test a motivational model predicting accelerometer-derived physical activity. Cross-sectional data from 462 children aged 7 to 11 years from 20 primary schools in Bristol, UK were analysed. Confirmatory factor analysis was used to examine the construct validity of adapted behavioral regulation and psychological need satisfaction scales. Structural equation modelling was used to test cross-sectional associations between psychological need satisfaction, motivation types and physical activity assessed by accelerometer. The construct validity and reliability of the motivation and psychological need satisfaction measures were supported. Structural equation modelling provided evidence for a motivational model in which psychological need satisfaction was positively associated with intrinsic and identified motivation types and intrinsic motivation was positively associated with children's minutes in moderate-to-vigorous physical activity. The study provides evidence for the psychometric properties of measures of motivation aligned with self-determination theory among children. Children's motivation that is based on enjoyment and inherent satisfaction of physical activity is associated with their objectively-assessed physical activity and such motivation is positively associated with perceptions of psychological need satisfaction. These psychological factors represent potential malleable targets for interventions to increase children's physical activity.
Culture and Parenting: Family Models Are Not One-Size-Fits-All. FPG Snapshot #67
ERIC Educational Resources Information Center
FPG Child Development Institute, 2012
2012-01-01
Family process models guide theories and research about family functioning and child development outcomes. Theory and research, in turn, inform policies and services aimed at families. But are widely accepted models valid across cultural groups? To address these gaps, FPG researchers examined the utility of two family process models for families…
ERIC Educational Resources Information Center
Hidiroglu, Çaglar Naci; Bukova Güzel, Esra
2013-01-01
The aim of the present study is to conceptualize the approaches displayed for validation of model and thought processes provided in mathematical modeling process performed in technology-aided learning environment. The participants of this grounded theory study were nineteen secondary school mathematics student teachers. The data gathered from the…
Validation of Western North America Models based on finite-frequency and ray theory imaging methods
DOE Office of Scientific and Technical Information (OSTI.GOV)
Larmat, Carene; Maceira, Monica; Porritt, Robert W.
2015-02-02
We validate seismic models developed for western North America with a focus on effect of imaging methods on data fit. We use the DNA09 models for which our collaborators provide models built with both the body-wave FF approach and the RT approach, when the data selection, processing and reference models are the same.
Goodness-of-Fit Assessment of Item Response Theory Models
ERIC Educational Resources Information Center
Maydeu-Olivares, Alberto
2013-01-01
The article provides an overview of goodness-of-fit assessment methods for item response theory (IRT) models. It is now possible to obtain accurate "p"-values of the overall fit of the model if bivariate information statistics are used. Several alternative approaches are described. As the validity of inferences drawn on the fitted model…
Validating for Use and Interpretation: A Mixed Methods Contribution Illustrated
ERIC Educational Resources Information Center
Morell, Linda; Tan, Rachael Jin Bee
2009-01-01
Researchers in the areas of psychology and education strive to understand the intersections among validity, educational measurement, and cognitive theory. Guided by a mixed model conceptual framework, this study investigates how respondents' opinions inform the validation argument. Validity evidence for a science assessment was collected through…
Niu, Ran; Skliar, Mikhail
2012-07-01
In this paper, we develop and validate a method to identify computationally efficient site- and patient-specific models of ultrasound thermal therapies from MR thermal images. The models of the specific absorption rate of the transduced energy and the temperature response of the therapy target are identified in the reduced basis of proper orthogonal decomposition of thermal images, acquired in response to a mild thermal test excitation. The method permits dynamic reidentification of the treatment models during the therapy by recursively utilizing newly acquired images. Such adaptation is particularly important during high-temperature therapies, which are known to substantially and rapidly change tissue properties and blood perfusion. The developed theory was validated for the case of focused ultrasound heating of a tissue phantom. The experimental and computational results indicate that the developed approach produces accurate low-dimensional treatment models despite temporal and spatial noises in MR images and slow image acquisition rate.
Kinetic Theories for Biofilms (Preprint)
2011-01-01
2011 2. REPORT TYPE 3. DATES COVERED 00-00-2011 to 00-00-2011 4. TITLE AND SUBTITLE Kinetic Theories for Biofilms 5a. CONTRACT NUMBER 5b...binary complex fluids to develop a set of hydrodynamic models for the two-phase mixture of biofilms and solvent (water). It is aimed to model...kinetics along with the intrinsic molecular elasticity of the EPS network strand modeled as an elastic dumbbell. This theory is valid in both the biofilm
[Traceability of Wine Varieties Using Near Infrared Spectroscopy Combined with Cyclic Voltammetry].
Li, Meng-hua; Li, Jing-ming; Li, Jun-hui; Zhang, Lu-da; Zhao, Long-lian
2015-06-01
To achieve the traceability of wine varieties, a method was proposed to fuse Near-infrared (NIR) spectra and cyclic voltammograms (CV) which contain different information using D-S evidence theory. NIR spectra and CV curves of three different varieties of wines (cabernet sauvignon, merlot, cabernet gernischt) which come from seven different geographical origins were collected separately. The discriminant models were built using PLS-DA method. Based on this, D-S evidence theory was then applied to achieve the integration of the two kinds of discrimination results. After integrated by D-S evidence theory, the accuracy rate of cross-validation is 95.69% and validation set is 94.12% for wine variety identification. When only considering the wine that come from Yantai, the accuracy rate of cross-validation is 99.46% and validation set is 100%. All the traceability models after fusion achieved better results on classification than individual method. These results suggest that the proposed method combining electrochemical information with spectral information using the D-S evidence combination formula is benefit to the improvement of model discrimination effect, and is a promising tool for discriminating different kinds of wines.
Mazilu, I; Mazilu, D A; Melkerson, R E; Hall-Mejia, E; Beck, G J; Nshimyumukiza, S; da Fonseca, Carlos M
2016-03-01
We present exact and approximate results for a class of cooperative sequential adsorption models using matrix theory, mean-field theory, and computer simulations. We validate our models with two customized experiments using ionically self-assembled nanoparticles on glass slides. We also address the limitations of our models and their range of applicability. The exact results obtained using matrix theory can be applied to a variety of two-state systems with cooperative effects.
Dong, Ren G; Welcome, Daniel E; McDowell, Thomas W; Wu, John Z
2013-11-25
The relationship between the vibration transmissibility and driving-point response functions (DPRFs) of the human body is important for understanding vibration exposures of the system and for developing valid models. This study identified their theoretical relationship and demonstrated that the sum of the DPRFs can be expressed as a linear combination of the transmissibility functions of the individual mass elements distributed throughout the system. The relationship is verified using several human vibration models. This study also clarified the requirements for reliably quantifying transmissibility values used as references for calibrating the system models. As an example application, this study used the developed theory to perform a preliminary analysis of the method for calibrating models using both vibration transmissibility and DPRFs. The results of the analysis show that the combined method can theoretically result in a unique and valid solution of the model parameters, at least for linear systems. However, the validation of the method itself does not guarantee the validation of the calibrated model, because the validation of the calibration also depends on the model structure and the reliability and appropriate representation of the reference functions. The basic theory developed in this study is also applicable to the vibration analyses of other structures.
A Model-Based Method for Content Validation of Automatically Generated Test Items
ERIC Educational Resources Information Center
Zhang, Xinxin; Gierl, Mark
2016-01-01
The purpose of this study is to describe a methodology to recover the item model used to generate multiple-choice test items with a novel graph theory approach. Beginning with the generated test items and working backward to recover the original item model provides a model-based method for validating the content used to automatically generate test…
Optical-model abrasion cross sections for high-energy heavy ions
NASA Technical Reports Server (NTRS)
Townsend, L. W.
1981-01-01
Within the context of eikonal scattering theory, a generalized optical model potential approximation to the nucleus-nucleus multiple scattering series is used in an abrasion-ablation collision model to predict abrasion cross sections for relativistic projectile heavy ions. Unlike the optical limit of Glauber theory, which cannot be used for very light nuclei, the abrasion formalism is valid for any projectile target combination at any incident kinetic energy for which eikonal scattering theory can be utilized. Results are compared with experimental results and predictions from Glauber theory.
Assessment of Differential Item Functioning under Cognitive Diagnosis Models: The DINA Model Example
ERIC Educational Resources Information Center
Li, Xiaomin; Wang, Wen-Chung
2015-01-01
The assessment of differential item functioning (DIF) is routinely conducted to ensure test fairness and validity. Although many DIF assessment methods have been developed in the context of classical test theory and item response theory, they are not applicable for cognitive diagnosis models (CDMs), as the underlying latent attributes of CDMs are…
Investigation of a Nonparametric Procedure for Assessing Goodness-of-Fit in Item Response Theory
ERIC Educational Resources Information Center
Wells, Craig S.; Bolt, Daniel M.
2008-01-01
Tests of model misfit are often performed to validate the use of a particular model in item response theory. Douglas and Cohen (2001) introduced a general nonparametric approach for detecting misfit under the two-parameter logistic model. However, the statistical properties of their approach, and empirical comparisons to other methods, have not…
Interpreting Variance Components as Evidence for Reliability and Validity.
ERIC Educational Resources Information Center
Kane, Michael T.
The reliability and validity of measurement is analyzed by a sampling model based on generalizability theory. A model for the relationship between a measurement procedure and an attribute is developed from an analysis of how measurements are used and interpreted in science. The model provides a basis for analyzing the concept of an error of…
The Comprehension and Validation of Social Information.
ERIC Educational Resources Information Center
Wyer, Robert S., Jr.; Radvansky, Gabriel A.
1999-01-01
Proposes a theory of social cognition to account for the comprehension and verification of social information. The theory views comprehension as a process of constructing situation models of new information on the basis of previously formed models about its referents. The comprehension of both single statements and multiple pieces of information…
NASA Astrophysics Data System (ADS)
Sarout, Joël.
2012-04-01
For the first time, a comprehensive and quantitative analysis of the domains of validity of popular wave propagation theories for porous/cracked media is provided. The case of a simple, yet versatile rock microstructure is detailed. The microstructural parameters controlling the applicability of the scattering theories, the effective medium theories, the quasi-static (Gassmann limit) and dynamic (inertial) poroelasticity are analysed in terms of pores/cracks characteristic size, geometry and connectivity. To this end, a new permeability model is devised combining the hydraulic radius and percolation concepts. The predictions of this model are compared to published micromechanical models of permeability for the limiting cases of capillary tubes and penny-shaped cracks. It is also compared to published experimental data on natural rocks in these limiting cases. It explicitly accounts for pore space topology around the percolation threshold and far above it. Thanks to this permeability model, the scattering, squirt-flow and Biot cut-off frequencies are quantitatively compared. This comparison leads to an explicit mapping of the domains of validity of these wave propagation theories as a function of the rock's actual microstructure. How this mapping impacts seismic, geophysical and ultrasonic wave velocity data interpretation is discussed. The methodology demonstrated here and the outcomes of this analysis are meant to constitute a quantitative guide for the selection of the most suitable modelling strategy to be employed for prediction and/or interpretation of rocks elastic properties in laboratory-or field-scale applications when information regarding the rock's microstructure is available.
Mathematical model of the SH-3G helicopter
NASA Technical Reports Server (NTRS)
Phillips, J. D.
1982-01-01
A mathematical model of the Sikorsky SH-3G helicopter based on classical nonlinear, quasi-steady rotor theory was developed. The model was validated statically and dynamically by comparison with Navy flight-test data. The model incorporates ad hoc revisions which address the ideal assumptions of classical rotor theory and improve the static trim characteristics to provide a more realistic simulation, while retaining the simplicity of the classical model.
ERIC Educational Resources Information Center
Gogus, Aytac; Nistor, Nicolae; Riley, Richard W.; Lerche, Thomas
2012-01-01
The Unified Theory of Acceptance and Use of Technology (UTAUT; Venkatesh et al., 2003, 2012) proposes a major model of educational technology acceptance (ETA) which has been yet validated only in few languages and cultures. Therefore, this study aims at extending the applicability of UTAUT to Turkish culture. Based on acceptance and cultural data…
Butt, Gail; Markle-Reid, Maureen; Browne, Gina
2008-01-01
Introduction Interprofessional health and social service partnerships (IHSSP) are internationally acknowledged as integral for comprehensive chronic illness care. However, the evidence-base for partnership effectiveness is lacking. This paper aims to clarify partnership measurement issues, conceptualize IHSSP at the front-line staff level, and identify tools valid for group process measurement. Theory and methods A systematic literature review utilizing three interrelated searches was conducted. Thematic analysis techniques were supported by NVivo 7 software. Complexity theory was used to guide the analysis, ground the new conceptualization and validate the selected measures. Other properties of the measures were critiqued using established criteria. Results There is a need for a convergent view of what constitutes a partnership and its measurement. The salient attributes of IHSSP and their interorganizational context were described and grounded within complexity theory. Two measures were selected and validated for measurement of proximal group outcomes. Conclusion This paper depicts a novel complexity theory-based conceptual model for IHSSP of front-line staff who provide chronic illness care. The conceptualization provides the underpinnings for a comprehensive evaluative framework for partnerships. Two partnership process measurement tools, the PSAT and TCI are valid for IHSSP process measurement with consideration of their strengths and limitations. PMID:18493591
ERIC Educational Resources Information Center
Skinner, Ellen A.; Chi, Una
2012-01-01
Building on self-determination theory, this study presents a model of intrinsic motivation and engagement as "active ingredients" in garden-based education. The model was used to create reliable and valid measures of key constructs, and to guide the empirical exploration of motivational processes in garden-based learning. Teacher- and…
From Cognitive-Domain Theory to Assessment Practice
ERIC Educational Resources Information Center
Bennett, Randy E.; Deane, Paul; van Rijn, Peter W.
2016-01-01
This article exemplifies how assessment design might be grounded in theory, thereby helping to strengthen validity claims. Spanning work across multiple related projects, the article first briefly summarizes an assessment system model for the elementary and secondary levels. Next the article describes how cognitive-domain theory and principles are…
ERIC Educational Resources Information Center
Ferrando, Pere J.
2008-01-01
This paper develops results and procedures for obtaining linear composites of factor scores that maximize: (a) test information, and (b) validity with respect to external variables in the multiple factor analysis (FA) model. I treat FA as a multidimensional item response theory model, and use Ackerman's multidimensional information approach based…
Current Concerns in Validity Theory.
ERIC Educational Resources Information Center
Kane, Michael
Validity is concerned with the clarification and justification of the intended interpretations and uses of observed scores. It has not been easy to formulate a general methodology set of principles for validation, but progress has been made, especially as the field has moved from relatively limited criterion-related models to sophisticated…
2013-01-01
In this paper, we develop and validate a method to identify computationally efficient site- and patient-specific models of ultrasound thermal therapies from MR thermal images. The models of the specific absorption rate of the transduced energy and the temperature response of the therapy target are identified in the reduced basis of proper orthogonal decomposition of thermal images, acquired in response to a mild thermal test excitation. The method permits dynamic reidentification of the treatment models during the therapy by recursively utilizing newly acquired images. Such adaptation is particularly important during high-temperature therapies, which are known to substantially and rapidly change tissue properties and blood perfusion. The developed theory was validated for the case of focused ultrasound heating of a tissue phantom. The experimental and computational results indicate that the developed approach produces accurate low-dimensional treatment models despite temporal and spatial noises in MR images and slow image acquisition rate. PMID:22531754
2013-01-01
Background Understanding children’s physical activity motivation, its antecedents and associations with behavior is important and can be advanced by using self-determination theory. However, research among youth is largely restricted to adolescents and studies of motivation within certain contexts (e.g., physical education). There are no measures of self-determination theory constructs (physical activity motivation or psychological need satisfaction) for use among children and no previous studies have tested a self-determination theory-based model of children’s physical activity motivation. The purpose of this study was to test the reliability and validity of scores derived from scales adapted to measure self-determination theory constructs among children and test a motivational model predicting accelerometer-derived physical activity. Methods Cross-sectional data from 462 children aged 7 to 11 years from 20 primary schools in Bristol, UK were analysed. Confirmatory factor analysis was used to examine the construct validity of adapted behavioral regulation and psychological need satisfaction scales. Structural equation modelling was used to test cross-sectional associations between psychological need satisfaction, motivation types and physical activity assessed by accelerometer. Results The construct validity and reliability of the motivation and psychological need satisfaction measures were supported. Structural equation modelling provided evidence for a motivational model in which psychological need satisfaction was positively associated with intrinsic and identified motivation types and intrinsic motivation was positively associated with children’s minutes in moderate-to-vigorous physical activity. Conclusions The study provides evidence for the psychometric properties of measures of motivation aligned with self-determination theory among children. Children’s motivation that is based on enjoyment and inherent satisfaction of physical activity is associated with their objectively-assessed physical activity and such motivation is positively associated with perceptions of psychological need satisfaction. These psychological factors represent potential malleable targets for interventions to increase children’s physical activity. PMID:24067078
Theoretical Commitment and Implicit Knowledge: Why Anomalies do not Trigger Learning
NASA Astrophysics Data System (ADS)
Ohlsson, Stellan
A theory consists of a mental model, laws that specify parameters of the model and one or more explanatory schemas. Models represent by being isomorphic to real systems. To explain an event is to reenact its genesis by executing the relevant model in the mind's eye. Schemas capture recurring structural features of explanations. To subscribe to a theory is to be committed to explaining a particular class of events with that theory (and nothing else). Given theoretical commitment, an anomaly, i.e., an event that cannot be explained, is an occasion for theory change, but in the absence of commitment, the response is instead to exclude the anomalous event from the domain of application of the theory. Lay people and children hold their theories implicitly and hence without commitment. These observations imply that the analogy between scientist's theories and children's knowledge is valid, but that the analogy between theory change and learning is not.
Wakeling, Helen C
2007-09-01
This study examined the reliability and validity of the Social Problem-Solving Inventory--Revised (SPSI-R; D'Zurilla, Nezu, & Maydeu-Olivares, 2002) with a population of incarcerated sexual offenders. An availability sample of 499 adult male sexual offenders was used. The SPSI-R had good reliability measured by internal consistency and test-retest reliability, and adequate validity. Construct validity was determined via factor analysis. An exploratory factor analysis extracted a two-factor model. This model was then tested against the theory-driven five-factor model using confirmatory factor analysis. The five-factor model was selected as the better fitting of the two, and confirmed the model according to social problem-solving theory (D'Zurilla & Nezu, 1982). The SPSI-R had good convergent validity; significant correlations were found between SPSI-R subscales and measures of self-esteem, impulsivity, and locus of control. SPSI-R subscales were however found to significantly correlate with a measure of socially desirable responding. This finding is discussed in relation to recent research suggesting that impression management may not invalidate self-report measures (e.g. Mills & Kroner, 2005). The SPSI-R was sensitive to sexual offender intervention, with problem-solving improving pre to post-treatment in both rapists and child molesters. The study concludes that the SPSI-R is a reasonably internally valid and appropriate tool to assess problem-solving in sexual offenders. However future research should cross-validate the SPSI-R with other behavioural outcomes to examine the external validity of the measure. Furthermore, future research should utilise a control group to determine treatment impact.
ERIC Educational Resources Information Center
Houde, Joseph
2006-01-01
Andragogy, originally proposed by Malcolm Knowles, has been criticized as an atheoretical model. Validation of andragogy has been advocated by scholars, and this paper explores one method for that process. Current motivation theory, specifically socioemotional selectivity and self-determination theory correspond with aspects of andragogy. In…
NASA Astrophysics Data System (ADS)
Hidayati, A.; Rahmi, A.; Yohandri; Ratnawulan
2018-04-01
The importance of teaching materials in accordance with the characteristics of students became the main reason for the development of basic electronics I module integrated character values based on conceptual change teaching model. The module development in this research follows the development procedure of Plomp which includes preliminary research, prototyping phase and assessment phase. In the first year of this research, the module is validated. Content validity is seen from the conformity of the module with the development theory in accordance with the demands of learning model characteristics. The validity of the construct is seen from the linkage and consistency of each module component developed with the characteristic of the integrated learning model of character values obtained through validator assessment. The average validation value assessed by the validator belongs to a very valid category. Based on the validator assessment then revised the basic electronics I module integrated character values based on conceptual change teaching model.
Validity of Multiprocess IRT Models for Separating Content and Response Styles
ERIC Educational Resources Information Center
Plieninger, Hansjörg; Meiser, Thorsten
2014-01-01
Response styles, the tendency to respond to Likert-type items irrespective of content, are a widely known threat to the reliability and validity of self-report measures. However, it is still debated how to measure and control for response styles such as extreme responding. Recently, multiprocess item response theory models have been proposed that…
Development of Additional Hazard Assessment Models
1977-03-01
globules, their trajectory (the distance from the spill point to the impact point on the river bed), and the time required for sinking. Established theories ...chemicals, the dissolution rate is estimated by using eddy diffusivity surface renewal theories . The validity of predictions of these theories has been... theories and experimental data on aeration of rivers. * Describe dispersion in rivers with stationary area source and sources moving with the stream
Multivariate Modelling of the Career Intent of Air Force Personnel.
1980-09-01
index (HOPP) was used as a measure of current job satisfaction . As with the Vroom and Fishbein/Graen models, two separate validations were accom...34 Organizational Behavior and Human Performance , 23: 251-267, 1979. Lewis, Logan M. "Expectancy Theory as a Predictive Model of Career Intent, Job Satisfaction ...W. Albright. "Expectancy Theory Predictions of the Satisfaction , Effort, Performance , and Retention of Naval Aviation Officers," Organizational
Etien, Erik
2013-05-01
This paper deals with the design of a speed soft sensor for induction motor. The sensor is based on the physical model of the motor. Because the validation step highlight the fact that the sensor cannot be validated for all the operating points, the model is modified in order to obtain a fully validated sensor in the whole speed range. An original feature of the proposed approach is that the modified model is derived from stability analysis using automatic control theory. Copyright © 2012 ISA. Published by Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Mashayekhi, Somayeh; Miles, Paul; Hussaini, M. Yousuff; Oates, William S.
2018-02-01
In this paper, fractional and non-fractional viscoelastic models for elastomeric materials are derived and analyzed in comparison to experimental results. The viscoelastic models are derived by expanding thermodynamic balance equations for both fractal and non-fractal media. The order of the fractional time derivative is shown to strongly affect the accuracy of the viscoelastic constitutive predictions. Model validation uses experimental data describing viscoelasticity of the dielectric elastomer Very High Bond (VHB) 4910. Since these materials are known for their broad applications in smart structures, it is important to characterize and accurately predict their behavior across a large range of time scales. Whereas integer order viscoelastic models can yield reasonable agreement with data, the model parameters often lack robustness in prediction at different deformation rates. Alternatively, fractional order models of viscoelasticity provide an alternative framework to more accurately quantify complex rate-dependent behavior. Prior research that has considered fractional order viscoelasticity lacks experimental validation and contains limited links between viscoelastic theory and fractional order derivatives. To address these issues, we use fractional order operators to experimentally validate fractional and non-fractional viscoelastic models in elastomeric solids using Bayesian uncertainty quantification. The fractional order model is found to be advantageous as predictions are significantly more accurate than integer order viscoelastic models for deformation rates spanning four orders of magnitude.
Boerboom, T B B; Dolmans, D H J M; Jaarsma, A D C; Muijtjens, A M M; Van Beukelen, P; Scherpbier, A J J A
2011-01-01
Feedback to aid teachers in improving their teaching requires validated evaluation instruments. When implementing an evaluation instrument in a different context, it is important to collect validity evidence from multiple sources. We examined the validity and reliability of the Maastricht Clinical Teaching Questionnaire (MCTQ) as an instrument to evaluate individual clinical teachers during short clinical rotations in veterinary education. We examined four sources of validity evidence: (1) Content was examined based on theory of effective learning. (2) Response process was explored in a pilot study. (3) Internal structure was assessed by confirmatory factor analysis using 1086 student evaluations and reliability was examined utilizing generalizability analysis. (4) Relations with other relevant variables were examined by comparing factor scores with other outcomes. Content validity was supported by theory underlying the cognitive apprenticeship model on which the instrument is based. The pilot study resulted in an additional question about supervision time. A five-factor model showed a good fit with the data. Acceptable reliability was achievable with 10-12 questionnaires per teacher. Correlations between the factors and overall teacher judgement were strong. The MCTQ appears to be a valid and reliable instrument to evaluate clinical teachers' performance during short rotations.
Corr, Philip J; Cooper, Andrew J
2016-11-01
We report the development and validation of a questionnaire measure of the revised reinforcement sensitivity theory (rRST) of personality. Starting with qualitative responses to defensive and approach scenarios modeled on typical rodent ethoexperimental situations, exploratory and confirmatory factor analyses (CFAs) revealed a robust 6-factor structure: 2 unitary defensive factors, fight-flight-freeze system (FFFS; related to fear) and the behavioral inhibition system (BIS; related to anxiety); and 4 behavioral approach system (BAS) factors (Reward Interest, Goal-Drive Persistence, Reward Reactivity, and Impulsivity). Theoretically motivated thematic facets were employed to sample the breadth of defensive space, comprising FFFS (Flight, Freeze, and Active Avoidance) and BIS (Motor Planning Interruption, Worry, Obsessive Thoughts, and Behavioral Disengagement). Based on theoretical considerations, and statistically confirmed, a separate scale for Defensive Fight was developed. Validation evidence for the 6-factor structure came from convergent and discriminant validity shown by correlations with existing personality scales. We offer the Reinforcement Sensitivity Theory of Personality Questionnaire to facilitate future research specifically on rRST and, more broadly, on approach-avoidance theories of personality. (PsycINFO Database Record (c) 2016 APA, all rights reserved).
Nematollahi, Mahin; Eslami, Ahmad Ali
2018-01-01
Background: Osteoporosis is common among women which may be mostly due to the low intake of calcium. This article reports the development, cultural adaptation and psychometric properties of a Calcium Intake Questionnaire based on the social cognitive theory (CIQ-SCT)among Iranian women. Methods: In 2016, this cross-sectional study was carried out among 400 younger than 50 years old women in Isfahan, Iran. After literature review, a preliminary 35-item questionnaire was developed. Then, forward-backward translation and cultural adaptation of the tool was conducted. Content Validity Index confirmed by an expert panel and Face Validity was evaluated in a pilot study. Exploratory and confirmatory factor analyses (EFA &CFA) were conducted on the calibration and validation sample, respectively. Reliability was also assessed using internal consistency test. Results: After determining content and face validity, 20 items with 5 factors (self-efficacy,outcome expectations, social support and self-regulation) were obtained. Cronbach alpha for the instrument was found to be 0.901. In EFA, we identified a 4-factor model with a total variance of 72.3%. The results related to CFA (CMIN/DF=1.850, CFI =0.946, TLI=0.938, RMSEA=0.069[90% CI: 0.057-0.081]) indicated that the model was fit to the social cognitive theory. Self regulation was detected as the best predictor for calcium intake. Conclusion: The CIQ-SCT showed acceptable levels of reliability and validity in explaining the calcium intake based on the constructs of social cognitive theory. Further psychometric testing is recommended in different population to approve the external validity of the instrument.
Electromagnetic Compatibility Testing Studies
NASA Technical Reports Server (NTRS)
Trost, Thomas F.; Mitra, Atindra K.
1996-01-01
This report discusses the results on analytical models and measurement and simulation of statistical properties from a study of microwave reverberation (mode-stirred) chambers performed at Texas Tech University. Two analytical models of power transfer vs. frequency in a chamber, one for antenna-to-antenna transfer and the other for antenna to D-dot sensor, were experimentally validated in our chamber. Two examples are presented of the measurement and calculation of chamber Q, one for each of the models. Measurements of EM power density validate a theoretical probability distribution on and away from the chamber walls and also yield a distribution with larger standard deviation at frequencies below the range of validity of the theory. Measurements of EM power density at pairs of points which validate a theoretical spatial correlation function on the chamber walls and also yield a correlation function with larger correlation length, R(sub corr), at frequencies below the range of validity of the theory. A numerical simulation, employing a rectangular cavity with a moving wall shows agreement with the measurements. The determination that the lowest frequency at which the theoretical spatial correlation function is valid in our chamber is considerably higher than the lowest frequency recommended by current guidelines for utilizing reverberation chambers in EMC testing. Two suggestions have been made for future studies related to EMC testing.
Shen, Minxue; Cui, Yuanwu; Hu, Ming; Xu, Linyong
2017-01-13
The study aimed to validate a scale to assess the severity of "Yin deficiency, intestine heat" pattern of functional constipation based on the modern test theory. Pooled longitudinal data of 237 patients with "Yin deficiency, intestine heat" pattern of constipation from a prospective cohort study were used to validate the scale. Exploratory factor analysis was used to examine the common factors of items. A multidimensional item response model was used to assess the scale with the presence of multidimensionality. The Cronbach's alpha ranged from 0.79 to 0.89, and the split-half reliability ranged from 0.67 to 0.79 at different measurements. Exploratory factor analysis identified two common factors, and all items had cross factor loadings. Bidimensional model had better goodness of fit than the unidimensional model. Multidimensional item response model showed that the all items had moderate to high discrimination parameters. Parameters indicated that the first latent trait signified intestine heat, while the second trait characterized Yin deficiency. Information function showed that items demonstrated highest discrimination power among patients with moderate to high level of disease severity. Multidimensional item response theory provides a useful and rational approach in validating scales for assessing the severity of patterns in traditional Chinese medicine.
ERIC Educational Resources Information Center
Garg, Deepti; Garg, Ajay K.
2007-01-01
This study applied the Theory of Reasoned Action and the Technology Acceptance Model to measure outcomes of general education courses (GECs) under the University of Botswana Computer and Information Skills (CIS) program. An exploratory model was validated for responses from 298 students. The results suggest that resources currently committed to…
The Information a Test Provides on an Ability Parameter. Research Report. ETS RR-07-18
ERIC Educational Resources Information Center
Haberman, Shelby J.
2007-01-01
In item-response theory, if a latent-structure model has an ability variable, then elementary information theory may be employed to provide a criterion for evaluation of the information the test provides concerning ability. This criterion may be considered even in cases in which the latent-structure model is not valid, although interpretation of…
ERIC Educational Resources Information Center
Aquino, Cesar A.
2014-01-01
This study represents a research validating the efficacy of Davis' Technology Acceptance Model (TAM) by pairing it with the Organizational Change Readiness Theory (OCRT) to develop another extension to the TAM, using the medical Laboratory Information Systems (LIS)--Electronic Health Records (EHR) interface as the medium. The TAM posits that it is…
Validation of a New Conceptual Model of School Connectedness and Its Assessment Measure
ERIC Educational Resources Information Center
Hirao, Katsura
2011-01-01
A self-report assessment scale of school connectedness was validated in this study based on the data from middle-school children in a northeastern state of the United States (n = 145). The scale was based on the School Bonding Model (Morita, 1991), which was derived reductively from the social control (bond) theory (Hirschi, 1969). This validation…
Comparing theories' performance in predicting violence.
Haas, Henriette; Cusson, Maurice
2015-01-01
The stakes of choosing the best theory as a basis for violence prevention and offender rehabilitation are high. However, no single theory of violence has ever been universally accepted by a majority of established researchers. Psychiatry, psychology and sociology are each subdivided into different schools relying upon different premises. All theories can produce empirical evidence for their validity, some of them stating the opposite of each other. Calculating different models with multivariate logistic regression on a dataset of N = 21,312 observations and ninety-two influences allowed a direct comparison of the performance of operationalizations of some of the most important schools. The psychopathology model ranked as the best model in terms of predicting violence right after the comprehensive interdisciplinary model. Next came the rational choice and lifestyle model and third the differential association and learning theory model. Other models namely the control theory model, the childhood-trauma model and the social conflict and reaction model turned out to have low sensitivities for predicting violence. Nevertheless, all models produced acceptable results in predictions of a non-violent outcome. Copyright © 2015. Published by Elsevier Ltd.
Virtual Model Validation of Complex Multiscale Systems: Applications to Nonlinear Elastostatics
DOE Office of Scientific and Technical Information (OSTI.GOV)
Oden, John Tinsley; Prudencio, Ernest E.; Bauman, Paul T.
We propose a virtual statistical validation process as an aid to the design of experiments for the validation of phenomenological models of the behavior of material bodies, with focus on those cases in which knowledge of the fabrication process used to manufacture the body can provide information on the micro-molecular-scale properties underlying macroscale behavior. One example is given by models of elastomeric solids fabricated using polymerization processes. We describe a framework for model validation that involves Bayesian updates of parameters in statistical calibration and validation phases. The process enables the quanti cation of uncertainty in quantities of interest (QoIs) andmore » the determination of model consistency using tools of statistical information theory. We assert that microscale information drawn from molecular models of the fabrication of the body provides a valuable source of prior information on parameters as well as a means for estimating model bias and designing virtual validation experiments to provide information gain over calibration posteriors.« less
Escaño, Mary Clare Sison; Arevalo, Ryan Lacdao; Gyenge, Elod; Kasai, Hideaki
2014-09-03
The electrocatalysis of borohydride oxidation is a complex, up-to-eight-electron transfer process, which is essential for development of efficient direct borohydride fuel cells. Here we review the progress achieved by density functional theory (DFT) calculations in explaining the adsorption of BH4(-) on various catalyst surfaces, with implications for electrocatalyst screening and selection. Wherever possible, we correlate the theoretical predictions with experimental findings, in order to validate the proposed models and to identify potential directions for further advancements.
NASA Astrophysics Data System (ADS)
Sison Escaño, Mary Clare; Lacdao Arevalo, Ryan; Gyenge, Elod; Kasai, Hideaki
2014-09-01
The electrocatalysis of borohydride oxidation is a complex, up-to-eight-electron transfer process, which is essential for development of efficient direct borohydride fuel cells. Here we review the progress achieved by density functional theory (DFT) calculations in explaining the adsorption of BH4- on various catalyst surfaces, with implications for electrocatalyst screening and selection. Wherever possible, we correlate the theoretical predictions with experimental findings, in order to validate the proposed models and to identify potential directions for further advancements.
NASA Astrophysics Data System (ADS)
Wang, W. L.; Zhou, Z. R.; Yu, D. S.; Qin, Q. H.; Iwnicki, S.
2017-10-01
A full nonlinear physical 'in-service' model was built for a rail vehicle secondary suspension hydraulic damper with shim-pack-type valves. In the modelling process, a shim pack deflection theory with an equivalent-pressure correction factor was proposed, and a Finite Element Analysis (FEA) approach was applied. Bench test results validated the damper model over its full velocity range and thus also proved that the proposed shim pack deflection theory and the FEA-based parameter identification approach are effective. The validated full damper model was subsequently incorporated into a detailed vehicle dynamics simulation to study how its key in-service parameter variations influence the secondary-suspension-related vehicle system dynamics. The obtained nonlinear physical in-service damper model and the vehicle dynamic response characteristics in this study could be used in the product design optimization and nonlinear optimal specifications of high-speed rail hydraulic dampers.
NMR relaxation induced by iron oxide particles: testing theoretical models.
Gossuin, Y; Orlando, T; Basini, M; Henrard, D; Lascialfari, A; Mattea, C; Stapf, S; Vuong, Q L
2016-04-15
Superparamagnetic iron oxide particles find their main application as contrast agents for cellular and molecular magnetic resonance imaging. The contrast they bring is due to the shortening of the transverse relaxation time T 2 of water protons. In order to understand their influence on proton relaxation, different theoretical relaxation models have been developed, each of them presenting a certain validity domain, which depends on the particle characteristics and proton dynamics. The validation of these models is crucial since they allow for predicting the ideal particle characteristics for obtaining the best contrast but also because the fitting of T 1 experimental data by the theory constitutes an interesting tool for the characterization of the nanoparticles. In this work, T 2 of suspensions of iron oxide particles in different solvents and at different temperatures, corresponding to different proton diffusion properties, were measured and were compared to the three main theoretical models (the motional averaging regime, the static dephasing regime, and the partial refocusing model) with good qualitative agreement. However, a real quantitative agreement was not observed, probably because of the complexity of these nanoparticulate systems. The Roch theory, developed in the motional averaging regime (MAR), was also successfully used to fit T 1 nuclear magnetic relaxation dispersion (NMRD) profiles, even outside the MAR validity range, and provided a good estimate of the particle size. On the other hand, the simultaneous fitting of T 1 and T 2 NMRD profiles by the theory was impossible, and this occurrence constitutes a clear limitation of the Roch model. Finally, the theory was shown to satisfactorily fit the deuterium T 1 NMRD profile of superparamagnetic particle suspensions in heavy water.
NASA Astrophysics Data System (ADS)
Fisher, Karl B.
1995-08-01
The relation between the galaxy correlation functions in real-space and redshift-space is derived in the linear regime by an appropriate averaging of the joint probability distribution of density and velocity. The derivation recovers the familiar linear theory result on large scales but has the advantage of clearly revealing the dependence of the redshift distortions on the underlying peculiar velocity field; streaming motions give rise to distortions of θ(Ω0.6/b) while variations in the anisotropic velocity dispersion yield terms of order θ(Ω1.2/b2). This probabilistic derivation of the redshift-space correlation function is similar in spirit to the derivation of the commonly used "streaming" model, in which the distortions are given by a convolution of the real-space correlation function with a velocity distribution function. The streaming model is often used to model the redshift-space correlation function on small, highly nonlinear, scales. There have been claims in the literature, however, that the streaming model is not valid in the linear regime. Our analysis confirms this claim, but we show that the streaming model can be made consistent with linear theory provided that the model for the streaming has the functional form predicted by linear theory and that the velocity distribution is chosen to be a Gaussian with the correct linear theory dispersion.
Decision-Making in Agent-Based Models of Migration: State of the Art and Challenges.
Klabunde, Anna; Willekens, Frans
We review agent-based models (ABM) of human migration with respect to their decision-making rules. The most prominent behavioural theories used as decision rules are the random utility theory, as implemented in the discrete choice model, and the theory of planned behaviour. We identify the critical choices that must be made in developing an ABM, namely the modelling of decision processes and social networks. We also discuss two challenges that hamper the widespread use of ABM in the study of migration and, more broadly, demography and the social sciences: (a) the choice and the operationalisation of a behavioural theory (decision-making and social interaction) and (b) the selection of empirical evidence to validate the model. We offer advice on how these challenges might be overcome.
Factorial validity of the Problematic Facebook Use Scale for adolescents and young adults
Marino, Claudia; Vieno, Alessio; Altoè, Gianmarco; Spada, Marcantonio M.
2017-01-01
Background and aims Recent research on problematic Facebook use has highlighted the need to develop a specific theory-driven measure to assess this potential behavioral addiction. The aim of the present study was to examine the factorial validity of the Problematic Facebook Use Scale (PFUS) adapted from Caplan’s Generalized Problematic Internet Scale model. Methods A total of 1,460 Italian adolescents and young adults (aged 14–29 years) participated in the study. Confirmatory factor analyses were performed in order to assess the factorial validity of the scale. Results Results revealed that the factor structure of the PFUS provided a good fit to the data. Furthermore, results of the multiple group analyses supported the invariance of the model across age and gender groups. Discussion and conclusions This study provides evidence supporting the factorial validity of the PFUS. This new scale provides a theory-driven tool to assess problematic use of Facebook among male and female adolescents and young adults. PMID:28198639
Factorial validity of the Problematic Facebook Use Scale for adolescents and young adults.
Marino, Claudia; Vieno, Alessio; Altoè, Gianmarco; Spada, Marcantonio M
2017-03-01
Background and aims Recent research on problematic Facebook use has highlighted the need to develop a specific theory-driven measure to assess this potential behavioral addiction. The aim of the present study was to examine the factorial validity of the Problematic Facebook Use Scale (PFUS) adapted from Caplan's Generalized Problematic Internet Scale model. Methods A total of 1,460 Italian adolescents and young adults (aged 14-29 years) participated in the study. Confirmatory factor analyses were performed in order to assess the factorial validity of the scale. Results Results revealed that the factor structure of the PFUS provided a good fit to the data. Furthermore, results of the multiple group analyses supported the invariance of the model across age and gender groups. Discussion and conclusions This study provides evidence supporting the factorial validity of the PFUS. This new scale provides a theory-driven tool to assess problematic use of Facebook among male and female adolescents and young adults.
A brief overview of the theory and application of the optimal control model of the human operator
NASA Technical Reports Server (NTRS)
Sheldon, B.
1979-01-01
The underlying motivation and concepts are presented, along with a review of the development and application of the model. The structure of the model is described and results validating the model are presented.
Iyigun, Emine; Tastan, Sevinc; Ayhan, Hatice; Kose, Gulsah; Acikel, Cengizhan
2016-06-01
This study aimed to determine the validity and reliability levels of the Planned Behavior Theory Scale as related to a testicular self-examination. The study was carried out in a health-profession higher-education school in Ankara, Turkey, from April to June 2012. The study participants comprised 215 male students. Study data were collected by using a questionnaire, a planned behavior theory scale related to testicular self-examination, and Champion's Health Belief Model Scale (CHBMS). The sub-dimensions of the planned behavior theory scale, namely those of intention, attitude, subjective norms and self-efficacy, were found to have Cronbach's alpha values of between 0.81 and 0.89. Exploratory factor analysis showed that items of the scale had five factors that accounted for 75% of the variance. Of these, the sub-dimension of intention was found to have the highest level of contribution. A significant correlation was found between the sub-dimensions of the testicular self-examination planned behavior theory scale and those of CHBMS (p < 0.05). The findings suggest that the Turkish version of the testicular self-examination Planned Behavior Theory Scale is a valid and reliable measurement for Turkish society.
Mathematical modeling in realistic mathematics education
NASA Astrophysics Data System (ADS)
Riyanto, B.; Zulkardi; Putri, R. I. I.; Darmawijoyo
2017-12-01
The purpose of this paper is to produce Mathematical modelling in Realistics Mathematics Education of Junior High School. This study used development research consisting of 3 stages, namely analysis, design and evaluation. The success criteria of this study were obtained in the form of local instruction theory for school mathematical modelling learning which was valid and practical for students. The data were analyzed using descriptive analysis method as follows: (1) walk through, analysis based on the expert comments in the expert review to get Hypothetical Learning Trajectory for valid mathematical modelling learning; (2) analyzing the results of the review in one to one and small group to gain practicality. Based on the expert validation and students’ opinion and answers, the obtained mathematical modeling problem in Realistics Mathematics Education was valid and practical.
McCormick, Jessica; Delfabbro, Paul; Denson, Linley A
2012-12-01
The aim of this study was to conduct an empirical investigation of the validity of Jacobs' (in J Gambl Behav 2:15-31, 1986) general theory of addictions in relation to gambling problems associated with electronic gaming machines (EGM). Regular EGM gamblers (n = 190) completed a series of standardised measures relating to psychological and physiological vulnerability, substance use, dissociative experiences, early childhood trauma and abuse and problem gambling (the Problem Gambling Severity Index). Statistical analysis using structural equation modelling revealed clear relationships between childhood trauma and life stressors and psychological vulnerability, dissociative-like experiences and problem gambling. These findings confirm and extend a previous model validated by Gupta and Derevensky (in J Gambl Stud 14: 17-49, 1998) using an adolescent population. The significance of these findings are discussed for existing pathway models of problem gambling, for Jacobs' theory, and for clinicians engaged in assessment and intervention.
A Criterion-Related Validation Study of the Army Core Leader Competency Model
2007-04-01
2004). Transformational and transactional leadership: A meta-analytic test of their relative validity. Journal of Applied Psychology , 89, 755- 768...performance criteria in an attempt to adjust ratings for this influence. Leader survey materials were developed and pilot tested at Ft. Drum and Ft... psychological constructs in the behavioral science realm. Numerous theories, popular literature, websites, assessments, and competency models are
Evaluation of physical activity web sites for use of behavior change theories.
Doshi, Amol; Patrick, Kevin; Sallis, James F; Calfas, Karen
2003-01-01
Physical activity (PA) Web sites were assessed for their use of behavior change theories, including constructs of the health belief model, Transtheoretical Model, social cognitive theory, and the theory of reasoned action and planned behavior. An evaluation template for assessing PA Web sites was developed, and content validity and interrater reliability were demonstrated. Two independent raters evaluated 24 PA Web sites. Web sites varied widely in application of theory-based constructs, ranging from 5 to 48 on a 100-point scale. The most common intervention strategies were general information, social support, and realistic goal areas. Coverage of theory-based strategies was low, varying from 26% for social cognitive theory to 39% for health belief model. Overall, PA Web sites provided little assessment, feedback, or individually tailored assistance for users. They were unable to substantially tailor the on-line experience for users at different stages of change or different demographic characteristics.
Model Selection Indices for Polytomous Items
ERIC Educational Resources Information Center
Kang, Taehoon; Cohen, Allan S.; Sung, Hyun-Jung
2009-01-01
This study examines the utility of four indices for use in model selection with nested and nonnested polytomous item response theory (IRT) models: a cross-validation index and three information-based indices. Four commonly used polytomous IRT models are considered: the graded response model, the generalized partial credit model, the partial credit…
U.S.A.B.I.L.I.T.Y. Framework for Older Adults.
Caboral-Stevens, Meriam; Whetsell, Martha V; Evangelista, Lorraine S; Cypress, Brigitte; Nickitas, Donna
2015-01-01
The purpose of the current study was to present a framework to determine potential usability of health websites by older adults. Review of the literature showed paucity of nursing theory related to the use of technology and usability, particularly in older adults. The Roy Adaptation Model, a widely used nursing theory, was chosen to provide framework for the new model. Technology constructs from the Technology Acceptance Model and United Theory of Acceptance and Use of Technology and behavioral control construct from the Theory of Planned Behavior were integrated into the construction of the derived model. The Use of Technology for Adaptation by Older Adults and/or Those With Limited Literacy (U.S.A.B.I.L.I.T.Y.) Model was constructed from the integration of diverse theoretical/conceptual perspectives. The four determinants of usability in the conceptual model include (a) efficiency, (b) learnability, (c) perceived user experience, and (d) perceived control. Because of the lack of well-validated survey questionnaires to measure these determinants, a U.S.A.B.I.L.I.T.Y. Survey was developed. A panel of experts evaluated face and content validity of the new instrument. Internal consistency of the new instrument was 0.96. Usability is key to accepting technology. The derived U.S.A.B.I.L.I.T.Y. framework could serve as a guide for nurses in formative evaluation of technology. Copyright 2015, SLACK Incorporated.
NASA Astrophysics Data System (ADS)
Landahl, M. T.
1984-08-01
The fundamental ideas behind Prandtl's famous mixing length theory are discussed in the light of newer findings from experimental and theoretical research on coherent turbulence structures in the region near solid walls. A simple theoretical model for 'flat' structures is used to examine the fundamental assumptions behind Prandtl's theory. The model is validated by comparisons with conditionally sampled velocity data obtained in recent channel flow experiments. Particular attention is given to the role of pressure fluctuations on the evolution of flat eddies. The validity of Prandtl's assumption that an element of fluid retains its streamwise momentum as it is moved around by turbulence is confirmed for flat eddies. It is demonstrated that spanwise pressure gradients give rise to a contribution to the vertical displacement of a fluid element which is proportional to the distance from the wall. This contribution is particularly important for eddies that are highly elongated in the streamwise direction.
Validation of a condition-specific measure for women having an abnormal screening mammography.
Brodersen, John; Thorsen, Hanne; Kreiner, Svend
2007-01-01
The aim of this study is to assess the validity of a new condition-specific instrument measuring psychosocial consequences of abnormal screening mammography (PCQ-DK33). The draft version of the PCQ-DK33 was completed on two occasions by 184 women who had received an abnormal screening mammography and on one occasion by 240 women who had received a normal screening result. Item Response Theories and Classical Test Theories were used to analyze data. Construct validity, concurrent validity, known group validity, objectivity and reliability were established by item analysis examining the fit between item responses and Rasch models. Six dimensions covering anxiety, behavioral impact, sense of dejection, impact on sleep, breast examination, and sexuality were identified. One item belonging to the dejection dimension had uniform differential item functioning. Two items not fitting the Rasch models were retained because of high face validity. A sick leave item added useful information when measuring side effects and socioeconomic consequences of breast cancer screening. Five "poor items" were identified and should be deleted from the final instrument. Preliminary evidence for a valid and reliable condition-specific measure for women having an abnormal screening mammography was established. The measure includes 27 "good" items measuring different attributes of the same overall latent structure-the psychosocial consequences of abnormal screening mammography.
Propagation of an ultrashort, intense laser pulse in a relativistic plasma
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ritchie, B.; Decker, C.D.
1997-12-31
A Maxwell-relativistic fluid model is developed for the propagation of an ultrashort, intense laser pulse through an underdense plasma. The separability of plasma and optical frequencies ({omega}{sub p} and {omega} respectively) for small {omega}{sub p}/{omega} is not assumed; thus the validity of multiple-scales theory (MST) can be tested. The theory is valid when {omega}{sub p}/{omega} is of order unity or for cases in which {omega}{sub p}/{omega} {much_lt} 1 but strongly relativistic motion causes higher-order plasma harmonics to be generated which overlap the region of the first-order laser harmonic, such that MST would not expected to be valid although its principalmore » validity criterion {omega}{sub p}/{omega} {much_lt} 1 holds.« less
NASA Technical Reports Server (NTRS)
Owre, Sam; Shankar, Natarajan; Butler, Ricky W. (Technical Monitor)
2001-01-01
The purpose of this task was to provide a mechanism for theory interpretations in a prototype verification system (PVS) so that it is possible to demonstrate the consistency of a theory by exhibiting an interpretation that validates the axioms. The mechanization makes it possible to show that one collection of theories is correctly interpreted by another collection of theories under a user-specified interpretation for the uninterpreted types and constants. A theory instance is generated and imported, while the axiom instances are generated as proof obligations to ensure that the interpretation is valid. Interpretations can be used to show that an implementation is a correct refinement of a specification, that an axiomatically defined specification is consistent, or that a axiomatically defined specification captures its intended models. In addition, the theory parameter mechanism has been extended with a notion of theory as parameter so that a theory instance can be given as an actual parameter to an imported theory. Theory interpretations can thus be used to refine an abstract specification or to demonstrate the consistency of an axiomatic theory. In this report we describe the mechanism in detail. This extension is a part of PVS version 3.0, which will be publicly released in mid-2001.
Examining the Cultural Validity of a College Student Engagement Survey for Latinos
ERIC Educational Resources Information Center
Hernandez, Ebelia; Mobley, Michael; Coryell, Gayle; Yu, En-Hui; Martinez, Gladys
2013-01-01
Using critical race theory and quantitative criticalist stance, this study examines the construct validity of an engagement survey, "Student Experiences in the Research University" (SERU) for Latino college students through exploratory factor analysis. Results support the principal seven-factor SERU model. However subfactors exhibited…
ERIC Educational Resources Information Center
Hsieh, Chueh-An; von Eye, Alexander A.; Maier, Kimberly S.
2010-01-01
The application of multidimensional item response theory models to repeated observations has demonstrated great promise in developmental research. It allows researchers to take into consideration both the characteristics of item response and measurement error in longitudinal trajectory analysis, which improves the reliability and validity of the…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Boero, Riccardo; Edwards, Brian Keith
Economists use computable general equilibrium (CGE) models to assess how economies react and self-organize after changes in policies, technology, and other exogenous shocks. CGE models are equation-based, empirically calibrated, and inspired by Neoclassical economic theory. The focus of this work was to validate the National Infrastructure Simulation and Analysis Center (NISAC) CGE model and apply it to the problem of assessing the economic impacts of severe events. We used the 2012 Hurricane Sandy event as our validation case. In particular, this work first introduces the model and then describes the validation approach and the empirical data available for studying themore » event of focus. Shocks to the model are then formalized and applied. Finally, model results and limitations are presented and discussed, pointing out both the model degree of accuracy and the assessed total damage caused by Hurricane Sandy.« less
Evaluating the Dimensionality of Self-Determination Theory's Relative Autonomy Continuum.
Sheldon, Kennon M; Osin, Evgeny N; Gordeeva, Tamara O; Suchkov, Dmitry D; Sychev, Oleg A
2017-09-01
We conducted a theoretical and psychometric evaluation of self-determination theory's "relative autonomy continuum" (RAC), an important aspect of the theory whose validity has recently been questioned. We first derived a Comprehensive Relative Autonomy Index (C-RAI) containing six subscales and 24 items, by conducting a paired paraphrase content analysis of existing RAI measures. We administered the C-RAI to multiple U.S. and Russian samples, assessing motivation to attend class, study a major, and take responsibility. Item-level and scale-level multidimensional scaling analyses, confirmatory factor analyses, and simplex/circumplex modeling analyses reaffirmed the validity of the RAC, across multiple samples, stems, and studies. Validation analyses predicting subjective well-being and trait autonomy from the six separate subscales, in combination with various higher order composites (weighted and unweighted), showed that an aggregate unweighted RAI score provides the most unbiased and efficient indicator of the overall quality of motivation within the behavioral domain being assessed.
Construct Validity: Advances in Theory and Methodology
Strauss, Milton E.; Smith, Gregory T.
2008-01-01
Measures of psychological constructs are validated by testing whether they relate to measures of other constructs as specified by theory. Each test of relations between measures reflects on the validity of both the measures and the theory driving the test. Construct validation concerns the simultaneous process of measure and theory validation. In this chapter, we review the recent history of validation efforts in clinical psychological science that has led to this perspective, and we review five recent advances in validation theory and methodology of importance for clinical researchers. These are: the emergence of nonjustificationist philosophy of science; an increasing appreciation for theory and the need for informative tests of construct validity; valid construct representation in experimental psychopathology; the need to avoid representing multidimensional constructs with a single score; and the emergence of effective new statistical tools for the evaluation of convergent and discriminant validity. PMID:19086835
Chen, Zhao; Cao, Yanfeng; He, Shuaibing; Qiao, Yanjiang
2018-01-01
Action (" gongxiao " in Chinese) of traditional Chinese medicine (TCM) is the high recapitulation for therapeutic and health-preserving effects under the guidance of TCM theory. TCM-defined herbal properties (" yaoxing " in Chinese) had been used in this research. TCM herbal property (TCM-HP) is the high generalization and summary for actions, both of which come from long-term effective clinical practice in two thousands of years in China. However, the specific relationship between TCM-HP and action of TCM is complex and unclear from a scientific perspective. The research about this is conducive to expound the connotation of TCM-HP theory and is of important significance for the development of the TCM-HP theory. One hundred and thirty-three herbs including 88 heat-clearing herbs (HCHs) and 45 blood-activating stasis-resolving herbs (BAHRHs) were collected from reputable TCM literatures, and their corresponding TCM-HPs/actions information were collected from Chinese pharmacopoeia (2015 edition). The Kennard-Stone (K-S) algorithm was used to split 133 herbs into 100 calibration samples and 33 validation samples. Then, machine learning methods including supported vector machine (SVM), k-nearest neighbor (kNN) and deep learning methods including deep belief network (DBN), convolutional neutral network (CNN) were adopted to develop action classification models based on TCM-HP theory, respectively. In order to ensure robustness, these four classification methods were evaluated by using the method of tenfold cross validation and 20 external validation samples for prediction. As results, 72.7-100% of 33 validation samples including 17 HCHs and 16 BASRHs were correctly predicted by these four types of methods. Both of the DBN and CNN methods gave out the best results and their sensitivity, specificity, precision, accuracy were all 100.00%. Especially, the predicted results of external validation set showed that the performance of deep learning methods (DBN, CNN) were better than traditional machine learning methods (kNN, SVM) in terms of their sensitivity, specificity, precision, accuracy. Moreover, the distribution patterns of TCM-HPs of HCHs and BASRHs were also analyzed to detect the featured TCM-HPs of these two types of herbs. The result showed that the featured TCM-HPs of HCHs were cold, bitter, liver and stomach meridians entered, while those of BASRHs were warm, bitter and pungent, liver meridian entered. The performance on validation set and external validation set of deep learning methods (DBN, CNN) were better than machine learning models (kNN, SVM) in sensitivity, specificity, precision, accuracy when predicting the actions of heat-clearing and blood-activating stasis-resolving based on TCM-HP theory. The deep learning classification methods owned better generalization ability and accuracy when predicting the actions of heat-clearing and blood-activating stasis-resolving based on TCM-HP theory. Besides, the methods of deep learning would help us to improve our understanding about the relationship between herbal property and action, as well as to enrich and develop the theory of TCM-HP scientifically.
Cosmological constraints on Brans-Dicke theory.
Avilez, A; Skordis, C
2014-07-04
We report strong cosmological constraints on the Brans-Dicke (BD) theory of gravity using cosmic microwave background data from Planck. We consider two types of models. First, the initial condition of the scalar field is fixed to give the same effective gravitational strength Geff today as the one measured on Earth, GN. In this case, the BD parameter ω is constrained to ω>692 at the 99% confidence level, an order of magnitude improvement over previous constraints. In the second type, the initial condition for the scalar is a free parameter leading to a somewhat stronger constraint of ω>890, while Geff is constrained to 0.981
Analysis of general power counting rules in effective field theory
Gavela, Belen; Jenkins, Elizabeth E.; Manohar, Aneesh V.; ...
2016-09-02
We derive the general counting rules for a quantum effective field theory (EFT) in d dimensions. The rules are valid for strongly and weakly coupled theories, and they predict that all kinetic energy terms are canonically normalized. They determine the energy dependence of scattering cross sections in the range of validity of the EFT expansion. We show that the size of the cross sections is controlled by the Λ power counting of EFT, not by chiral counting, even for chiral perturbation theory (χPT). The relation between Λ and f is generalized to d dimensions. We show that the naive dimensionalmore » analysis 4π counting is related to ℏ counting. The EFT counting rules are applied to χPT, low-energy weak interactions, Standard Model EFT and the non-trivial case of Higgs EFT.« less
Influence of Learner Beliefs and Gender on the Motivating Power of L2 Selves
ERIC Educational Resources Information Center
Yashima, Tomoko; Nishida, Rieko; Mizumoto, Atsushi
2017-01-01
This study investigates 3 unexplored issues regarding Second Language (L2) Motivational Self System theory. It further validates the theory using multiple structural equation modeling (SEM) along with a procedure comparing the strength of corresponding paths. Japanese university freshmen (N = 2,631) responded to a questionnaire and took the…
Teachers and Technology: Development of an Extended Theory of Planned Behavior
ERIC Educational Resources Information Center
Teo, Timothy; Zhou, Mingming; Noyes, Jan
2016-01-01
This study tests the validity of an extended theory of planned behaviour (TPB) to explain teachers' intention to use technology for teaching and learning. Five hundred and ninety two participants completed a survey questionnaire measuring their responses to eight constructs which form an extended TPB. Using structural equation modelling, the…
ERIC Educational Resources Information Center
Wyker, Brett A.; Jordan, Patricia; Quigley, Danielle L.
2012-01-01
Objective: Application of the Transtheoretical Model (TTM) to Supplemental Nutrition Assistance Program Education (SNAP-Ed) evaluation and development and validation of an evaluation tool used to measure TTM constructs is described. Methods: Surveys were collected from parents of children receiving food at Summer Food Service Program sites prior…
NASA Astrophysics Data System (ADS)
Choji, Niri Martha; Sek, Siok Kun
2017-11-01
The purchasing power parity theory says that the trade rates among two nations ought to be equivalent to the proportion of the total price levels between the two nations. For more than a decade, there has been substantial interest in testing for the validity of the Purchasing Power Parity (PPP) empirically. This paper performs a series of tests to see if PPP is valid for ASEAN-5 nations for the period of 2000-2016 using monthly data. For this purpose, we conducted four different tests of stationarity, two cointegration tests (Pedroni and Westerlund), and also the VAR model. The stationarity (unit root) tests reveal that the variables are not stationary at levels however stationary at first difference. Cointegration test results did not reject the H0 of no cointegration implying the absence long-run association among the variables and results of the VAR model did not reveal a strong short-run relationship. Based on the data, we, therefore, conclude that PPP is not valid in long-and short-run for ASEAN-5 during 2000-2016.
Liu, Hong; Zhu, Jingping; Wang, Kai
2015-08-24
The geometrical attenuation model given by Blinn was widely used in the geometrical optics bidirectional reflectance distribution function (BRDF) models. Blinn's geometrical attenuation model based on symmetrical V-groove assumption and ray scalar theory causes obvious inaccuracies in BRDF curves and negatives the effects of polarization. Aiming at these questions, a modified polarized geometrical attenuation model based on random surface microfacet theory is presented by combining of masking and shadowing effects and polarized effect. The p-polarized, s-polarized and unpolarized geometrical attenuation functions are given in their separate expressions and are validated with experimental data of two samples. It shows that the modified polarized geometrical attenuation function reaches better physical rationality, improves the precision of BRDF model, and widens the applications for different polarization.
Interpersonal Harmony and Conflict for Chinese People: A Yin-Yang Perspective.
Huang, Li-Li
2016-01-01
This article provides an overview on a series of original studies conducted by the author. The aim here is to present the ideas that the author reconstructed, based on the dialectics of harmonization, regarding harmony and conflict embodied in traditional Chinese thought, and to describe how a formal psychological theory/model on interpersonal harmony and conflict was developed based on the Yin-Yang perspective. The paper also details how essential theories on interpersonal harmony and conflict were constructed under this formal model by conducting a qualitative study involving in-depth interviews with 30 adults. Psychological research in Western society has, intriguingly, long been focused more on interpersonal conflict than on interpersonal harmony. By contrast, the author's work started from the viewpoint of a materialist conception of history and dialectics of harmonization in order to reinterpret traditional Chinese thought. Next, a "dynamic model of interpersonal harmony and conflict" was developed, as a formal psychological theory, based on the real-virtual notions in the Yin-Yang perspective. Under this model, interpersonal harmony and conflict can be classified into genuine versus superficial harmony and authentic versus virtual focus conflict, and implicit/hidden conflict is regarded as superficial harmony. Subsequently, the author conducted a series of quantitative studies on interpersonal harmony and conflict within parent-child, supervisor-subordinate, and friend-friend relationships in order to verify the construct validity and the predictive validity of the dynamic model of interpersonal harmony and conflict. The claim presented herein is that Chinese traditional thought and the psychological theory/model based on the Yin-Yang perspective can be combined. Accordingly, by combining qualitative and quantitative empirical research, the relative substantial theory can be developed and the concepts can be validated. Thus, this work represents the realization of a series of modern Chinese indigenous psychological research studies rooted in traditional cultural thought and the Yin-Yang perspective. The work also mirrors the current conflict-management research that has incorporated the Chinese notion of harmony and adopted the Yin-Yang perspective on culture.
Interpersonal Harmony and Conflict for Chinese People: A Yin–Yang Perspective
Huang, Li-Li
2016-01-01
This article provides an overview on a series of original studies conducted by the author. The aim here is to present the ideas that the author reconstructed, based on the dialectics of harmonization, regarding harmony and conflict embodied in traditional Chinese thought, and to describe how a formal psychological theory/model on interpersonal harmony and conflict was developed based on the Yin–Yang perspective. The paper also details how essential theories on interpersonal harmony and conflict were constructed under this formal model by conducting a qualitative study involving in-depth interviews with 30 adults. Psychological research in Western society has, intriguingly, long been focused more on interpersonal conflict than on interpersonal harmony. By contrast, the author’s work started from the viewpoint of a materialist conception of history and dialectics of harmonization in order to reinterpret traditional Chinese thought. Next, a “dynamic model of interpersonal harmony and conflict” was developed, as a formal psychological theory, based on the real-virtual notions in the Yin–Yang perspective. Under this model, interpersonal harmony and conflict can be classified into genuine versus superficial harmony and authentic versus virtual focus conflict, and implicit/hidden conflict is regarded as superficial harmony. Subsequently, the author conducted a series of quantitative studies on interpersonal harmony and conflict within parent–child, supervisor–subordinate, and friend–friend relationships in order to verify the construct validity and the predictive validity of the dynamic model of interpersonal harmony and conflict. The claim presented herein is that Chinese traditional thought and the psychological theory/model based on the Yin–Yang perspective can be combined. Accordingly, by combining qualitative and quantitative empirical research, the relative substantial theory can be developed and the concepts can be validated. Thus, this work represents the realization of a series of modern Chinese indigenous psychological research studies rooted in traditional cultural thought and the Yin–Yang perspective. The work also mirrors the current conflict-management research that has incorporated the Chinese notion of harmony and adopted the Yin–Yang perspective on culture. PMID:27375526
Self-determination, smoking, diet and health.
Williams, Geoffrey C; Minicucci, Daryl S; Kouides, Ruth W; Levesque, Chantal S; Chirkov, Valery I; Ryan, Richard M; Deci, Edward L
2002-10-01
A Clinical Trial will test (1) a Self-Determination Theory (SDT) model of maintained smoking cessation and diet improvement, and (2) an SDT intervention, relative to usual care, for facilitating maintained behavior change and decreasing depressive symptoms for those who quit smoking. SDT is the only empirically derived theory which emphasizes patient autonomy and has a validated measure for each of its constructs, and this is the first trial to evaluate an SDT intervention. Adult smokers will be stratified for whether they are at National Cholesterol Education Program (1996) recommended goal for low-density lipoprotein cholesterol (LDL-C). Those with elevated LDL-C will be studied for diet improvement as well as smoking cessation. Six-month interventions involve a behavior-change counselor using principles of SDT to facilitate autonomous motivation and perceived competence for healthier behaving. Cotinine-validated smoking cessation and LDL-C-validated dietary recall of reduced fat intake, as well as depressive symptoms, will be assessed at 6 and 18 months. Structural equation modeling will test the model for both behaviors within the intervention and usual-care conditions.
Comparison of Geant4 multiple Coulomb scattering models with theory for radiotherapy protons
NASA Astrophysics Data System (ADS)
Makarova, Anastasia; Gottschalk, Bernard; Sauerwein, Wolfgang
2017-08-01
Usually, Monte Carlo models are validated against experimental data. However, models of multiple Coulomb scattering (MCS) in the Gaussian approximation are exceptional in that we have theories which are probably more accurate than the experiments which have, so far, been done to test them. In problems directly sensitive to the distribution of angles leaving the target, the relevant theory is the Molière/Fano/Hanson variant of Molière theory (Gottschalk et al 1993 Nucl. Instrum. Methods Phys. Res. B 74 467-90). For transverse spreading of the beam in the target itself, the theory of Preston and Koehler (Gottschalk (2012 arXiv:1204.4470)) holds. Therefore, in this paper we compare Geant4 simulations, using the Urban and Wentzel models of MCS, with theory rather than experiment, revealing trends which would otherwise be obscured by experimental scatter. For medium-energy (radiotherapy) protons, and low-Z (water-like) target materials, Wentzel appears to be better than Urban in simulating the distribution of outgoing angles. For beam spreading in the target itself, the two models are essentially equal.
Comparison of Geant4 multiple Coulomb scattering models with theory for radiotherapy protons.
Makarova, Anastasia; Gottschalk, Bernard; Sauerwein, Wolfgang
2017-07-06
Usually, Monte Carlo models are validated against experimental data. However, models of multiple Coulomb scattering (MCS) in the Gaussian approximation are exceptional in that we have theories which are probably more accurate than the experiments which have, so far, been done to test them. In problems directly sensitive to the distribution of angles leaving the target, the relevant theory is the Molière/Fano/Hanson variant of Molière theory (Gottschalk et al 1993 Nucl. Instrum. Methods Phys. Res. B 74 467-90). For transverse spreading of the beam in the target itself, the theory of Preston and Koehler (Gottschalk (2012 arXiv:1204.4470)) holds. Therefore, in this paper we compare Geant4 simulations, using the Urban and Wentzel models of MCS, with theory rather than experiment, revealing trends which would otherwise be obscured by experimental scatter. For medium-energy (radiotherapy) protons, and low-Z (water-like) target materials, Wentzel appears to be better than Urban in simulating the distribution of outgoing angles. For beam spreading in the target itself, the two models are essentially equal.
Dima, Alexandra Lelia; Schulz, Peter Johannes
2017-01-01
Background The eHealth Literacy Scale (eHEALS) is a tool to assess consumers’ comfort and skills in using information technologies for health. Although evidence exists of reliability and construct validity of the scale, less agreement exists on structural validity. Objective The aim of this study was to validate the Italian version of the eHealth Literacy Scale (I-eHEALS) in a community sample with a focus on its structural validity, by applying psychometric techniques that account for item difficulty. Methods Two Web-based surveys were conducted among a total of 296 people living in the Italian-speaking region of Switzerland (Ticino). After examining the latent variables underlying the observed variables of the Italian scale via principal component analysis (PCA), fit indices for two alternative models were calculated using confirmatory factor analysis (CFA). The scale structure was examined via parametric and nonparametric item response theory (IRT) analyses accounting for differences between items regarding the proportion of answers indicating high ability. Convergent validity was assessed by correlations with theoretically related constructs. Results CFA showed a suboptimal model fit for both models. IRT analyses confirmed all items measure a single dimension as intended. Reliability and construct validity of the final scale were also confirmed. The contrasting results of factor analysis (FA) and IRT analyses highlight the importance of considering differences in item difficulty when examining health literacy scales. Conclusions The findings support the reliability and validity of the translated scale and its use for assessing Italian-speaking consumers’ eHealth literacy. PMID:28400356
Harradine, Paul; Gates, Lucy; Bowen, Catherine
2018-03-01
The use of subtalar joint neutral (STJN) in the assessment and treatment of foot-related musculoskeletal symptomology is common in daily practice and still widely taught. The main pioneer of this theory was Dr Merton L. Root, and it has been labeled with a variety of names: "the foot morphology theory," "the subtalar joint neutral theory," or simply "Rootian theory" or "Root model." The theory's core concepts still underpin a common approach to musculoskeletal assessment of the foot, as well as the consequent design of foot orthoses. The available literature continues to point to Dr Root's theory as the most prevalently utilized. Concurrently, the worth of this theory has been challenged due to its poor reliability and limited external validity. This Viewpoint reviews the main clinical areas of the STJN theory, and concludes with a possible explanation and concerns for its ongoing use. To support our view, we will discuss (1) historical inaccuracies, (2) challenges with reliability, and (3) concerns with validity. J Orthop Sports Phys Ther 2018;48(3):130-132. doi:10.2519/jospt.2018.0604.
Otto, Siegmar; Kröhne, Ulf; Richter, David
2018-01-01
The behavioral sciences, including most of psychology, seek to explain and predict behavior with the help of theories and models that involve concepts (e.g., attitudes) that are subsequently translated into measures. Currently, some subdisciplines such as social psychology focus almost exclusively on measures that demand reflection or even introspection when administered to persons. We argue that such a focus hinders progress in explaining behavior. One major reason is that such an exclusive focus on reflections results in common method bias, which then produces spurious relations, or in other words, low discriminant validity. Without the valid measurement of theoretical concepts, theoretical assumptions cannot be tested, and hence, theory development will be hampered. We argue that the use of a greater variety of methods would reduce these problems and would in turn foster theory building. Using a representative sample of N = 472 participants (age: M = 51.0, SD = 17.7; 54% female), we compared the validity of a classical introspective attitude measure (i.e., the New Ecological Paradigm) with that of an alternative attitude measure (i.e., the General Ecological Behavior scale). The latter measure, which was based on self-reported behavior, showed substantially better validity that we argue could aid theory development.
The dominance of introspective measures and what this implies: The example of environmental attitude
Kröhne, Ulf; Richter, David
2018-01-01
The behavioral sciences, including most of psychology, seek to explain and predict behavior with the help of theories and models that involve concepts (e.g., attitudes) that are subsequently translated into measures. Currently, some subdisciplines such as social psychology focus almost exclusively on measures that demand reflection or even introspection when administered to persons. We argue that such a focus hinders progress in explaining behavior. One major reason is that such an exclusive focus on reflections results in common method bias, which then produces spurious relations, or in other words, low discriminant validity. Without the valid measurement of theoretical concepts, theoretical assumptions cannot be tested, and hence, theory development will be hampered. We argue that the use of a greater variety of methods would reduce these problems and would in turn foster theory building. Using a representative sample of N = 472 participants (age: M = 51.0, SD = 17.7; 54% female), we compared the validity of a classical introspective attitude measure (i.e., the New Ecological Paradigm) with that of an alternative attitude measure (i.e., the General Ecological Behavior scale). The latter measure, which was based on self-reported behavior, showed substantially better validity that we argue could aid theory development. PMID:29447235
Medhi, Amal; Shenoy, Vijay B
2012-09-05
We develop a continuum theory to model low energy excitations of a generic four-band time reversal invariant electronic system with boundaries. We propose a variational energy functional for the wavefunctions which allows us to derive natural boundary conditions valid for such systems. Our formulation is particularly suited for developing a continuum theory of the protected edge/surface excitations of topological insulators both in two and three dimensions. By a detailed comparison of our analytical formulation with tight binding calculations of ribbons of topological insulators modelled by the Bernevig-Hughes-Zhang (BHZ) Hamiltonian, we show that the continuum theory with a natural boundary condition provides an appropriate description of the low energy physics.
Theory of low frequency noise transmission through turbines
NASA Technical Reports Server (NTRS)
Matta, R. K.; Mani, R.
1979-01-01
Improvements of the existing theory of low frequency noise transmission through turbines and development of a working prediction tool are described. The existing actuator-disk model and a new finite-chord model were utilized in an analytical study. The interactive effect of adjacent blade rows, higher order spinning modes, blade-passage shocks, and duct area variations were considered separately. The improved theory was validated using the data acquired in an earlier NASA program. Computer programs incorporating the improved theory were produced for transmission loss prediction purposes. The programs were exercised parametrically and charts constructed to define approximately the low frequency noise transfer through turbines. The loss through the exhaust nozzle and flow(s) was also considered.
The Elastic Behaviour of Sintered Metallic Fibre Networks: A Finite Element Study by Beam Theory
Bosbach, Wolfram A.
2015-01-01
Background The finite element method has complimented research in the field of network mechanics in the past years in numerous studies about various materials. Numerical predictions and the planning efficiency of experimental procedures are two of the motivational aspects for these numerical studies. The widespread availability of high performance computing facilities has been the enabler for the simulation of sufficiently large systems. Objectives and Motivation In the present study, finite element models were built for sintered, metallic fibre networks and validated by previously published experimental stiffness measurements. The validated models were the basis for predictions about so far unknown properties. Materials and Methods The finite element models were built by transferring previously published skeletons of fibre networks into finite element models. Beam theory was applied as simplification method. Results and Conclusions The obtained material stiffness isn’t a constant but rather a function of variables such as sample size and boundary conditions. Beam theory offers an efficient finite element method for the simulated fibre networks. The experimental results can be approximated by the simulated systems. Two worthwhile aspects for future work will be the influence of size and shape and the mechanical interaction with matrix materials. PMID:26569603
Linear and nonlinear acoustic wave propagation in the atmosphere
NASA Technical Reports Server (NTRS)
Hariharan, S. I.; Yu, Ping
1988-01-01
The investigation of the acoustic wave propagation theory and numerical implementation for the situation of an isothermal atmosphere is described. A one-dimensional model to validate an asymptotic theory and a 3-D situation to relate to a realistic situation are considered. In addition, nonlinear wave propagation and the numerical treatment are included. It is known that the gravitational effects play a crucial role in the low frequency acoustic wave propagation. They propagate large distances and, as such, the numerical treatment of those problems become difficult in terms of posing boundary conditions which are valid for all frequencies.
Developing, Testing, and Using Theoretical Models for Promoting Quality in Education
ERIC Educational Resources Information Center
Creemers, Bert; Kyriakides, Leonidas
2015-01-01
This paper argues that the dynamic model of educational effectiveness can be used to establish stronger links between educational effectiveness research (EER) and school improvement. It provides research evidence to support the validity of the model. Thus, the importance of using the dynamic model to establish an evidence-based and theory-driven…
Application of Game Theory to Improve the Defense of the Smart Grid
2012-03-01
Computer Systems and Networks ...............................................22 2.4.2 Trust Models ...systems. In this environment, developers assumed deterministic communications mediums rather than the “best effort” models provided in most modern... models or computational models to validate the SPSs design. Finally, the study reveals concerns about the performance of load rejection schemes
Subjective Expected Utility: A Model of Decision-Making.
ERIC Educational Resources Information Center
Fischoff, Baruch; And Others
1981-01-01
Outlines a model of decision making known to researchers in the field of behavioral decision theory (BDT) as subjective expected utility (SEU). The descriptive and predictive validity of the SEU model, probability and values assessment using SEU, and decision contexts are examined, and a 54-item reference list is provided. (JL)
The Twin-Cycle Experiential Learning Model: Reconceptualising Kolb's Theory
ERIC Educational Resources Information Center
Bergsteiner, Harald; Avery, Gayle C.
2014-01-01
Experiential learning styles remain popular despite criticisms about their validity, usefulness, fragmentation and poor definitions and categorisation. After examining four prominent models and building on Bergsteiner, Avery, and Neumann's suggestion of a dual cycle, this paper proposes a twin-cycle experiential learning model to overcome…
ERIC Educational Resources Information Center
Wark, David M.
The initial means for arriving at a dynamic model of reading were suggested in the form of "behaviormetric" research. A review of valid reading models noted those of Smith and Carrigan, Delacato, and Holmes as eminent, and it distinguished between models based on concrete evidence and metaphors of the reading process which are basically…
Einstein’s gravity from a polynomial affine model
NASA Astrophysics Data System (ADS)
Castillo-Felisola, Oscar; Skirzewski, Aureliano
2018-03-01
We show that the effective field equations for a recently formulated polynomial affine model of gravity, in the sector of a torsion-free connection, accept general Einstein manifolds—with or without cosmological constant—as solutions. Moreover, the effective field equations are partially those obtained from a gravitational Yang–Mills theory known as Stephenson–Kilmister–Yang theory. Additionally, we find a generalization of a minimally coupled massless scalar field in General Relativity within a ‘minimally’ coupled scalar field in this affine model. Finally, we present a brief (perturbative) analysis of the propagators of the gravitational theory, and count the degrees of freedom. For completeness, we prove that a Birkhoff-like theorem is valid for the analyzed sector.
ERIC Educational Resources Information Center
Kuhl, Julius
1978-01-01
A formal elaboration of the original theory of achievement motivation (Atkinson, 1957; Atkinson & Feather, 1966) is proposed that includes personal standards as determinants of motivational tendencies. The results of an experiment are reported that examines the validity of some of the implications of the elaborated model proposed here. (Author/RK)
Optimal fiber design for large capacity long haul coherent transmission [Invited].
Hasegawa, Takemi; Yamamoto, Yoshinori; Hirano, Masaaki
2017-01-23
Fiber figure of merit (FOM), derived from the GN-model theory and validated by several experiments, can predict improvement in OSNR or transmission distance using advanced fibers. We review the FOM theory and present design results of optimal fiber for large capacity long haul transmission, showing variation in design results according to system configuration.
On the dangers of model complexity without ecological justification in species distribution modeling
David M. Bell; Daniel R. Schlaepfer
2016-01-01
Although biogeographic patterns are the product of complex ecological processes, the increasing com-plexity of correlative species distribution models (SDMs) is not always motivated by ecological theory,but by model fit. The validity of model projections, such as shifts in a speciesâ climatic niche, becomesquestionable particularly during extrapolations, such as for...
Bonomo, Anthony L; Isakson, Marcia J; Chotiros, Nicholas P
2015-04-01
The finite element method is used to model acoustic scattering from rough poroelastic surfaces. Both monostatic and bistatic scattering strengths are calculated and compared with three analytic models: Perturbation theory, the Kirchhoff approximation, and the small-slope approximation. It is found that the small-slope approximation is in very close agreement with the finite element results for all cases studied and that perturbation theory and the Kirchhoff approximation can be considered valid in those instances where their predictions match those given by the small-slope approximation.
Sun, Xiaoling; Kaur, Jasleen; Milojević, Staša; Flammini, Alessandro; Menczer, Filippo
2013-01-01
The birth and decline of disciplines are critical to science and society. How do scientific disciplines emerge? No quantitative model to date allows us to validate competing theories on the different roles of endogenous processes, such as social collaborations, and exogenous events, such as scientific discoveries. Here we propose an agent-based model in which the evolution of disciplines is guided mainly by social interactions among agents representing scientists. Disciplines emerge from splitting and merging of social communities in a collaboration network. We find that this social model can account for a number of stylized facts about the relationships between disciplines, scholars, and publications. These results provide strong quantitative support for the key role of social interactions in shaping the dynamics of science. While several "science of science" theories exist, this is the first account for the emergence of disciplines that is validated on the basis of empirical data.
Sun, Xiaoling; Kaur, Jasleen; Milojević, Staša; Flammini, Alessandro; Menczer, Filippo
2013-01-01
The birth and decline of disciplines are critical to science and society. How do scientific disciplines emerge? No quantitative model to date allows us to validate competing theories on the different roles of endogenous processes, such as social collaborations, and exogenous events, such as scientific discoveries. Here we propose an agent-based model in which the evolution of disciplines is guided mainly by social interactions among agents representing scientists. Disciplines emerge from splitting and merging of social communities in a collaboration network. We find that this social model can account for a number of stylized facts about the relationships between disciplines, scholars, and publications. These results provide strong quantitative support for the key role of social interactions in shaping the dynamics of science. While several “science of science” theories exist, this is the first account for the emergence of disciplines that is validated on the basis of empirical data. PMID:23323212
NASA Astrophysics Data System (ADS)
Sun, Xiaoling; Kaur, Jasleen; Milojević, Staša; Flammini, Alessandro; Menczer, Filippo
2013-01-01
The birth and decline of disciplines are critical to science and society. How do scientific disciplines emerge? No quantitative model to date allows us to validate competing theories on the different roles of endogenous processes, such as social collaborations, and exogenous events, such as scientific discoveries. Here we propose an agent-based model in which the evolution of disciplines is guided mainly by social interactions among agents representing scientists. Disciplines emerge from splitting and merging of social communities in a collaboration network. We find that this social model can account for a number of stylized facts about the relationships between disciplines, scholars, and publications. These results provide strong quantitative support for the key role of social interactions in shaping the dynamics of science. While several ``science of science'' theories exist, this is the first account for the emergence of disciplines that is validated on the basis of empirical data.
NASA Astrophysics Data System (ADS)
Kruk, D.; Earle, K. A.; Mielczarek, A.; Kubica, A.; Milewska, A.; Moscicki, J.
2011-12-01
A general theory of lineshapes in nuclear quadrupole resonance (NQR), based on the stochastic Liouville equation, is presented. The description is valid for arbitrary motional conditions (particularly beyond the valid range of perturbation approaches) and interaction strengths. It can be applied to the computation of NQR spectra for any spin quantum number and for any applied magnetic field. The treatment presented here is an adaptation of the "Swedish slow motion theory," [T. Nilsson and J. Kowalewski, J. Magn. Reson. 146, 345 (2000), 10.1006/jmre.2000.2125] originally formulated for paramagnetic systems, to NQR spectral analysis. The description is formulated for simple (Brownian) diffusion, free diffusion, and jump diffusion models. The two latter models account for molecular cooperativity effects in dense systems (such as liquids of high viscosity or molecular glasses). The sensitivity of NQR slow motion spectra to the mechanism of the motional processes modulating the nuclear quadrupole interaction is discussed.
Ashton, Michael C; Lee, Kibeom; de Vries, Reinout E
2014-05-01
We review research and theory on the HEXACO personality dimensions of Honesty-Humility (H), Agreeableness (A), and Emotionality (E), with particular attention to the following topics: (1) the origins of the HEXACO model in lexical studies of personality structure, and the content of the H, A, and E factors in those studies; (2) the operationalization of the H, A, and E factors in the HEXACO Personality Inventory-Revised; (3) the construct validity of self-reports on scales measuring the H factor; (4) the theoretical distinction between H and A; (5) similarity and assumed similarity between social partners in personality, with a focus on H and A; (6) the extent to which H (and A and E) variance is represented in instruments assessing the "Five-Factor Model" of personality; and (7) the relative validity of scales assessing the HEXACO and Five-Factor Model dimensions in predicting criteria conceptually relevant to H, A, and E.
ERIC Educational Resources Information Center
Van Orden, Kimberly A.; Cukrowicz, Kelly C.; Witte, Tracy K.; Joiner, Thomas E., Jr.
2012-01-01
The present study examined the psychometric properties and construct validity of scores derived from the Interpersonal Needs Questionnaire (INQ) using latent variable modeling with 5 independent samples varying in age and level of psychopathology. The INQ was derived from the interpersonal theory of suicide and was developed to measure thwarted…
Distance Education in Taiwan: A Model Validated.
ERIC Educational Resources Information Center
Shih, Mei-Yau; Zvacek, Susan M.
The Triad Perspective Model of Distance Education (TPMDE) guides researchers in developing research questions, gathering data, and producing a comprehensive description of a distance education program. It was developed around three theoretical perspectives: (1) curriculum development theory (Tyler's four questions, 1949); (2) systems theory…
Validity of Sensory Systems as Distinct Constructs
Su, Chia-Ting
2014-01-01
This study investigated the validity of sensory systems as distinct measurable constructs as part of a larger project examining Ayres’s theory of sensory integration. Confirmatory factor analysis (CFA) was conducted to test whether sensory questionnaire items represent distinct sensory system constructs. Data were obtained from clinical records of two age groups, 2- to 5-yr-olds (n = 231) and 6- to 10-yr-olds (n = 223). With each group, we tested several CFA models for goodness of fit with the data. The accepted model was identical for each group and indicated that tactile, vestibular–proprioceptive, visual, and auditory systems form distinct, valid factors that are not age dependent. In contrast, alternative models that grouped items according to sensory processing problems (e.g., over- or underresponsiveness within or across sensory systems) did not yield valid factors. Results indicate that distinct sensory system constructs can be measured validly using questionnaire data. PMID:25184467
A Navier-Stokes phase-field crystal model for colloidal suspensions
DOE Office of Scientific and Technical Information (OSTI.GOV)
Praetorius, Simon, E-mail: simon.praetorius@tu-dresden.de; Voigt, Axel, E-mail: axel.voigt@tu-dresden.de
2015-04-21
We develop a fully continuous model for colloidal suspensions with hydrodynamic interactions. The Navier-Stokes Phase-Field Crystal model combines ideas of dynamic density functional theory with particulate flow approaches and is derived in detail and related to other dynamic density functional theory approaches with hydrodynamic interactions. The derived system is numerically solved using adaptive finite elements and is used to analyze colloidal crystallization in flowing environments demonstrating a strong coupling in both directions between the crystal shape and the flow field. We further validate the model against other computational approaches for particulate flow systems for various colloidal sedimentation problems.
A Navier-Stokes phase-field crystal model for colloidal suspensions.
Praetorius, Simon; Voigt, Axel
2015-04-21
We develop a fully continuous model for colloidal suspensions with hydrodynamic interactions. The Navier-Stokes Phase-Field Crystal model combines ideas of dynamic density functional theory with particulate flow approaches and is derived in detail and related to other dynamic density functional theory approaches with hydrodynamic interactions. The derived system is numerically solved using adaptive finite elements and is used to analyze colloidal crystallization in flowing environments demonstrating a strong coupling in both directions between the crystal shape and the flow field. We further validate the model against other computational approaches for particulate flow systems for various colloidal sedimentation problems.
Zielinski, Michal W; McGann, Locksley E; Nychka, John A; Elliott, Janet A W
2017-11-22
The prediction of nonideal chemical potentials in aqueous solutions is important in fields such as cryobiology, where models of water and solute transport-that is, osmotic transport-are used to help develop cryopreservation protocols and where solutions contain many varied solutes and are generally highly concentrated and thus thermodynamically nonideal. In this work, we further the development of a nonideal multisolute solution theory that has found application across a broad range of aqueous systems. This theory is based on the osmotic virial equation and does not depend on multisolute data. Specifically, we derive herein a novel solute chemical potential equation that is thermodynamically consistent with the existing model, and we establish the validity of a grouped solute model for the intracellular space. With this updated solution theory, it is now possible to model cellular osmotic behavior in nonideal solutions containing multiple permeating solutes, such as those commonly encountered by cells during cryopreservation. In addition, because we show here that for the osmotic virial equation the grouped solute approach is mathematically equivalent to treating each solute separately, multisolute solutions in other applications with fixed solute mass ratios can now be treated rigorously with such a model, even when all of the solutes cannot be enumerated.
Liou, Shwu-Ru
2009-01-01
To systematically analyse the Organizational Commitment model and Theory of Reasoned Action and determine concepts that can better explain nurses' intention to leave their job. The Organizational Commitment model and Theory of Reasoned Action have been proposed and applied to understand intention to leave and turnover behaviour, which are major contributors to nursing shortage. However, the appropriateness of applying these two models in nursing was not analysed. Three main criteria of a useful model were used for the analysis: consistency in the use of concepts, testability and predictability. Both theories use concepts consistently. Concepts in the Theory of Reasoned Action are defined broadly whereas they are operationally defined in the Organizational Commitment model. Predictability of the Theory of Reasoned Action is questionable whereas the Organizational Commitment model can be applied to predict intention to leave. A model was proposed based on this analysis. Organizational commitment, intention to leave, work experiences, job characteristics and personal characteristics can be concepts for predicting nurses' intention to leave. Nursing managers may consider nurses' personal characteristics and experiences to increase their organizational commitment and enhance their intention to stay. Empirical studies are needed to test and cross-validate the re-synthesized model for nurses' intention to leave their job.
Morton, Katie L; Barling, Julian; Rhodes, Ryan E; Mâsse, Louise C; Zumbo, Bruno D; Beauchamp, Mark R
2011-10-01
We draw upon transformational leadership theory to develop an instrument to measure transformational parenting for use with adolescents. First, potential items were generated that were developmentally appropriate and evidence for content validity was provided through the use of focus groups with parents and adolescents. We subsequently provide evidence for several aspects of construct validity of measures derived from the Transformational Parenting Questionnaire (TPQ). Data were collected from 857 adolescents (M(age) = 14.70 years), who rated the behaviors of their mothers and fathers. The results provided support for a second-order measurement model of transformational parenting. In addition, positive relationships between mothers' and fathers' transformational parenting behaviors, adolescents' self-regulatory efficacy for physical activity and healthy eating, and life satisfaction were found. The results of this research support the application of transformational leadership theory to parenting behaviors, as well as the construct validity of measures derived from the TPQ.
Ardestani, M S; Niknami, S; Hidarnia, A; Hajizadeh, E
2016-08-18
This research examined the validity and reliability of a researcher-developed questionnaire based on Social Cognitive Theory (SCT) to assess the physical activity behaviour of Iranian adolescent girls (SCT-PAIAGS). Psychometric properties of the SCT-PAIAGS were assessed by determining its face validity, content and construct validity as well as its reliability. In order to evaluate factor structure, cross-sectional research was conducted on 400 high-school girls in Tehran. Content validity index, content validity ratio and impact score for the SCT-PAIAGS varied between 0.97-1, 0.91-1 and 4.6-4.9 respectively. Confirmatory factor analysis approved a six-factor structure comprising self-efficacy, self-regulation, family support, friend support, outcome expectancy and self-efficacy to overcoming impediments. Factor loadings, t-values and fit indices showed that the SCT model was fitted to the data. Cronbach's α-coefficient ranged from 0.78 to 0.85 and intraclass correlation coefficient from 0.73 to 0.90.
Asakura, Nobuhiko; Inui, Toshio
2016-01-01
Two apparently contrasting theories have been proposed to account for the development of children's theory of mind (ToM): theory-theory and simulation theory. We present a Bayesian framework that rationally integrates both theories for false belief reasoning. This framework exploits two internal models for predicting the belief states of others: one of self and one of others. These internal models are responsible for simulation-based and theory-based reasoning, respectively. The framework further takes into account empirical studies of a developmental ToM scale (e.g., Wellman and Liu, 2004): developmental progressions of various mental state understandings leading up to false belief understanding. By representing the internal models and their interactions as a causal Bayesian network, we formalize the model of children's false belief reasoning as probabilistic computations on the Bayesian network. This model probabilistically weighs and combines the two internal models and predicts children's false belief ability as a multiplicative effect of their early-developed abilities to understand the mental concepts of diverse beliefs and knowledge access. Specifically, the model predicts that children's proportion of correct responses on a false belief task can be closely approximated as the product of their proportions correct on the diverse belief and knowledge access tasks. To validate this prediction, we illustrate that our model provides good fits to a variety of ToM scale data for preschool children. We discuss the implications and extensions of our model for a deeper understanding of developmental progressions of children's ToM abilities. PMID:28082941
Asakura, Nobuhiko; Inui, Toshio
2016-01-01
Two apparently contrasting theories have been proposed to account for the development of children's theory of mind (ToM): theory-theory and simulation theory. We present a Bayesian framework that rationally integrates both theories for false belief reasoning. This framework exploits two internal models for predicting the belief states of others: one of self and one of others. These internal models are responsible for simulation-based and theory-based reasoning, respectively. The framework further takes into account empirical studies of a developmental ToM scale (e.g., Wellman and Liu, 2004): developmental progressions of various mental state understandings leading up to false belief understanding. By representing the internal models and their interactions as a causal Bayesian network, we formalize the model of children's false belief reasoning as probabilistic computations on the Bayesian network. This model probabilistically weighs and combines the two internal models and predicts children's false belief ability as a multiplicative effect of their early-developed abilities to understand the mental concepts of diverse beliefs and knowledge access. Specifically, the model predicts that children's proportion of correct responses on a false belief task can be closely approximated as the product of their proportions correct on the diverse belief and knowledge access tasks. To validate this prediction, we illustrate that our model provides good fits to a variety of ToM scale data for preschool children. We discuss the implications and extensions of our model for a deeper understanding of developmental progressions of children's ToM abilities.
Macroscopic Fluctuation Theory for Stationary Non-Equilibrium States
NASA Astrophysics Data System (ADS)
Bertini, L.; de Sole, A.; Gabrielli, D.; Jona-Lasinio, G.; Landim, C.
2002-05-01
We formulate a dynamical fluctuation theory for stationary non-equilibrium states (SNS) which is tested explicitly in stochastic models of interacting particles. In our theory a crucial role is played by the time reversed dynamics. Within this theory we derive the following results: the modification of the Onsager-Machlup theory in the SNS; a general Hamilton-Jacobi equation for the macroscopic entropy; a non-equilibrium, nonlinear fluctuation dissipation relation valid for a wide class of systems; an H theorem for the entropy. We discuss in detail two models of stochastic boundary driven lattice gases: the zero range and the simple exclusion processes. In the first model the invariant measure is explicitly known and we verify the predictions of the general theory. For the one dimensional simple exclusion process, as recently shown by Derrida, Lebowitz, and Speer, it is possible to express the macroscopic entropy in terms of the solution of a nonlinear ordinary differential equation; by using the Hamilton-Jacobi equation, we obtain a logically independent derivation of this result.
Kunicki, Zachary J; Schick, Melissa R; Spillane, Nichea S; Harlow, Lisa L
2018-06-01
Those who binge drink are at increased risk for alcohol-related consequences when compared to non-binge drinkers. Research shows individuals may face barriers to reducing their drinking behavior, but few measures exist to assess these barriers. This study created and validated the Barriers to Alcohol Reduction (BAR) scale. Participants were college students ( n = 230) who endorsed at least one instance of past-month binge drinking (4+ drinks for women or 5+ drinks for men). Using classical test theory, exploratory structural equation modeling found a two-factor structure of personal/psychosocial barriers and perceived program barriers. The sub-factors, and full scale had reasonable internal consistency (i.e., coefficient omega = 0.78 (personal/psychosocial), 0.82 (program barriers), and 0.83 (full measure)). The BAR also showed evidence for convergent validity with the Brief Young Adult Alcohol Consequences Questionnaire ( r = 0.39, p < .001) and discriminant validity with Barriers to Physical Activity ( r = -0.02, p = .81). Item Response Theory (IRT) analysis showed the two factors separately met the unidimensionality assumption, and provided further evidence for severity of the items on the two factors. Results suggest that the BAR measure appears reliable and valid for use in an undergraduate student population of binge drinkers. Future studies may want to re-examine this measure in a more diverse sample.
Modeling thermal infrared (2-14 micrometer) reflectance spectra of frost and snow
NASA Technical Reports Server (NTRS)
Wald, Andrew E.
1994-01-01
Existing theories of radiative transfer in close-packed media assume that each particle scatters independently of its neighbors. For opaque particles, such as are common in the thermal infrared, this assumption is not valid, and these radiative transfer theories will not be accurate. A new method is proposed, called 'diffraction subtraction', which modifies the scattering cross section of close-packed large, opaque spheres to account for the effect of close packing on the diffraction cross section of a scattering particle. This method predicts the thermal infrared reflectance of coarse (greater than 50 micrometers radius), disaggregated granular snow. However, such coarse snow is typically old and metamorphosed, with adjacent grains welded together. The reflectance of such a welded block can be described as partly Fresnel in nature and cannot be predicted using Mie inputs to radiative transfer theory. Owing to the high absorption coefficient of ice in the thermal infrared, a rough surface reflectance model can be used to calculate reflectance from such a block. For very small (less than 50 micrometers), disaggregated particles, it is incorrect in principle to treat diffraction independently of reflection and refraction, and the theory fails. However, for particles larger than 50 micrometers, independent scattering is a valid assumption, and standard radiative transfer theory works.
NASA Astrophysics Data System (ADS)
Wu, Chenglin
Bond between deformed rebar and concrete is affected by rebar deformation pattern, concrete properties, concrete confinement, and rebar-concrete interfacial properties. Two distinct groups of bond models were traditionally developed based on the dominant effects of concrete splitting and near-interface shear-off failures. Their accuracy highly depended upon the test data sets selected in analysis and calibration. In this study, a unified bond model is proposed and developed based on an analogy to the indentation problem around the rib front of deformed rebar. This mechanics-based model can take into account the combined effect of concrete splitting and interface shear-off failures, resulting in average bond strengths for all practical scenarios. To understand the fracture process associated with bond failure, a probabilistic meso-scale model of concrete is proposed and its sensitivity to interface and confinement strengths are investigated. Both the mechanical and finite element models are validated with the available test data sets and are superior to existing models in prediction of average bond strength (< 6% error) and crack spacing (< 6% error). The validated bond model is applied to derive various interrelations among concrete crushing, concrete splitting, interfacial behavior, and the rib spacing-to-height ratio of deformed rebar. It can accurately predict the transition of failure modes from concrete splitting to rebar pullout and predict the effect of rebar surface characteristics as the rib spacing-to-height ratio increases. Based on the unified theory, a global bond model is proposed and developed by introducing bond-slip laws, and validated with testing of concrete beams with spliced reinforcement, achieving a load capacity prediction error of less than 26%. The optimal rebar parameters and concrete cover in structural designs can be derived from this study.
Analysis of delamination related fracture processes in composites
NASA Technical Reports Server (NTRS)
Armanios, Erian A.
1992-01-01
This is a final report that summarizes the results achieved under this grant. The first major accomplishment is the development of the sublaminate modeling approach and shear deformation theory. The sublaminate approach allows the flexibility of considering one ply or groups of plies as a single laminated unit with effective properties. This approach is valid when the characteristic length of the response is small compared to the sublaminate thickness. The sublaminate approach was validated comparing its predictions with a finite element solution. A shear deformation theory represents an optimum compromise between accuracy and computational effort in delamination analysis of laminated composites. This conclusion was reached by applying several theories with increasing level of complexity to the prediction of interlaminar stresses and strain energy release rate in a double cracked-lap-shear configuration.
NASA Astrophysics Data System (ADS)
Farrell, Kathryn; Oden, J. Tinsley
2014-07-01
Coarse-grained models of atomic systems, created by aggregating groups of atoms into molecules to reduce the number of degrees of freedom, have been used for decades in important scientific and technological applications. In recent years, interest in developing a more rigorous theory for coarse graining and in assessing the predictivity of coarse-grained models has arisen. In this work, Bayesian methods for the calibration and validation of coarse-grained models of atomistic systems in thermodynamic equilibrium are developed. For specificity, only configurational models of systems in canonical ensembles are considered. Among major challenges in validating coarse-grained models are (1) the development of validation processes that lead to information essential in establishing confidence in the model's ability predict key quantities of interest and (2), above all, the determination of the coarse-grained model itself; that is, the characterization of the molecular architecture, the choice of interaction potentials and thus parameters, which best fit available data. The all-atom model is treated as the "ground truth," and it provides the basis with respect to which properties of the coarse-grained model are compared. This base all-atom model is characterized by an appropriate statistical mechanics framework in this work by canonical ensembles involving only configurational energies. The all-atom model thus supplies data for Bayesian calibration and validation methods for the molecular model. To address the first challenge, we develop priors based on the maximum entropy principle and likelihood functions based on Gaussian approximations of the uncertainties in the parameter-to-observation error. To address challenge (2), we introduce the notion of model plausibilities as a means for model selection. This methodology provides a powerful approach toward constructing coarse-grained models which are most plausible for given all-atom data. We demonstrate the theory and methods through applications to representative atomic structures and we discuss extensions to the validation process for molecular models of polymer structures encountered in certain semiconductor nanomanufacturing processes. The powerful method of model plausibility as a means for selecting interaction potentials for coarse-grained models is discussed in connection with a coarse-grained hexane molecule. Discussions of how all-atom information is used to construct priors are contained in an appendix.
Beam-tracing model for predicting sound fields in rooms with multilayer bounding surfaces
NASA Astrophysics Data System (ADS)
Wareing, Andrew; Hodgson, Murray
2005-10-01
This paper presents the development of a wave-based room-prediction model for predicting steady-state sound fields in empty rooms with specularly reflecting, multilayer surfaces. A triangular beam-tracing model with phase, and a transfer-matrix approach to model the surfaces, were involved. Room surfaces were modeled as multilayers of fluid, solid, or porous materials. Biot theory was used in the transfer-matrix formulation of the porous layer. The new model consisted of the transfer-matrix model integrated into the beam-tracing algorithm. The transfer-matrix model was validated by comparing predictions with those by theory, and with experiment. The test surfaces were a glass plate, double drywall panels, double steel panels, a carpeted floor, and a suspended-acoustical ceiling. The beam-tracing model was validated in the cases of three idealized room configurations-a small office, a corridor, and a small industrial workroom-with simple boundary conditions. The number of beams, the reflection order, and the frequency resolution required to obtain accurate results were investigated. Beam-tracing predictions were compared with those by a method-of-images model with phase. The model will be used to study sound fields in rooms with local- or extended-reaction multilayer surfaces.
Shen, Minxue; Hu, Ming; Sun, Zhenqiu
2017-01-01
Objectives To develop and validate brief scales to measure common emotional and behavioural problems among adolescents in the examination-oriented education system and collectivistic culture of China. Setting Middle schools in Hunan province. Participants 5442 middle school students aged 11–19 years were sampled. 4727 valid questionnaires were collected and used for validation of the scales. The final sample included 2408 boys and 2319 girls. Primary and secondary outcome measures The tools were assessed by the item response theory, classical test theory (reliability and construct validity) and differential item functioning. Results Four scales to measure anxiety, depression, study problem and sociality problem were established. Exploratory factor analysis showed that each scale had two solutions. Confirmatory factor analysis showed acceptable to good model fit for each scale. Internal consistency and test–retest reliability of all scales were above 0.7. Item response theory showed that all items had acceptable discrimination parameters and most items had appropriate difficulty parameters. 10 items demonstrated differential item functioning with respect to gender. Conclusions Four brief scales were developed and validated among adolescents in middle schools of China. The scales have good psychometric properties with minor differential item functioning. They can be used in middle school settings, and will help school officials to assess the students’ emotional/behavioural problems. PMID:28062469
Drive: Theory and Construct Validation
Petrides, K. V.
2016-01-01
This article explicates the theory of drive and describes the development and validation of two measures. A representative set of drive facets was derived from an extensive corpus of human attributes (Study 1). Operationalised using an International Personality Item Pool version (the Drive:IPIP), a three-factor model was extracted from the facets in two samples and confirmed on a third sample (Study 2). The multi-item IPIP measure showed congruence with a short form, based on single-item ratings of the facets, and both demonstrated cross-informant reliability. Evidence also supported the measures’ convergent, discriminant, concurrent, and incremental validity (Study 3). Based on very promising findings, the authors hope to initiate a stream of research in what is argued to be a rather neglected niche of individual differences and non-cognitive assessment. PMID:27409773
2005-05-01
TANK WALL.........................74 6 VERIFICATION - BONDED JOINT HOMOGENOUS ISOTROPIC AND ORTHOTROPIC DELALE & ERDOGAN PUBLICATION (SIX EXAMPLES...developed for verification of BondJo 87 6.3.2 Adhesive stress comparisons between BondJo, Ansys solid model FEA and Delale and Erdogan plate theory 88...comparisons for condition 1 91 6.3.6 Adhesive stress comparisons between BondJo, Ansys solid model FEA and Delale and Erdogan plate theory 92 x FIGURE
1999-12-01
Ajzen , I . and M . Fishbein . Understanding Attitudes and Predicting Social Behavior. ’ Prentice-Hall, Englewood Cliffs, NJ: 1980. Alwang, Greg. "Speech...Decline: Computer Introduction in the Financial Industry." Technology Forecasting and Social Change. 31: 143-154. Fishbein , M . and I . Ajzen . Belief...Theory of Reasoned Action (TRA) ( Fishbein and Ajzen , 1980) 13 3. Theory of Planned Behavior (TPB) ( Ajzen , 1991) 15 4. Technology Acceptance Model
Development of a Brief Questionnaire to Assess Contraceptive Intent
Raine-Bennett, Tina R; Rocca, Corinne H
2015-01-01
Objective We sought to develop and validate an instrument that can enable providers to identify young women who may be at risk of contraceptive non-adherence. Methods Item response theory based methods were used to evaluate the psychometric properties of the Contraceptive Intent Questionnaire, a 15-item self-administered questionnaire, based on theory and prior qualitative and quantitative research. The questionnaire was administered to 200 women aged 15–24 years who were initiating contraceptives. We assessed item fit to the item response model, internal consistency, internal structure validity, and differential item functioning. Results All items fit a one-dimensional model. The separation reliability coefficient was 0.73. Participants’ overall scores covered the full range of the scale (0–15), and items appropriately matched the range of participants’ contraceptive intent. Items met the criteria for internal structure validity and most items functioned similarly between groups of women. Conclusion The Contraceptive Intent Questionnaire appears to be a reliable and valid tool. Future testing is needed to assess predictive ability and clinical utility. Practice Implications The Contraceptive Intent Questionnaire may serve as a valid tool to help providers identify women who may have problems with contraceptive adherence, as well as to pinpoint areas in which counseling may be directed. PMID:26104994
Development of a brief questionnaire to assess contraceptive intent.
Raine-Bennett, Tina R; Rocca, Corinne H
2015-11-01
We sought to develop and validate an instrument that can enable providers to identify young women who may be at risk of contraceptive non-adherence. Item response theory based methods were used to evaluate the psychometric properties of the Contraceptive Intent Questionnaire, a 15-item self-administered questionnaire, based on theory and prior qualitative and quantitative research. The questionnaire was administered to 200 women aged 15-24 years who were initiating contraceptives. We assessed item fit to the item response model, internal consistency, internal structure validity, and differential item functioning. All items fit a one-dimensional model. The separation reliability coefficient was 0.73. Participants' overall scores covered the full range of the scale (0-15), and items appropriately matched the range of participants' contraceptive intent. Items met the criteria for internal structure validity and most items functioned similarly between groups of women. The Contraceptive Intent Questionnaire appears to be a reliable and valid tool. Future testing is needed to assess predictive ability and clinical utility. The Contraceptive Intent Questionnaire may serve as a valid tool to help providers identify women who may have problems with contraceptive adherence, as well as to pinpoint areas in which counseling may be directed. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.
Tsai, Chung-Hung
2014-05-07
Telehealth has become an increasingly applied solution to delivering health care to rural and underserved areas by remote health care professionals. This study integrated social capital theory, social cognitive theory, and the technology acceptance model (TAM) to develop a comprehensive behavioral model for analyzing the relationships among social capital factors (social capital theory), technological factors (TAM), and system self-efficacy (social cognitive theory) in telehealth. The proposed framework was validated with 365 respondents from Nantou County, located in Central Taiwan. Structural equation modeling (SEM) was used to assess the causal relationships that were hypothesized in the proposed model. The finding indicates that elderly residents generally reported positive perceptions toward the telehealth system. Generally, the findings show that social capital factors (social trust, institutional trust, and social participation) significantly positively affect the technological factors (perceived ease of use and perceived usefulness respectively), which influenced usage intention. This study also confirmed that system self-efficacy was the salient antecedent of perceived ease of use. In addition, regarding the samples, the proposed model fitted considerably well. The proposed integrative psychosocial-technological model may serve as a theoretical basis for future research and can also offer empirical foresight to practitioners and researchers in the health departments of governments, hospitals, and rural communities.
Tsai, Chung-Hung
2014-01-01
Telehealth has become an increasingly applied solution to delivering health care to rural and underserved areas by remote health care professionals. This study integrated social capital theory, social cognitive theory, and the technology acceptance model (TAM) to develop a comprehensive behavioral model for analyzing the relationships among social capital factors (social capital theory), technological factors (TAM), and system self-efficacy (social cognitive theory) in telehealth. The proposed framework was validated with 365 respondents from Nantou County, located in Central Taiwan. Structural equation modeling (SEM) was used to assess the causal relationships that were hypothesized in the proposed model. The finding indicates that elderly residents generally reported positive perceptions toward the telehealth system. Generally, the findings show that social capital factors (social trust, institutional trust, and social participation) significantly positively affect the technological factors (perceived ease of use and perceived usefulness respectively), which influenced usage intention. This study also confirmed that system self-efficacy was the salient antecedent of perceived ease of use. In addition, regarding the samples, the proposed model fitted considerably well. The proposed integrative psychosocial-technological model may serve as a theoretical basis for future research and can also offer empirical foresight to practitioners and researchers in the health departments of governments, hospitals, and rural communities. PMID:24810577
The Determinants of Student Effort at Learning ERP: A Cultural Perspective
ERIC Educational Resources Information Center
Alshare, Khaled A.; El-Masri, Mazen; Lane, Peggy L.
2015-01-01
This paper develops a research model based on the Unified Theory of Acceptance and Use of Technology model (UTAUT) and Hofstede's cultural dimensions to explore factors that influence student effort at learning Enterprise Resource Planning (ERP) systems. A Structural Equation Model (SEM) using LISREL was utilized to validate the proposed research…
ERIC Educational Resources Information Center
Nair, Pradeep Kumar; Ali, Faizan; Leong, Lim Chee
2015-01-01
Purpose: This study aims to explain the factors affecting students' acceptance and usage of a lecture capture system (LCS)--ReWIND--in a Malaysian university based on the extended unified theory of acceptance and use of technology (UTAUT2) model. Technological advances have become an important feature of universities' plans to improve the…
ERIC Educational Resources Information Center
Yuan, Ke-Hai; Lu, Laura
2008-01-01
This article provides the theory and application of the 2-stage maximum likelihood (ML) procedure for structural equation modeling (SEM) with missing data. The validity of this procedure does not require the assumption of a normally distributed population. When the population is normally distributed and all missing data are missing at random…
ERIC Educational Resources Information Center
Pongsophon, Pongprapan; Herman, Benjamin C.
2017-01-01
Given the abundance of literature describing the strong relationship between inquiry-based teaching and student achievement, more should be known about the factors impacting science teachers' classroom inquiry implementation. This study utilises the theory of planned behaviour to propose and validate a causal model of inquiry-based teaching…
Diviani, Nicola; Dima, Alexandra Lelia; Schulz, Peter Johannes
2017-04-11
The eHealth Literacy Scale (eHEALS) is a tool to assess consumers' comfort and skills in using information technologies for health. Although evidence exists of reliability and construct validity of the scale, less agreement exists on structural validity. The aim of this study was to validate the Italian version of the eHealth Literacy Scale (I-eHEALS) in a community sample with a focus on its structural validity, by applying psychometric techniques that account for item difficulty. Two Web-based surveys were conducted among a total of 296 people living in the Italian-speaking region of Switzerland (Ticino). After examining the latent variables underlying the observed variables of the Italian scale via principal component analysis (PCA), fit indices for two alternative models were calculated using confirmatory factor analysis (CFA). The scale structure was examined via parametric and nonparametric item response theory (IRT) analyses accounting for differences between items regarding the proportion of answers indicating high ability. Convergent validity was assessed by correlations with theoretically related constructs. CFA showed a suboptimal model fit for both models. IRT analyses confirmed all items measure a single dimension as intended. Reliability and construct validity of the final scale were also confirmed. The contrasting results of factor analysis (FA) and IRT analyses highlight the importance of considering differences in item difficulty when examining health literacy scales. The findings support the reliability and validity of the translated scale and its use for assessing Italian-speaking consumers' eHealth literacy. ©Nicola Diviani, Alexandra Lelia Dima, Peter Johannes Schulz. Originally published in the Journal of Medical Internet Research (http://www.jmir.org), 11.04.2017.
Non-Equilibrium Turbulence and Two-Equation Modeling
NASA Technical Reports Server (NTRS)
Rubinstein, Robert
2011-01-01
Two-equation turbulence models are analyzed from the perspective of spectral closure theories. Kolmogorov theory provides useful information for models, but it is limited to equilibrium conditions in which the energy spectrum has relaxed to a steady state consistent with the forcing at large scales; it does not describe transient evolution between such states. Transient evolution is necessarily through nonequilibrium states, which can only be found from a theory of turbulence evolution, such as one provided by a spectral closure. When the departure from equilibrium is small, perturbation theory can be used to approximate the evolution by a two-equation model. The perturbation theory also gives explicit conditions under which this model can be valid, and when it will fail. Implications of the non-equilibrium corrections for the classic Tennekes-Lumley balance in the dissipation rate equation are drawn: it is possible to establish both the cancellation of the leading order Re1/2 divergent contributions to vortex stretching and enstrophy destruction, and the existence of a nonzero difference which is finite in the limit of infinite Reynolds number.
ERIC Educational Resources Information Center
Santor, Darcy A.
2006-01-01
In this article, the author outlines six recommendations that may guide the continued development and validation of measures of depression. These are (a) articulate and revise a formal theory of signs and symptoms; (b) differentiate complex theoretical goals from pragmatic evaluation needs; (c) invest heavily in new methods and analytic models;…
ERIC Educational Resources Information Center
O'Sullivan, Deirdre Elizabeth Mary
2009-01-01
The current demands of the global economy has led to an increased focus on personality and behaviors as they relate to employment outcomes for the rising number of people living with disabilities and chronic illness. There are a number of well-established and validated theories, models, and instruments that have been implemented to improve work…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Guo, Y.; Parsons, T.; King, R.
This report summarizes the theory, verification, and validation of a new sizing tool for wind turbine drivetrain components, the Drivetrain Systems Engineering (DriveSE) tool. DriveSE calculates the dimensions and mass properties of the hub, main shaft, main bearing(s), gearbox, bedplate, transformer if up-tower, and yaw system. The level of fi¬ delity for each component varies depending on whether semiempirical parametric or physics-based models are used. The physics-based models have internal iteration schemes based on system constraints and design criteria. Every model is validated against available industry data or finite-element analysis. The verification and validation results show that the models reasonablymore » capture primary drivers for the sizing and design of major drivetrain components.« less
NASA Astrophysics Data System (ADS)
Kadyshevich, E. A.; Dzyabchenko, A. V.; Ostrovskii, V. E.
2014-04-01
Size compatibility of the CH4-hydrate structure II and multi-component DNA fragments is confirmed by three-dimensional simulation; it is validation of the Life Origination Hydrate Theory (LOH-Theory).
Basic Brackets of a 2D Model for the Hodge Theory Without its Canonical Conjugate Momenta
NASA Astrophysics Data System (ADS)
Kumar, R.; Gupta, S.; Malik, R. P.
2016-06-01
We deduce the canonical brackets for a two (1+1)-dimensional (2D) free Abelian 1-form gauge theory by exploiting the beauty and strength of the continuous symmetries of a Becchi-Rouet-Stora-Tyutin (BRST) invariant Lagrangian density that respects, in totality, six continuous symmetries. These symmetries entail upon this model to become a field theoretic example of Hodge theory. Taken together, these symmetries enforce the existence of exactly the same canonical brackets amongst the creation and annihilation operators that are found to exist within the standard canonical quantization scheme. These creation and annihilation operators appear in the normal mode expansion of the basic fields of this theory. In other words, we provide an alternative to the canonical method of quantization for our present model of Hodge theory where the continuous internal symmetries play a decisive role. We conjecture that our method of quantization is valid for a class of field theories that are tractable physical examples for the Hodge theory. This statement is true in any arbitrary dimension of spacetime.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Damiani, Rick
This manual summarizes the theory and preliminary verifications of the JacketSE module, which is an offshore jacket sizing tool that is part of the Wind-Plant Integrated System Design & Engineering Model toolbox. JacketSE is based on a finite-element formulation and on user-prescribed inputs and design standards' criteria (constraints). The physics are highly simplified, with a primary focus on satisfying ultimate limit states and modal performance requirements. Preliminary validation work included comparing industry data and verification against ANSYS, a commercial finite-element analysis package. The results are encouraging, and future improvements to the code are recommended in this manual.
Duality, Gauge Symmetries, Renormalization Groups and the BKT Transition
NASA Astrophysics Data System (ADS)
José, Jorge V.
2017-03-01
In this chapter, I will briefly review, from my own perspective, the situation within theoretical physics at the beginning of the 1970s, and the advances that played an important role in providing a solid theoretical and experimental foundation for the Berezinskii-Kosterlitz-Thouless theory (BKT). Over this period, it became clear that the Abelian gauge symmetry of the 2D-XY model had to be preserved to get the right phase structure of the model. In previous analyses, this symmetry was broken when using low order calculational approximations. Duality transformations at that time for two-dimensional models with compact gauge symmetries were introduced by José, Kadanoff, Nelson and Kirkpatrick (JKKN). Their goal was to analyze the phase structure and excitations of XY and related models, including symmetry breaking fields which are experimentally important. In a separate context, Migdal had earlier developed an approximate Renormalization Group (RG) algorithm to implement Wilson’s RG for lattice gauge theories. Although Migdal’s RG approach, later extended by Kadanoff, did not produce a true phase transition for the XY model, it almost did asymptotically in terms of a non-perturbative expansion in the coupling constant with an essential singularity. Using these advances, including work done on instantons (vortices), JKKN analyzed the behavior of the spin-spin correlation functions of the 2D XY-model in terms of an expansion in temperature and vortex-pair fugacity. Their analysis led to a perturbative derivation of RG equations for the XY model which are the same as those first derived by Kosterlitz for the two-dimensional Coulomb gas. JKKN’s results gave a theoretical formulation foundation and justification for BKT’s sound physical assumptions and for the validity of their calculational approximations that were, in principle, strictly valid only at very low temperatures, away from the critical TBKT temperature. The theoretical predictions were soon tested successfully against experimental results on superfluid helium films. The success of the BKT theory also gave one of the first quantitative proofs of the validity of the RG theory.
Duality, Gauge Symmetries, Renormalization Groups and the BKT Transition
NASA Astrophysics Data System (ADS)
José, Jorge V.
2013-06-01
In this chapter, I will briefly review, from my own perspective, the situation within theoretical physics at the beginning of the 1970s, and the advances that played an important role in providing a solid theoretical and experimental foundation for the Berezinskii-Kosterlitz-Thouless theory (BKT). Over this period, it became clear that the Abelian gauge symmetry of the 2D-XY model had to be preserved to get the right phase structure of the model. In previous analyses, this symmetry was broken when using low order calculational approximations. Duality transformations at that time for two-dimensional models with compact gauge symmetries were introduced by José, Kadanoff, Nelson and Kirkpatrick (JKKN). Their goal was to analyze the phase structure and excitations of XY and related models, including symmetry breaking fields which are experimentally important. In a separate context, Migdal had earlier developed an approximate Renormalization Group (RG) algorithm to implement Wilson's RG for lattice gauge theories. Although Migdal's RG approach, later extended by Kadanoff, did not produce a true phase transition for the XY model, it almost did asymptotically in terms of a non-perturbative expansion in the coupling constant with an essential singularity. Using these advances, including work done on instantons (vortices), JKKN analyzed the behavior of the spin-spin correlation functions of the 2D XY-model in terms of an expansion in temperature and vortex-pair fugacity. Their analysis led to a perturbative derivation of RG equations for the XY model which are the same as those first derived by Kosterlitz for the two-dimensional Coulomb gas. JKKN's results gave a theoretical formulation foundation and justification for BKT's sound physical assumptions and for the validity of their calculational approximations that were, in principle, strictly valid only at very low temperatures, away from the critical TBKT temperature. The theoretical predictions were soon tested successfully against experimental results on superfluid helium films. The success of the BKT theory also gave one of the first quantitative proofs of the validity of the RG theory...
Billon, Alexis; Foy, Cédric; Picaut, Judicaël; Valeau, Vincent; Sakout, Anas
2008-06-01
In this paper, a modification of the diffusion model for room acoustics is proposed to account for sound transmission between two rooms, a source room and an adjacent room, which are coupled through a partition wall. A system of two diffusion equations, one for each room, together with a set of two boundary conditions, one for the partition wall and one for the other walls of a room, is obtained and numerically solved. The modified diffusion model is validated by numerical comparisons with the statistical theory for several coupled-room configurations by varying the coupling area surface, the absorption coefficient of each room, and the volume of the adjacent room. An experimental comparison is also carried out for two coupled classrooms. The modified diffusion model results agree very well with both the statistical theory and the experimental data. The diffusion model can then be used as an alternative to the statistical theory, especially when the statistical theory is not applicable, that is, when the reverberant sound field is not diffuse. Moreover, the diffusion model allows the prediction of the spatial distribution of sound energy within each coupled room, while the statistical theory gives only one sound level for each room.
Hawkins, Melanie; Elsworth, Gerald R; Osborne, Richard H
2018-07-01
Data from subjective patient-reported outcome measures (PROMs) are now being used in the health sector to make or support decisions about individuals, groups and populations. Contemporary validity theorists define validity not as a statistical property of the test but as the extent to which empirical evidence supports the interpretation of test scores for an intended use. However, validity testing theory and methodology are rarely evident in the PROM validation literature. Application of this theory and methodology would provide structure for comprehensive validation planning to support improved PROM development and sound arguments for the validity of PROM score interpretation and use in each new context. This paper proposes the application of contemporary validity theory and methodology to PROM validity testing. The validity testing principles will be applied to a hypothetical case study with a focus on the interpretation and use of scores from a translated PROM that measures health literacy (the Health Literacy Questionnaire or HLQ). Although robust psychometric properties of a PROM are a pre-condition to its use, a PROM's validity lies in the sound argument that a network of empirical evidence supports the intended interpretation and use of PROM scores for decision making in a particular context. The health sector is yet to apply contemporary theory and methodology to PROM development and validation. The theoretical and methodological processes in this paper are offered as an advancement of the theory and practice of PROM validity testing in the health sector.
White, Steven M; White, K A Jane
2005-08-21
Recently there has been a great deal of interest within the ecological community about the interactions of local populations that are coupled only by dispersal. Models have been developed to consider such scenarios but the theory needed to validate model outcomes has been somewhat lacking. In this paper, we present theory which can be used to understand these types of interaction when population exhibit discrete time dynamics. In particular, we consider a spatial extension to discrete-time models, known as coupled map lattices (CMLs) which are discrete in space. We introduce a general form of the CML and link this to integro-difference equations via a special redistribution kernel. General conditions are then derived for dispersal-driven instabilities. We then apply this theory to two discrete-time models; a predator-prey model and a host-pathogen model.
Testing the Validity of a Cognitive Behavioral Model for Gambling Behavior.
Raylu, Namrata; Oei, Tian Po S; Loo, Jasmine M Y; Tsai, Jung-Shun
2016-06-01
Currently, cognitive behavioral therapies appear to be one of the most studied treatments for gambling problems and studies show it is effective in treating gambling problems. However, cognitive behavior models have not been widely tested using statistical means. Thus, the aim of this study was to test the validity of the pathways postulated in the cognitive behavioral theory of gambling behavior using structural equation modeling (AMOS 20). Several questionnaires assessing a range of gambling specific variables (e.g., gambling urges, cognitions and behaviors) and gambling correlates (e.g., psychological states, and coping styles) were distributed to 969 participants from the community. Results showed that negative psychological states (i.e., depression, anxiety and stress) only directly predicted gambling behavior, whereas gambling urges predicted gambling behavior directly as well as indirectly via gambling cognitions. Avoidance coping predicted gambling behavior only indirectly via gambling cognitions. Negative psychological states were significantly related to gambling cognitions as well as avoidance coping. In addition, significant gender differences were also found. The results provided confirmation for the validity of the pathways postulated in the cognitive behavioral theory of gambling behavior. It also highlighted the importance of gender differences in conceptualizing gambling behavior.
Concurrent Validity of Holland's Theory for College-Degreed Black Women.
ERIC Educational Resources Information Center
Bingham, Rosie P.; Walsh, W. Bruce
1978-01-01
This study, using the Vocational Preference Inventory and the Self-Directed Search, explored the concurrent validity of Holland's theory for employed college-degreed Black women. The findings support the validity of Holland's theory for this population. (Author)
Hong, Quan Nha; Coutu, Marie-France; Berbiche, Djamal
2017-01-01
The Work Role Functioning Questionnaire (WRFQ) was developed to assess workers' perceived ability to perform job demands and is used to monitor presenteeism. Still few studies on its validity can be found in the literature. The purpose of this study was to assess the items and factorial composition of the Canadian French version of the WRFQ (WRFQ-CF). Two measurement approaches were used to test the WRFQ-CF: Classical Test Theory (CTT) and non-parametric Item Response Theory (IRT). A total of 352 completed questionnaires were analyzed. A four-factor and three-factor model models were tested and shown respectively good fit with 14 items (Root Mean Square Error of Approximation (RMSEA) = 0.06, Standardized Root Mean Square Residual (SRMR) = 0.04, Bentler Comparative Fit Index (CFI) = 0.98) and with 17 items (RMSEA = 0.059, SRMR = 0.048, CFI = 0.98). Using IRT, 13 problematic items were identified, of which 9 were common with CTT. This study tested different models with fewer problematic items found in a three-factor model. Using a non-parametric IRT and CTT for item purification gave complementary results. IRT is still scarcely used and can be an interesting alternative method to enhance the quality of a measurement instrument. More studies are needed on the WRFQ-CF to refine its items and factorial composition.
Classen, Sherrilene; Winter, Sandra M.; Velozo, Craig A.; Bédard, Michel; Lanford, Desiree N.; Brumback, Babette; Lutz, Barbara J.
2010-01-01
OBJECTIVE We report on item development and validity testing of a self-report older adult safe driving behaviors measure (SDBM). METHOD On the basis of theoretical frameworks (Precede–Proceed Model of Health Promotion, Haddon’s matrix, and Michon’s model), existing driving measures, and previous research and guided by measurement theory, we developed items capturing safe driving behavior. Item development was further informed by focus groups. We established face validity using peer reviewers and content validity using expert raters. RESULTS Peer review indicated acceptable face validity. Initial expert rater review yielded a scale content validity index (CVI) rating of 0.78, with 44 of 60 items rated ≥0.75. Sixteen unacceptable items (≤0.5) required major revision or deletion. The next CVI scale average was 0.84, indicating acceptable content validity. CONCLUSION The SDBM has relevance as a self-report to rate older drivers. Future pilot testing of the SDBM comparing results with on-road testing will define criterion validity. PMID:20437917
Campos, Juliana Alvares Duarte Bonini; Spexoto, Maria Cláudia Bernardes; da Silva, Wanderson Roberto; Serrano, Sergio Vicente; Marôco, João
2018-01-01
ABSTRACT Objective To evaluate the psychometric properties of the seven theoretical models proposed in the literature for European Organization for Research and Treatment of Cancer Quality of Life Questionnaire Core 30 (EORTC QLQ-C30), when applied to a sample of Brazilian cancer patients. Methods Content and construct validity (factorial, convergent, discriminant) were estimated. Confirmatory factor analysis was performed. Convergent validity was analyzed using the average variance extracted. Discriminant validity was analyzed using correlational analysis. Internal consistency and composite reliability were used to assess the reliability of instrument. Results A total of 1,020 cancer patients participated. The mean age was 53.3±13.0 years, and 62% were female. All models showed adequate factorial validity for the study sample. Convergent and discriminant validities and the reliability were compromised in all of the models for all of the single items referring to symptoms, as well as for the “physical function” and “cognitive function” factors. Conclusion All theoretical models assessed in this study presented adequate factorial validity when applied to Brazilian cancer patients. The choice of the best model for use in research and/or clinical protocols should be centered on the purpose and underlying theory of each model. PMID:29694609
Validation of Kinetic-Turbulent-Neoclassical Theory for Edge Intrinsic Rotation in DIII-D Plasmas
NASA Astrophysics Data System (ADS)
Ashourvan, Arash
2017-10-01
Recent experiments on DIII-D with low-torque neutral beam injection (NBI) have provided a validation of a new model of momentum generation in a wide range of conditions spanning L- and H-mode with direct ion and electron heating. A challenge in predicting the bulk rotation profile for ITER has been to capture the physics of momentum transport near the separatrix and steep gradient region. A recent theory has presented a model for edge momentum transport which predicts the value and direction of the main-ion intrinsic velocity at the pedestal-top, generated by the passing orbits in the inhomogeneous turbulent field. In this study, this model-predicted velocity is tested on DIII-D for a database of 44 low-torque NBI discharges comprised of bothL- and H-mode plasmas. For moderate NBI powers (PNBI<4 MW), model prediction agrees well with the experiments for both L- and H-mode. At higher NBI power the experimental rotation is observed to saturate and even degrade compared to theory. TRANSP-NUBEAM simulations performed for the database show that for discharges with nominally balanced - but high powered - NBI, the net injected torque through the edge can exceed 1 N.m in the counter-current direction. The theory model has been extended to compute the rotation degradation from this counter-current NBI torque by solving a reduced momentum evolution equation for the edge and found the revised velocity prediction to be in agreement with experiment. Projecting to the ITER baseline scenario, this model predicts a value for the pedestal-top rotation (ρ 0.9) comparable to 4 kRad/s. Using the theory modeled - and now tested - velocity to predict the bulk plasma rotation opens up a path to more confidently projecting the confinement and stability in ITER. Supported by the US DOE under DE-AC02-09CH11466 and DE-FC02-04ER54698.
De Bondt, Niki; Van Petegem, Peter
2015-01-01
The Overexcitability Questionnaire-Two (OEQ-II) measures the degree and nature of overexcitability, which assists in determining the developmental potential of an individual according to Dabrowski's Theory of Positive Disintegration. Previous validation studies using frequentist confirmatory factor analysis, which postulates exact parameter constraints, led to model rejection and a long series of model modifications. Bayesian structural equation modeling (BSEM) allows the application of zero-mean, small-variance priors for cross-loadings, residual covariances, and differences in measurement parameters across groups, better reflecting substantive theory and leading to better model fit and less overestimation of factor correlations. Our BSEM analysis with a sample of 516 students in higher education yields positive results regarding the factorial validity of the OEQ-II. Likewise, applying BSEM-based alignment with approximate measurement invariance, the absence of non-invariant factor loadings and intercepts across gender is supportive of the psychometric quality of the OEQ-II. Compared to males, females scored significantly higher on emotional and sensual overexcitability, and significantly lower on psychomotor overexcitability. PMID:26733931
De Bondt, Niki; Van Petegem, Peter
2015-01-01
The Overexcitability Questionnaire-Two (OEQ-II) measures the degree and nature of overexcitability, which assists in determining the developmental potential of an individual according to Dabrowski's Theory of Positive Disintegration. Previous validation studies using frequentist confirmatory factor analysis, which postulates exact parameter constraints, led to model rejection and a long series of model modifications. Bayesian structural equation modeling (BSEM) allows the application of zero-mean, small-variance priors for cross-loadings, residual covariances, and differences in measurement parameters across groups, better reflecting substantive theory and leading to better model fit and less overestimation of factor correlations. Our BSEM analysis with a sample of 516 students in higher education yields positive results regarding the factorial validity of the OEQ-II. Likewise, applying BSEM-based alignment with approximate measurement invariance, the absence of non-invariant factor loadings and intercepts across gender is supportive of the psychometric quality of the OEQ-II. Compared to males, females scored significantly higher on emotional and sensual overexcitability, and significantly lower on psychomotor overexcitability.
Loop models, modular invariance, and three-dimensional bosonization
NASA Astrophysics Data System (ADS)
Goldman, Hart; Fradkin, Eduardo
2018-05-01
We consider a family of quantum loop models in 2+1 spacetime dimensions with marginally long-ranged and statistical interactions mediated by a U (1 ) gauge field, both purely in 2+1 dimensions and on a surface in a (3+1)-dimensional bulk system. In the absence of fractional spin, these theories have been shown to be self-dual under particle-vortex duality and shifts of the statistical angle of the loops by 2 π , which form a subgroup of the modular group, PSL (2 ,Z ) . We show that careful consideration of fractional spin in these theories completely breaks their statistical periodicity and describe how this occurs, resolving a disagreement with the conformal field theories they appear to approach at criticality. We show explicitly that incorporation of fractional spin leads to loop model dualities which parallel the recent web of (2+1)-dimensional field theory dualities, providing a nontrivial check on its validity.
Continuum Mean-Field Theories for Molecular Fluids, and Their Validity at the Nanoscale
NASA Astrophysics Data System (ADS)
Hanna, C. B.; Peyronel, F.; MacDougall, C.; Marangoni, A.; Pink, D. A.; AFMNet-NCE Collaboration
2011-03-01
We present a calculation of the physical properties of solid triglyceride particles dispersed in an oil phase, using atomic- scale molecular dynamics. Significant equilibrium density oscillations in the oil appear when the interparticle distance, d , becomes sufficiently small, with a global minimum in the free energy found at d ~ 1.4 nm. We compare the simulation values of the Hamaker coefficient with those of models which assume that the oil is a homogeneous continuum: (i) Lifshitz theory, (ii) the Fractal Model, and (iii) a Lennard-Jones 6-12 potential model. The last-named yields a minimum in the free energy at d ~ 0.26 nm. We conclude that, at the nanoscale, continuum Lifshitz theory and other continuum mean-field theories based on the assumption of homogeneous fluid density can lead to erroneous conclusions. CBH supported by NSF DMR-0906618. DAP supported by NSERC. This work supported by AFMNet-NCE.
A Nonparametric Approach for Assessing Goodness-of-Fit of IRT Models in a Mixed Format Test
ERIC Educational Resources Information Center
Liang, Tie; Wells, Craig S.
2015-01-01
Investigating the fit of a parametric model plays a vital role in validating an item response theory (IRT) model. An area that has received little attention is the assessment of multiple IRT models used in a mixed-format test. The present study extends the nonparametric approach, proposed by Douglas and Cohen (2001), to assess model fit of three…
A model for plant lighting system selection.
Ciolkosz, D E; Albright, L D; Sager, J C; Langhans, R W
2002-01-01
A decision model is presented that compares lighting systems for a plant growth scenario and chooses the most appropriate system from a given set of possible choices. The model utilizes a Multiple Attribute Utility Theory approach, and incorporates expert input and performance simulations to calculate a utility value for each lighting system being considered. The system with the highest utility is deemed the most appropriate system. The model was applied to a greenhouse scenario, and analyses were conducted to test the model's output for validity. Parameter variation indicates that the model performed as expected. Analysis of model output indicates that differences in utility among the candidate lighting systems were sufficiently large to give confidence that the model's order of selection was valid.
Bai, Yeon K; Dinour, Lauren M
2017-11-01
A proper assessment of multidimensional needs for breastfeeding mothers in various settings is crucial to facilitate and support breastfeeding and its exclusivity. The theory of planned behavior (TPB) has been used frequently to measure factors associated with breastfeeding. Full utility of the TPB requires accurate measurement of theory constructs. Research aim: This study aimed to develop and confirm the psychometric properties of an instrument, Milk Expression on Campus, based on the TPB and to establish the reliability and validity of the instrument. In spring 2015, 218 breastfeeding (current or in the recent past) employees and students at one university campus in northern New Jersey completed the online questionnaire containing demography and theory-based items. Internal consistency (α) and split-half reliability ( r) tests and factor analyses established and confirmed the reliability and construct validity of this instrument. Milk Expression on Campus showed strong and significant reliabilities as a full scale (α = .78, r = .74, p < .001) and theory construct subscales. Validity was confirmed as psychometric properties corresponded to the factors extracted from the scale. Four factors extracted from the direct construct subscales accounted for 79.49% of the total variability. Four distinct factors from the indirect construct subscales accounted for 73.68% of the total variability. Milk Expression on Campus can serve as a model TPB-based instrument to examine factors associated with women's milk expression behavior. The utility of this instrument extends to designing effective promotion programs to foster breastfeeding and milk expression behaviors in diverse settings.
Lopes; Oden
1999-06-01
In recent years, descriptive models of risky choice have incorporated features that reflect the importance of particular outcome values in choice. Cumulative prospect theory (CPT) does this by inserting a reference point in the utility function. SP/A (security-potential/aspiration) theory uses aspiration level as a second criterion in the choice process. Experiment 1 compares the ability of the CPT and SP/A models to account for the same within-subjects data set and finds in favor of SP/A. Experiment 2 replicates the main finding of Experiment 1 in a between-subjects design. The final discussion brackets the SP/A result by showing the impact on fit of both decreasing and increasing the number of free parameters. We also suggest how the SP/A approach might be useful in modeling investment decision making in a descriptively more valid way and conclude with comments on the relation between descriptive and normative theories of risky choice. Copyright 1999 Academic Press.
Sensitivity of Precipitation in Coupled Land-Atmosphere Models
NASA Technical Reports Server (NTRS)
Neelin, David; Zeng, N.; Suarez, M.; Koster, R.
2004-01-01
The project objective was to understand mechanisms by which atmosphere-land-ocean processes impact precipitation in the mean climate and interannual variations, focusing on tropical and subtropical regions. A combination of modeling tools was used: an intermediate complexity land-atmosphere model developed at UCLA known as the QTCM and the NASA Seasonal-to-Interannual Prediction Program general circulation model (NSIPP GCM). The intermediate complexity model was used to develop hypotheses regarding the physical mechanisms and theory for the interplay of large-scale dynamics, convective heating, cloud radiative effects and land surface feedbacks. The theoretical developments were to be confronted with diagnostics from the more complex GCM to validate or modify the theory.
Stochastic field-line wandering in magnetic turbulence with shear. I. Quasi-linear theory
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shalchi, A.; Negrea, M.; Petrisor, I.
2016-07-15
We investigate the random walk of magnetic field lines in magnetic turbulence with shear. In the first part of the series, we develop a quasi-linear theory in order to compute the diffusion coefficient of magnetic field lines. We derive general formulas for the diffusion coefficients in the different directions of space. We like to emphasize that we expect that quasi-linear theory is only valid if the so-called Kubo number is small. We consider two turbulence models as examples, namely, a noisy slab model as well as a Gaussian decorrelation model. For both models we compute the field line diffusion coefficientsmore » and we show how they depend on the aforementioned Kubo number as well as a shear parameter. It is demonstrated that the shear effect reduces all field line diffusion coefficients.« less
Validation, Optimization and Simulation of a Solar Thermoelectric Generator Model
NASA Astrophysics Data System (ADS)
Madkhali, Hadi Ali; Hamil, Ali; Lee, HoSung
2017-12-01
This study explores thermoelectrics as a viable option for small-scale solar thermal applications. Thermoelectric technology is based on the Seebeck effect, which states that a voltage is induced when a temperature gradient is applied to the junctions of two differing materials. This research proposes to analyze, validate, simulate, and optimize a prototype solar thermoelectric generator (STEG) model in order to increase efficiency. The intent is to further develop STEGs as a viable and productive energy source that limits pollution and reduces the cost of energy production. An empirical study (Kraemer et al. in Nat Mater 10:532, 2011) on the solar thermoelectric generator reported a high efficiency performance of 4.6%. The system had a vacuum glass enclosure, a flat panel (absorber), thermoelectric generator and water circulation for the cold side. The theoretical and numerical approach of this current study validated the experimental results from Kraemer's study to a high degree. The numerical simulation process utilizes a two-stage approach in ANSYS software for Fluent and Thermal-Electric Systems. The solar load model technique uses solar radiation under AM 1.5G conditions in Fluent. This analytical model applies Dr. Ho Sung Lee's theory of optimal design to improve the performance of the STEG system by using dimensionless parameters. Applying this theory, using two cover glasses and radiation shields, the STEG model can achieve a highest efficiency of 7%.
Sweet, Shane N.; Fortier, Michelle S.; Strachan, Shaelyn M.; Blanchard, Chris M.; Boulay, Pierre
2014-01-01
Self-determination theory and self-efficacy theory are prominent theories in the physical activity literature, and studies have begun integrating their concepts. Sweet, Fortier, Strachan and Blanchard (2012) have integrated these two theories in a cross-sectional study. Therefore, this study sought to test a longitudinal integrated model to predict physical activity at the end of a 4-month cardiac rehabilitation program based on theory, research and Sweet et al.’s cross-sectional model. Participants from two cardiac rehabilitation programs (N=109) answered validated self-report questionnaires at baseline, two and four months. Data were analyzed using Amos to assess the path analysis and model fit. Prior to integration, perceived competence and self-efficacy were combined, and labeled as confidence. After controlling for 2-month physical activity and cardiac rehabilitation site, no motivational variables significantly predicted residual change in 4-month physical activity. Although confidence at two months did not predict residual change in 4-month physical activity, it had a strong positive relationship with 2-month physical activity (β=0.30, P<0.001). The overall model retained good fit indices. In conclusion, results diverged from theoretical predictions of physical activity, but self-determination and self-efficacy theory were still partially supported. Because the model had good fit, this study demonstrated that theoretical integration is feasible. PMID:26973926
A New Similarity theory for Strongly Unstable Atmospheric Surface Layer
NASA Astrophysics Data System (ADS)
Ji, Yong; She, Zhen-Su
2017-11-01
We apply the structural ensemble dynamics (SED) theory to analyze mean velocity and streamwise turbulence intensity distribution in unstable atmospheric surface layer (ASL). The turbulent kinetic energy balance equation in ASL asserts that above a critical height zL, the buoyancy production cannot be neglected. The SED theory predicts that a stress length function displays a generalized scaling law from z to z 4 / 3. The zL derived from observational data show a two-regime form with Obukhov length L , including a linear dependence for moderate heat flux and a constant regime for large heat flux, extending the Monin-Obukhov similarity theory which is only valid for large | L | . This two-regime description is further extended to model turbulent intensity, with a new similarity coordinate Lz such that the observational data collapse for all L. Finally, we propose a phase diagram for characterizing different ASL flow regimes, and the corresponding flow structures are discussed. In summary, a new similarity theory for unstable atmosphere is constructed, and validated by observational data of the mean velocity and streamwise turbulence intensity distribution for all heat flux regimes.
Holgado-Tello, Fco P; Chacón-Moscoso, Salvador; Sanduvete-Chaves, Susana; Pérez-Gil, José A
2016-01-01
The Campbellian tradition provides a conceptual framework to assess threats to validity. On the other hand, different models of causal analysis have been developed to control estimation biases in different research designs. However, the link between design features, measurement issues, and concrete impact estimation analyses is weak. In order to provide an empirical solution to this problem, we use Structural Equation Modeling (SEM) as a first approximation to operationalize the analytical implications of threats to validity in quasi-experimental designs. Based on the analogies established between the Classical Test Theory (CTT) and causal analysis, we describe an empirical study based on SEM in which range restriction and statistical power have been simulated in two different models: (1) A multistate model in the control condition (pre-test); and (2) A single-trait-multistate model in the control condition (post-test), adding a new mediator latent exogenous (independent) variable that represents a threat to validity. Results show, empirically, how the differences between both the models could be partially or totally attributed to these threats. Therefore, SEM provides a useful tool to analyze the influence of potential threats to validity.
Holgado-Tello, Fco. P.; Chacón-Moscoso, Salvador; Sanduvete-Chaves, Susana; Pérez-Gil, José A.
2016-01-01
The Campbellian tradition provides a conceptual framework to assess threats to validity. On the other hand, different models of causal analysis have been developed to control estimation biases in different research designs. However, the link between design features, measurement issues, and concrete impact estimation analyses is weak. In order to provide an empirical solution to this problem, we use Structural Equation Modeling (SEM) as a first approximation to operationalize the analytical implications of threats to validity in quasi-experimental designs. Based on the analogies established between the Classical Test Theory (CTT) and causal analysis, we describe an empirical study based on SEM in which range restriction and statistical power have been simulated in two different models: (1) A multistate model in the control condition (pre-test); and (2) A single-trait-multistate model in the control condition (post-test), adding a new mediator latent exogenous (independent) variable that represents a threat to validity. Results show, empirically, how the differences between both the models could be partially or totally attributed to these threats. Therefore, SEM provides a useful tool to analyze the influence of potential threats to validity. PMID:27378991
Nursing intellectual capital theory: operationalization and empirical validation of concepts.
Covell, Christine L; Sidani, Souraya
2013-08-01
To present the operationalization of concepts in the nursing intellectual capital theory and the results of a methodological study aimed at empirically validating the concepts. The nursing intellectual capital theory proposes that the stocks of nursing knowledge in an organization are embedded in two concepts, nursing human capital and nursing structural capital. The theory also proposes that two concepts in the work environment, nurse staffing and employer support for nursing continuing professional development, influence nursing human capital. A cross-sectional design. A systematic three-step process was used to operationalize the concepts of the theory. In 2008, data were collected for 147 inpatient units from administrative departments and unit managers in 6 Canadian hospitals. Exploratory factor analyses were conducted to determine if the indicator variables accurately reflect their respective concepts. The proposed indicator variables collectively measured the nurse staffing concept. Three indicators were retained to construct nursing human capital: clinical expertise and experience concept. The nursing structural capital and employer support for nursing continuing professional development concepts were not validated empirically. The nurse staffing and the nursing human capital: clinical expertise and experience concepts will be brought forward for further model testing. Refinement for some of the indicator variables of the concepts is indicated. Additional research is required with different sources of data to confirm the findings. © 2012 Blackwell Publishing Ltd.
The formal verification of generic interpreters
NASA Technical Reports Server (NTRS)
Windley, P.; Levitt, K.; Cohen, G. C.
1991-01-01
The task assignment 3 of the design and validation of digital flight control systems suitable for fly-by-wire applications is studied. Task 3 is associated with formal verification of embedded systems. In particular, results are presented that provide a methodological approach to microprocessor verification. A hierarchical decomposition strategy for specifying microprocessors is also presented. A theory of generic interpreters is presented that can be used to model microprocessor behavior. The generic interpreter theory abstracts away the details of instruction functionality, leaving a general model of what an interpreter does.
Li, Rui; Ye, Hongfei; Zhang, Weisheng; Ma, Guojun; Su, Yewang
2015-10-29
Spring constant calibration of the atomic force microscope (AFM) cantilever is of fundamental importance for quantifying the force between the AFM cantilever tip and the sample. The calibration within the framework of thin plate theory undoubtedly has a higher accuracy and broader scope than that within the well-established beam theory. However, thin plate theory-based accurate analytic determination of the constant has been perceived as an extremely difficult issue. In this paper, we implement the thin plate theory-based analytic modeling for the static behavior of rectangular AFM cantilevers, which reveals that the three-dimensional effect and Poisson effect play important roles in accurate determination of the spring constants. A quantitative scaling law is found that the normalized spring constant depends only on the Poisson's ratio, normalized dimension and normalized load coordinate. Both the literature and our refined finite element model validate the present results. The developed model is expected to serve as the benchmark for accurate calibration of rectangular AFM cantilevers.
Thomas, Michael L
2012-03-01
There is growing evidence that psychiatric disorders maintain hierarchical associations where general and domain-specific factors play prominent roles (see D. Watson, 2005). Standard, unidimensional measurement models can fail to capture the meaningful nuances of such complex latent variable structures. The present study examined the ability of the multidimensional item response theory bifactor model (see R. D. Gibbons & D. R. Hedeker, 1992) to improve construct validity by serving as a bridge between measurement and clinical theories. Archival data consisting of 688 outpatients' psychiatric diagnoses and item-level responses to the Brief Symptom Inventory (BSI; L. R. Derogatis, 1993) were extracted from files at a university mental health clinic. The bifactor model demonstrated superior fit for the internal structure of the BSI and improved overall diagnostic accuracy in the sample (73%) compared with unidimensional (61%) and oblique simple structure (65%) models. Consistent with clinical theory, multiple sources of item variance were drawn from individual test items. Test developers and clinical researchers are encouraged to consider model-based measurement in the assessment of psychiatric distress.
Measuring Constructs in Family Science: How Can Item Response Theory Improve Precision and Validity?
Gordon, Rachel A.
2014-01-01
This article provides family scientists with an understanding of contemporary measurement perspectives and the ways in which item response theory (IRT) can be used to develop measures with desired evidence of precision and validity for research uses. The article offers a nontechnical introduction to some key features of IRT, including its orientation toward locating items along an underlying dimension and toward estimating precision of measurement for persons with different levels of that same construct. It also offers a didactic example of how the approach can be used to refine conceptualization and operationalization of constructs in the family sciences, using data from the National Longitudinal Survey of Youth 1979 (n = 2,732). Three basic models are considered: (a) the Rasch and (b) two-parameter logistic models for dichotomous items and (c) the Rating Scale Model for multicategory items. Throughout, the author highlights the potential for researchers to elevate measurement to a level on par with theorizing and testing about relationships among constructs. PMID:25663714
Koller, Ingrid; Levenson, Michael R.; Glück, Judith
2017-01-01
The valid measurement of latent constructs is crucial for psychological research. Here, we present a mixed-methods procedure for improving the precision of construct definitions, determining the content validity of items, evaluating the representativeness of items for the target construct, generating test items, and analyzing items on a theoretical basis. To illustrate the mixed-methods content-scaling-structure (CSS) procedure, we analyze the Adult Self-Transcendence Inventory, a self-report measure of wisdom (ASTI, Levenson et al., 2005). A content-validity analysis of the ASTI items was used as the basis of psychometric analyses using multidimensional item response models (N = 1215). We found that the new procedure produced important suggestions concerning five subdimensions of the ASTI that were not identifiable using exploratory methods. The study shows that the application of the suggested procedure leads to a deeper understanding of latent constructs. It also demonstrates the advantages of theory-based item analysis. PMID:28270777
A conceptual model of children's cognitive adaptation to physical disability.
Bernardo, M L
1982-11-01
Increasing numbers of children are being required to adapt to lifelong illness and disability. While numerous studies exist on theories of adaptation, reaction to illness, and children's concepts of self and of illness, an integrated view of children's ability to conceptualize themselves, their disabilities and possible adaptations has not been formulated. In this article an attempt has been made to integrate models of adaptation to disability and knowledge about children's cognitive development using Piagetian theory of cognitive development and Crate's stages of adaptation to chronic illness. This conceptually integrated model can be used as a departure point for studies to validate the applicability of Piaget's theory to the development of the physically disabled child and to clinically assess the adaptational stages available to the child at various developmental stages.
Field theories and fluids for an interacting dark sector
NASA Astrophysics Data System (ADS)
Carrillo González, Mariana; Trodden, Mark
2018-02-01
We consider the relationship between fluid models of an interacting dark sector and the field theoretical models that underlie such descriptions. This question is particularly important in light of suggestions that such interactions may help alleviate a number of current tensions between different cosmological datasets. We construct consistent field theory models for an interacting dark sector that behave exactly like the coupled fluid ones, even at the level of linear perturbations, and can be trusted deep in the nonlinear regime. As a specific example, we focus on the case of a Dirac, Born-Infeld (DBI) field conformally coupled to a quintessence field. We show that the fluid linear regime breaks before the field gradients become large; this means that the field theory is valid inside a large region of the fluid nonlinear regime.
Nursing Care Interpersonal Relationship Questionnaire: elaboration and validation.
Borges, José Wicto Pereira; Moreira, Thereza Maria Magalhães; Andrade, Dalton Franscisco de
2018-01-08
to elaborate an instrument for the measurement of the interpersonal relationship in nursing care through the Item Response Theory, and the validation thereof. methodological study, which followed the three poles of psychometry: theoretical, empirical and analytical. The Nursing Care Interpersonal Relationship Questionnaire was developed in light of the Imogene King's Interpersonal Conceptual Model and the psychometric properties were studied through the Item Response Theory in a sample of 950 patients attended in Primary, Secondary and Tertiary Health Care. the final instrument consisted of 31 items, with Cronbach's alpha of 0.90 and McDonald's Omega of 0.92. The parameters of the Item Response Theory demonstrated high discrimination in 28 items, being developed a five-level interpretive scale. At the first level, the communication process begins, gaining a wealth of interaction. Subsequent levels demonstrate qualitatively the points of effectiveness of the interpersonal relationship with the involvement of behaviors related to the concepts of transaction and interaction, followed by the concept of role. the instrument was created and proved to be consistent to measure interpersonal relationship in nursing care, as it presented adequate reliability and validity parameters.
Invariant operators, orthogonal bases and correlators in general tensor models
NASA Astrophysics Data System (ADS)
Diaz, Pablo; Rey, Soo-Jong
2018-07-01
We study invariant operators in general tensor models. We show that representation theory provides an efficient framework to count and classify invariants in tensor models of (gauge) symmetry Gd = U (N1) ⊗ ⋯ ⊗ U (Nd). As a continuation and completion of our earlier work, we present two natural ways of counting invariants, one for arbitrary Gd and another valid for large rank of Gd. We construct bases of invariant operators based on the counting, and compute correlators of their elements. The basis associated with finite rank of Gd diagonalizes the two-point function of the free theory. It is analogous to the restricted Schur basis used in matrix models. We show that the constructions get almost identical as we swap the Littlewood-Richardson numbers in multi-matrix models with Kronecker coefficients in general tensor models. We explore the parallelism between matrix model and tensor model in depth from the perspective of representation theory and comment on several ideas for future investigation.
Antoneli, Fernando; Ferreira, Renata C; Briones, Marcelo R S
2016-06-01
Here we propose a new approach to modeling gene expression based on the theory of random dynamical systems (RDS) that provides a general coupling prescription between the nodes of any given regulatory network given the dynamics of each node is modeled by a RDS. The main virtues of this approach are the following: (i) it provides a natural way to obtain arbitrarily large networks by coupling together simple basic pieces, thus revealing the modularity of regulatory networks; (ii) the assumptions about the stochastic processes used in the modeling are fairly general, in the sense that the only requirement is stationarity; (iii) there is a well developed mathematical theory, which is a blend of smooth dynamical systems theory, ergodic theory and stochastic analysis that allows one to extract relevant dynamical and statistical information without solving the system; (iv) one may obtain the classical rate equations form the corresponding stochastic version by averaging the dynamic random variables (small noise limit). It is important to emphasize that unlike the deterministic case, where coupling two equations is a trivial matter, coupling two RDS is non-trivial, specially in our case, where the coupling is performed between a state variable of one gene and the switching stochastic process of another gene and, hence, it is not a priori true that the resulting coupled system will satisfy the definition of a random dynamical system. We shall provide the necessary arguments that ensure that our coupling prescription does indeed furnish a coupled regulatory network of random dynamical systems. Finally, the fact that classical rate equations are the small noise limit of our stochastic model ensures that any validation or prediction made on the basis of the classical theory is also a validation or prediction of our model. We illustrate our framework with some simple examples of single-gene system and network motifs. Copyright © 2016 Elsevier Inc. All rights reserved.
Brod, Meryl; Højbjerre, Lise; Adalsteinsson, Johan Erpur; Rasmussen, Michael Højby
2014-04-01
Approximately 50 000 adults in the United States are diagnosed with GH deficiency, which has negative impacts on cognitive functioning, psychological well-being, and quality of life. This paper presents development and validation of a patient-reported outcome measure (PRO), the Treatment-Related Impact Measure-Adult Growth Hormone Deficiency (TRIM-AGHD). The TRIM-AGHD was developed to measure the impact of GH deficiency and its treatment. The development and validation of the TRIM-AGHD was conducted according to the Food and Drug Administration guidance on the development of PROs. Concept elicitation, conducted in three countries included interviews with patients, clinical experts, and literature review. Qualitative data were analyzed based on grounded theory principles, and draft items were cognitively debriefed. The measure underwent psychometric validation in a US clinic-based population. An a priori statistical analysis plan included assessment of the measurement model, reliability, and validity. Item functioning was reviewed using item response theory analyses. Forty-eight patients and six clinical experts participated in concept elicitation and 169 patients completed the validation study. TRIM-AGHD was measured. Factor analysis resulted in four domains: energy level, physical health, emotional health, and cognitive ability. The item response theory confirmed adequate item fit and placement within their domain. Internal consistency ranged from 0.82 to 0.95 and test-retest ranged from 0.80 to 0.92. All prespecified hypotheses for convergent validity and all but two for discriminant validity were met. The final 26-item TRIM-AGHD can be considered a reliable and valid PRO of the impact of disease and treatment for adult GH deficiency.
Development and validation of a piloted simulation of a helicopter and external sling load
NASA Technical Reports Server (NTRS)
Shaughnessy, J. D.; Deaux, T. N.; Yenni, K. R.
1979-01-01
A generalized, real time, piloted, visual simulation of a single rotor helicopter, suspension system, and external load is described and validated for the full flight envelope of the U.S. Army CH-54 helicopter and cargo container as an example. The mathematical model described uses modified nonlinear classical rotor theory for both the main rotor and tail rotor, nonlinear fuselage aerodynamics, an elastic suspension system, nonlinear load aerodynamics, and a loadground contact model. The implementation of the mathematical model on a large digital computing system is described, and validation of the simulation is discussed. The mathematical model is validated by comparing measured flight data with simulated data, by comparing linearized system matrices, eigenvalues, and eigenvectors with manufacturers' data, and by the subjective comparison of handling characteristics by experienced pilots. A visual landing display system for use in simulation which generates the pilot's forward looking real world display was examined and a special head up, down looking load/landing zone display is described.
Bernard, Larry C
2010-04-01
There are few multidimensional measures of individual differences in motivation available. The Assessment of Individual Motives-Questionnaire assesses 15 putative dimensions of motivation. The dimensions are based on evolutionary theory and preliminary evidence suggests the motive scales have good psychometric properties. The scales are reliable and there is evidence of their consensual validity (convergence of self-other ratings) and behavioral validity (relationships with self-other reported behaviors of social importance). Additional validity research is necessary, however, especially with respect to current models of personality. The present study tested two general and 24 specific hypotheses based on proposed evolutionary advantages/disadvantages and fitness benefits/costs of the five-factor model of personality together with the new motive scales in a sample of 424 participants (M age=28.8 yr., SD=14.6). Results were largely supportive of the hypotheses. These results support the validity of new motive dimensions and increase understanding of the five-factor model of personality.
An Analysis of Measured Pressure Signatures From Two Theory-Validation Low-Boom Models
NASA Technical Reports Server (NTRS)
Mack, Robert J.
2003-01-01
Two wing/fuselage/nacelle/fin concepts were designed to check the validity and the applicability of sonic-boom minimization theory, sonic-boom analysis methods, and low-boom design methodology in use at the end of the 1980is. Models of these concepts were built, and the pressure signatures they generated were measured in the wind-tunnel. The results of these measurements lead to three conclusions: (1) the existing methods could adequately predict sonic-boom characteristics of wing/fuselage/fin(s) configurations if the equivalent area distributions of each component were smooth and continuous; (2) these methods needed revision so the engine-nacelle volume and the nacelle-wing interference lift disturbances could be accurately predicted; and (3) current nacelle-configuration integration methods had to be updated. With these changes in place, the existing sonic-boom analysis and minimization methods could be effectively applied to supersonic-cruise concepts for acceptable/tolerable sonic-boom overpressures during cruise.
Riemannian geometry of Hamiltonian chaos: hints for a general theory.
Cerruti-Sola, Monica; Ciraolo, Guido; Franzosi, Roberto; Pettini, Marco
2008-10-01
We aim at assessing the validity limits of some simplifying hypotheses that, within a Riemmannian geometric framework, have provided an explanation of the origin of Hamiltonian chaos and have made it possible to develop a method of analytically computing the largest Lyapunov exponent of Hamiltonian systems with many degrees of freedom. Therefore, a numerical hypotheses testing has been performed for the Fermi-Pasta-Ulam beta model and for a chain of coupled rotators. These models, for which analytic computations of the largest Lyapunov exponents have been carried out in the mentioned Riemannian geometric framework, appear as paradigmatic examples to unveil the reason why the main hypothesis of quasi-isotropy of the mechanical manifolds sometimes breaks down. The breakdown is expected whenever the topology of the mechanical manifolds is nontrivial. This is an important step forward in view of developing a geometric theory of Hamiltonian chaos of general validity.
Cooperative Learning: Improving University Instruction by Basing Practice on Validated Theory
ERIC Educational Resources Information Center
Johnson, David W.; Johnson, Roger T.; Smith, Karl A.
2014-01-01
Cooperative learning is an example of how theory validated by research may be applied to instructional practice. The major theoretical base for cooperative learning is social interdependence theory. It provides clear definitions of cooperative, competitive, and individualistic learning. Hundreds of research studies have validated its basic…
ERIC Educational Resources Information Center
AL-Dossary, Saeed Abdullah
2017-01-01
Cheating on tests is a serious problem in education. The purpose of this study was to test the efficacy of a modified form of the theory of planned behavior (TPB) to predict cheating behavior among a sample of Saudi university students. This study also sought to test the influence of cheating in high school on cheating in college within the…
The music of gold: can gold counterfeited coins be detected by ear?
NASA Astrophysics Data System (ADS)
Manas, Arnaud
2015-07-01
In this paper I investigate whether it is true and to what extent counterfeit coins can be detected by their sound frequency. I describe the different types of counterfeit coins encountered and their respective characteristics. I then use the Kirchoff thin plate theory to model a coin, and confirm the validity of the theory by listening to the tone of genuine and counterfeit coins.
Continuum-mechanics-based rheological formulation for debris flow
Chen, Cheng-lung; Ling, Chi-Hai; ,
1993-01-01
This paper aims to assess the validity of the generalized viscoplastic fluid (GVF) model in the light of both the classical relative-viscosity versus concentration relation and the dimensionless stress versus shear-rate squared relations based on kinetic theory, thereby addressing how to evaluate the rheological parameters of the GVF model using Bagnold's data.
ERIC Educational Resources Information Center
Bergner, Yoav; Droschler, Stefan; Kortemeyer, Gerd; Rayyan, Saif; Seaton, Daniel; Pritchard, David E.
2012-01-01
We apply collaborative filtering (CF) to dichotomously scored student response data (right, wrong, or no interaction), finding optimal parameters for each student and item based on cross-validated prediction accuracy. The approach is naturally suited to comparing different models, both unidimensional and multidimensional in ability, including a…
Formulaic Language in Computer-Supported Communication: Theory Meets Reality.
ERIC Educational Resources Information Center
Wray, Alison
2002-01-01
Attempts to validate a psycholinguistic model of language processing. One experiment designed to provide insight into the model is TALK, is a system developed to promote conversational fluency in non-speaking individuals. TALK, designed primarily for people with cerebral palsy and motor neuron disease. Talk is demonstrated to be a viable tool for…
NASA Astrophysics Data System (ADS)
Tscharnuter, W. M.
1980-02-01
Modes and model concept of star formation are reviewed, beginning with the theory of Kant (1755), via Newton's exact mathematical formulation of the laws of motion, his recognition of the universal validity of general gravitation, to modern concepts and hypotheses. Axisymmetric and spherically symmetric collapse models are discussed, and the origin of double and multiple star systems is examined.
ERIC Educational Resources Information Center
Peeraer, Jef; Van Petegem, Peter
2012-01-01
This research describes the development and validation of an instrument to measure integration of Information and Communication Technology (ICT) in education. After literature research on definitions of integration of ICT in education, a comparison is made between the classical test theory and the item response modeling approach for the…
Changes in Arctic Sea Ice Thickness and Floe Size
NASA Astrophysics Data System (ADS)
Zhang, J.; Schweiger, A. J. B.; Stern, H. L., III; Steele, M.
2016-12-01
A thickness, floe size, and enthalpy distribution sea ice model was implemented into the Pan-arctic Ice-Ocean Modeling and Assimilation System (PIOMAS) by coupling the Zhang et al. [2015] sea ice floe size distribution (FSD) theory with the Thorndike et al. [1975] ice thickness distribution (ITD) theory in order to explicitly simulate multicategory FSD and ITD simultaneously. A range of ice thickness and floe size observations were used for model calibration and validation. The expanded, validated PIOMAS was used to study sea ice response to atmospheric and oceanic changes in the Arctic, focusing on the interannual variability and trends of ice thickness and floe size over the period 1979-2015. It is found that over the study period both ice thickness and floe size have been decreasing steadily in the Arctic. The simulated ice thickness shows considerable spatiotemporal variability in recent years. As the ice cover becomes thinner and weaker, the model simulates an increasing number of small floes (at the low end of the FSD), which affects sea ice properties, particularly in the marginal ice zone.
NASA Astrophysics Data System (ADS)
Cotté, B.
2018-05-01
This study proposes to couple a source model based on Amiet's theory and a parabolic equation code in order to model wind turbine noise emission and propagation in an inhomogeneous atmosphere. Two broadband noise generation mechanisms are considered, namely trailing edge noise and turbulent inflow noise. The effects of wind shear and atmospheric turbulence are taken into account using the Monin-Obukhov similarity theory. The coupling approach, based on the backpropagation method to preserve the directivity of the aeroacoustic sources, is validated by comparison with an analytical solution for the propagation over a finite impedance ground in a homogeneous atmosphere. The influence of refraction effects is then analyzed for different directions of propagation. The spectrum modification related to the ground effect and the presence of a shadow zone for upwind receivers are emphasized. The validity of the point source approximation that is often used in wind turbine noise propagation models is finally assessed. This approximation exaggerates the interference dips in the spectra, and is not able to correctly predict the amplitude modulation.
Non-linear assessment and deficiency of linear relationship for healthcare industry
NASA Astrophysics Data System (ADS)
Nordin, N.; Abdullah, M. M. A. B.; Razak, R. C.
2017-09-01
This paper presents the development of the non-linear service satisfaction model that assumes patients are not necessarily satisfied or dissatisfied with good or poor service delivery. With that, compliment and compliant assessment is considered, simultaneously. Non-linear service satisfaction instrument called Kano-Q and Kano-SS is developed based on Kano model and Theory of Quality Attributes (TQA) to define the unexpected, hidden and unspoken patient satisfaction and dissatisfaction into service quality attribute. A new Kano-Q and Kano-SS algorithm for quality attribute assessment is developed based satisfaction impact theories and found instrumentally fit the reliability and validity test. The results were also validated based on standard Kano model procedure before Kano model and Quality Function Deployment (QFD) is integrated for patient attribute and service attribute prioritization. An algorithm of Kano-QFD matrix operation is developed to compose the prioritized complaint and compliment indexes. Finally, the results of prioritized service attributes are mapped to service delivery category to determine the most prioritized service delivery that need to be improved at the first place by healthcare service provider.
NASA Astrophysics Data System (ADS)
Fredette, Luke; Singh, Rajendra
2017-02-01
A spectral element approach is proposed to determine the multi-axis dynamic stiffness terms of elastomeric isolators with fractional damping over a broad range of frequencies. The dynamic properties of a class of cylindrical isolators are modeled by using the continuous system theory in terms of homogeneous rods or Timoshenko beams. The transfer matrix type dynamic stiffness expressions are developed from exact harmonic solutions given translational or rotational displacement excitations. Broadband dynamic stiffness magnitudes (say up to 5 kHz) are computationally verified for axial, torsional, shear, flexural, and coupled stiffness terms using a finite element model. Some discrepancies are found between finite element and spectral element models for the axial and flexural motions, illustrating certain limitations of each method. Experimental validation is provided for an isolator with two cylindrical elements (that work primarily in the shear mode) using dynamic measurements, as reported in the prior literature, up to 600 Hz. Superiority of the fractional damping formulation over structural or viscous damping models is illustrated via experimental validation. Finally, the strengths and limitations of the spectral element approach are briefly discussed.
Hattori, Masasi
2016-12-01
This paper presents a new theory of syllogistic reasoning. The proposed model assumes there are probabilistic representations of given signature situations. Instead of conducting an exhaustive search, the model constructs an individual-based "logical" mental representation that expresses the most probable state of affairs, and derives a necessary conclusion that is not inconsistent with the model using heuristics based on informativeness. The model is a unification of previous influential models. Its descriptive validity has been evaluated against existing empirical data and two new experiments, and by qualitative analyses based on previous empirical findings, all of which supported the theory. The model's behavior is also consistent with findings in other areas, including working memory capacity. The results indicate that people assume the probabilities of all target events mentioned in a syllogism to be almost equal, which suggests links between syllogistic reasoning and other areas of cognition. Copyright © 2016 The Author(s). Published by Elsevier B.V. All rights reserved.
Trait-specific dependence in romantic relationships.
Ellis, Bruce J; Simpson, Jeffry A; Campbell, Lorne
2002-10-01
Informed by three theoretical frameworks--trait psychology, evolutionary psychology, and interdependence theory--we report four investigations designed to develop and test the reliability and validity of a new construct and accompanying multiscale inventory, the Trait-Specific Dependence Inventory (TSDI). The TSDI assesses comparisons between present and alternative romantic partners on major dimensions of mate value. In Study 1, principal components analyses revealed that the provisional pool of theory-generated TSDI items were represented by six factors: Agreeable/Committed, Resource Accruing Potential, Physical Prowess, Emotional Stability, Surgency, and Physical Attractiveness. In Study 2, confirmatory factor analysis replicated these results on a different sample and tested how well different structural models fit the data. Study 3 provided evidence for the convergent and discriminant validity of the six TSDI scales by correlating each one with a matched personality trait scale that did not explicitly incorporate comparisons between partners. Study 4 provided further validation evidence, revealing that the six TSDI scales successfully predicted three relationship outcome measures--love, time investment, and anger/upset--above and beyond matched sets of traditional personality trait measures. These results suggest that the TSDI is a reliable, valid, and unique construct that represents a new trait-specific method of assessing dependence in romantic relationships. The construct of trait-specific dependence is introduced and linked with other theories of mate value.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Korayem, M. H.; Khaksar, H.; Taheri, M.
2013-11-14
This article has dealt with the development and modeling of various contact theories for biological nanoparticles shaped as cylinders and circular crowned rollers for application in the manipulation of different biological micro/nanoparticles based on Atomic Force Microscope. First, the effective contact forces were simulated, and their impact on contact mechanics simulation was investigated. In the next step, the Hertz contact model was simulated and compared for gold and DNA nanoparticles with the three types of spherical, cylindrical, and circular crowned roller type contact geometries. Then by reducing the length of the cylindrical section in the circular crowned roller geometry, themore » geometry of the body was made to approach that of a sphere, and the results were compared for DNA nanoparticles. To anticipatory validate the developed theories, the results of the cylindrical and the circular crowned roller contacts were compared with the results of the existing spherical contact simulations. Following the development of these contact models for the manipulation of various biological micro/nanoparticles, the cylindrical and the circular crowned roller type contact theories were modeled based on the theories of Lundberg, Dowson, Nikpur, Heoprich, and Hertz for the manipulation of biological micro/nanoparticles. Then, for a more accurate validation, the results obtained from the simulations were compared with those obtained by the finite element method and with the experimental results available in previous articles. The previous research works on the simulation of nanomanipulation have mainly investigated the contact theories used in the manipulation of spherical micro/nanoparticles. However since in real biomanipulation situations, biological micro/nanoparticles of more complex shapes need to be displaced in biological environments, this article therefore has modeled and compared, for the first time, different contact theories for use in the biomanipulation of cylindrical and circular crowned roller shaped micro/nanoparticles. The results of models indicate that the contact model of Hertz achieves the largest amount of deformation for the DNA nanoparticle in cylindrical form and the contact model of Heoprich achieves the largest deformation for the circular crowned roller shaped DNA. Of course, this finding is not always true for the other nanoparticles; and considering the mechanical and environmental characteristics, different results can be obtained. Also, by comparing the deformations of different types of nanoparticles, it was determined that the platelet type nanoparticles display the highest degree of deformation in all the considered models, due to their particular mechanical characteristics.« less
Ketcham, Jonathan D; Kuminoff, Nicolai V; Powers, Christopher A
2016-12-01
Consumers' enrollment decisions in Medicare Part D can be explained by Abaluck and Gruber’s (2011) model of utility maximization with psychological biases or by a neoclassical version of their model that precludes such biases. We evaluate these competing hypotheses by applying nonparametric tests of utility maximization and model validation tests to administrative data. We find that 79 percent of enrollment decisions from 2006 to 2010 satisfied basic axioms of consumer theory under the assumption of full information. The validation tests provide evidence against widespread psychological biases. In particular, we find that precluding psychological biases improves the structural model's out-of-sample predictions for consumer behavior.
Juarez, Juan C; Brown, David M; Young, David W
2014-05-19
Current Strehl ratio models for actively compensated free-space optical communications terminals do not accurately predict system performance under strong turbulence conditions as they are based on weak turbulence theory. For evaluation of compensated systems, we present an approach for simulating the Strehl ratio with both low-order (tip/tilt) and higher-order (adaptive optics) correction. Our simulation results are then compared to the published models and their range of turbulence validity is assessed. Finally, we propose a new Strehl ratio model and antenna gain equation that are valid for general turbulence conditions independent of the degree of compensation.
New Phenomena in NC Field Theory and Emergent Spacetime Geometry
NASA Astrophysics Data System (ADS)
Ydri, Badis
2010-10-01
We give a brief review of two nonperturbative phenomena typical of noncommutative field theory which are known to lead to the perturbative instability known as the UV-IR mixing. The first phenomena concerns the emergence/evaporation of spacetime geometry in matrix models which describe perturbative noncommutative gauge theory on fuzzy backgrounds. In particular we show that the transition from a geometrical background to a matrix phase makes the description of noncommutative gauge theory in terms of fields via the Weyl map only valid below a critical value g*. The second phenomena concerns the appearance of a nonuniform ordered phase in noncommutative scalar φ4 field theory and the spontaneous symmetry breaking of translational/rotational invariance which happens even in two dimensions. We argue that this phenomena also originates in the underlying matrix degrees of freedom of the noncommutative field theory. Furthermore it is conjectured that in addition to the usual WF fixed point at θ = 0 there must exist a novel fixed point at θ = ∞ corresponding to the quartic hermitian matrix model.
Validation of the Sexual Orientation Microaggression Inventory In Two Diverse Samples of LGBTQ Youth
Swann, Gregory; Minshew, Reese; Newcomb, Michael E.; Mustanski, Brian
2016-01-01
Critical race theory asserts that microaggressions, or low-level, covert acts of aggression, are commonplace in the lives of people of color. These theorists also assert a taxonomy of microaggressions, which includes “microassaults,” “microinsults,” and “microinvalidations.” The theory of microaggressions has been adopted by researchers of LGBTQ communities. This study investigated the three-factor taxonomy as it relates to a diverse sample of LGBTQ youth using the newly developed Sexual Orientation Microaggression Inventory (SOMI). Exploratory factor analysis was used to determine the number of factors that exist in SOMI in a sample of 206 LGBTQ-identifying youth. Follow up confirmatory factor analyses (CFAs) were conducted in order to compare single factor, unrestricted four factor, second order, and bi-factor models in a separate sample of 363 young men who have sex with men. The best fitting model was used to predict victimization, depressive symptoms, and depression diagnosis in order to test validity. The best fitting model was a bi-factor model utilizing 19 of the original 26 items with a general factor and four specific factors representing anti-gay attitudes (“microinsults”), denial of homosexuality, heterosexism (“microinvalidations”), and societal disapproval (“microassaults”). Reliability analyses found that the majority of reliable variance was accounted for by the general factor. The general factor was a significant predictor of victimization and depressive symptoms, as well as unrelated to social desirability, suggesting convergent, criterion-related, and discriminant validity. SOMI emerged as a scale with evidence of validity for assessing exposure to microaggressions in a diverse sample of LGBTQ youth. PMID:27067241
Swann, Gregory; Minshew, Reese; Newcomb, Michael E; Mustanski, Brian
2016-08-01
Critical race theory asserts that microaggressions, or low-level, covert acts of aggression, are commonplace in the lives of people of color. These theorists also assert a taxonomy of microaggressions, which includes "microassaults," "microinsults," and "microinvalidations". The theory of microaggressions has been adopted by researchers of LGBTQ communities. This study investigated the three-factor taxonomy as it relates to a diverse sample of LGBTQ youth using the newly developed Sexual Orientation Microaggression Inventory (SOMI). Exploratory factor analysis was used to determine the number of factors that exist in SOMI in a sample of 206 LGBTQ-identifying youth. Follow up confirmatory factor analyses were conducted in order to compare single-factor, unrestricted four-factor, second-order, and bi-factor models in a separate sample of 363 young men who have sex with men. The best fitting model was used to predict victimization, depressive symptoms, and depression diagnosis in order to test validity. The best fitting model was a bi-factor model utilizing 19 of the original 26 items with a general factor and four specific factors representing anti-gay attitudes ("microinsults"), denial of homosexuality, heterosexism ("microinvalidations"), and societal disapproval ("microassaults"). Reliability analyses found that the majority of reliable variance was accounted for by the general factor. The general factor was a significant predictor of victimization and depressive symptoms, as well as unrelated to social desirability, suggesting convergent, criterion-related, and discriminant validity. SOMI emerged as a scale with evidence of validity for assessing exposure to microaggressions in a diverse sample of LGBTQ youth.
A philosophy for big-bang cosmology.
McCrea, W H
1970-10-03
According to recent developments in cosmology we seem bound to find a model universe like the observed universe, almost independently of how we suppose it started. Such ideas, if valid, provide fresh justification for the procedures of current cosmological theory.
Development and construct validity of the Classroom Strategies Scale-Observer Form.
Reddy, Linda A; Fabiano, Gregory; Dudek, Christopher M; Hsu, Louis
2013-12-01
Research on progress monitoring has almost exclusively focused on student behavior and not on teacher practices. This article presents the development and validation of a new teacher observational assessment (Classroom Strategies Scale) of classroom instructional and behavioral management practices. The theoretical underpinnings and empirical basis for the instructional and behavioral management scales are presented. The Classroom Strategies Scale (CSS) evidenced overall good reliability estimates including internal consistency, interrater reliability, test-retest reliability, and freedom from item bias on important teacher demographics (age, educational degree, years of teaching experience). Confirmatory factor analyses (CFAs) of CSS data from 317 classrooms were carried out to assess the level of empirical support for (a) a 4 first-order factor theory concerning teachers' instructional practices, and (b) a 4 first-order factor theory concerning teachers' behavior management practice. Several fit indices indicated acceptable fit of the (a) and (b) CFA models to the data, as well as acceptable fit of less parsimonious alternative CFA models that included 1 or 2 second-order factors. Information-theory-based indices generally suggested that the (a) and (b) CFA models fit better than some more parsimonious alternative CFA models that included constraints on relations of first-order factors. Overall, CFA first-order and higher order factor results support the CSS-Observer Total, Composite, and subscales. Suggestions for future measurement development efforts are outlined. PsycINFO Database Record (c) 2013 APA, all rights reserved.
Osmotic pressure beyond concentration restrictions.
Grattoni, Alessandro; Merlo, Manuele; Ferrari, Mauro
2007-10-11
Osmosis is a fundamental physical process that involves the transit of solvent molecules across a membrane separating two liquid solutions. Osmosis plays a role in many biological processes such as fluid exchange in animal cells (Cell Biochem. Biophys. 2005, 42, 277-345;1 J. Periodontol. 2007, 78, 757-7632) and water transport in plants. It is also involved in many technological applications such as drug delivery systems (Crit. Rev. Ther. Drug. 2004, 21, 477-520;3 J. Micro-Electromech. Syst. 2004, 13, 75-824) and water purification. Extensive attention has been dedicated in the past to the modeling of osmosis, starting with the classical theories of van't Hoff and Morse. These are predictive, in the sense that they do not involve adjustable parameters; however, they are directly applicable only to limited regimes of dilute solute concentrations. Extensions beyond the domains of validity of these classical theories have required recourse to fitting parameters, transitioning therefore to semiempirical, or nonpredictive models. A novel approach was presented by Granik et al., which is not a priori restricted in concentration domains, presents no adjustable parameters, and is mechanistic, in the sense that it is based on a coupled diffusion model. In this work, we examine the validity of predictive theories of osmosis, by comparison with our new experimental results, and a meta-analysis of literature data.
Farkas, József; Kovács, László Á; Gáspár, László; Nafz, Anna; Gaszner, Tamás; Ujvári, Balázs; Kormos, Viktória; Csernus, Valér; Hashimoto, Hitoshi; Reglődi, Dóra; Gaszner, Balázs
2017-06-23
Major depression is a common cause of chronic disability. Despite decades of efforts, no equivocally accepted animal model is available for studying depression. We tested the validity of a new model based on the three-hit concept of vulnerability and resilience. Genetic predisposition (hit 1, mutation of pituitary adenylate cyclase-activating polypeptide, PACAP gene), early-life adversity (hit 2, 180-min maternal deprivation, MD180) and chronic variable mild stress (hit 3, CVMS) were combined. Physical, endocrinological, behavioral and functional morphological tools were used to validate the model. Body- and adrenal weight changes as well as corticosterone titers proved that CVMS was effective. Forced swim test indicated increased depression in CVMS PACAP heterozygous (Hz) mice with MD180 history, accompanied by elevated anxiety level in marble burying test. Corticotropin-releasing factor neurons in the oval division of the bed nucleus of the stria terminalis showed increased FosB expression, which was refractive to CVMS exposure in wild-type and Hz mice. Urocortin1 neurons became over-active in CMVS-exposed PACAP knock out (KO) mice with MD180 history, suggesting the contribution of centrally projecting Edinger-Westphal nucleus to the reduced depression and anxiety level of stressed KO mice. Serotoninergic neurons of the dorsal raphe nucleus lost their adaptation ability to CVMS in MD180 mice. In conclusion, the construct and face validity criteria suggest that MD180 PACAP HZ mice on CD1 background upon CVMS may be used as a reliable model for the three-hit theory. Copyright © 2017 IBRO. Published by Elsevier Ltd. All rights reserved.
From Planck Data to Planck Era: Observational Tests of Holographic Cosmology
NASA Astrophysics Data System (ADS)
Afshordi, Niayesh; Corianò, Claudio; Delle Rose, Luigi; Gould, Elizabeth; Skenderis, Kostas
2017-01-01
We test a class of holographic models for the very early Universe against cosmological observations and find that they are competitive to the standard cold dark matter model with a cosmological constant (Λ CDM ) of cosmology. These models are based on three-dimensional perturbative superrenormalizable quantum field theory (QFT), and, while they predict a different power spectrum from the standard power law used in Λ CDM , they still provide an excellent fit to the data (within their regime of validity). By comparing the Bayesian evidence for the models, we find that Λ CDM does a better job globally, while the holographic models provide a (marginally) better fit to the data without very low multipoles (i.e., l ≲30 ), where the QFT becomes nonperturbative. Observations can be used to exclude some QFT models, while we also find models satisfying all phenomenological constraints: The data rule out the dual theory being a Yang-Mills theory coupled to fermions only but allow for a Yang-Mills theory coupled to nonminimal scalars with quartic interactions. Lattice simulations of 3D QFTs can provide nonperturbative predictions for large-angle statistics of the cosmic microwave background and potentially explain its apparent anomalies.
From Planck Data to Planck Era: Observational Tests of Holographic Cosmology.
Afshordi, Niayesh; Corianò, Claudio; Delle Rose, Luigi; Gould, Elizabeth; Skenderis, Kostas
2017-01-27
We test a class of holographic models for the very early Universe against cosmological observations and find that they are competitive to the standard cold dark matter model with a cosmological constant (ΛCDM) of cosmology. These models are based on three-dimensional perturbative superrenormalizable quantum field theory (QFT), and, while they predict a different power spectrum from the standard power law used in ΛCDM, they still provide an excellent fit to the data (within their regime of validity). By comparing the Bayesian evidence for the models, we find that ΛCDM does a better job globally, while the holographic models provide a (marginally) better fit to the data without very low multipoles (i.e., l≲30), where the QFT becomes nonperturbative. Observations can be used to exclude some QFT models, while we also find models satisfying all phenomenological constraints: The data rule out the dual theory being a Yang-Mills theory coupled to fermions only but allow for a Yang-Mills theory coupled to nonminimal scalars with quartic interactions. Lattice simulations of 3D QFTs can provide nonperturbative predictions for large-angle statistics of the cosmic microwave background and potentially explain its apparent anomalies.
Computing decay rates for new physics theories with FEYNRULES and MADGRAPH 5_AMC@NLO
NASA Astrophysics Data System (ADS)
Alwall, Johan; Duhr, Claude; Fuks, Benjamin; Mattelaer, Olivier; Öztürk, Deniz Gizem; Shen, Chia-Hsien
2015-12-01
We present new features of the FEYNRULES and MADGRAPH 5_AMC@NLO programs for the automatic computation of decay widths that consistently include channels of arbitrary final-state multiplicity. The implementations are generic enough so that they can be used in the framework of any quantum field theory, possibly including higher-dimensional operators. We extend at the same time the conventions of the Universal FEYNRULES Output (or UFO) format to include decay tables and information on the total widths. We finally provide a set of representative examples of the usage of the new functions of the different codes in the framework of the Standard Model, the Higgs Effective Field Theory, the Strongly Interacting Light Higgs model and the Minimal Supersymmetric Standard Model and compare the results to available literature and programs for validation purposes.
SurfKin: an ab initio kinetic code for modeling surface reactions.
Le, Thong Nguyen-Minh; Liu, Bin; Huynh, Lam K
2014-10-05
In this article, we describe a C/C++ program called SurfKin (Surface Kinetics) to construct microkinetic mechanisms for modeling gas-surface reactions. Thermodynamic properties of reaction species are estimated based on density functional theory calculations and statistical mechanics. Rate constants for elementary steps (including adsorption, desorption, and chemical reactions on surfaces) are calculated using the classical collision theory and transition state theory. Methane decomposition and water-gas shift reaction on Ni(111) surface were chosen as test cases to validate the code implementations. The good agreement with literature data suggests this is a powerful tool to facilitate the analysis of complex reactions on surfaces, and thus it helps to effectively construct detailed microkinetic mechanisms for such surface reactions. SurfKin also opens a possibility for designing nanoscale model catalysts. Copyright © 2014 Wiley Periodicals, Inc.
Recent modelling advances for ultrasonic TOFD inspections
DOE Office of Scientific and Technical Information (OSTI.GOV)
Darmon, Michel; Ferrand, Adrien; Dorval, Vincent
The ultrasonic TOFD (Time of Flight Diffraction) Technique is commonly used to detect and characterize disoriented cracks using their edge diffraction echoes. An overview of the models integrated in the CIVA software platform and devoted to TOFD simulation is presented. CIVA allows to predict diffraction echoes from complex 3D flaws using a PTD (Physical Theory of Diffraction) based model. Other dedicated developments have been added to simulate lateral waves in 3D on planar entry surfaces and in 2D on irregular surfaces by a ray approach. Calibration echoes from Side Drilled Holes (SDHs), specimen echoes and shadowing effects from flaws canmore » also been modelled. Some examples of theoretical validation of the models are presented. In addition, experimental validations have been performed both on planar blocks containing calibration holes and various notches and also on a specimen with an irregular entry surface and allow to draw conclusions on the validity of all the developed models.« less
ERIC Educational Resources Information Center
Romine, William L.; Walter, Emily M.; Bosse, Ephiram; Todd, Amber N.
2017-01-01
We validate the Measure of Acceptance of the Theory of Evolution (MATE) on undergraduate students using the Rasch model and utilize the MATE to explore qualitatively how students express their acceptance of evolution. At least 24 studies have used the MATE, most with the assumption that it is unidimensional. However, we found that the MATE is best…
Implicit theories of writing and their impact on students' response to a SRSD intervention.
Limpo, Teresa; Alves, Rui A
2014-12-01
In the field of intelligence research, it has been shown that some people conceive intelligence as a fixed trait that cannot be changed (entity beliefs), whereas others conceive it as a malleable trait that can be developed (incremental beliefs). What about writing? Do people hold similar implicit theories about the nature of their writing ability? Furthermore, are these beliefs likely to influence students' response to a writing intervention? We aimed to develop a scale to measure students' implicit theories of writing (pilot study) and to test whether these beliefs influence strategy-instruction effectiveness (intervention study). In the pilot and intervention studies participated, respectively, 128 and 192 students (Grades 5-6). Based on existing instruments that measure self-theories of intelligence, we developed the Implicit Theories of Writing (ITW) scale that was tested with the pilot sample. In the intervention study, 109 students received planning instruction based on the self-regulated strategy development model, whereas 83 students received standard writing instruction. Students were evaluated before, in the middle, and after instruction. ITW's validity was supported by piloting results and their successful cross-validation in the intervention study. In this, intervention students wrote longer and better texts than control students. Moreover, latent growth curve modelling showed that the more the intervention students conceived writing as a malleable skill, the more the quality of their texts improved. This research is of educational relevance because it provides a measure to evaluate students' implicit theories of writing and shows their impact on response to intervention. © 2014 The British Psychological Society.
ERIC Educational Resources Information Center
Gomez, Laura E.; Arias, Benito; Verdugo, Miguel Angel; Navas, Patricia
2012-01-01
Background: Most instruments that assess quality of life have been validated by means of the classical test theory (CTT). However, CTT limitations have resulted in the development of alternative models, such as the Rasch rating scale model (RSM). The main goal of this paper is testing and improving the psychometric properties of the INTEGRAL…
ERIC Educational Resources Information Center
Kaufman, Alan S.
1984-01-01
A response to 13 articles on the Kaufman Assessment Battery for Children address seven areas: validity, theory underlying the intelligence portion, role of the clinicians in intellectual assessment, distinction between ability and achievement, evaluation of alternate models, remedial applications of the sequential-simultaneous processing…
ERIC Educational Resources Information Center
Watson, Kathy; Baranowski, Tom; Thompson, Debbe
2006-01-01
Perceived self-efficacy (SE) for eating fruit and vegetables (FV) is a key variable mediating FV change in interventions. This study applies item response modeling (IRM) to a fruit, juice and vegetable self-efficacy questionnaire (FVSEQ) previously validated with classical test theory (CTT) procedures. The 24-item (five-point Likert scale) FVSEQ…
ERIC Educational Resources Information Center
Ebert, Ashlee A.
2009-01-01
Ehri's developmental model of word recognition outlines early reading development that spans from the use of logos to advanced knowledge of oral and written language to read words. Henderson's developmental spelling theory presents stages of word knowledge that progress in a similar manner to Ehri's phases. The purpose of this research study was…
Linking Outcomes from Peabody Picture Vocabulary Test Forms Using Item Response Models
ERIC Educational Resources Information Center
Hoffman, Lesa; Templin, Jonathan; Rice, Mabel L.
2012-01-01
Purpose: The present work describes how vocabulary ability as assessed by 3 different forms of the Peabody Picture Vocabulary Test (PPVT; Dunn & Dunn, 1997) can be placed on a common latent metric through item response theory (IRT) modeling, by which valid comparisons of ability between samples or over time can then be made. Method: Responses…
Gaudioso, Fulvio; Turel, Ofir; Galimberti, Carlo
2015-01-01
The purpose of this study is to theoretically develop and empirically examine a general coping theory model which explicates the indirect effects of key job-related techno-stressors on job exhaustion. Through this study, we show that techno-stress creators are detrimental to employee well-being and should be treated accordingly. Specifically, we first argue that key techno-stress creators on the job, namely techno-invasion and techno-overload, drive unpleasant states such as work-family conflict and distress. Next, we rely on general coping theory and argue that people respond to these states differently, but with both adaptive and maladaptive technology-specific coping strategies. Adaptive coping behaviors are argued to ultimately reduce work exhaustion, and maladaptive coping strategies are argued to increase it. The proposed model was tested and validated with structural equation modeling techniques applied to self-reported data obtained from a sample of 242 employees of a large organization in the United States. Implications for theory and practice are discussed.
Implicit theories of a desire for fame.
Maltby, John; Day, Liz; Giles, David; Gillett, Raphael; Quick, Marianne; Langcaster-James, Honey; Linley, P Alex
2008-05-01
The aim of the present studies was to generate implicit theories of a desire for fame among the general population. In Study 1, we were able to develop a nine-factor analytic model of conceptions of the desire to be famous that initially comprised nine separate factors; ambition, meaning derived through comparison with others, psychologically vulnerable, attention seeking, conceitedness, social access, altruistic, positive affect, and glamour. Analysis that sought to examine replicability among these factors suggested that three factors (altruistic, positive affect, and glamour) neither display factor congruence nor display adequate internal reliability. A second study examined the validity of these factors in predicting profiles of individuals who may desire fame. The findings from this study suggested that two of the nine factors (positive affect and altruism) could not be considered strong factors within the model. Overall, the findings suggest that implicit theories of a desire for fame comprise six factors. The discussion focuses on how an implicit model of a desire for fame might progress into formal theories of a desire for fame.
Konik, R. M.; Palmai, T.; Takacs, G.; ...
2015-08-24
We study the SU(2) k Wess-Zumino-Novikov-Witten (WZNW) theory perturbed by the trace of the primary field in the adjoint representation, a theory governing the low-energy behaviour of a class of strongly correlated electronic systems. While the model is non-integrable, its dynamics can be investigated using the numerical technique of the truncated conformal spectrum approach combined with numerical and analytical renormalization groups (TCSA+RG). The numerical results so obtained provide support for a semiclassical analysis valid at k » 1. Namely, we find that the low energy behavior is sensitive to the sign of the coupling constant, λ. Moreover for λ >more » 0 this behavior depends on whether k is even or odd. With k even, we find definitive evidence that the model at low energies is equivalent to the massive O(3) sigma model. For k odd, the numerical evidence is more equivocal, but we find indications that the low energy effective theory is critical.« less
Coherence bandwidth loss in transionospheric radio propagation
NASA Technical Reports Server (NTRS)
Rino, C. L.; Gonzalez, V. H.; Hessing, A. R.
1980-01-01
In this report a theoretical model is developed that predicts the single-point, two-frequency coherence function for transionospheric radio waves. The theoretical model is compared to measured complex frequency correlation coefficients using data from the seven equispaced, phase-coherent UHF signals transmitted by the Wideband satellite. The theory and data are in excellent agreement. The theory is critically dependent upon the power-law index, and the frequency coherence data clearly favor the comparatively small spectral indices that have been consistently measured from the wideband satellite phase data. A model for estimating the pulse delay jitter induced by the coherence bandwidth loss is also developed and compared with the actual delay jitter observed on synthesized pulses obtained from the Wideband UFH comb. The results are in good agreement with the theory. The results presented in this report, which are based on an asymptotic theory, are compared with the more commonly used quadratic theory. The model developed and validated in this report can be used to predict the effects of coherence bandwidth loss in disturbed nuclear environments. Simple formulas for the resultant pulse delay jitter are derived that can be used in predictive codes.
A New Higher-Order Composite Theory for Analysis and Design of High Speed Tilt-Rotor Blades
NASA Technical Reports Server (NTRS)
McCarthy, Thomas Robert
1996-01-01
A higher-order theory is developed to model composite box beams with arbitrary wall thicknesses. The theory, based on a refined displacement field, represents a three-dimensional model which approximates the elasticity solution. Therefore, the cross-sectional properties are not reduced to one-dimensional beam parameters. Both inplane and out-of-plane warping are automatically included in the formulation. The model accurately captures the transverse shear stresses through the thickness of each wall while satisfying all stress-free boundary conditions. Several numerical results are presented to validate the present theory. The developed theory is then used to model the load carrying member of a tilt-rotor blade which has thick-walled sections. The composite structural analysis is coupled with an aerodynamic analysis to compute the aeroelastic stability of the blade. Finally, a multidisciplinary optimization procedure is developed to improve the aerodynamic, structural and aeroelastic performance of the tilt-rotor aircraft. The Kreisselmeier-Steinhauser function is used to formulate the multiobjective function problem and a hybrid approximate analysis is used to reduce the computational effort. The optimum results are compared with the baseline values and show significant improvements in the overall performance of the tilt-rotor blade.
Validation of the Physician Teaching Motivation Questionnaire (PTMQ).
Dybowski, Christoph; Harendza, Sigrid
2015-10-02
Physicians play a major role as teachers in undergraduate medical education. Studies indicate that different forms and degrees of motivation can influence work performance in general and that teachers' motivation to teach can influence students' academic achievements in particular. Therefore, the aim of this study was to develop and to validate an instrument measuring teaching motivations in hospital-based physicians. We chose self-determination theory as a theoretical framework for item and scale development. It distinguishes between different dimensions of motivation depending on the amount of self-regulation and autonomy involved and its empirical evidence has been demonstrated in other areas of research. To validate the new instrument (PTMQ = Physician Teaching Motivation Questionnaire), we used data from a sample of 247 physicians from internal medicine and surgery at six German medical faculties. Structural equation modelling was conducted to confirm the factorial structure, correlation analyses and linear regressions were performed to examine concurrent and incremental validity. Structural equation modelling confirmed a good global fit for the factorial structure of the final instrument (RMSEA = .050, TLI = .957, SRMR = .055, CFI = .966). Cronbach's alphas indicated good internal consistencies for all scales (α = .75 - .89) except for the identified teaching motivation subscale with an acceptable internal consistency (α = .65). Tests of concurrent validity with global work motivation, perceived teaching competence, perceived teaching involvement and voluntariness of lesson allocation delivered theory-consistent results with slight deviations for some scales. Incremental validity over global work motivation in predicting perceived teaching involvement was also confirmed. Our results indicate that the PTMQ is a reliable, valid and therefore suitable instrument for assessing physicians' teaching motivation.
Multisample cross-validation of a model of childhood posttraumatic stress disorder symptomatology.
Anthony, Jason L; Lonigan, Christopher J; Vernberg, Eric M; Greca, Annette M La; Silverman, Wendy K; Prinstein, Mitchell J
2005-12-01
This study is the latest advancement of our research aimed at best characterizing children's posttraumatic stress reactions. In a previous study, we compared existing nosologic and empirical models of PTSD dimensionality and determined the superior model was a hierarchical one with three symptom clusters (Intrusion/Active Avoidance, Numbing/Passive Avoidance, and Arousal; Anthony, Lonigan, & Hecht, 1999). In this study, we cross-validate this model in two populations. Participants were 396 fifth graders who were exposed to either Hurricane Andrew or Hurricane Hugo. Multisample confirmatory factor analysis demonstrated the model's factorial invariance across populations who experienced traumatic events that differed in severity. These results show the model's robustness to characterize children's posttraumatic stress reactions. Implications for diagnosis, classification criteria, and an empirically supported theory of PTSD are discussed.
NASA Technical Reports Server (NTRS)
Hart-Smith, L. J.
1992-01-01
The irrelevance of most composite failure criteria to conventional fiber-polymer composites is claimed to have remained undetected primarily because the experiments that can either validate or disprove them are difficult to perform. Uniaxial tests are considered inherently incapable of validating or refuting any composite failure theory because so much of the total load is carried by the fibers aligned in the direction of the load. The Ten-Percent Rule, a simple rule-of-mixtures analysis method, is said to work well only because of this phenomenon. It is stated that failure criteria can be verified for fibrous composites only by biaxial tests, with orthogonal in-plane stresses of the same as well as different signs, because these particular states of combined stress reveal substantial differences between the predictions of laminate strength made by various theories. Three scientifically plausible failure models for fibrous composites are compared, and it is shown that only the in-plane shear test (orthogonal tension and compression) is capable of distinguishing between them. This is because most theories are 'calibrated' against the measured uniaxial tension and compression tests and any cross-plied laminate tests dominated by those same states of stress must inevitably 'confirm' the theory.
Theory analysis of the Dental Hygiene Human Needs Conceptual Model.
MacDonald, L; Bowen, D M
2017-11-01
Theories provide a structural knowing about concept relationships, practice intricacies, and intuitions and thus shape the distinct body of the profession. Capturing ways of knowing and being is essential to any professions' practice, education and research. This process defines the phenomenon of the profession - its existence or experience. Theory evaluation is a systematic criterion-based assessment of a specific theory. This study presents a theory analysis of the Dental Hygiene Human Needs Conceptual Model (DH HNCM). Using the Walker and Avant Theory Analysis, a seven-step process, the DH HNCM, was analysed and evaluated for its meaningfulness and contribution to dental hygiene. The steps include the following: (i) investigate the origins; (ii) examine relationships of the theory's concepts; (iii) assess the logic of the theory's structure; (iv) consider the usefulness to practice; (v) judge the generalizability; (vi) evaluate the parsimony; and (vii) appraise the testability of the theory. Human needs theory in nursing and Maslow's Hierarchy of Need Theory prompted this theory's development. The DH HNCM depicts four concepts based on the paradigm concepts of the profession: client, health/oral health, environment and dental hygiene actions, and includes validated eleven human needs that evolved overtime to eight. It is logical, simplistic, allows scientific predictions and testing, and provides a unique lens for the dental hygiene practitioner. With this model, dental hygienists have entered practice, knowing they enable clients to meet their human needs. For the DH HNCM, theory analysis affirmed that the model is reasonable and insightful and adds to the dental hygiene professions' epistemology and ontology. © 2016 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.
From atoms to steps: The microscopic origins of crystal evolution
NASA Astrophysics Data System (ADS)
Patrone, Paul N.; Einstein, T. L.; Margetis, Dionisios
2014-07-01
The Burton-Cabrera-Frank (BCF) theory of crystal growth has been successful in describing a wide range of phenomena in surface physics. Typical crystal surfaces are slightly misoriented with respect to a facet plane; thus, the BCF theory views such systems as composed of staircase-like structures of steps separating terraces. Adsorbed atoms (adatoms), which are represented by a continuous density, diffuse on terraces, and steps move by absorbing or emitting these adatoms. Here we shed light on the microscopic origins of the BCF theory by deriving a simple, one-dimensional (1D) version of the theory from an atomistic, kinetic restricted solid-on-solid (KRSOS) model without external material deposition. We define the time-dependent adatom density and step position as appropriate ensemble averages in the KRSOS model, thereby exposing the non-equilibrium statistical mechanics origins of the BCF theory. Our analysis reveals that the BCF theory is valid in a low adatom-density regime, much in the same way that an ideal gas approximation applies to dilute gasses. We find conditions under which the surface remains in a low-density regime and discuss the microscopic origin of corrections to the BCF model.
Physical Projections in BRST Treatments of Reparametrization Invariant Theories
NASA Astrophysics Data System (ADS)
Marnelius, Robert; Sandström, Niclas
Any regular quantum mechanical system may be cast into an Abelian gauge theory by simply reformulating it as a reparametrization invariant theory. We present a detailed study of the BRST quantization of such reparametrization invariant theories within a precise operator version of BRST which is related to the conventional BFV path integral formulation. Our treatments lead us to propose general rules for how physical wave functions and physical propagators are to be projected from the BRST singlets and propagators in the ghost extended BRST theory. These projections are performed by boundary conditions which are specified by the ingredients of BRST charge and precisely determined by the operator BRST. We demonstrate explicitly the validity of these rules for the considered class of models.
The direct simulation of acoustics on Earth, Mars, and Titan.
Hanford, Amanda D; Long, Lyle N
2009-02-01
With the recent success of the Huygens lander on Titan, a moon of Saturn, there has been renewed interest in further exploring the acoustic environments of the other planets in the solar system. The direct simulation Monte Carlo (DSMC) method is used here for modeling sound propagation in the atmospheres of Earth, Mars, and Titan at a variety of altitudes above the surface. DSMC is a particle method that describes gas dynamics through direct physical modeling of particle motions and collisions. The validity of DSMC for the entire range of Knudsen numbers (Kn), where Kn is defined as the mean free path divided by the wavelength, allows for the exploration of sound propagation in planetary environments for all values of Kn. DSMC results at a variety of altitudes on Earth, Mars, and Titan including the details of nonlinearity, absorption, dispersion, and molecular relaxation in gas mixtures are given for a wide range of Kn showing agreement with various continuum theories at low Kn and deviation from continuum theory at high Kn. Despite large computation time and memory requirements, DSMC is the method best suited to study high altitude effects or where continuum theory is not valid.
Hagger, Martin S; Chan, Derwin K C; Protogerou, Cleo; Chatzisarantis, Nikos L D
2016-08-01
Synthesizing research on social cognitive theories applied to health behavior is an important step in the development of an evidence base of psychological factors as targets for effective behavioral interventions. However, few meta-analyses of research on social cognitive theories in health contexts have conducted simultaneous tests of theoretically-stipulated pattern effects using path analysis. We argue that conducting path analyses of meta-analytic effects among constructs from social cognitive theories is important to test nomological validity, account for mediation effects, and evaluate unique effects of theory constructs independent of past behavior. We illustrate our points by conducting new analyses of two meta-analyses of a popular theory applied to health behaviors, the theory of planned behavior. We conducted meta-analytic path analyses of the theory in two behavioral contexts (alcohol and dietary behaviors) using data from the primary studies included in the original meta-analyses augmented to include intercorrelations among constructs and relations with past behavior missing from the original analysis. Findings supported the nomological validity of the theory and its hypotheses for both behaviors, confirmed important model processes through mediation analysis, demonstrated the attenuating effect of past behavior on theory relations, and provided estimates of the unique effects of theory constructs independent of past behavior. Our analysis illustrates the importance of conducting a simultaneous test of theory-stipulated effects in meta-analyses of social cognitive theories applied to health behavior. We recommend researchers adopt this analytic procedure when synthesizing evidence across primary tests of social cognitive theories in health. Copyright © 2016 Elsevier Inc. All rights reserved.
Kang, Xiaofeng; Dennison Himmelfarb, Cheryl R; Li, Zheng; Zhang, Jian; Lv, Rong; Guo, Jinyu
2015-01-01
The Self-care of Heart Failure Index (SCHFI) is an empirically tested instrument for measuring the self-care of patients with heart failure. The aim of this study was to develop a simplified Chinese version of the SCHFI and provide evidence for its construct validity. A total of 182 Chinese with heart failure were surveyed. A 2-step structural equation modeling procedure was applied to test construct validity. Factor analysis showed 3 factors explaining 43% of the variance. Structural equation model confirmed that self-care maintenance, self-care management, and self-care confidence are indeed indicators of self-care, and self-care confidence was a positive and equally strong predictor of self-care maintenance and self-care management. Moreover, self-care scores were correlated with the Partners in Health Scale, indicating satisfactory concurrent validity. The Chinese version of the SCHFI is a theory-based instrument for assessing self-care of Chinese patients with heart failure.
Criticality Calculations with MCNP6 - Practical Lectures
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brown, Forrest B.; Rising, Michael Evan; Alwin, Jennifer Louise
2016-11-29
These slides are used to teach MCNP (Monte Carlo N-Particle) usage to nuclear criticality safety analysts. The following are the lecture topics: course information, introduction, MCNP basics, criticality calculations, advanced geometry, tallies, adjoint-weighted tallies and sensitivities, physics and nuclear data, parameter studies, NCS validation I, NCS validation II, NCS validation III, case study 1 - solution tanks, case study 2 - fuel vault, case study 3 - B&W core, case study 4 - simple TRIGA, case study 5 - fissile mat. vault, criticality accident alarm systems. After completion of this course, you should be able to: Develop an input modelmore » for MCNP; Describe how cross section data impact Monte Carlo and deterministic codes; Describe the importance of validation of computer codes and how it is accomplished; Describe the methodology supporting Monte Carlo codes and deterministic codes; Describe pitfalls of Monte Carlo calculations; Discuss the strengths and weaknesses of Monte Carlo and Discrete Ordinants codes; The diffusion theory model is not strictly valid for treating fissile systems in which neutron absorption, voids, and/or material boundaries are present. In the context of these limitations, identify a fissile system for which a diffusion theory solution would be adequate.« less
[Health promotion. Instrument development for the application of the theory of planned behavior].
Lee, Y O
1993-01-01
The purpose of this article is to describe operationalization of the Theory of Planned Behavior (TPB). The quest to understand determinants of health behaviors has intensified as evidence accumulates concerning the impact of personal behavior on health. The majority of theory-based research has used the Health Belief Model(HBM). The HBM components have had limited success in explaining health-related behaviors. There are several advantages of the TPB over the HBM. TPB is an expansion of the Theory of Reasoned Action(TRA) with the addition of the construct, perceived behavioral control. The revised model has been shown to yield greater explanatory power than the original TRA for goal-directed behaviors. The process of TPB instrument development was described, using example form the study of smoking cessation behavior in military smokers. It was followed by a discussion of reliability and validity issues in operationalizing the TPB. The TPB is a useful model for understanding and predicting health-related behaviors when carefully operationalized. The model holds promise in the development of prescriptive nursing approaches.
Nursing Care Interpersonal Relationship Questionnaire: elaboration and validation 1
Borges, José Wicto Pereira; Moreira, Thereza Maria Magalhães; de Andrade, Dalton Franscisco
2018-01-01
ABSTRACT Objective: to elaborate an instrument for the measurement of the interpersonal relationship in nursing care through the Item Response Theory, and the validation thereof. Method: methodological study, which followed the three poles of psychometry: theoretical, empirical and analytical. The Nursing Care Interpersonal Relationship Questionnaire was developed in light of the Imogene King’s Interpersonal Conceptual Model and the psychometric properties were studied through the Item Response Theory in a sample of 950 patients attended in Primary, Secondary and Tertiary Health Care. Results: the final instrument consisted of 31 items, with Cronbach’s alpha of 0.90 and McDonald’s Omega of 0.92. The parameters of the Item Response Theory demonstrated high discrimination in 28 items, being developed a five-level interpretive scale. At the first level, the communication process begins, gaining a wealth of interaction. Subsequent levels demonstrate qualitatively the points of effectiveness of the interpersonal relationship with the involvement of behaviors related to the concepts of transaction and interaction, followed by the concept of role. Conclusion: the instrument was created and proved to be consistent to measure interpersonal relationship in nursing care, as it presented adequate reliability and validity parameters. PMID:29319743
Implementation of a Smeared Crack Band Model in a Micromechanics Framework
NASA Technical Reports Server (NTRS)
Pineda, Evan J.; Bednarcyk, Brett A.; Waas, Anthony M.; Arnold, Steven M.
2012-01-01
The smeared crack band theory is implemented within the generalized method of cells and high-fidelity generalized method of cells micromechanics models to capture progressive failure within the constituents of a composite material while retaining objectivity with respect to the size of the discretization elements used in the model. An repeating unit cell containing 13 randomly arranged fibers is modeled and subjected to a combination of transverse tension/compression and transverse shear loading. The implementation is verified against experimental data (where available), and an equivalent finite element model utilizing the same implementation of the crack band theory. To evaluate the performance of the crack band theory within a repeating unit cell that is more amenable to a multiscale implementation, a single fiber is modeled with generalized method of cells and high-fidelity generalized method of cells using a relatively coarse subcell mesh which is subjected to the same loading scenarios as the multiple fiber repeating unit cell. The generalized method of cells and high-fidelity generalized method of cells models are validated against a very refined finite element model.
The Trunk Impairment Scale - modified to ordinal scales in the Norwegian version.
Gjelsvik, Bente; Breivik, Kyrre; Verheyden, Geert; Smedal, Tori; Hofstad, Håkon; Strand, Liv Inger
2012-01-01
To translate the Trunk Impairment Scale (TIS), a measure of trunk control in patients after stroke, into Norwegian (TIS-NV), and to explore its construct validity, internal consistency, intertester and test-retest reliability. TIS was translated according to international guidelines. The validity study was performed on data from 201 patients with acute stroke. Fifty patients with stroke and acquired brain injury were recruited to examine intertester and test-retest reliability. Construct validity was analyzed with exploratory and confirmatory factor analysis and item response theory, internal consistency with Cronbach's alpha test, and intertester and test-retest reliability with kappa and intraclass correlation coefficient tests. The back-translated version of TIS-NV was validated by the original developer. The subscale Static sitting balance was removed. By combining items from the subscales Dynamic sitting balance and Coordination, six ordinal superitems (testlets) were constructed. The TIS-NV was renamed the modified TIS-NV (TIS-modNV). After modifications the TIS-modNV fitted well to a locally dependent unidimensional item response theory model. It demonstrated good construct validity, excellent internal consistency, and high intertester and test-retest reliability for the total score. This study supports that the TIS-modNV is a valid and reliable scale for use in clinical practice and research.
Yang-Baxter deformations of W2,4 × T1,1 and the associated T-dual models
NASA Astrophysics Data System (ADS)
Sakamoto, Jun-ichi; Yoshida, Kentaroh
2017-08-01
Recently, for principal chiral models and symmetric coset sigma models, Hoare and Tseytlin proposed an interesting conjecture that the Yang-Baxter deformations with the homogeneous classical Yang-Baxter equation are equivalent to non-abelian T-dualities with topological terms. It is significant to examine this conjecture for non-symmetric (i.e., non-integrable) cases. Such an example is the W2,4 ×T 1 , 1 background. In this note, we study Yang-Baxter deformations of type IIB string theory defined on W2,4 ×T 1 , 1 and the associated T-dual models, and show that this conjecture is valid even for this case. Our result indicates that the conjecture would be valid beyond integrability.
Li, Tsung-Lung; Lu, Wen-Cai
2015-10-05
In this work, Koopmans' theorem for Kohn-Sham density functional theory (KS-DFT) is applied to the photoemission spectra (PES) modeling over the entire valence-band. To examine the validity of this application, a PES modeling scheme is developed to facilitate a full valence-band comparison of theoretical PES spectra with experiments. The PES model incorporates the variations of electron ionization cross-sections over atomic orbitals and a linear dispersion of spectral broadening widths. KS-DFT simulations of pristine rubrene (5,6,11,12-tetraphenyltetracene) and potassium-rubrene complex are performed, and the simulation results are used as the input to the PES models. Two conclusions are reached. First, decompositions of the theoretical total spectra show that the dissociated electron of the potassium mainly remains on the backbone and has little effect on the electronic structures of phenyl side groups. This and other electronic-structure results deduced from the spectral decompositions have been qualitatively obtained with the anionic approximation to potassium-rubrene complexes. The qualitative validity of the anionic approximation is thus verified. Second, comparison of the theoretical PES with the experiments shows that the full-scale simulations combined with the PES modeling methods greatly enhance the agreement on spectral shapes over the anionic approximation. This agreement of the theoretical PES spectra with the experiments over the full valence-band can be regarded, to some extent, as a collective validation of the application of Koopmans' theorem for KS-DFT to valence-band PES, at least, for this hydrocarbon and its alkali-adsorbed complex. Copyright © 2015 Elsevier B.V. All rights reserved.
Statistical turbulence theory and turbulence phenomenology
NASA Technical Reports Server (NTRS)
Herring, J. R.
1973-01-01
The application of deductive turbulence theory for validity determination of turbulence phenomenology at the level of second-order, single-point moments is considered. Particular emphasis is placed on the phenomenological formula relating the dissipation to the turbulence energy and the Rotta-type formula for the return to isotropy. Methods which deal directly with most or all the scales of motion explicitly are reviewed briefly. The statistical theory of turbulence is presented as an expansion about randomness. Two concepts are involved: (1) a modeling of the turbulence as nearly multipoint Gaussian, and (2) a simultaneous introduction of a generalized eddy viscosity operator.
Development and validation of instrument for ergonomic evaluation of tablet arm chairs
Tirloni, Adriana Seára; dos Reis, Diogo Cunha; Bornia, Antonio Cezar; de Andrade, Dalton Francisco; Borgatto, Adriano Ferreti; Moro, Antônio Renato Pereira
2016-01-01
The purpose of this study was to develop and validate an evaluation instrument for tablet arm chairs based on ergonomic requirements, focused on user perceptions and using Item Response Theory (IRT). This exploratory study involved 1,633 participants (university students and professors) in four steps: a pilot study (n=26), semantic validation (n=430), content validation (n=11) and construct validation (n=1,166). Samejima's graded response model was applied to validate the instrument. The results showed that all the steps (theoretical and practical) of the instrument's development and validation processes were successful and that the group of remaining items (n=45) had a high consistency (0.95). This instrument can be used in the furniture industry by engineers and product designers and in the purchasing process of tablet arm chairs for schools, universities and auditoriums. PMID:28337099
Optimal Decision Stimuli for Risky Choice Experiments: An Adaptive Approach.
Cavagnaro, Daniel R; Gonzalez, Richard; Myung, Jay I; Pitt, Mark A
2013-02-01
Collecting data to discriminate between models of risky choice requires careful selection of decision stimuli. Models of decision making aim to predict decisions across a wide range of possible stimuli, but practical limitations force experimenters to select only a handful of them for actual testing. Some stimuli are more diagnostic between models than others, so the choice of stimuli is critical. This paper provides the theoretical background and a methodological framework for adaptive selection of optimal stimuli for discriminating among models of risky choice. The approach, called Adaptive Design Optimization (ADO), adapts the stimulus in each experimental trial based on the results of the preceding trials. We demonstrate the validity of the approach with simulation studies aiming to discriminate Expected Utility, Weighted Expected Utility, Original Prospect Theory, and Cumulative Prospect Theory models.
NASA Astrophysics Data System (ADS)
Moriarty, Patrick; Sanz Rodrigo, Javier; Gancarski, Pawel; Chuchfield, Matthew; Naughton, Jonathan W.; Hansen, Kurt S.; Machefaux, Ewan; Maguire, Eoghan; Castellani, Francesco; Terzi, Ludovico; Breton, Simon-Philippe; Ueda, Yuko
2014-06-01
Researchers within the International Energy Agency (IEA) Task 31: Wakebench have created a framework for the evaluation of wind farm flow models operating at the microscale level. The framework consists of a model evaluation protocol integrated with a web-based portal for model benchmarking (www.windbench.net). This paper provides an overview of the building-block validation approach applied to wind farm wake models, including best practices for the benchmarking and data processing procedures for validation datasets from wind farm SCADA and meteorological databases. A hierarchy of test cases has been proposed for wake model evaluation, from similarity theory of the axisymmetric wake and idealized infinite wind farm, to single-wake wind tunnel (UMN-EPFL) and field experiments (Sexbierum), to wind farm arrays in offshore (Horns Rev, Lillgrund) and complex terrain conditions (San Gregorio). A summary of results from the axisymmetric wake, Sexbierum, Horns Rev and Lillgrund benchmarks are used to discuss the state-of-the-art of wake model validation and highlight the most relevant issues for future development.
On the Connection between Kinetic Monte Carlo and the Burton-Cabrera-Frank Theory
NASA Astrophysics Data System (ADS)
Patrone, Paul; Margetis, Dionisios; Einstein, T. L.
2013-03-01
In the many years since it was first proposed, the Burton- Cabrera-Frank (BCF) model of step-flow has been experimentally established as one of the cornerstones of surface physics. However, many questions remain regarding the underlying physical processes and theoretical assumptions that give rise to the BCF theory. In this work, we formally derive the BCF theory from an atomistic, kinetic Monte Carlo model of the surface in 1 +1 dimensions with one step. Our analysis (i) shows how the BCF theory describes a surface with a low density of adsorbed atoms, and (ii) establishes a set of near-equilibrium conditions ensuring that the theory remains valid for all times. Support for PP was provided by the NIST-ARRA Fellowship Award No. 70NANB10H026 through UMD. Support for TLE and PP was also provided by the CMTC at UMD, with ancillary support from the UMD MRSEC. Support for DM was provided by NSF DMS0847587 at UMD.
The Nottingham Adjustment Scale: a validation study.
Dodds, A G; Flannigan, H; Ng, L
1993-09-01
The concept of adjustment to acquired sight loss is examined in the context of existing loss models. An alternative conceptual framework is presented which addresses the 'blindness experience', and which suggests that the depression so frequently encountered in those losing their sight can be understood better by recourse to cognitive factors than to psychoanalytically based theories of grieving. A scale to measure psychological status before and after rehabilitation is described, its factorial validity is demonstrated, and its validity in enabling changes to be measured. Practitioners are encouraged to adopt a similar perspective in other areas of acquired disability.
Propeller aircraft interior noise model utilization study and validation
NASA Technical Reports Server (NTRS)
Pope, L. D.
1984-01-01
Utilization and validation of a computer program designed for aircraft interior noise prediction is considered. The program, entitled PAIN (an acronym for Propeller Aircraft Interior Noise), permits (in theory) predictions of sound levels inside propeller driven aircraft arising from sidewall transmission. The objective of the work reported was to determine the practicality of making predictions for various airplanes and the extent of the program's capabilities. The ultimate purpose was to discern the quality of predictions for tonal levels inside an aircraft occurring at the propeller blade passage frequency and its harmonics. The effort involved three tasks: (1) program validation through comparisons of predictions with scale-model test results; (2) development of utilization schemes for large (full scale) fuselages; and (3) validation through comparisons of predictions with measurements taken in flight tests on a turboprop aircraft. Findings should enable future users of the program to efficiently undertake and correctly interpret predictions.
SHERMAN, a shape-based thermophysical model. I. Model description and validation
NASA Astrophysics Data System (ADS)
Magri, Christopher; Howell, Ellen S.; Vervack, Ronald J.; Nolan, Michael C.; Fernández, Yanga R.; Marshall, Sean E.; Crowell, Jenna L.
2018-03-01
SHERMAN, a new thermophysical modeling package designed for analyzing near-infrared spectra of asteroids and other solid bodies, is presented. The model's features, the methods it uses to solve for surface and subsurface temperatures, and the synthetic data it outputs are described. A set of validation tests demonstrates that SHERMAN produces accurate output in a variety of special cases for which correct results can be derived from theory. These cases include a family of solutions to the heat equation for which thermal inertia can have any value and thermophysical properties can vary with depth and with temperature. An appendix describes a new approximation method for estimating surface temperatures within spherical-section craters, more suitable for modeling infrared beaming at short wavelengths than the standard method.
On the validation of seismic imaging methods: Finite frequency or ray theory?
Maceira, Monica; Larmat, Carene; Porritt, Robert W.; ...
2015-01-23
We investigate the merits of the more recently developed finite-frequency approach to tomography against the more traditional and approximate ray theoretical approach for state of the art seismic models developed for western North America. To this end, we employ the spectral element method to assess the agreement between observations on real data and measurements made on synthetic seismograms predicted by the models under consideration. We check for phase delay agreement as well as waveform cross-correlation values. Based on statistical analyses on S wave phase delay measurements, finite frequency shows an improvement over ray theory. Random sampling using cross-correlation values identifiesmore » regions where synthetic seismograms computed with ray theory and finite-frequency models differ the most. Our study suggests that finite-frequency approaches to seismic imaging exhibit measurable improvement for pronounced low-velocity anomalies such as mantle plumes.« less
Dambrun, Michaël; Duarte, Sandra; Guimond, Serge
2004-06-01
Arguing from a sociobiological perspective, Sidanius and Pratto (1999) have shown that the male/female difference in social dominance orientation (SDO) is largely invariant across cultural, situational and contextual boundaries. The main objective of this study was to test the validity of Social Dominance Theory (SDT) by contrasting it with a model derived from Social Identity Theory (SIT). More specifically, while SIT predicts that gender identification mediates the effect of gender on SDO, SDT predicts the reverse. According to SDT, the degree to which men and women endorse status legitimizing ideology should determine to what extent they identify with their gender group. Using structural equation modelling, the results provide strong support for the SIT model and no support for SDT predictions. Implications of these results for social dominance theory and its sociobiologically based invariance hypothesis are discussed.
Vrotsou, Kalliopi; Cuéllar, Ricardo; Silió, Félix; Rodriguez, Miguel Ángel; Garay, Daniel; Busto, Gorka; Trancho, Ziortza; Escobar, Antonio
2016-10-18
The aim of the current study was to validate the self-report section of the American Shoulder and Elbow Surgeons questionnaire (ASES-p) into Spanish. Shoulder pathology patients were recruited and followed up to 6 months post treatment. The ASES-p, Constant, SF-36 and Barthel scales were filled-in pre and post treatment. Reliability was tested with Cronbach's alpha, convergent validity with Spearman's correlations coefficients. Confirmatory factor analysis (CFA) and the Rasch model were implemented for assessing structural validity and unidimensionality of the scale. Models with and without the pain item were considered. Responsiveness to change was explored via standardised effect sizes. Results were acceptable for both tested models. Cronbach's alpha was 0.91, total scale correlations with Constant and physical SF-36 dimensions were >0.50. Factor loadings for CFA were >0.40. The Rasch model confirmed unidimensionality of the scale, even though item 10 "do usual sport" was suggested as non-informative. Finally, patients with improved post treatment shoulder function and those receiving surgery had higher standardised effect sizes. The adapted Spanish ASES-p version is a valid and reliable tool for shoulder evaluation and its unidimensionality is supported by the data.
The Development and Validation of the Measure of Acceptance of the Theory of Evolution Instrument.
ERIC Educational Resources Information Center
Rutledge, Michael L.; Warden, Melissa A.
1999-01-01
Describes the development and validation of the Measure of Acceptance of the Theory of Evolution (MATE), a 20-item, Likert-scaled instrument that assesses teachers' overall acceptance of evolutionary theory. (Author/CCM)
NASA Technical Reports Server (NTRS)
Pineda, Evan J.; Bednarcyk, Brett A.; Waas, Anthony M.; Arnold, Steven M.
2012-01-01
The smeared crack band theory is implemented within the generalized method of cells and high-fidelity generalized method of cells micromechanics models to capture progressive failure within the constituents of a composite material while retaining objectivity with respect to the size of the discretization elements used in the model. An repeating unit cell containing 13 randomly arranged fibers is modeled and subjected to a combination of transverse tension/compression and transverse shear loading. The implementation is verified against experimental data (where available), and an equivalent finite element model utilizing the same implementation of the crack band theory. To evaluate the performance of the crack band theory within a repeating unit cell that is more amenable to a multiscale implementation, a single fiber is modeled with generalized method of cells and high-fidelity generalized method of cells using a relatively coarse subcell mesh which is subjected to the same loading scenarios as the multiple fiber repeating unit cell. The generalized method of cells and high-fidelity generalized method of cells models are validated against a very refined finite element model.
ERIC Educational Resources Information Center
Schroeders, Ulrich; Robitzsch, Alexander; Schipolowski, Stefan
2014-01-01
C-tests are a specific variant of cloze tests that are considered time-efficient, valid indicators of general language proficiency. They are commonly analyzed with models of item response theory assuming local item independence. In this article we estimated local interdependencies for 12 C-tests and compared the changes in item difficulties,…
ERIC Educational Resources Information Center
Rieger, Marc Oliver; Wang, Mei
2008-01-01
Comments on the article by E. Brandstatter, G. Gigerenzer, and R. Hertwig (2006). The authors discuss the priority heuristic, a recent model for decisions under risk. They reanalyze the experimental validity of this approach and discuss how these results compare with cumulative prospect theory, the currently most established model in behavioral…
Theory-based interventions in physical activity: a systematic review of literature in Iran.
Abdi, Jalal; Eftekhar, Hassan; Estebsari, Fatemeh; Sadeghi, Roya
2014-11-30
Lack of physical activity is ranked fourth among the causes of human death and chronic diseases. Using models and theories to design, implement, and evaluate the health education and health promotion interventions has many advantages. Using models and theories of physical activity, we decided to systematically study the educational and promotional interventions carried out in Iran from 2003 to 2013.Three information databases were used to systematically select papers using key words including Iranian Magazine Database (MAGIRAN), Iran Medical Library (MEDLIB), and Scientific Information Database (SID). Twenty papers were selected and studied .Having been applied in 9 studies, The Trans Theoretical Model (TTM) was the most widespread model in Iran (PENDER in 3 studies, BASNEF in 2, and the Theory of Planned Behavior in 2 studies). With regards to the educational methods, almost all studies used a combination of methods. The most widely used Integrative educational method was group discussion. Only one integrated study was done. Behavior maintenance was not addressed in 75% of the studies. Almost all studies used self-reporting instruments. The effectiveness of educational methods was assessed in none of the studies. Most of the included studies had several methodological weaknesses, which hinder the validity and applicability of their results. According to the findings, the necessity of need assessment in using models, epidemiology and methodology consultation, addressing maintenance of physical activity, using other theories and models such as social marketing and social-cognitive theory, and other educational methods like empirical and complementary are suggested.
NASA Astrophysics Data System (ADS)
Nikurashin, Maxim; Gunn, Andrew
2017-04-01
The meridional overturning circulation (MOC) is a planetary-scale oceanic flow which is of direct importance to the climate system: it transports heat meridionally and regulates the exchange of CO2 with the atmosphere. The MOC is forced by wind and heat and freshwater fluxes at the surface and turbulent mixing in the ocean interior. A number of conceptual theories for the sensitivity of the MOC to changes in forcing have recently been developed and tested with idealized numerical models. However, the skill of the simple conceptual theories to describe the MOC simulated with higher complexity global models remains largely unknown. In this study, we present a systematic comparison of theoretical and modelled sensitivity of the MOC and associated deep ocean stratification to vertical mixing and southern hemisphere westerlies. The results show that theories that simplify the ocean into a single-basin, zonally-symmetric box are generally in a good agreement with a realistic, global ocean circulation model. Some disagreement occurs in the abyssal ocean, where complex bottom topography is not taken into account by simple theories. Distinct regimes, where the MOC has a different sensitivity to wind or mixing, as predicted by simple theories, are also clearly shown by the global ocean model. The sensitivity of the Indo-Pacific, Atlantic, and global basins is analysed separately to validate the conceptual understanding of the upper and lower overturning cells in the theory.
Sheldon, Lisa Kennedy; Ellington, Lee
2008-11-01
This paper is a report of a study to assess the applicability of a theoretical model of social information processing in expanding a nursing theory addressing how nurses respond to patients. Nursing communication affects patient outcomes such as anxiety, adherence to treatments and satisfaction with care. Orlando's theory of nursing process describes nurses' reactions to patients' behaviour as generating a perception, thought and feeling in the nurse and then action by the nurse. A model of social information processing describes the sequential steps in the cognitive processes used to respond to social cues and may be useful in describing the nursing process. Cognitive interviews were conducted in 2006 with a convenience sample of 5 nurses in the United States of America. The data were interpreted using the Crick and Dodge model of social information processing. Themes arising from cognitive interviews validated concepts of the nursing theory and the constructs of the model of social information processing. The interviews revealed that the support of peers was an additional construct involved in the development of communication skills, creation of a database and enhancement of self-efficacy. Models of social information processing enhance understanding of the process of how nurses respond to patients and further develop nursing theories further. In combination, the theories are useful in developing research into nurse-patient communication. Future research based on the expansion of nursing theory may identify effective and culturally appropriate nurse response patterns to specific patient interactions with implications for nursing care and patient outcomes.
Atombo, Charles; Wu, Chaozhong; Zhang, Hui; Wemegah, Tina D
2017-10-03
Road accidents are an important public health concern, and speeding is a major contributor. Although flow theory (FLT) is a valid model for understanding behavior, currently the nature of the roles and interplay of FLT constructs within the theory of planned behavior (TPB) framework when attempting to explain the determinants of motivations for intention to speed and speeding behavior of car drivers is not yet known. The study aims to synthesize TPB and FLT in explaining drivers of advanced vehicles intentions to speed and speed violation behaviors and evaluate factors that are critical for explaining intention and behavior. The hypothesized model was validated using a sample collected from 354 fully licensed drivers of advanced vehicles, involving 278 males and 76 females on 2 occasions separated by a 3-month interval. During the first of the 2 occasions, participants completed questionnaire measures of TPB and FLT variables. Three months later, participants' speed violation behaviors were assessed. The study observed a significant positive relationship between the constructs. The proposed model accounted for 51 and 45% of the variance in intention to speed and speed violation behavior, respectively. The independent predictors of intention were enjoyment, attitude, and subjective norm. The independent predictors of speed violation behavior were enjoyment, concentration, intention, and perceived behavioral control. The findings suggest that safety interventions for preventing speed violation behaviors should be aimed at underlying beliefs influencing the speeding behaviors of drivers of advanced vehicles. Furthermore, perceived enjoyment is of equal importance to driver's intention, influencing speed violation behavior.
A Chemistry Concept Reasoning Test
ERIC Educational Resources Information Center
Cloonan, Carrie A.; Hutchinson, John S.
2011-01-01
A Chemistry Concept Reasoning Test was created and validated providing an easy-to-use tool for measuring conceptual understanding and critical scientific thinking of general chemistry models and theories. The test is designed to measure concept understanding comparable to that found in free-response questions requiring explanations over…
Control Theory based Shape Design for the Incompressible Navier-Stokes Equations
NASA Astrophysics Data System (ADS)
Cowles, G.; Martinelli, L.
2003-12-01
A design method for shape optimization in incompressible turbulent viscous flow has been developed and validated for inverse design. The gradient information is determined using a control theory based algorithm. With such an approach, the cost of computing the gradient is negligible. An additional adjoint system must be solved which requires the cost of a single steady state flow solution. Thus, this method has an enormous advantage over traditional finite-difference based algorithms. The method of artificial compressibility is utilized to solve both the flow and adjoint systems. An algebraic turbulence model is used to compute the eddy viscosity. The method is validated using several inverse wing design test cases. In each case, the program must modify the shape of the initial wing such that its pressure distribution matches that of the target wing. Results are shown for the inversion of both finite thickness wings as well as zero thickness wings which can be considered a model of yacht sails.
Harman, Elena; Azzam, Tarek
2018-02-01
This exploratory study examines a novel tool for validating program theory through crowdsourced qualitative analysis. It combines a quantitative pattern matching framework traditionally used in theory-driven evaluation with crowdsourcing to analyze qualitative interview data. A sample of crowdsourced participants are asked to read an interview transcript and identify whether program theory components (Activities and Outcomes) are discussed and to highlight the most relevant passage about that component. The findings indicate that using crowdsourcing to analyze qualitative data can differentiate between program theory components that are supported by a participant's experience and those that are not. This approach expands the range of tools available to validate program theory using qualitative data, thus strengthening the theory-driven approach. Copyright © 2017 Elsevier Ltd. All rights reserved.
(In)validity of the constant field and constant currents assumptions in theories of ion transport.
Syganow, A; von Kitzing, E
1999-01-01
Constant electric fields and constant ion currents are often considered in theories of ion transport. Therefore, it is important to understand the validity of these helpful concepts. The constant field assumption requires that the charge density of permeant ions and flexible polar groups is virtually voltage independent. We present analytic relations that indicate the conditions under which the constant field approximation applies. Barrier models are frequently fitted to experimental current-voltage curves to describe ion transport. These models are based on three fundamental characteristics: a constant electric field, negligible concerted motions of ions inside the channel (an ion can enter only an empty site), and concentration-independent energy profiles. An analysis of those fundamental assumptions of barrier models shows that those approximations require large barriers because the electrostatic interaction is strong and has a long range. In the constant currents assumption, the current of each permeating ion species is considered to be constant throughout the channel; thus ion pairing is explicitly ignored. In inhomogeneous steady-state systems, the association rate constant determines the strength of ion pairing. Among permeable ions, however, the ion association rate constants are not small, according to modern diffusion-limited reaction rate theories. A mathematical formulation of a constant currents condition indicates that ion pairing very likely has an effect but does not dominate ion transport. PMID:9929480
Neural activity in the hippocampus during conflict resolution.
Sakimoto, Yuya; Okada, Kana; Hattori, Minoru; Takeda, Kozue; Sakata, Shogo
2013-01-15
This study examined configural association theory and conflict resolution models in relation to hippocampal neural activity during positive patterning tasks. According to configural association theory, the hippocampus is important for responses to compound stimuli in positive patterning tasks. In contrast, according to the conflict resolution model, the hippocampus is important for responses to single stimuli in positive patterning tasks. We hypothesized that if configural association theory is applicable, and not the conflict resolution model, the hippocampal theta power should be increased when compound stimuli are presented. If, on the other hand, the conflict resolution model is applicable, but not configural association theory, then the hippocampal theta power should be increased when single stimuli are presented. If both models are valid and applicable in the positive patterning task, we predict that the hippocampal theta power should be increased by presentation of both compound and single stimuli during the positive patterning task. To examine our hypotheses, we measured hippocampal theta power in rats during a positive patterning task. The results showed that hippocampal theta power increased during the presentation of a single stimulus, but did not increase during the presentation of a compound stimulus. This finding suggests that the conflict resolution model is more applicable than the configural association theory for describing neural activity during positive patterning tasks. Copyright © 2012 Elsevier B.V. All rights reserved.
Dynamic Simulation of Human Gait Model With Predictive Capability.
Sun, Jinming; Wu, Shaoli; Voglewede, Philip A
2018-03-01
In this paper, it is proposed that the central nervous system (CNS) controls human gait using a predictive control approach in conjunction with classical feedback control instead of exclusive classical feedback control theory that controls based on past error. To validate this proposition, a dynamic model of human gait is developed using a novel predictive approach to investigate the principles of the CNS. The model developed includes two parts: a plant model that represents the dynamics of human gait and a controller that represents the CNS. The plant model is a seven-segment, six-joint model that has nine degrees-of-freedom (DOF). The plant model is validated using data collected from able-bodied human subjects. The proposed controller utilizes model predictive control (MPC). MPC uses an internal model to predict the output in advance, compare the predicted output to the reference, and optimize the control input so that the predicted error is minimal. To decrease the complexity of the model, two joints are controlled using a proportional-derivative (PD) controller. The developed predictive human gait model is validated by simulating able-bodied human gait. The simulation results show that the developed model is able to simulate the kinematic output close to experimental data.
Computational Fluid Dynamics Modeling of Nickel Hydrogen Batteries
NASA Technical Reports Server (NTRS)
Cullion, R.; Gu, W. B.; Wang, C. Y.; Timmerman, P.
2000-01-01
An electrochemical Ni-H2 battery model has been expanded to include thermal effects. A thermal energy conservation equation was derived from first principles. An electrochemical and thermal coupled model was created by the addition of this equation to an existing multiphase, electrochemical model. Charging at various rates was investigated and the results validated against experimental data. Reaction currents, pressure changes, temperature profiles, and concentration variations within the cell are predicted numerically and compared with available data and theory.
NASA Astrophysics Data System (ADS)
Tang, Jinjun; Zhang, Shen; Chen, Xinqiang; Liu, Fang; Zou, Yajie
2018-03-01
Understanding Origin-Destination distribution of taxi trips is very important for improving effects of transportation planning and enhancing quality of taxi services. This study proposes a new method based on Entropy-Maximizing theory to model OD distribution in Harbin city using large-scale taxi GPS trajectories. Firstly, a K-means clustering method is utilized to partition raw pick-up and drop-off location into different zones, and trips are assumed to start from and end at zone centers. A generalized cost function is further defined by considering travel distance, time and fee between each OD pair. GPS data collected from more than 1000 taxis at an interval of 30 s during one month are divided into two parts: data from first twenty days is treated as training dataset and last ten days is taken as testing dataset. The training dataset is used to calibrate model while testing dataset is used to validate model. Furthermore, three indicators, mean absolute error (MAE), root mean square error (RMSE) and mean percentage absolute error (MPAE), are applied to evaluate training and testing performance of Entropy-Maximizing model versus Gravity model. The results demonstrate Entropy-Maximizing model is superior to Gravity model. Findings of the study are used to validate the feasibility of OD distribution from taxi GPS data in urban system.
Cappelleri, Joseph C.; Lundy, J. Jason; Hays, Ron D.
2014-01-01
Introduction The U.S. Food and Drug Administration’s patient-reported outcome (PRO) guidance document defines content validity as “the extent to which the instrument measures the concept of interest” (FDA, 2009, p. 12). “Construct validity is now generally viewed as a unifying form of validity for psychological measurements, subsuming both content and criterion validity” (Strauss & Smith, 2009, p. 7). Hence both qualitative and quantitative information are essential in evaluating the validity of measures. Methods We review classical test theory and item response theory approaches to evaluating PRO measures including frequency of responses to each category of the items in a multi-item scale, the distribution of scale scores, floor and ceiling effects, the relationship between item response options and the total score, and the extent to which hypothesized “difficulty” (severity) order of items is represented by observed responses. Conclusion Classical test theory and item response theory can be useful in providing a quantitative assessment of items and scales during the content validity phase of patient-reported outcome measures. Depending on the particular type of measure and the specific circumstances, either one or both approaches should be considered to help maximize the content validity of PRO measures. PMID:24811753
Consumer preference models: fuzzy theory approach
NASA Astrophysics Data System (ADS)
Turksen, I. B.; Wilson, I. A.
1993-12-01
Consumer preference models are widely used in new product design, marketing management, pricing and market segmentation. The purpose of this article is to develop and test a fuzzy set preference model which can represent linguistic variables in individual-level models implemented in parallel with existing conjoint models. The potential improvements in market share prediction and predictive validity can substantially improve management decisions about what to make (product design), for whom to make it (market segmentation) and how much to make (market share prediction).
NASA Astrophysics Data System (ADS)
Nolet, G.; Mercerat, D.; Zaroli, C.
2012-12-01
We present the first complete test of finite frequency tomography with banana-doughnut kernels, from the generation of seismograms in a 3D model to the final inversion, and are able to lay to rest all of the so-called `controversies' that have slowed down its adoption. Cross-correlation delay times are influenced by energy arriving in a time window that includes later arrivals, either scattered from, or diffracted around lateral heterogeneities. We present here the results of a 3D test in which we generate 1716 seismograms using the spectral element method in a cross-borehole experiment conducted in a checkerboard box. Delays are determined for the broadband signals as well as for five frequency bands (each one octave apart) by cross-correlating seismograms for a homogeneous pattern with those for a checkerboard. The large (10 per cent) velocity contrast and the regularity of the checkerboard pattern causes severe reverberations that arrive late in the cross-correlation window. Data errors are estimated by comparing linearity between delays measured for a model with 10 per cent velocity contrast with those with a 4 per cent contrast. Sensitivity kernels are efficiently computed with ray theory using the `banana-doughnut' kernels from Dahlen et al. (GJI 141:157, 2000). The model resulting from the inversion with a data fit with reduced χ2red=1 shows an excellent correspondence with the input model and allows for a complete validation of the theory. Amplitudes in the (well resolved) top part of the model are close to the input amplitudes. Comparing a model derived from one band only shows the power of using multiple frequency bands in resolving detail - essentially the observed dispersion captures some of the waveform information. Finite frequency theory also allows us to image the checkerboard at some distance from the borehole plane. Most disconcertingly for advocates of ray theory are the results obtained when we interpret cross-correlation delays with ray theory. We shall present an extreme case of the devil's checkerboard (the term is from Jacobsen and Sigloch), in which the sign of the anomalies in the checkerboard is reversed in the ray-theoretical solution, a clear demonstration of the reality of effects of the doughnut hole. We conclude that the test fully validates `banana-doughnut' theory, and disqualifies ray theoretical inversions of cross-correlation delays.
The Development and Validation of the Online Shopping Addiction Scale.
Zhao, Haiyan; Tian, Wei; Xin, Tao
2017-01-01
We report the development and validation of a scale to measure online shopping addiction. Inspired by previous theories and research on behavioral addiction, the Griffiths's widely accepted six-factor component model was referred to and an 18-item scale was constructed, with each component measured by three items. The results of exploratory factor analysis, based on Sample 1 (999 college students) and confirmatory factor analysis, based on Sample 2 (854 college students) showed the Griffiths's substantive six-factor structure underlay the online shopping addiction scale. Cronbach's alpha suggested that the resulting scale was highly reliable. Concurrent validity, based on Sample 3 (328 college students), was also satisfactory as indicated by correlations between the scale and measures of similar constructs. Finally, self-perceived online shopping addiction can be predicted to a relatively high degree. The present 18-item scale is a solid theory-based instrument to empirically measure online shopping addiction and can be used for understanding the phenomena among young adults.
The Development and Validation of the Online Shopping Addiction Scale
Zhao, Haiyan; Tian, Wei; Xin, Tao
2017-01-01
We report the development and validation of a scale to measure online shopping addiction. Inspired by previous theories and research on behavioral addiction, the Griffiths's widely accepted six-factor component model was referred to and an 18-item scale was constructed, with each component measured by three items. The results of exploratory factor analysis, based on Sample 1 (999 college students) and confirmatory factor analysis, based on Sample 2 (854 college students) showed the Griffiths's substantive six-factor structure underlay the online shopping addiction scale. Cronbach's alpha suggested that the resulting scale was highly reliable. Concurrent validity, based on Sample 3 (328 college students), was also satisfactory as indicated by correlations between the scale and measures of similar constructs. Finally, self-perceived online shopping addiction can be predicted to a relatively high degree. The present 18-item scale is a solid theory-based instrument to empirically measure online shopping addiction and can be used for understanding the phenomena among young adults. PMID:28559864
Iyioha, Ireh
2011-01-01
This paper examines the (in)compatibility between the diagnostic and therapeutic theories of complementary and alternative medicine (CAM) and a science-based regulatory framework. Specifically, the paper investigates the nexus between statutory legitimacy and scientific validation of health systems, with an examination of its impact on the development of complementary and alternative therapies. The paper evaluates competing theories for validating CAM ranging from the RCT methodology to anthropological perspectives and contends that while the RCT method might be beneficial in the regulation of many CAM therapies, yet dogmatic adherence to this paradigm as the exclusive method for legitimizing CAM will be adverse to the independent development of many CAM therapies whose philosophies and mechanisms of action are not scientifically interpretable. Drawing on history and research evidence to support this argument, the paper sues for a regulatory model that is accommodative of different evidential paradigms in support of a pluralistic healthcare system that balances the imperative of quality assurance with the need to ensure access. PMID:20953428
Nurmi, Johanna; Hagger, Martin S; Haukkala, Ari; Araújo-Soares, Vera; Hankonen, Nelli
2016-04-01
This study tested the predictive validity of a multitheory process model in which the effect of autonomous motivation from self-determination theory on physical activity participation is mediated by the adoption of self-regulatory techniques based on control theory. Finnish adolescents (N = 411, aged 17-19) completed a prospective survey including validated measures of the predictors and physical activity, at baseline and after one month (N = 177). A subsample used an accelerometer to objectively measure physical activity and further validate the physical activity self-report assessment tool (n = 44). Autonomous motivation statistically significantly predicted action planning, coping planning, and self-monitoring. Coping planning and self-monitoring mediated the effect of autonomous motivation on physical activity, although self-monitoring was the most prominent. Controlled motivation had no effect on self-regulation techniques or physical activity. Developing interventions that support autonomous motivation for physical activity may foster increased engagement in self-regulation techniques and positively affect physical activity behavior.
Applicability of the Continuum-Shell Theories to the Mechanics of Carbon Nanotubes
NASA Technical Reports Server (NTRS)
Harik, V. M.; Gates, T. S.; Nemeth, M. P.
2002-01-01
Validity of the assumptions relating the applicability of continuum shell theories to the global mechanical behavior of carbon nanotubes is examined. The present study focuses on providing a basis that can be used to qualitatively assess the appropriateness of continuum-shell models for nanotubes. To address the effect of nanotube structure on their deformation, all nanotube geometries are divided into four major classes that require distinct models. Criteria for the applicability of continuum models are presented. The key parameters that control the buckling strains and deformation modes of these classes of nanotubes are determined. In an analogy with continuum mechanics, mechanical laws of geometric similitude are presented. A parametric map is constructed for a variety of nanotube geometries as a guide for the applicability of different models. The continuum assumptions made in representing a nanotube as a homogeneous thin shell are analyzed to identify possible limitations of applying shell theories and using their bifurcation-buckling equations at the nano-scale.
A Theory and Experiments for Detecting Shock Locations
NASA Technical Reports Server (NTRS)
Hariharan, S. I.; Johnson, D. K.; Adamovsky, G.
1994-01-01
In this paper we present a simplified one-dimensional theory for predicting locations of normal shocks in a converging diverging nozzle. The theory assumes that the flow is quasi one-dimensional and the flow is accelerated in the throat area. Optical aspects of the model consider propagation of electromagnetic fields transverse to the shock front. The theory consists of an inverse problem in which from the measured intensity it reconstructs an index of refraction profile for the shock. From this profile and the Dale-Gladstone relation, the density in the flow field is determined, thus determining the shock location. Experiments show agreement with the theory. In particular the location is determined within 10 percent of accuracy. Both the theoretical as well as the experimental results are presented to validate the procedures in this work.
Tan, Christine L; Hassali, Mohamed A; Saleem, Fahad; Shafie, Asrul A; Aljadhey, Hisham; Gan, Vincent B
2015-01-01
(i) To develop the Pharmacy Value-Added Services Questionnaire (PVASQ) using emerging themes generated from interviews. (ii) To establish reliability and validity of questionnaire instrument. Using an extended Theory of Planned Behavior as the theoretical model, face-to-face interviews generated salient beliefs of pharmacy value-added services. The PVASQ was constructed initially in English incorporating important themes and later translated into the Malay language with forward and backward translation. Intention (INT) to adopt pharmacy value-added services is predicted by attitudes (ATT), subjective norms (SN), perceived behavioral control (PBC), knowledge and expectations. Using a 7-point Likert-type scale and a dichotomous scale, test-retest reliability (N=25) was assessed by administrating the questionnaire instrument twice at an interval of one week apart. Internal consistency was measured by Cronbach's alpha and construct validity between two administrations was assessed using the kappa statistic and the intraclass correlation coefficient (ICC). Confirmatory Factor Analysis, CFA (N=410) was conducted to assess construct validity of the PVASQ. The kappa coefficients indicate a moderate to almost perfect strength of agreement between test and retest. The ICC for all scales tested for intra-rater (test-retest) reliability was good. The overall Cronbach' s alpha (N=25) is 0.912 and 0.908 for the two time points. The result of CFA (N=410) showed most items loaded strongly and correctly into corresponding factors. Only one item was eliminated. This study is the first to develop and establish the reliability and validity of the Pharmacy Value-Added Services Questionnaire instrument using the Theory of Planned Behavior as the theoretical model. The translated Malay language version of PVASQ is reliable and valid to predict Malaysian patients' intention to adopt pharmacy value-added services to collect partial medicine supply.
Elvén, Maria; Hochwälder, Jacek; Dean, Elizabeth; Söderlund, Anne
2015-05-01
A biopsychosocial approach and behaviour change strategies have long been proposed to serve as a basis for addressing current multifaceted health problems. This emphasis has implications for clinical reasoning of health professionals. This study's aim was to develop and validate a conceptual model to guide physiotherapists' clinical reasoning focused on clients' behaviour change. Phase 1 consisted of the exploration of existing research and the research team's experiences and knowledge. Phases 2a and 2b consisted of validation and refinement of the model based on input from physiotherapy students in two focus groups (n = 5 per group) and from experts in behavioural medicine (n = 9). Phase 1 generated theoretical and evidence bases for the first version of a model. Phases 2a and 2b established the validity and value of the model. The final model described clinical reasoning focused on clients' behaviour change as a cognitive, reflective, collaborative and iterative process with multiple interrelated levels that included input from the client and physiotherapist, a functional behavioural analysis of the activity-related target behaviour and the selection of strategies for behaviour change. This unique model, theory- and evidence-informed, has been developed to help physiotherapists to apply clinical reasoning systematically in the process of behaviour change with their clients.
Unsteady Thick Airfoil Aerodynamics: Experiments, Computation, and Theory
NASA Technical Reports Server (NTRS)
Strangfeld, C.; Rumsey, C. L.; Mueller-Vahl, H.; Greenblatt, D.; Nayeri, C. N.; Paschereit, C. O.
2015-01-01
An experimental, computational and theoretical investigation was carried out to study the aerodynamic loads acting on a relatively thick NACA 0018 airfoil when subjected to pitching and surging, individually and synchronously. Both pre-stall and post-stall angles of attack were considered. Experiments were carried out in a dedicated unsteady wind tunnel, with large surge amplitudes, and airfoil loads were estimated by means of unsteady surface mounted pressure measurements. Theoretical predictions were based on Theodorsen's and Isaacs' results as well as on the relatively recent generalizations of van der Wall. Both two- and three-dimensional computations were performed on structured grids employing unsteady Reynolds-averaged Navier-Stokes (URANS). For pure surging at pre-stall angles of attack, the correspondence between experiments and theory was satisfactory; this served as a validation of Isaacs theory. Discrepancies were traced to dynamic trailing-edge separation, even at low angles of attack. Excellent correspondence was found between experiments and theory for airfoil pitching as well as combined pitching and surging; the latter appears to be the first clear validation of van der Wall's theoretical results. Although qualitatively similar to experiment at low angles of attack, two-dimensional URANS computations yielded notable errors in the unsteady load effects of pitching, surging and their synchronous combination. The main reason is believed to be that the URANS equations do not resolve wake vorticity (explicitly modeled in the theory) or the resulting rolled-up un- steady flow structures because high values of eddy viscosity tend to \\smear" the wake. At post-stall angles, three-dimensional computations illustrated the importance of modeling the tunnel side walls.
NASA Technical Reports Server (NTRS)
Flower, D. A.; Peckham, G. E.; Bradford, W. J.
1984-01-01
Experiments with a millimeter wave radar operating on the NASA CV-990 aircraft which validate the technique for remotely sensing atmospheric pressure at the Earth's surface are described. Measurements show that the precise millimeter wave observations needed to deduce pressure from space with an accuracy of 1 mb are possible, that sea surface reflection properties agree with theory and that the measured variation of differential absorption with altitude corresponds to that expected from spectroscopic models.
Proving Properties of Rule-Based Systems
1990-12-01
in these systems and enable us to use them with more confidence. Each system of rules is encoded as a set of axioms that define the system theory . The...operation of the rule language and information about the subject domain are also described in the system theory . Validation tasks, such as...the validity of the conjecture in the system theory , we have carried out the corresponding validation task. If the proof is restricted to be
Wing Shape Sensing from Measured Strain
NASA Technical Reports Server (NTRS)
Pak, Chan-Gi
2015-01-01
A new two step theory is investigated for predicting the deflection and slope of an entire structure using strain measurements at discrete locations. In the first step, a measured strain is fitted using a piecewise least squares curve fitting method together with the cubic spline technique. These fitted strains are integrated twice to obtain deflection data along the fibers. In the second step, computed deflection along the fibers are combined with a finite element model of the structure in order to extrapolate the deflection and slope of the entire structure through the use of System Equivalent Reduction and Expansion Process. The theory is first validated on a computational model, a cantilevered rectangular wing. It is then applied to test data from a cantilevered swept wing model.
Optimal Decision Stimuli for Risky Choice Experiments: An Adaptive Approach
Cavagnaro, Daniel R.; Gonzalez, Richard; Myung, Jay I.; Pitt, Mark A.
2014-01-01
Collecting data to discriminate between models of risky choice requires careful selection of decision stimuli. Models of decision making aim to predict decisions across a wide range of possible stimuli, but practical limitations force experimenters to select only a handful of them for actual testing. Some stimuli are more diagnostic between models than others, so the choice of stimuli is critical. This paper provides the theoretical background and a methodological framework for adaptive selection of optimal stimuli for discriminating among models of risky choice. The approach, called Adaptive Design Optimization (ADO), adapts the stimulus in each experimental trial based on the results of the preceding trials. We demonstrate the validity of the approach with simulation studies aiming to discriminate Expected Utility, Weighted Expected Utility, Original Prospect Theory, and Cumulative Prospect Theory models. PMID:24532856
ERIC Educational Resources Information Center
Lucas, Aaron D.; Voss, Roger Alan; Krumwiede, Dennis W.
2015-01-01
Fractal vertical polarization (FVP) has joined leader-member exchange (LMX) and team member exchange (TMX) as one of the available models of communication dynamics based on complexity theory, which now all benefit from valid scales for use in organizational settings. The purpose of these models is to assess the quality of interpersonal information…
Validation of a Measure of Non-Commissioned Officer Leadership
2006-01-20
trauma, aside from injury, include cardiovascular distress and somatic complaints (Belkic, Emdad, & Theorell , 1998; Zatzick, Russo, & Katon, 2003...hypothesis emerged from the DC model, and posits that high strain jobs are characterized as high in demands and low in control ( Karasek , 1979). NCO...leadership three months following a combat deployment, in the context of Leader-Member Exchange theory (Graen, 1976), Demand-Control-Support model ( Karasek
Validating a Theory-Based Survey to Evaluate Teaching Effectiveness in Higher Education
ERIC Educational Resources Information Center
Amrein-Beardsley, A.; Haladyna, T.
2012-01-01
Surveys to evaluate instructor effectiveness are commonly used in higher education. Yet the survey items included are often drawn from other surveys without reference to a theory of adult learning. The authors present the results from a validation study of such a theory-based survey. They evidence that an evaluation survey based on a theory that…
Chatzisarantis, Nikos L D; Hagger, Martin S
2008-01-01
Previous research has suggested that the theory of planned behaviour is insufficient in capturing all the antecedents of physical activity participation and that continuation intentions or personality traits may improve the predictive validity of the model. The present study examined the combined effects of continuation intentions and personality traits on health behaviour within the theory of planned behaviour. To examine these effects, 180 university students (N = 180, Male = 87, Female = 93, Age = 19.14 years, SD = 0.94) completed self-report measures of the theory of planned behaviour, personality traits and continuation intentions. After 5 weeks, perceived achievement of behavioural outcomes and actual participation in physical activities were assessed. Results supported discriminant validity between continuation intentions, conscientiousness and extroversion and indicated that perceived achievement of behavioural outcomes and continuation intentions of failure predicted physical activity participation after controlling for personality effects, past behaviour and other variables in the theory of planned behaviour. In addition, results indicated that conscientiousness moderated the effects of continuation intentions of failure on physical activity such that continuation intentions of failure predicted physical activity participation among conscientious and not among less conscientious individuals. These findings suggest that the effects of continuation intentions on health behaviour are contingent on personality characteristics.
Breakdown parameter for kinetic modeling of multiscale gas flows.
Meng, Jianping; Dongari, Nishanth; Reese, Jason M; Zhang, Yonghao
2014-06-01
Multiscale methods built purely on the kinetic theory of gases provide information about the molecular velocity distribution function. It is therefore both important and feasible to establish new breakdown parameters for assessing the appropriateness of a fluid description at the continuum level by utilizing kinetic information rather than macroscopic flow quantities alone. We propose a new kinetic criterion to indirectly assess the errors introduced by a continuum-level description of the gas flow. The analysis, which includes numerical demonstrations, focuses on the validity of the Navier-Stokes-Fourier equations and corresponding kinetic models and reveals that the new criterion can consistently indicate the validity of continuum-level modeling in both low-speed and high-speed flows at different Knudsen numbers.
Calculation of Optical Parameters of Liquid Crystals
NASA Astrophysics Data System (ADS)
Kumar, A.
2007-12-01
Validation of a modified four-parameter model describing temperature effect on liquid crystal refractive indices is being reported in the present article. This model is based upon the Vuks equation. Experimental data of ordinary and extraordinary refractive indices for two liquid crystal samples MLC-9200-000 and MLC-6608 are used to validate the above-mentioned theoretical model. Using these experimental data, birefringence, order parameter, normalized polarizabilities, and the temperature gradient of refractive indices are determined. Two methods: directly using birefringence measurements and using Haller's extrapolation procedure are adopted for the determination of order parameter. Both approches of order parameter calculation are compared. The temperature dependences of all these parameters are discussed. A close agreement between theory and experiment is obtained.
NASA Technical Reports Server (NTRS)
Bellan, Josette; Harstad, Kenneth; Ohsaka, Kenichi
2003-01-01
Although the high pressure multicomponent fluid conservation equations have already been derived and approximately validated for binary mixtures by this PI, the validation of the multicomponent theory is hampered by the lack of existing mixing rules for property calculations. Classical gas dynamics theory can provide property mixing-rules at low pressures exclusively. While thermal conductivity and viscosity high-pressure mixing rules have been documented in the literature, there is no such equivalent for the diffusion coefficients and the thermal diffusion factors. The primary goal of this investigation is to extend the low pressure mixing rule theory to high pressures and validate the new theory with experimental data from levitated single drops. The two properties that will be addressed are the diffusion coefficients and the thermal diffusion factors. To validate/determine the property calculations, ground-based experiments from levitated drops are being conducted.
Whose Consensus Is It Anyway? Scientific versus Legalistic Conceptions of Validity
ERIC Educational Resources Information Center
Borsboom, Denny
2012-01-01
Paul E. Newton provides an insightful and scholarly overview of central issues in validity theory. As he notes, many of the conceptual problems in validity theory derive from the fact that the word "validity" has two meanings. First, it indicates "whether a test measures what it purports to measure." This is a factual claim about the psychometric…
The design of patient decision support interventions: addressing the theory-practice gap.
Elwyn, Glyn; Stiel, Mareike; Durand, Marie-Anne; Boivin, Jacky
2011-08-01
Although an increasing number of decision support interventions for patients (including decision aids) are produced, few make explicit use of theory. We argue the importance of using theory to guide design. The aim of this work was to address this theory-practice gap and to examine how a range of selected decision-making theories could inform the design and evaluation of decision support interventions. We reviewed the decision-making literature and selected relevant theories. We assessed their key principles, theoretical pathways and predictions in order to determine how they could inform the design of two core components of decision support interventions, namely, information and deliberation components and to specify theory-based outcome measures. Eight theories were selected: (1) the expected utility theory; (2) the conflict model of decision making; (3) prospect theory; (4) fuzzy-trace theory; (5) the differentiation and consolidation theory; (6) the ecological rationality theory; (7) the rational-emotional model of decision avoidance; and finally, (8) the Attend, React, Explain, Adapt model of affective forecasting. Some theories have strong relevance to the information design (e.g. prospect theory); some are more relevant to deliberation processes (conflict theory, differentiation theory and ecological validity). None of the theories in isolation was sufficient to inform the design of all the necessary components of decision support interventions. It was also clear that most work in theory-building has focused on explaining or describing how humans think rather than on how tools could be designed to help humans make good decisions. It is not surprising therefore that a large theory-practice gap exists as we consider decision support for patients. There was no relevant theory that integrated all the necessary contributions to the task of making good decisions in collaborative interactions. Initiatives such as the International Patient Decision Aids Standards Collaboration influence standards for the design of decision support interventions. However, this analysis points to the need to undertake more work in providing theoretical foundations for these interventions. © 2010 Blackwell Publishing Ltd.
Jin, Xinfang; White, Ralph E.; Huang, Kevin
2016-10-04
With the assumption that the Fermi level (electrochemical potential of electrons) is uniform across the thickness of a mixed ionic and electronic conducting (MIEC) electrode, the charge-transport model in the electrode domain can be reduced to the modified Fick’s first law, which includes a thermodynamic factor A. A transient numerical solution of the Nernst-Planck theory was obtained for a symmetric cell with MIEC electrodes to illustrate the validity of the assumption of a uniform Fermi level. Subsequently, an impedance numerical solution based on the modified Fick’s first law is compared with that from the Nernst-Planck theory. The results show thatmore » Nernst-Planck charge-transport model is essentially the same as the modified Fick’s first law model as long as the MIEC electrodes have a predominant electronic conductivity. However, because of the invalidity of the uniform Fermi level assumption for aMIEC electrolyte with a predominant ionic conductivity, Nernst-Planck theory is needed to describe the charge transport behaviors.« less
Zheng, Ya; Yang, Zhong; Jin, Chunlan; Qi, Yue; Liu, Xun
2017-01-01
Fairness-related decision making is an important issue in the field of decision making. Traditional theories emphasize the roles of inequity aversion and reciprocity, whereas recent research increasingly shows that emotion plays a critical role in this type of decision making. In this review, we summarize the influences of three types of emotions (i.e., the integral emotion experienced at the time of decision making, the incidental emotion aroused by a task-unrelated dispositional or situational source, and the interaction of emotion and cognition) on fairness-related decision making. Specifically, we first introduce three dominant theories that describe how emotion may influence fairness-related decision making (i.e., the wounded pride/spite model, affect infusion model, and dual-process model). Next, we collect behavioral and neural evidence for and against these theories. Finally, we propose that future research on fairness-related decision making should focus on inducing incidental social emotion, avoiding irrelevant emotion when regulating, exploring the individual differences in emotional dispositions, and strengthening the ecological validity of the paradigm.
Zheng, Ya; Yang, Zhong; Jin, Chunlan; Qi, Yue; Liu, Xun
2017-01-01
Fairness-related decision making is an important issue in the field of decision making. Traditional theories emphasize the roles of inequity aversion and reciprocity, whereas recent research increasingly shows that emotion plays a critical role in this type of decision making. In this review, we summarize the influences of three types of emotions (i.e., the integral emotion experienced at the time of decision making, the incidental emotion aroused by a task-unrelated dispositional or situational source, and the interaction of emotion and cognition) on fairness-related decision making. Specifically, we first introduce three dominant theories that describe how emotion may influence fairness-related decision making (i.e., the wounded pride/spite model, affect infusion model, and dual-process model). Next, we collect behavioral and neural evidence for and against these theories. Finally, we propose that future research on fairness-related decision making should focus on inducing incidental social emotion, avoiding irrelevant emotion when regulating, exploring the individual differences in emotional dispositions, and strengthening the ecological validity of the paradigm. PMID:28974937
Structured Uncertainty Bound Determination From Data for Control and Performance Validation
NASA Technical Reports Server (NTRS)
Lim, Kyong B.
2003-01-01
This report attempts to document the broad scope of issues that must be satisfactorily resolved before one can expect to methodically obtain, with a reasonable confidence, a near-optimal robust closed loop performance in physical applications. These include elements of signal processing, noise identification, system identification, model validation, and uncertainty modeling. Based on a recently developed methodology involving a parameterization of all model validating uncertainty sets for a given linear fractional transformation (LFT) structure and noise allowance, a new software, Uncertainty Bound Identification (UBID) toolbox, which conveniently executes model validation tests and determine uncertainty bounds from data, has been designed and is currently available. This toolbox also serves to benchmark the current state-of-the-art in uncertainty bound determination and in turn facilitate benchmarking of robust control technology. To help clarify the methodology and use of the new software, two tutorial examples are provided. The first involves the uncertainty characterization of a flexible structure dynamics, and the second example involves a closed loop performance validation of a ducted fan based on an uncertainty bound from data. These examples, along with other simulation and experimental results, also help describe the many factors and assumptions that determine the degree of success in applying robust control theory to practical problems.
NASA Astrophysics Data System (ADS)
Boger, R. A.; Low, R.; Paull, S.; Anyamba, A.; Soebiyanto, R. P.
2017-12-01
Temperature and precipitation are important drivers of mosquito population dynamics, and a growing set of models have been proposed to characterize these relationships. Validation of these models, and development of broader theories across mosquito species and regions could nonetheless be improved by comparing observations from a global dataset of mosquito larvae with satellite-based measurements of meteorological variables. Citizen science data can be particularly useful for two such aspects of research into the meteorological drivers of mosquito populations: i) Broad-scale validation of mosquito distribution models and ii) Generation of quantitative hypotheses regarding changes to mosquito abundance and phenology across scales. The recently released GLOBE Observer Mosquito Habitat Mapper (GO-MHM) app engages citizen scientists in identifying vector taxa, mapping breeding sites and decommissioning non-natural habitats, and provides a potentially useful new tool for validating mosquito ubiquity projections based on the analysis of remotely sensed environmental data. Our early work with GO-MHM data focuses on two objectives: validating citizen science reports of Aedes aegypti distribution through comparison with accepted scientific data sources, and exploring the relationship between extreme temperature and precipitation events and subsequent observations of mosquito larvae. Ultimately the goal is to develop testable hypotheses regarding the shape and character of this relationship between mosquito species and regions.
Simulation of Left Atrial Function Using a Multi-Scale Model of the Cardiovascular System
Pironet, Antoine; Dauby, Pierre C.; Paeme, Sabine; Kosta, Sarah; Chase, J. Geoffrey; Desaive, Thomas
2013-01-01
During a full cardiac cycle, the left atrium successively behaves as a reservoir, a conduit and a pump. This complex behavior makes it unrealistic to apply the time-varying elastance theory to characterize the left atrium, first, because this theory has known limitations, and second, because it is still uncertain whether the load independence hypothesis holds. In this study, we aim to bypass this uncertainty by relying on another kind of mathematical model of the cardiac chambers. In the present work, we describe both the left atrium and the left ventricle with a multi-scale model. The multi-scale property of this model comes from the fact that pressure inside a cardiac chamber is derived from a model of the sarcomere behavior. Macroscopic model parameters are identified from reference dog hemodynamic data. The multi-scale model of the cardiovascular system including the left atrium is then simulated to show that the physiological roles of the left atrium are correctly reproduced. This include a biphasic pressure wave and an eight-shaped pressure-volume loop. We also test the validity of our model in non basal conditions by reproducing a preload reduction experiment by inferior vena cava occlusion with the model. We compute the variation of eight indices before and after this experiment and obtain the same variation as experimentally observed for seven out of the eight indices. In summary, the multi-scale mathematical model presented in this work is able to correctly account for the three roles of the left atrium and also exhibits a realistic left atrial pressure-volume loop. Furthermore, the model has been previously presented and validated for the left ventricle. This makes it a proper alternative to the time-varying elastance theory if the focus is set on precisely representing the left atrial and left ventricular behaviors. PMID:23755183
Differential Cross Sections for Proton-Proton Elastic Scattering
NASA Technical Reports Server (NTRS)
Norman, Ryan B.; Dick, Frank; Norbury, John W.; Blattnig, Steve R.
2009-01-01
Proton-proton elastic scattering is investigated within the framework of the one pion exchange model in an attempt to model nucleon-nucleon interactions spanning the large range of energies important to cosmic ray shielding. A quantum field theoretic calculation is used to compute both differential and total cross sections. A scalar theory is then presented and compared to the one pion exchange model. The theoretical cross sections are compared to proton-proton scattering data to determine the validity of the models.
NASA Astrophysics Data System (ADS)
Williams, Karen Ann
One section of college students (N = 25) enrolled in an algebra-based physics course was selected for a Piagetian-based learning cycle (LC) treatment while a second section (N = 25) studied in an Ausubelian-based meaningful verbal reception learning treatment (MVRL). This study examined the students' overall (concept + problem solving + mental model) meaningful understanding of force, density/Archimedes Principle, and heat. Also examined were students' meaningful understanding as measured by conceptual questions, problems, and mental models. In addition, students' learning orientations were examined. There were no significant posttest differences between the LC and MVRL groups for students' meaningful understanding or learning orientation. Piagetian and Ausubelian theories explain meaningful understanding for each treatment. Students from each treatment increased their meaningful understanding. However, neither group altered their learning orientation. The results of meaningful understanding as measured by conceptual questions, problem solving, and mental models were mixed. Differences were attributed to the weaknesses and strengths of each treatment. This research also examined four variables (treatment, reasoning ability, learning orientation, and prior knowledge) to find which best predicted students' overall meaningful understanding of physics concepts. None of these variables were significant predictors at the.05 level. However, when the same variables were used to predict students' specific understanding (i.e. concept, problem solving, or mental model understanding), the results were mixed. For forces and density/Archimedes Principle, prior knowledge and reasoning ability significantly predicted students' conceptual understanding. For heat, however, reasoning ability was the only significant predictor of concept understanding. Reasoning ability and treatment were significant predictors of students' problem solving for heat and forces. For density/Archimedes Principle, treatment was the only significant predictor of students' problem solving. None of the variables were significant predictors of mental model understanding. This research suggested that Piaget and Ausubel used different terminology to describe learning yet these theories are similar. Further research is needed to validate this premise and validate the blending of the two theories.
Ensemble method: Community detection based on game theory
NASA Astrophysics Data System (ADS)
Zhang, Xia; Xia, Zhengyou; Xu, Shengwu; Wang, J. D.
2014-08-01
Timely and cost-effective analytics over social network has emerged as a key ingredient for success in many businesses and government endeavors. Community detection is an active research area of relevance to analyze online social network. The problem of selecting a particular community detection algorithm is crucial if the aim is to unveil the community structure of a network. The choice of a given methodology could affect the outcome of the experiments because different algorithms have different advantages and depend on tuning specific parameters. In this paper, we propose a community division model based on the notion of game theory, which can combine advantages of previous algorithms effectively to get a better community classification result. By making experiments on some standard dataset, it verifies that our community detection model based on game theory is valid and better.
Constraint on reconstructed f(R) gravity models from gravitational waves
NASA Astrophysics Data System (ADS)
Lee, Seokcheon
2018-06-01
The gravitational wave (GW) detection of a binary neutron star inspiral made by the Advanced LIGO and Advanced Virgo paves the unprecedented way for multi-messenger observations. The propagation speed of this GW can be scrutinized by comparing the arrival times between GW and neutrinos or photons. It provides the constraint on the mass of the graviton. f(R) gravity theories have the habitual non-zero mass gravitons in addition to usual massless ones. Previously, we show that the model independent f(R) gravity theories can be constructed from the both background evolution and the matter growth with one undetermined parameter. We show that this parameter can be constrained from the graviton mass bound obtained from GW detection. Thus, the GW detection provides the invaluable constraint on the validity of f(R) gravity theories.
Ocean tides for satellite geodesy
NASA Technical Reports Server (NTRS)
Dickman, S. R.
1990-01-01
Spherical harmonic tidal solutions have been obtained at the frequencies of the 32 largest luni-solar tides using prior theory of the author. That theory was developed for turbulent, nonglobal, self-gravitating, and loading oceans possessing realistic bathymetry and linearized bottom friction; the oceans satisfy no-flow boundary conditions at coastlines. In this theory the eddy viscosity and bottom drag coefficients are treated as spatially uniform. Comparison of the predicted degree-2 components of the Mf, P1, and M2 tides with those from numerical and satellite-based tide models allows the ocean friction parameters to be estimated at long and short periods. Using the 32 tide solutions, the frequency dependence of tidal admittance is investigated, and the validity of sideband tide models used in satellite orbit analysis is examined. The implications of admittance variability for oceanic resonances are also explored.
Classical and non-classical effective medium theories: New perspectives
NASA Astrophysics Data System (ADS)
Tsukerman, Igor
2017-05-01
Future research in electrodynamics of periodic electromagnetic composites (metamaterials) can be expected to produce sophisticated homogenization theories valid for any composition and size of the lattice cell. The paper outlines a promising path in that direction, leading to non-asymptotic and nonlocal homogenization models, and highlights aspects of homogenization that are often overlooked: the finite size of the sample and the role of interface boundaries. Classical theories (e.g. Clausius-Mossotti, Maxwell Garnett), while originally derived from a very different set of ideas, fit well into the proposed framework. Nonlocal effects can be included in the model, making an order-of-magnitude accuracy improvements possible. One future challenge is to determine what effective parameters can or cannot be obtained for a given set of constituents of a metamaterial lattice cell, thereby delineating the possible from the impossible in metamaterial design.
Laboratory Investigation of Space and Planetary Dust Grains
NASA Technical Reports Server (NTRS)
Spann, James
2005-01-01
Dust in space is ubiquitous and impacts diverse observed phenomena in various ways. Understanding the dominant mechanisms that control dust grain properties and its impact on surrounding environments is basic to improving our understanding observed processes at work in space. There is a substantial body of work on the theory and modeling of dust in space and dusty plasmas. To substantiate and validate theory and models, laboratory investigations and space borne observations have been conducted. Laboratory investigations are largely confined to an assembly of dust grains immersed in a plasma environment. Frequently the behaviors of these complex dusty plasmas in the laboratory have raised more questions than verified theories. Space borne observations have helped us characterize planetary environments. The complex behavior of dust grains in space indicates the need to understand the microphysics of individual grains immersed in a plasma or space environment.
A Study of Hierarchical Classification in Concrete and Formal Thought.
ERIC Educational Resources Information Center
Lowell, Walter E.
This researcher investigated the relationship of hierarchical classification processes in subjects categorized as to developmental level as defined by Piaget's theory, and explored the validity of the hierarchical model and test used in the study. A hierarchical classification test and a battery of four Piaget-type tasks were administered…
Adaptive Patterns of Stress Responsivity: A Preliminary Investigation
ERIC Educational Resources Information Center
Del Giudice, Marco; Hinnant, J. Benjamin; Ellis, Bruce J.; El-Sheikh, Mona
2012-01-01
The adaptive calibration model (ACM) is an evolutionary-developmental theory of individual differences in stress responsivity. In this article, we tested some key predictions of the ACM in a middle childhood sample (N = 256). Measures of autonomic nervous system activity across the sympathetic and parasympathetic branches validated the 4-pattern…
Self-Concept Disconfirmation, Psychological Distress, and Marital Happiness.
ERIC Educational Resources Information Center
Schafer, Robert B.; And Others
1996-01-01
Uses self-verification and self-discrepancy theories to test a model of subjective and objective self-disconfirmation, self-efficacy, depression, and marital happiness. Expands issues of self-validation by evaluating self-efficacy in the relationship between self-disconfirmation and depression, and the effect of self-concept disconfirmation of…
Change Detection, Multiple Controllers, and Dynamic Environments: Insights from the Brain
ERIC Educational Resources Information Center
Pearson, John M.; Platt, Michael L.
2013-01-01
Foundational studies in decision making focused on behavior as the most accessible and reliable data on which to build theories of choice. More recent work, however, has incorporated neural data to provide insights unavailable from behavior alone. Among other contributions, these studies have validated reinforcement learning models by…
Ethnic Differences in Decisional Balance and Stages of Mammography Adoption
ERIC Educational Resources Information Center
Otero-Sabogal, Regina; Stewart, Susan; Shema, Sarah J.; Pasick, Rena J.
2007-01-01
Behavioral theories developed through research with mainstream, English-speaking populations have been applied to ethnically diverse and underserved communities in the effort to eliminate disparities in early breast cancer detection. This study tests the validity of the transtheoretical model (TTM) decisional balance measure and the application of…
A low-dimensional analogue of holographic baryons
NASA Astrophysics Data System (ADS)
Bolognesi, Stefano; Sutcliffe, Paul
2014-04-01
Baryons in holographic QCD correspond to topological solitons in the bulk. The most prominent example is the Sakai-Sugimoto model, where the bulk soliton in the five-dimensional spacetime of AdS-type can be approximated by the flat space self-dual Yang-Mills instanton with a small size. Recently, the validity of this approximation has been verified by comparison with the numerical field theory solution. However, multi-solitons and solitons with finite density are currently beyond numerical field theory computations. Various approximations have been applied to investigate these important issues and have led to proposals for finite density configurations that include dyonic salt and baryonic popcorn. Here we introduce and investigate a low-dimensional analogue of the Sakai-Sugimoto model, in which the bulk soliton can be approximated by a flat space sigma model instanton. The bulk theory is a baby Skyrme model in a three-dimensional spacetime with negative curvature. The advantage of the lower-dimensional theory is that numerical simulations of multi-solitons and finite density solutions can be performed and compared with flat space instanton approximations. In particular, analogues of dyonic salt and baryonic popcorn configurations are found and analysed.
Theory and experiments in model-based space system anomaly management
NASA Astrophysics Data System (ADS)
Kitts, Christopher Adam
This research program consists of an experimental study of model-based reasoning methods for detecting, diagnosing and resolving anomalies that occur when operating a comprehensive space system. Using a first principles approach, several extensions were made to the existing field of model-based fault detection and diagnosis in order to develop a general theory of model-based anomaly management. Based on this theory, a suite of algorithms were developed and computationally implemented in order to detect, diagnose and identify resolutions for anomalous conditions occurring within an engineering system. The theory and software suite were experimentally verified and validated in the context of a simple but comprehensive, student-developed, end-to-end space system, which was developed specifically to support such demonstrations. This space system consisted of the Sapphire microsatellite which was launched in 2001, several geographically distributed and Internet-enabled communication ground stations, and a centralized mission control complex located in the Space Technology Center in the NASA Ames Research Park. Results of both ground-based and on-board experiments demonstrate the speed, accuracy, and value of the algorithms compared to human operators, and they highlight future improvements required to mature this technology.
Determinants of voluntary carbon disclosure in the corporate real estate sector of Malaysia.
Kalu, Joseph Ufere; Buang, Alias; Aliagha, Godwin Uche
2016-11-01
Corporate real estate management holds the tent that risk which is not understood cannot be measured or managed. The effect of global warming on real estate investment and need for climate change mitigation through disclosures by companies of carbon emission information has becomes a sine-qua-non for the management of companies' carbon footprint and reducing its overall effect on global warming. This study applied the structural equation modeling technique to determine the determinants influencing Carbon Disclosure in Real Estate Companies in a developing economy. The analysis was based on 2013 annual reports of 126 property sector companies listed in Malaysia stock exchange market. The model was validated through convergent validity, discriminant validity, composite reliability and goodness of fit. The result reveals that social and financial market were critical determinant factors for carbon disclosure while the economic and institutional factors did not achieve significant effect on voluntary carbon disclosure. The result is consistent with legitimacy theory and agency theories. The implication of this finding is that increase in public education and awareness will enhance community demand for disclosure from companies and they will increase level of disclosure; also as financial institutions consider sustainability practice as a viable investment and term for credit financing, companies will be motivated to increase disclosure. Copyright © 2016 Elsevier Ltd. All rights reserved.
Zeng, Liang; Proctor, Robert W; Salvendy, Gavriel
2011-06-01
This research is intended to empirically validate a general model of creative product and service development proposed in the literature. A current research gap inspired construction of a conceptual model to capture fundamental phases and pertinent facilitating metacognitive strategies in the creative design process. The model also depicts the mechanism by which design creativity affects consumer behavior. The validity and assets of this model have not yet been investigated. Four laboratory studies were conducted to demonstrate the value of the proposed cognitive phases and associated metacognitive strategies in the conceptual model. Realistic product and service design problems were used in creativity assessment to ensure ecological validity. Design creativity was enhanced by explicit problem analysis, whereby one formulates problems from different perspectives and at different levels of abstraction. Remote association in conceptual combination spawned more design creativity than did near association. Abstraction led to greater creativity in conducting conceptual expansion than did specificity, which induced mental fixation. Domain-specific knowledge and experience enhanced design creativity, indicating that design can be of a domain-specific nature. Design creativity added integrated value to products and services and positively influenced customer behavior. The validity and value of the proposed conceptual model is supported by empirical findings. The conceptual model of creative design could underpin future theory development. Propositions advanced in this article should provide insights and approaches to facilitate organizations pursuing product and service creativity to gain competitive advantage.
On multiscale moving contact line theory.
Li, Shaofan; Fan, Houfu
2015-07-08
In this paper, a multiscale moving contact line (MMCL) theory is presented and employed to simulate liquid droplet spreading and capillary motion. The proposed MMCL theory combines a coarse-grained adhesive contact model with a fluid interface membrane theory, so that it can couple molecular scale adhesive interaction and surface tension with hydrodynamics of microscale flow. By doing so, the intermolecular force, the van der Waals or double layer force, separates and levitates the liquid droplet from the supporting solid substrate, which avoids the shear stress singularity caused by the no-slip condition in conventional hydrodynamics theory of moving contact line. Thus, the MMCL allows the difference of the surface energies and surface stresses to drive droplet spreading naturally. To validate the proposed MMCL theory, we have employed it to simulate droplet spreading over various elastic substrates. The numerical simulation results obtained by using MMCL are in good agreement with the molecular dynamics results reported in the literature.
Scalco, Andrea; Ceschi, Andrea; Sartori, Riccardo
2018-01-01
It is likely that computer simulations will assume a greater role in the next future to investigate and understand reality (Rand & Rust, 2011). Particularly, agent-based models (ABMs) represent a method of investigation of social phenomena that blend the knowledge of social sciences with the advantages of virtual simulations. Within this context, the development of algorithms able to recreate the reasoning engine of autonomous virtual agents represents one of the most fragile aspects and it is indeed crucial to establish such models on well-supported psychological theoretical frameworks. For this reason, the present work discusses the application case of the theory of planned behavior (TPB; Ajzen, 1991) in the context of agent-based modeling: It is argued that this framework might be helpful more than others to develop a valid representation of human behavior in computer simulations. Accordingly, the current contribution considers issues related with the application of the model proposed by the TPB inside computer simulations and suggests potential solutions with the hope to contribute to shorten the distance between the fields of psychology and computer science.
Adaptive modeling, identification, and control of dynamic structural systems. I. Theory
Safak, Erdal
1989-01-01
A concise review of the theory of adaptive modeling, identification, and control of dynamic structural systems based on discrete-time recordings is presented. Adaptive methods have four major advantages over the classical methods: (1) Removal of the noise from the signal is done over the whole frequency band; (2) time-varying characteristics of systems can be tracked; (3) systems with unknown characteristics can be controlled; and (4) a small segment of the data is needed during the computations. Included in the paper are the discrete-time representation of single-input single-output (SISO) systems, models for SISO systems with noise, the concept of stochastic approximation, recursive prediction error method (RPEM) for system identification, and the adaptive control. Guidelines for model selection and model validation and the computational aspects of the method are also discussed in the paper. The present paper is the first of two companion papers. The theory given in the paper is limited to that which is necessary to follow the examples for applications in structural dynamics presented in the second paper.
Generalizability Theory and the Fair and Valid Assessment of Linguistic Minorities
ERIC Educational Resources Information Center
Solano-Flores, Guillermo; Li, Min
2013-01-01
We discuss generalizability (G) theory and the fair and valid assessment of linguistic minorities, especially emergent bilinguals. G theory allows examination of the relationship between score variation and language variation (e.g., variation of proficiency across languages, language modes, and social contexts). Studies examining score variation…
Center for Modeling of Turbulence and Transition (CMOTT): Research Briefs, 1992
NASA Technical Reports Server (NTRS)
Liou, William W. (Editor)
1992-01-01
The progress is reported of the Center for Modeling of Turbulence and Transition (CMOTT). The main objective of the CMOTT is to develop, validate and implement the turbulence and transition models for practical engineering flows. The flows of interest are three-dimensional, incompressible and compressible flows with chemical reaction. The research covers two-equation (e.g., k-e) and algebraic Reynolds-stress models, second moment closure models, probability density function (pdf) models, Renormalization Group Theory (RNG), Large Eddy Simulation (LES) and Direct Numerical Simulation (DNS).
Validation of the Mindful Coping Scale
ERIC Educational Resources Information Center
Tharaldsen, Kjersti B.; Bru, Edvin
2011-01-01
The aim of this research is to develop and validate a self-report measure of mindfulness and coping, the mindful coping scale (MCS). Dimensions of mindful coping were theoretically deduced from mindfulness theory and coping theory. The MCS was empirically evaluated by use of factor analyses, reliability testing and nomological network validation.…
Mangaraj, S; K Goswami, T; Mahajan, P V
2015-07-01
MAP is a dynamic system where respiration of the packaged product and gas permeation through the packaging film takes place simultaneously. The desired level of O2 and CO2 in a package is achieved by matching film permeation rates for O2 and CO2 with respiration rate of the packaged product. A mathematical model for MAP of fresh fruits applying enzyme kinetics based respiration equation coupled with the Arrhenious type model was developed. The model was solved numerically using MATLAB programme. The model was used to determine the time to reach to the equilibrium concentration inside the MA package and the level of O2 and CO2 concentration at equilibrium state. The developed model for prediction of equilibrium O2 and CO2 concentration was validated using experimental data for MA packaging of apple, guava and litchi.
Sierra-Fitzgerald, O; Quevedo-Caicedo, J
The aim of this article is to relate two theories regarding the structure of the human mind. We suggest that the theory of multiple intelligences, a neurocognitive theory of the psychologist Howard Garnerd provides a suitable context for theoretical understanding and validation of the hypothesis of the pathology of superiority, a neuropsychological hypothesis formulated by the neuropsychologists Norman Geschwind and Albert Galaburda. Similarly, we show that, apart from being a context, the first theory enriches the second. We review the essential elements of both theories together with the arguments for them so that the reader may judge for himself. Similarly we review the factors determining intelligence; the association between neuropathology and intellectual dysfunction, general and specific, and the new directions in the understanding of human cognition. We propose to consider the first theory as a fertile ambit and broad methodological framework for investigation in neuropsychology. This simultaneously shows the relevance of including neuropsychological investigation in broader cognitive and neuropsychological theories and models.
Estimation of Critical Gap Based on Raff's Definition
Guo, Rui-jun; Wang, Xiao-jing; Wang, Wan-xiang
2014-01-01
Critical gap is an important parameter used to calculate the capacity and delay of minor road in gap acceptance theory of unsignalized intersections. At an unsignalized intersection with two one-way traffic flows, it is assumed that two events are independent between vehicles' arrival of major stream and vehicles' arrival of minor stream. The headways of major stream follow M3 distribution. Based on Raff's definition of critical gap, two calculation models are derived, which are named M3 definition model and revised Raff's model. Both models use total rejected coefficient. Different calculation models are compared by simulation and new models are found to be valid. The conclusion reveals that M3 definition model is simple and valid. Revised Raff's model strictly obeys the definition of Raff's critical gap and its application field is more extensive than Raff's model. It can get a more accurate result than the former Raff's model. The M3 definition model and revised Raff's model can derive accordant result. PMID:25574160
Estimation of critical gap based on Raff's definition.
Guo, Rui-jun; Wang, Xiao-jing; Wang, Wan-xiang
2014-01-01
Critical gap is an important parameter used to calculate the capacity and delay of minor road in gap acceptance theory of unsignalized intersections. At an unsignalized intersection with two one-way traffic flows, it is assumed that two events are independent between vehicles' arrival of major stream and vehicles' arrival of minor stream. The headways of major stream follow M3 distribution. Based on Raff's definition of critical gap, two calculation models are derived, which are named M3 definition model and revised Raff's model. Both models use total rejected coefficient. Different calculation models are compared by simulation and new models are found to be valid. The conclusion reveals that M3 definition model is simple and valid. Revised Raff's model strictly obeys the definition of Raff's critical gap and its application field is more extensive than Raff's model. It can get a more accurate result than the former Raff's model. The M3 definition model and revised Raff's model can derive accordant result.
A Systematic Review of Rural, Theory-based Physical Activity Interventions.
Walsh, Shana M; Meyer, M Renée Umstattd; Gamble, Abigail; Patterson, Megan S; Moore, Justin B
2017-05-01
This systematic review synthesized the scientific literature on theory-based physical activity (PA) interventions in rural populations. PubMed, PsycINFO, and Web of Science databases were searched to identify studies with a rural study sample, PA as a primary outcome, use of a behavioral theory or model, randomized or quasi-experimental research design, and application at the primary and/or secondary level of prevention. Thirty-one studies met our inclusion criteria. The Social Cognitive Theory (N = 14) and Transtheoretical Model (N = 10) were the most frequently identified theories; however, most intervention studies were informed by theory but lacked higher-level theoretical application and testing. Interventions largely took place in schools (N = 10) and with female-only samples (N = 8). Findings demonstrated that theory-based PA interventions are mostly successful at increasing PA in rural populations but require improvement. Future studies should incorporate higher levels of theoretical application, and should explore adapting or developing rural-specific theories. Study designs should employ more rigorous research methods to decrease bias and increase validity of findings. Follow-up assessments to determine behavioral maintenance and/or intervention sustainability are warranted. Finally, funding agencies and journals are encouraged to adopt rural-urban commuting area codes as the standard for defining rural.
Pattern activation/recognition theory of mind
du Castel, Bertrand
2015-01-01
In his 2012 book How to Create a Mind, Ray Kurzweil defines a “Pattern Recognition Theory of Mind” that states that the brain uses millions of pattern recognizers, plus modules to check, organize, and augment them. In this article, I further the theory to go beyond pattern recognition and include also pattern activation, thus encompassing both sensory and motor functions. In addition, I treat checking, organizing, and augmentation as patterns of patterns instead of separate modules, therefore handling them the same as patterns in general. Henceforth I put forward a unified theory I call “Pattern Activation/Recognition Theory of Mind.” While the original theory was based on hierarchical hidden Markov models, this evolution is based on their precursor: stochastic grammars. I demonstrate that a class of self-describing stochastic grammars allows for unifying pattern activation, recognition, organization, consistency checking, metaphor, and learning, into a single theory that expresses patterns throughout. I have implemented the model as a probabilistic programming language specialized in activation/recognition grammatical and neural operations. I use this prototype to compute and present diagrams for each stochastic grammar and corresponding neural circuit. I then discuss the theory as it relates to artificial network developments, common coding, neural reuse, and unity of mind, concluding by proposing potential paths to validation. PMID:26236228
Pattern activation/recognition theory of mind.
du Castel, Bertrand
2015-01-01
In his 2012 book How to Create a Mind, Ray Kurzweil defines a "Pattern Recognition Theory of Mind" that states that the brain uses millions of pattern recognizers, plus modules to check, organize, and augment them. In this article, I further the theory to go beyond pattern recognition and include also pattern activation, thus encompassing both sensory and motor functions. In addition, I treat checking, organizing, and augmentation as patterns of patterns instead of separate modules, therefore handling them the same as patterns in general. Henceforth I put forward a unified theory I call "Pattern Activation/Recognition Theory of Mind." While the original theory was based on hierarchical hidden Markov models, this evolution is based on their precursor: stochastic grammars. I demonstrate that a class of self-describing stochastic grammars allows for unifying pattern activation, recognition, organization, consistency checking, metaphor, and learning, into a single theory that expresses patterns throughout. I have implemented the model as a probabilistic programming language specialized in activation/recognition grammatical and neural operations. I use this prototype to compute and present diagrams for each stochastic grammar and corresponding neural circuit. I then discuss the theory as it relates to artificial network developments, common coding, neural reuse, and unity of mind, concluding by proposing potential paths to validation.
A Method of Q-Matrix Validation for the Linear Logistic Test Model
Baghaei, Purya; Hohensinn, Christine
2017-01-01
The linear logistic test model (LLTM) is a well-recognized psychometric model for examining the components of difficulty in cognitive tests and validating construct theories. The plausibility of the construct model, summarized in a matrix of weights, known as the Q-matrix or weight matrix, is tested by (1) comparing the fit of LLTM with the fit of the Rasch model (RM) using the likelihood ratio (LR) test and (2) by examining the correlation between the Rasch model item parameters and LLTM reconstructed item parameters. The problem with the LR test is that it is almost always significant and, consequently, LLTM is rejected. The drawback of examining the correlation coefficient is that there is no cut-off value or lower bound for the magnitude of the correlation coefficient. In this article we suggest a simulation method to set a minimum benchmark for the correlation between item parameters from the Rasch model and those reconstructed by the LLTM. If the cognitive model is valid then the correlation coefficient between the RM-based item parameters and the LLTM-reconstructed item parameters derived from the theoretical weight matrix should be greater than those derived from the simulated matrices. PMID:28611721
Holographic constraints on Bjorken hydrodynamics at finite coupling
NASA Astrophysics Data System (ADS)
DiNunno, Brandon S.; Grozdanov, Sašo; Pedraza, Juan F.; Young, Steve
2017-10-01
In large- N c conformal field theories with classical holographic duals, inverse coupling constant corrections are obtained by considering higher-derivative terms in the corresponding gravity theory. In this work, we use type IIB supergravity and bottom-up Gauss-Bonnet gravity to study the dynamics of boost-invariant Bjorken hydrodynamics at finite coupling. We analyze the time-dependent decay properties of non-local observables (scalar two-point functions and Wilson loops) probing the different models of Bjorken flow and show that they can be expressed generically in terms of a few field theory parameters. In addition, our computations provide an analytically quantifiable probe of the coupling-dependent validity of hydrodynamics at early times in a simple model of heavy-ion collisions, which is an observable closely analogous to the hydrodynamization time of a quark-gluon plasma. We find that to third order in the hydrodynamic expansion, the convergence of hydrodynamics is improved and that generically, as expected from field theory considerations and recent holographic results, the applicability of hydrodynamics is delayed as the field theory coupling decreases.
Thermal isomerization of azobenzenes: on the performance of Eyring transition state theory.
Rietze, Clemens; Titov, Evgenii; Lindner, Steven; Saalfrank, Peter
2017-08-09
The thermal [Formula: see text] (back-)isomerization of azobenzenes is a prototypical reaction occurring in molecular switches. It has been studied for decades, yet its kinetics is not fully understood. In this paper, quantum chemical calculations are performed to model the kinetics of an experimental benchmark system, where a modified azobenzene (AzoBiPyB) is embedded in a metal-organic framework (MOF). The molecule can be switched thermally from cis to trans, under solvent-free conditions. We critically test the validity of Eyring transition state theory for this reaction. As previously found for other azobenzenes (albeit in solution), good agreement between theory and experiment emerges for activation energies and activation free energies, already at a comparatively simple level of theory, B3LYP/6-31G * including dispersion corrections. However, theoretical Arrhenius prefactors and activation entropies are in qualitiative disagreement with experiment. Several factors are discussed that may have an influence on activation entropies, among them dynamical and geometric constraints (imposed by the MOF). For a simpler model-[Formula: see text] isomerization in azobenzene-a systematic test of quantum chemical methods from both density functional theory and wavefunction theory is carried out in the context of Eyring theory. Also, the effect of anharmonicities on activation entropies is discussed for this model system. Our work highlights capabilities and shortcomings of Eyring transition state theory and quantum chemical methods, when applied for the [Formula: see text] (back-)isomerization of azobenzenes under solvent-free conditions.
Zhou, Hongmei; Romero, Stephanie Ballon; Qin, Xiao
2016-10-01
This paper aimed to examine pedestrians' self-reported violating crossing behavior intentions by applying the theory of planned behavior (TPB). We studied the behavior intentions regarding instrumental attitude, subjective norm, perceived behavioral control, the three basic components of TPB, and extended the theory by adding new factors including descriptive norm, perceived risk and conformity tendency to evaluate their respective impacts on pedestrians' behavior intentions. A questionnaire presented with a scenario that pedestrians crossed the road violating the pedestrian lights at an intersection was designed, and the survey was conducted in Dalian, China. Based on the 260 complete and valid responses, reliability and validity of the data for each question was evaluated. The data were then analyzed by using the structural equation modeling (SEM). The results showed that people had a negative attitude toward the behavior of violating road-crossing rules; they perceived social influences from their family and friends; and they believed that this kind of risky behavior would potentially harm them in a traffic accident. The results also showed that instrumental attitude and subjective norm were significant in the basic TPB model. After adding descriptive norm, subjective norm was no more significant. Other models showed that conformity tendency was a strong predictor, indicating that the presence of other pedestrians would influence behavioral intention. The findings could help to design more effective interventions and safety campaigns, such as changing people's attitude toward this violation behavior, correcting the social norms, increasing their safety awareness, etc. in order to reduce pedestrians' road crossing violations. Copyright © 2015 Elsevier Ltd. All rights reserved.
Health belief model and reasoned action theory in predicting water saving behaviors in yazd, iran.
Morowatisharifabad, Mohammad Ali; Momayyezi, Mahdieh; Ghaneian, Mohammad Taghi
2012-01-01
People's behaviors and intentions about healthy behaviors depend on their beliefs, values, and knowledge about the issue. Various models of health education are used in deter¬mining predictors of different healthy behaviors but their efficacy in cultural behaviors, such as water saving behaviors, are not studied. The study was conducted to explain water saving beha¬viors in Yazd, Iran on the basis of Health Belief Model and Reasoned Action Theory. The cross-sectional study used random cluster sampling to recruit 200 heads of households to collect the data. The survey questionnaire was tested for its content validity and reliability. Analysis of data included descriptive statistics, simple correlation, hierarchical multiple regression. Simple correlations between water saving behaviors and Reasoned Action Theory and Health Belief Model constructs were statistically significant. Health Belief Model and Reasoned Action Theory constructs explained 20.80% and 8.40% of the variances in water saving beha-viors, respectively. Perceived barriers were the strongest Predictor. Additionally, there was a sta¬tistically positive correlation between water saving behaviors and intention. In designing interventions aimed at water waste prevention, barriers of water saving behaviors should be addressed first, followed by people's attitude towards water saving. Health Belief Model constructs, with the exception of perceived severity and benefits, is more powerful than is Reasoned Action Theory in predicting water saving behavior and may be used as a framework for educational interventions aimed at improving water saving behaviors.
Health Belief Model and Reasoned Action Theory in Predicting Water Saving Behaviors in Yazd, Iran
Morowatisharifabad, Mohammad Ali; Momayyezi, Mahdieh; Ghaneian, Mohammad Taghi
2012-01-01
Background: People's behaviors and intentions about healthy behaviors depend on their beliefs, values, and knowledge about the issue. Various models of health education are used in deter¬mining predictors of different healthy behaviors but their efficacy in cultural behaviors, such as water saving behaviors, are not studied. The study was conducted to explain water saving beha¬viors in Yazd, Iran on the basis of Health Belief Model and Reasoned Action Theory. Methods: The cross-sectional study used random cluster sampling to recruit 200 heads of households to collect the data. The survey questionnaire was tested for its content validity and reliability. Analysis of data included descriptive statistics, simple correlation, hierarchical multiple regression. Results: Simple correlations between water saving behaviors and Reasoned Action Theory and Health Belief Model constructs were statistically significant. Health Belief Model and Reasoned Action Theory constructs explained 20.80% and 8.40% of the variances in water saving beha-viors, respectively. Perceived barriers were the strongest Predictor. Additionally, there was a sta¬tistically positive correlation between water saving behaviors and intention. Conclusion: In designing interventions aimed at water waste prevention, barriers of water saving behaviors should be addressed first, followed by people's attitude towards water saving. Health Belief Model constructs, with the exception of perceived severity and benefits, is more powerful than is Reasoned Action Theory in predicting water saving behavior and may be used as a framework for educational interventions aimed at improving water saving behaviors. PMID:24688927
Intercomparison of granular stress and turbulence models for unidirectional sheet flow applications
NASA Astrophysics Data System (ADS)
Chauchat, J.; Cheng, Z.; Hsu, T. J.
2016-12-01
The intergranular stresses are one of the key elements in two-phase sediment transport models. There are two main existing approaches, the kinetic theory of granular flows (Jenkins and Hanes, 1998; Hsu et al., 2004) and the phenomenological rheology such as the one proposed by Bagnold (Hanes and Bowen, 1985) or the μ(I) dense granular flow rheology (Revil-Baudard and Chauchat, 2013). Concerning the turbulent Reynolds stress, mixing length and k-ɛ turbulence models have been validated by previous studies (Revil-Baudard and Chauchat, 2013; Hsu et al., 2004). Recently, sedFoam was developed based on kinetic theory of granular flows and k-ɛ turbulence models (Cheng and Hsu, 2014). In this study, we further extended sedFoam by implementing the mixing length and the dense granular flow rheology by following Revil-Baudard and Chauchat (2013). This allows us to objectively compare the different combinations of intergranular stresses (kinetic theory or the dense granular flow rheology) and turbulence models (mixing length or k-ɛ) under unidirectional sheet flow conditions. We found that the calibrated mixing length and k-ɛ models predicts similar velocity and concentration profiles. The differences observed between the kinetic theory and the dense granular flow rheology requires further investigation. In particular, we hypothesize that the extended kinetic theory proposed by Berzi (2011) would probably improve the existing combination of the kinetic theory with a simple Coulomb frictional model in sedFoam. A semi-analytical solution proposed by Berzi and Fraccarollo(2013) for sediment transport rate and sheet layer thickness versus the Shields number is compared with the results obtained by using the dense granular flow rheology and the mixing length model. The results are similar which demonstrate that both the extended kinetic theory and the dense granular flow rheology can be used to model intergranular stresses under sheet flow conditions.
Polymer Brushes under High Load
Balko, Suzanne M.; Kreer, Torsten; Costanzo, Philip J.; Patten, Tim E.; Johner, Albert; Kuhl, Tonya L.; Marques, Carlos M.
2013-01-01
Polymer coatings are frequently used to provide repulsive forces between surfaces in solution. After 25 years of design and study, a quantitative model to explain and predict repulsion under strong compression is still lacking. Here, we combine experiments, simulations, and theory to study polymer coatings under high loads and demonstrate a validated model for the repulsive forces, proposing that this universal behavior can be predicted from the polymer solution properties. PMID:23516470
Ji, Wenyu; Zhang, Letian; Gao, Ruixue; Zhang, Liming; Xie, Wenfa; Zhang, Hanzhuang; Li, Bin
2008-09-29
White top-emitting organic light-emitting devices (TEOLEDs) with down-conversion phosphors are investigated from theory and experiment. The theoretical simulation was described by combining the microcavity model with the down-conversion model. A White TEOLED by the combination of a blue TEOLED with organic down-conversion phosphor 3-(4-(diphenylamino)phenyl)-1-pheny1prop-2-en-1-one was fabricated to validate the simulated results. It is shown that this approach permits the generation of white light in TEOLEDs. The efficiency of the white TEOLED is twice over the corresponding blue TEOLED. The feasible methods to improve the performance of such white TEOLEDs are discussed.
Adams, Jeffrey M; Natarajan, Sudha
2016-01-01
Acquiring influence, and knowing how to use it, is a required competency for nurse leaders, yet the concept of influence and how it works is not well described in the nursing literature. In this article, the authors examine what is known about influence and present an influence model specific to nurse leaders. The Adams Influence Model was developed through an iterative process and is based on a comprehensive review of the influence literature, expert commentary, multiple pilot studies, evaluation of nursing theories, and validation by an external data source. Rather than defining "how to" influence, the model serves as a guide for personal reflection, helping nurse leaders understand and reflect on the influence process and factors, tactics, and strategies they can use when seeking to influence others.
Validation of the Learning Progression-based Assessment of Modern Genetics in a college context
NASA Astrophysics Data System (ADS)
Todd, Amber; Romine, William L.
2016-07-01
Building upon a methodologically diverse research foundation, we adapted and validated the Learning Progression-based Assessment of Modern Genetics (LPA-MG) for college students' knowledge of the domain. Toward collecting valid learning progression-based measures in a college majors context, we redeveloped and content validated a majority of a previous version of the LPA-MG which was developed for high school students. Using a Rasch model calibrated on 316 students from 2 sections of majors introductory biology, we demonstrate the validity of this version and describe how college students' ideas of modern genetics are likely to change as the students progress from low to high understanding. We then utilize these findings to build theory around the connections college students at different levels of understanding make within and across the many ideas within the domain.
Interaction of Theory and Practice to Assess External Validity.
Leviton, Laura C; Trujillo, Mathew D
2016-01-18
Variations in local context bedevil the assessment of external validity: the ability to generalize about effects of treatments. For evaluation, the challenges of assessing external validity are intimately tied to the translation and spread of evidence-based interventions. This makes external validity a question for decision makers, who need to determine whether to endorse, fund, or adopt interventions that were found to be effective and how to ensure high quality once they spread. To present the rationale for using theory to assess external validity and the value of more systematic interaction of theory and practice. We review advances in external validity, program theory, practitioner expertise, and local adaptation. Examples are provided for program theory, its adaptation to diverse contexts, and generalizing to contexts that have not yet been studied. The often critical role of practitioner experience is illustrated in these examples. Work is described that the Robert Wood Johnson Foundation is supporting to study treatment variation and context more systematically. Researchers and developers generally see a limited range of contexts in which the intervention is implemented. Individual practitioners see a different and often a wider range of contexts, albeit not a systematic sample. Organized and taken together, however, practitioner experiences can inform external validity by challenging the developers and researchers to consider a wider range of contexts. Researchers have developed a variety of ways to adapt interventions in light of such challenges. In systematic programs of inquiry, as opposed to individual studies, the problems of context can be better addressed. Evaluators have advocated an interaction of theory and practice for many years, but the process can be made more systematic and useful. Systematic interaction can set priorities for assessment of external validity by examining the prevalence and importance of context features and treatment variations. Practitioner interaction with researchers and developers can assist in sharpening program theory, reducing uncertainty about treatment variations that are consistent or inconsistent with the theory, inductively ruling out the ones that are harmful or irrelevant, and helping set priorities for more rigorous study of context and treatment variation. © The Author(s) 2016.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rajagopal, K.R.
The mechanics of the flowing granular materials such as coal, agricultural products, at deal of attention as it has fertilizers, dry chemicals, metal ores, etc. have received a great deal of attention as it has relevance to several important technological problems. Despite wide interest and more than five decades of experimental and theoretical investigations, most aspects of the behavior of flowing granular materials are still not well understood. So Experiments have to be devised which quantify and describe the non-linear behavior of the modular materials, and theories developed which can explain the experimentally observed facts. As many models have beenmore » suggested for describing the behavior of granular materials, from both continuum and kinetic theory viewpoints, we proposed to investigate the validity and usefulness of representative models from both the continuum and kinetic theory points of view, by determining the prediction of such a theory, in a representative flow, with respect to existence, non-existence, multiplicity and stability of solutions. The continuum model to be investigated is an outgrowth of a model due to Goodman and Cowin (1971, 1972) and the kinetic theory models being those due to Jenkins and Richman (1985) and Boyle and Massoudi (1989). In this report we present detailed results regarding the same. Interestingly, we find that the predictions of all the theories, in certain parameter space associated with these models, are qualitatively similar. This ofcourse depends on the values assumed for various material parameters in the models, which as yet are unknown, as reliable experiments have not been carried out as yet for their determination.« less
Construct validity of the five factor borderline inventory.
DeShong, Hilary L; Lengel, Gregory J; Sauer-Zavala, Shannon E; O'Meara, Madison; Mullins-Sweatt, Stephanie N
2015-06-01
The Five Factor Borderline Inventory (FFBI) is a new self-report measure developed to assess traits of borderline personality disorder (BPD) from the perspective of the Five Factor Model of general personality. The current study sought to first replicate initial validity findings for the FFBI and then to further validate the FFBI with predispositional risk factors of the biosocial theory of BPD and with commonly associated features of BPD (e.g., depression, low self-esteem) utilizing two samples of young adults (N = 87; 85) who have engaged in nonsuicidal self-injury. The FFBI showed strong convergent and discriminant validity across two measures of the Five Factor Model and also correlated strongly with measures of impulsivity, emotion dysregulation, and BPD. The FFBI also related to two measures of early childhood emotional vulnerability and parental invalidation and measures of depression, anxiety, and self-esteem. Overall, the results provide support for the FFBI as a measure of BPD. © The Author(s) 2014.
Exact Mass-Coupling Relation for the Homogeneous Sine-Gordon Model.
Bajnok, Zoltán; Balog, János; Ito, Katsushi; Satoh, Yuji; Tóth, Gábor Zsolt
2016-05-06
We derive the exact mass-coupling relation of the simplest multiscale quantum integrable model, i.e., the homogeneous sine-Gordon model with two mass scales. The relation is obtained by comparing the perturbed conformal field theory description of the model valid at short distances to the large distance bootstrap description based on the model's integrability. In particular, we find a differential equation for the relation by constructing conserved tensor currents, which satisfy a generalization of the Θ sum rule Ward identity. The mass-coupling relation is written in terms of hypergeometric functions.
A conceptual framework for organismal biology: linking theories, models, and data.
Zamer, William E; Scheiner, Samuel M
2014-11-01
Implicit or subconscious theory is especially common in the biological sciences. Yet, theory plays a variety of roles in scientific inquiry. First and foremost, it determines what does and does not count as a valid or interesting question or line of inquiry. Second, theory determines the background assumptions within which inquiries are pursued. Third, theory provides linkages among disciplines. For these reasons, it is important and useful to develop explicit theories for biology. A general theory of organisms is developed, which includes 10 fundamental principles that apply to all organisms, and 6 that apply to multicellular organisms only. The value of a general theory comes from its utility to help guide the development of more specific theories and models. That process is demonstrated by examining two domains: ecoimmunology and development. For the former, a constitutive theory of ecoimmunology is presented, and used to develop a specific model that explains energetic trade-offs that may result from an immunological response of a host to a pathogen. For the latter, some of the issues involved in trying to devise a constitutive theory that covers all of development are explored, and a more narrow theory of phenotypic novelty is presented. By its very nature, little of a theory of organisms will be new. Rather, the theory presented here is a formal expression of nearly two centuries of conceptual advances and practice in research. Any theory is dynamic and subject to debate and change. Such debate will occur as part of the present, initial formulation, as the ideas presented here are refined. The very process of debating the form of the theory acts to clarify thinking. The overarching goal is to stimulate debate about the role of theory in the study of organisms, and thereby advance our understanding of them. Published by Oxford University Press on behalf of the Society for Integrative and Comparative Biology 2014. This work is written by US Government employees and is in the public domain in the US.
Testing Software Development Project Productivity Model
NASA Astrophysics Data System (ADS)
Lipkin, Ilya
Software development is an increasingly influential factor in today's business environment, and a major issue affecting software development is how an organization estimates projects. If the organization underestimates cost, schedule, and quality requirements, the end results will not meet customer needs. On the other hand, if the organization overestimates these criteria, resources that could have been used more profitably will be wasted. There is no accurate model or measure available that can guide an organization in a quest for software development, with existing estimation models often underestimating software development efforts as much as 500 to 600 percent. To address this issue, existing models usually are calibrated using local data with a small sample size, with resulting estimates not offering improved cost analysis. This study presents a conceptual model for accurately estimating software development, based on an extensive literature review and theoretical analysis based on Sociotechnical Systems (STS) theory. The conceptual model serves as a solution to bridge organizational and technological factors and is validated using an empirical dataset provided by the DoD. Practical implications of this study allow for practitioners to concentrate on specific constructs of interest that provide the best value for the least amount of time. This study outlines key contributing constructs that are unique for Software Size E-SLOC, Man-hours Spent, and Quality of the Product, those constructs having the largest contribution to project productivity. This study discusses customer characteristics and provides a framework for a simplified project analysis for source selection evaluation and audit task reviews for the customers and suppliers. Theoretical contributions of this study provide an initial theory-based hypothesized project productivity model that can be used as a generic overall model across several application domains such as IT, Command and Control, Simulation and etc... This research validates findings from previous work concerning software project productivity and leverages said results in this study. The hypothesized project productivity model provides statistical support and validation of expert opinions used by practitioners in the field of software project estimation.
ERIC Educational Resources Information Center
Acevedo-Gil, Nancy; Solorzano, Daniel G.; Santos, Ryan E.
2014-01-01
This qualitative study examines the experiences of Latinas/os in community college English and math developmental education courses. Critical race theory in education and the theory of validation serve as guiding frameworks. The authors find that institutional agents provide academic validation by emphasizing high expectations, focusing on social…
ERIC Educational Resources Information Center
Acevedo-Gil, Nancy; Santos, Ryan E.; Alonso, LLuliana; Solorzano, Daniel G.
2015-01-01
This qualitative study examines the experiences of Latinas/os in community college English and math developmental education courses. Critical race theory in education and the theory of validation serve as guiding frameworks. The authors find that institutional agents provide academic validation by emphasizing high expectations, focusing on social…
NASA Astrophysics Data System (ADS)
Alloui, Mebarka; Belaidi, Salah; Othmani, Hasna; Jaidane, Nejm-Eddine; Hochlaf, Majdi
2018-03-01
We performed benchmark studies on the molecular geometry, electron properties and vibrational analysis of imidazole using semi-empirical, density functional theory and post Hartree-Fock methods. These studies validated the use of AM1 for the treatment of larger systems. Then, we treated the structural, physical and chemical relationships for a series of imidazole derivatives acting as angiotensin II AT1 receptor blockers using AM1. QSAR studies were done for these imidazole derivatives using a combination of various physicochemical descriptors. A multiple linear regression procedure was used to design the relationships between molecular descriptor and the activity of imidazole derivatives. Results validate the derived QSAR model.
Experimental validation of a linear model for data reduction in chirp-pulse microwave CT.
Miyakawa, M; Orikasa, K; Bertero, M; Boccacci, P; Conte, F; Piana, M
2002-04-01
Chirp-pulse microwave computerized tomography (CP-MCT) is an imaging modality developed at the Department of Biocybernetics, University of Niigata (Niigata, Japan), which intends to reduce the microwave-tomography problem to an X-ray-like situation. We have recently shown that data acquisition in CP-MCT can be described in terms of a linear model derived from scattering theory. In this paper, we validate this model by showing that the theoretically computed response function is in good agreement with the one obtained from a regularized multiple deconvolution of three data sets measured with the prototype of CP-MCT. Furthermore, the reliability of the model as far as image restoration in concerned, is tested in the case of space-invariant conditions by considering the reconstruction of simple on-axis cylindrical phantoms.
Contextual Advantage for State Discrimination
NASA Astrophysics Data System (ADS)
Schmid, David; Spekkens, Robert W.
2018-02-01
Finding quantitative aspects of quantum phenomena which cannot be explained by any classical model has foundational importance for understanding the boundary between classical and quantum theory. It also has practical significance for identifying information processing tasks for which those phenomena provide a quantum advantage. Using the framework of generalized noncontextuality as our notion of classicality, we find one such nonclassical feature within the phenomenology of quantum minimum-error state discrimination. Namely, we identify quantitative limits on the success probability for minimum-error state discrimination in any experiment described by a noncontextual ontological model. These constraints constitute noncontextuality inequalities that are violated by quantum theory, and this violation implies a quantum advantage for state discrimination relative to noncontextual models. Furthermore, our noncontextuality inequalities are robust to noise and are operationally formulated, so that any experimental violation of the inequalities is a witness of contextuality, independently of the validity of quantum theory. Along the way, we introduce new methods for analyzing noncontextuality scenarios and demonstrate a tight connection between our minimum-error state discrimination scenario and a Bell scenario.
Dissipative time-dependent quantum transport theory.
Zhang, Yu; Yam, Chi Yung; Chen, GuanHua
2013-04-28
A dissipative time-dependent quantum transport theory is developed to treat the transient current through molecular or nanoscopic devices in presence of electron-phonon interaction. The dissipation via phonon is taken into account by introducing a self-energy for the electron-phonon coupling in addition to the self-energy caused by the electrodes. Based on this, a numerical method is proposed. For practical implementation, the lowest order expansion is employed for the weak electron-phonon coupling case and the wide-band limit approximation is adopted for device and electrodes coupling. The corresponding hierarchical equation of motion is derived, which leads to an efficient and accurate time-dependent treatment of inelastic effect on transport for the weak electron-phonon interaction. The resulting method is applied to a one-level model system and a gold wire described by tight-binding model to demonstrate its validity and the importance of electron-phonon interaction for the quantum transport. As it is based on the effective single-electron model, the method can be readily extended to time-dependent density functional theory.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jin, Xinfang; White, Ralph E.; Huang, Kevin
With the assumption that the Fermi level (electrochemical potential of electrons) is uniform across the thickness of a mixed ionic and electronic conducting (MIEC) electrode, the charge-transport model in the electrode domain can be reduced to the modified Fick’s first law, which includes a thermodynamic factor A. A transient numerical solution of the Nernst-Planck theory was obtained for a symmetric cell with MIEC electrodes to illustrate the validity of the assumption of a uniform Fermi level. Subsequently, an impedance numerical solution based on the modified Fick’s first law is compared with that from the Nernst-Planck theory. The results show thatmore » Nernst-Planck charge-transport model is essentially the same as the modified Fick’s first law model as long as the MIEC electrodes have a predominant electronic conductivity. However, because of the invalidity of the uniform Fermi level assumption for aMIEC electrolyte with a predominant ionic conductivity, Nernst-Planck theory is needed to describe the charge transport behaviors.« less
NASA Astrophysics Data System (ADS)
Melendez, Jordan; Wesolowski, Sarah; Furnstahl, Dick
2017-09-01
Chiral effective field theory (EFT) predictions are necessarily truncated at some order in the EFT expansion, which induces an error that must be quantified for robust statistical comparisons to experiment. A Bayesian model yields posterior probability distribution functions for these errors based on expectations of naturalness encoded in Bayesian priors and the observed order-by-order convergence pattern of the EFT. As a general example of a statistical approach to truncation errors, the model was applied to chiral EFT for neutron-proton scattering using various semi-local potentials of Epelbaum, Krebs, and Meißner (EKM). Here we discuss how our model can learn correlation information from the data and how to perform Bayesian model checking to validate that the EFT is working as advertised. Supported in part by NSF PHY-1614460 and DOE NUCLEI SciDAC DE-SC0008533.
Cultural Consensus Theory: Aggregating Continuous Responses in a Finite Interval
NASA Astrophysics Data System (ADS)
Batchelder, William H.; Strashny, Alex; Romney, A. Kimball
Cultural consensus theory (CCT) consists of cognitive models for aggregating responses of "informants" to test items about some domain of their shared cultural knowledge. This paper develops a CCT model for items requiring bounded numerical responses, e.g. probability estimates, confidence judgments, or similarity judgments. The model assumes that each item generates a latent random representation in each informant, with mean equal to the consensus answer and variance depending jointly on the informant and the location of the consensus answer. The manifest responses may reflect biases of the informants. Markov Chain Monte Carlo (MCMC) methods were used to estimate the model, and simulation studies validated the approach. The model was applied to an existing cross-cultural dataset involving native Japanese and English speakers judging the similarity of emotion terms. The results sharpened earlier studies that showed that both cultures appear to have very similar cognitive representations of emotion terms.
Copenhagen Psychosocial Questionnaire - A validation study using the Job Demand-Resources model.
Berthelsen, Hanne; Hakanen, Jari J; Westerlund, Hugo
2018-01-01
This study aims at investigating the nomological validity of the Copenhagen Psychosocial Questionnaire (COPSOQ II) by using an extension of the Job Demands-Resources (JD-R) model with aspects of work ability as outcome. The study design is cross-sectional. All staff working at public dental organizations in four regions of Sweden were invited to complete an electronic questionnaire (75% response rate, n = 1345). The questionnaire was based on COPSOQ II scales, the Utrecht Work Engagement scale, and the one-item Work Ability Score in combination with a proprietary item. The data was analysed by Structural Equation Modelling. This study contributed to the literature by showing that: A) The scale characteristics were satisfactory and the construct validity of COPSOQ instrument could be integrated in the JD-R framework; B) Job resources arising from leadership may be a driver of the two processes included in the JD-R model; and C) Both the health impairment and motivational processes were associated with WA, and the results suggested that leadership may impact WA, in particularly by securing task resources. In conclusion, the nomological validity of COPSOQ was supported as the JD-R model-can be operationalized by the instrument. This may be helpful for transferral of complex survey results and work life theories to practitioners in the field.
Final Scientific/Technical Report: Correlations and Fluctuations in Weakly Collisional Plasma
DOE Office of Scientific and Technical Information (OSTI.GOV)
Skiff, Frederick
Plasma is a state of matter that exhibits a very rich range of phenomena. To begin with, plasma is both electrical and mechanical - bringing together theories of particle motion and the electromagnetic field. Furthermore, and especially important for this project, a weakly-collisional plasma, such as is found in high-temperature (fusion energy) experiments on earth and the majority of contexts in space and astrophysics, has many moving parts. For example, sitting in earth’s atmosphere we are immersed in a mechanical wave field (sound), a possibly turbulent fluid motion (wind), and an electromagnetic vector wave field with two polarizations (light). Thismore » is already enough to produce a rich range of possibilities. In plasma, the electromagnetic field is coupled to the mechanical motion of the medium because it is ionized. Furthermore, a weakly-collisional plasma supports an infinite number of mechanically independent fluids. Thus, plasmas support an infinite number of independent electromechanical waves. Much has been done to describe plasmas with "reduced models" of various kinds. The goal of this project was to both explore the validity of reduced plasma models that are in use, and to propose and validate new models of plasma motion. The primary means to his end was laboratory experiments employing both electrical probes and laser spectroscopy. Laser spectroscopy enables many techniques which can separate the spectrum of independent fluid motions in the ion phase-space. The choice was to focus on low frequency electrostatic waves because the electron motion is relatively simple, the experiments can be on a spatial scale of a few meters, and all the relevant parameters can be measured with a few lasers systems. No study of this kind had previously been undertaken for the study of plasmas. The validation of theories required that the experimental descriptions be compared with theory and simulation in detail. It was found that even multi-fluid theories leave out a large part of the complexity of plasma motion. Reduced descriptions were found to fail under most circumstances. A new technique was developed that enabled a measurement of the phase-space resolved ion correlation function for the first time. The wide range of plasma dynamics possible became clear through this technique. It was found that collisionless (Vlasov) theory has a large field of application even when the plasma is weakly-collisional. A new approach, the kinetic wave expansion, was proposed, tested and found to be very useful for describing electrostatic ion waves. This project demonstrated a new way of looking at the "degrees-of-freedom" of plasmas and provided significant validation tests of fluid and kinetic plasma descriptions.« less
Sun, Peishi; Huang, Bing; Huang, Ruohua; Yang, Ping
2002-05-01
For the process of biopurifying waste gas containing VOC in low concentration by using a biological trickling filter, the related kinetic model and simulation of the new Adsorption-Biofilm theory were investigated in this study. By using the lab test data and the industrial test data, the results of contrast and validation indicated that the model had a good applicability for describing the practical bio-purification process of VOC waste gas. In the simulation study for the affection of main factor, such as the concentration of toluene in inlet gas, the gas flow and the height of biofilm-packing, a good pertinence was showed between calculated data and test dada, the interrelation coefficients were in 0.80-0.97.
Dynamics of DNA/intercalator complexes
NASA Astrophysics Data System (ADS)
Schurr, J. M.; Wu, Pengguang; Fujimoto, Bryant S.
1990-05-01
Complexes of linear and supercoiled DNAs with different intercalating dyes are studied by time-resolved fluorescence polarization anisotropy using intercalated ethidium as the probe. Existing theory is generalized to take account of excitation transfer between intercalated ethidiums, and Forster theory is shown to be valid in this context. The effects of intercalated ethidium, 9-aminoacridine, and proflavine on the torsional rigidity of linear and supercoiled DNAs are studied up to rather high binding ratios. Evidence is presented that metastable secondary structure persists in dye-relaxed supercoiled DNAs, which contradicts the standard model of supercoiled DNAs.
2014-01-01
Background Research has shown that nursing students find it difficult to translate and apply their theoretical knowledge in a clinical context. Virtual patients (VPs) have been proposed as a learning activity that can support nursing students in their learning of scientific knowledge and help them integrate theory and practice. Although VPs are increasingly used in health care education, they still lack a systematic consistency that would allow their reuse outside of their original context. There is therefore a need to develop a model for the development and implementation of VPs in nursing education. Objective The aim of this study was to develop and evaluate a virtual patient model optimized to the learning and assessment needs in nursing education. Methods The process of modeling started by reviewing theoretical frameworks reported in the literature and used by practitioners when designing learning and assessment activities. The Outcome-Present State Test (OPT) model was chosen as the theoretical framework. The model was then, in an iterative manner, developed and optimized to the affordances of virtual patients. Content validation was performed with faculty both in terms of the relevance of the chosen theories but also its applicability in nursing education. The virtual patient nursing model was then instantiated in two VPs. The students’ perceived usefulness of the VPs was investigated using a questionnaire. The result was analyzed using descriptive statistics. Results A virtual patient Nursing Design Model (vpNDM) composed of three layers was developed. Layer 1 contains the patient story and ways of interacting with the data, Layer 2 includes aspects of the iterative process of clinical reasoning, and finally Layer 3 includes measurable outcomes. A virtual patient Nursing Activity Model (vpNAM) was also developed as a guide when creating VP-centric learning activities. The students perceived the global linear VPs as a relevant learning activity for the integration of theory and practice. Conclusions Virtual patients that are adapted to the nursing paradigm can support nursing students’ development of clinical reasoning skills. The proposed virtual patient nursing design and activity models will allow the systematic development of different types of virtual patients from a common model and thereby create opportunities for sharing pedagogical designs across technical solutions. PMID:24727709
The pros and cons of code validation
NASA Technical Reports Server (NTRS)
Bobbitt, Percy J.
1988-01-01
Computational and wind tunnel error sources are examined and quantified using specific calculations of experimental data, and a substantial comparison of theoretical and experimental results, or a code validation, is discussed. Wind tunnel error sources considered include wall interference, sting effects, Reynolds number effects, flow quality and transition, and instrumentation such as strain gage balances, electronically scanned pressure systems, hot film gages, hot wire anemometers, and laser velocimeters. Computational error sources include math model equation sets, the solution algorithm, artificial viscosity/dissipation, boundary conditions, the uniqueness of solutions, grid resolution, turbulence modeling, and Reynolds number effects. It is concluded that, although improvements in theory are being made more quickly than in experiments, wind tunnel research has the advantage of the more realistic transition process of a right turbulence model in a free-transition test.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Juanes, Ruben
The overarching goal of this project was to develop a new continuum theory of multiphase flow in porous media. The theory follows a phase-field modeling approach, and therefore has a sound thermodynamical basis. It is a phenomenological theory in the sense that its formulation is driven by macroscopic phenomena, such as viscous instabilities during multifluid displacement. The research agenda was organized around a set of hypothesis on hitherto unexplained behavior of multiphase flow. All these hypothesis are nontrivial, and testable. Indeed, a central aspect of the project was testing each hypothesis by means of carefully-designed laboratory experiments, therefore probing themore » validity of the proposed theory. The proposed research places an emphasis on the fundamentals of flow physics, but is motivated by important energy-driven applications in earth sciences, as well as microfluidic technology.« less
Predicting fire frequency with chemistry and climate
Richard P. Guyette; Michael C. Stambaugh; Daniel C. Dey; Rose-Marie Muzika
2012-01-01
A predictive equation for estimating fire frequency was developed from theories and data in physical chemistry, ecosystem ecology, and climatology. We refer to this equation as the Physical Chemistry Fire Frequency Model (PC2FM). The equation was calibrated and validated with North American fire data (170 sites) prior to widespread industrial influences (before ...
ERIC Educational Resources Information Center
Colligan, Robert C.; And Others
1994-01-01
Developed bipolar Minnesota Multiphasic Personality Inventory (MMPI) Optimism-Pessimism (PSM) scale based on results on Content Analysis of Verbatim Explanation applied to MMPI. Reliability and validity indices show that PSM scale is highly accurate and consistent with Seligman's theory that pessimistic explanatory style predicts increased…
ERIC Educational Resources Information Center
Gabbard, Clinton E.; And Others
1986-01-01
Adaptive Counseling and Therapy (ACT) is an integrative, metatheoretical model for selecting an appropriate therapeutic style based on the task-relevant development maturity of the client. The Counselor Behavior Analysis (CBA) Scale measures the central explanatory construct of ACT theory: counselor adaptability. Three studies designed to assess…
ERIC Educational Resources Information Center
Fountain, Lily
2011-01-01
This cross-sectional descriptive study of the Model of Domain Learning, which describes learners' progress from acclimation through competence to proficiency through the interplay of knowledge, interest and strategic processing/critical thinking (CT), examined its extension to maternity nursing. Based on the identified need for valid, reliable…
Development and Validation of a Computer Adaptive EFL Test
ERIC Educational Resources Information Center
He, Lianzhen; Min, Shangchao
2017-01-01
The first aim of this study was to develop a computer adaptive EFL test (CALT) that assesses test takers' listening and reading proficiency in English with dichotomous items and polytomous testlets. We reported in detail on the development of the CALT, including item banking, determination of suitable item response theory (IRT) models for item…
The Subjective Well-Being Construct: A Test of Its Convergent, Discriminant, and Factorial Validity
ERIC Educational Resources Information Center
Arthaud-day, Marne L.; Rode, Joseph C.; Mooney, Christine H.; Near, Janet P.
2005-01-01
Using structural equation modeling, we found empirical support for the prevailing theory that subjective well-being consists of three domains: (1) cognitive evaluations of one's life (i.e., life satisfaction or happiness); (2) positive affect; and (3) negative affect. Multiple indicators of satisfaction/happiness were shown to have strong…
Answering the call: a tool that measures functional breast cancer literacy.
Williams, Karen Patricia; Templin, Thomas N; Hines, Resche D
2013-01-01
There is a need for health care providers and health care educators to ensure that the messages they communicate are understood. The purpose of this research was to test the reliability and validity, in a culturally diverse sample of women, of a revised Breast Cancer Literacy Assessment Tool (Breast-CLAT) designed to measure functional understanding of breast cancer in English, Spanish, and Arabic. Community health workers verbally administered the 35-item Breast-CLAT to 543 Black, Latina, and Arab American women. A confirmatory factor analysis using a 2-parameter item response theory model was used to test the proposed 3-factor Breast-CLAT (awareness, screening and knowledge, and prevention and control). The confirmatory factor analysis using a 2-parameter item response theory model had a good fit (TLI = .91, RMSEA = .04) to the proposed 3-factor structure. The total scale reliability ranged from .80 for Black participants to .73 for total culturally diverse sample. The three subscales were differentially predictive of family history of cancer. The revised Breast-CLAT scales demonstrated internal consistency reliability and validity in this multiethnic, community-based sample.
The refined Swampland Distance Conjecture in Calabi-Yau moduli spaces
NASA Astrophysics Data System (ADS)
Blumenhagen, Ralph; Klaewer, Daniel; Schlechter, Lorenz; Wolf, Florian
2018-06-01
The Swampland Distance Conjecture claims that effective theories derived from a consistent theory of quantum gravity only have a finite range of validity. This will imply drastic consequences for string theory model building. The refined version of this conjecture says that this range is of the order of the naturally built in scale, namely the Planck scale. It is investigated whether the Refined Swampland Distance Conjecture is consistent with proper field distances arising in the well understood moduli spaces of Calabi-Yau compactification. Investigating in particular the non-geometric phases of Kähler moduli spaces of dimension h 11 ∈ {1 , 2 , 101}, we always find proper field distances that are smaller than the Planck-length.
Interfacial Ordering and Accompanying Divergent Capacitance at Ionic Liquid-Metal Interfaces.
Limmer, David T
2015-12-18
A theory is constructed for dense ionic solutions near charged planar walls that is valid for strong interionic correlations. This theory predicts a fluctuation-induced, first-order transition and spontaneous charge density ordering at the interface, in the presence of an otherwise disordered bulk solution. The surface ordering is driven by applied voltage and results in an anomalous differential capacitance, in agreement with recent simulation results and consistent with experimental observations of a wide array of systems. Explicit forms for the charge density profile and capacitance are given. The theory is compared with numerical results for the charge frustrated Ising model, which is also found to exhibit a voltage driven first-order transition.
Interfacial Ordering and Accompanying Divergent Capacitance at Ionic Liquid-Metal Interfaces
NASA Astrophysics Data System (ADS)
Limmer, David T.
2015-12-01
A theory is constructed for dense ionic solutions near charged planar walls that is valid for strong interionic correlations. This theory predicts a fluctuation-induced, first-order transition and spontaneous charge density ordering at the interface, in the presence of an otherwise disordered bulk solution. The surface ordering is driven by applied voltage and results in an anomalous differential capacitance, in agreement with recent simulation results and consistent with experimental observations of a wide array of systems. Explicit forms for the charge density profile and capacitance are given. The theory is compared with numerical results for the charge frustrated Ising model, which is also found to exhibit a voltage driven first-order transition.
Pressure and wall shear stress in blood hammer - Analytical theory.
Mei, Chiang C; Jing, Haixiao
2016-10-01
We describe an analytical theory of blood hammer in a long and stiffened artery due to sudden blockage. Based on the model of a viscous fluid in laminar flow, we derive explicit expressions of oscillatory pressure and wall shear stress. To examine the effects on local plaque formation we also allow the blood vessel radius to be slightly nonuniform. Without resorting to discrete computation, the asymptotic method of multiple scales is utilized to deal with the sharp contrast of time scales. The effects of plaque and blocking time on blood pressure and wall shear stress are studied. The theory is validated by comparison with existing water hammer experiments. Copyright © 2016. Published by Elsevier Inc.
Theory of the Bloch oscillating transistor
NASA Astrophysics Data System (ADS)
Hassel, J.; Seppä, H.
2005-01-01
The Bloch oscillating transistor (BOT) is a device in which single electron current through a normal tunnel junction enhances Cooper pair current in a mesoscopic Josephson junction, leading to signal amplification. In this article we develop a theory in which the BOT dynamics is described as a two-level system. The theory is used to predict current-voltage characteristics and small-signal response. The transition from stable operation into the hysteretic regime is studied. By identifying the two-level switching noise as the main source of fluctuations, the expressions for equivalent noise sources and the noise temperature are derived. The validity of the model is tested by comparing the results with simulations and experiments.
An intelligent knowledge mining model for kidney cancer using rough set theory.
Durai, M A Saleem; Acharjya, D P; Kannan, A; Iyengar, N Ch Sriman Narayana
2012-01-01
Medical diagnosis processes vary in the degree to which they attempt to deal with different complicating aspects of diagnosis such as relative importance of symptoms, varied symptom pattern and the relation between diseases themselves. Rough set approach has two major advantages over the other methods. First, it can handle different types of data such as categorical, numerical etc. Secondly, it does not make any assumption like probability distribution function in stochastic modeling or membership grade function in fuzzy set theory. It involves pattern recognition through logical computational rules rather than approximating them through smooth mathematical functional forms. In this paper we use rough set theory as a data mining tool to derive useful patterns and rules for kidney cancer faulty diagnosis. In particular, the historical data of twenty five research hospitals and medical college is used for validation and the results show the practical viability of the proposed approach.
Event reweighting with the NuWro neutrino interaction generator
NASA Astrophysics Data System (ADS)
Pickering, Luke; Stowell, Patrick; Sobczyk, Jan
2017-09-01
Event reweighting has been implemented in the NuWro neutrino event generator for a number of free theory parameters in the interaction model. Event reweighting is a key analysis technique, used to efficiently study the effect of neutrino interaction model uncertainties. This opens up the possibility for NuWro to be used as a primary event generator by experimental analysis groups. A preliminary model tuning to ANL and BNL data of quasi-elastic and single pion production events was performed to validate the reweighting engine.
Ischemic stroke assessment with near-infrared spectroscopy
NASA Astrophysics Data System (ADS)
Chen, Weiguo; Li, Pengcheng; Zeng, Shaoqun; Luo, Qingming; Hu, Bo
1999-09-01
Many authors have elucidated the theory about oxygenated hemoglobin, deoxygenated hemoglobin absorption in near-infrared spectrum. And the theory has opened a window to measure the hemodynamic changes caused by stroke. However, no proper animal model still has established to confirm the theory. The aim of this study was to validate near-infrared cerebral topography (NCT) as a practical tool and to try to trace the focal hemodynamic changes of ischemic stroke. In the present study, middle cerebral artery occlusion model and the photosensitizer induced intracranial infarct model had been established. NCT and functional magnetic resonance image (fMRI) were obtained during pre- and post-operation. The geometric shape and infarct area of NCT image was compared with the fMRI images and anatomical samples of each rat. The results of two occlusion models in different intervene factors showed the NCT for infarct focus matched well with fMRI and anatomic sample of each rats. The instrument might become a practical tool for short-term prediction of stroke and predicting the rehabilitation after stroke in real time.
A model to evaluate quality and effectiveness of disease management.
Lemmens, K M M; Nieboer, A P; van Schayck, C P; Asin, J D; Huijsman, R
2008-12-01
Disease management has emerged as a new strategy to enhance quality of care for patients suffering from chronic conditions, and to control healthcare costs. So far, however, the effects of this strategy remain unclear. Although current models define the concept of disease management, they do not provide a systematic development or an explanatory theory of how disease management affects the outcomes of care. The objective of this paper is to present a framework for valid evaluation of disease-management initiatives. The evaluation model is built on two pillars of disease management: patient-related and professional-directed interventions. The effectiveness of these interventions is thought to be affected by the organisational design of the healthcare system. Disease management requires a multifaceted approach; hence disease-management programme evaluations should focus on the effects of multiple interventions, namely patient-related, professional-directed and organisational interventions. The framework has been built upon the conceptualisation of these disease-management interventions. Analysis of the underlying mechanisms of these interventions revealed that learning and behavioural theories support the core assumptions of disease management. The evaluation model can be used to identify the components of disease-management programmes and the mechanisms behind them, making valid comparison feasible. In addition, this model links the programme interventions to indicators that can be used to evaluate the disease-management programme. Consistent use of this framework will enable comparisons among disease-management programmes and outcomes in evaluation research.
Angelis, Alessia De; Pancani, Luca; Steca, Patrizia; Colaceci, Sofia; Giusti, Angela; Tibaldi, Laura; Alvaro, Rosaria; Ausili, Davide; Vellone, Ercole
2017-05-01
To test an explanatory model of nurses' intention to report adverse drug reactions in hospital settings, based on the theory of planned behaviour. Under-reporting of adverse drug reactions is an important problem among nurses. A cross-sectional design was used. Data were collected with the adverse drug reporting nurses' questionnaire. Confirmatory factor analysis was performed to test the factor validity of the adverse drug reporting nurses' questionnaire, and structural equation modelling was used to test the explanatory model. The convenience sample comprised 500 Italian hospital nurses (mean age = 43.52). Confirmatory factor analysis supported the factor validity of the adverse drug reporting nurses' questionnaire. The structural equation modelling showed a good fit with the data. Nurses' intention to report adverse drug reactions was significantly predicted by attitudes, subjective norms and perceived behavioural control (R² = 0.16). The theory of planned behaviour effectively explained the mechanisms behind nurses' intention to report adverse drug reactions, showing how several factors come into play. In a scenario of organisational empowerment towards adverse drug reaction reporting, the major predictors of the intention to report are support for the decision to report adverse drug reactions from other health care practitioners, perceptions about the value of adverse drug reaction reporting and nurses' favourable self-assessment of their adverse drug reaction reporting skills. © 2017 John Wiley & Sons Ltd.
De Vito, Francesca; Veytsman, Boris; Painter, Paul; Kokini, Jozef L
2015-03-06
Carbohydrates exhibit either van der Waals and ionic interactions or strong hydrogen bonding interactions. The prominence and large number of hydrogen bonds results in major contributions to phase behavior. A thermodynamic framework that accounts for hydrogen bonding interactions is therefore necessary. We have developed an extension of the thermodynamic model based on the Veytsman association theory to predict the contribution of hydrogen bonds to the behavior of glucose-water and dextran-water systems and we have calculated the free energy of mixing and its derivative leading to chemical potential and water activity. We compared our calculations with experimental data of water activity for glucose and dextran and found excellent agreement far superior to the Flory-Huggins theory. The validation of our calculations using experimental data demonstrated the validity of the Veytsman model in properly accounting for the hydrogen bonding interactions and successfully predicting water activity of glucose and dextran. Our calculations of the concentration of hydrogen bonds using the Veytsman model were instrumental in our ability to explain the difference between glucose and dextran and the role that hydrogen bonds play in contributing to these differences. The miscibility predictions showed that the Veytsman model is also able to correctly describe the phase behavior of glucose and dextran. Copyright © 2014 Elsevier Ltd. All rights reserved.
ERIC Educational Resources Information Center
Lievens, Filip; Chasteen, Christopher S.; Day, Eric Anthony; Christiansen, Neil D.
2006-01-01
This study used trait activation theory as a theoretical framework to conduct a large-scale test of the interactionist explanation of the convergent and discriminant validity findings obtained in assessment centers. Trait activation theory specifies the conditions in which cross-situationally consistent and inconsistent candidate performances are…
The Validation by Measurement Theory of Proposed Object-Oriented Software Metrics
NASA Technical Reports Server (NTRS)
Neal, Ralph D.
1996-01-01
Moving software development into the engineering arena requires controllability, and to control a process, it must be measurable. Measuring the process does no good if the product is not also measured, i.e., being the best at producing an inferior product does not define a quality process. Also, not every number extracted from software development is a valid measurement. A valid measurement only results when we are able to verify that the number is representative of the attribute that we wish to measure. Many proposed software metrics are used by practitioners without these metrics ever having been validated, leading to costly but often useless calculations. Several researchers have bemoaned the lack of scientific precision in much of the published software measurement work and have called for validation of software metrics by measurement theory. This dissertation applies measurement theory to validate fifty proposed object-oriented software metrics.
Exploring the reliability and validity of the social-moral awareness test.
Livesey, Alexandra; Dodd, Karen; Pote, Helen; Marlow, Elizabeth
2012-11-01
The aim of the study was to explore the validity of the social-moral awareness test (SMAT) a measure designed for assessing socio-moral rule knowledge and reasoning in people with learning disabilities. Comparisons between Theory of Mind and socio-moral reasoning allowed the exploration of construct validity of the tool. Factor structure, reliability and discriminant validity were also assessed. Seventy-one participants with mild-moderate learning disabilities completed the two scales of the SMAT and two False Belief Tasks for Theory of Mind. Reliability of the SMAT was very good, and the scales were shown to be uni-dimensional in factor structure. There was a significant positive relationship between Theory of Mind and both SMAT scales. There is early evidence of the construct validity and reliability of the SMAT. Further assessment of the validity of the SMAT will be required. © 2012 Blackwell Publishing Ltd.
Assessing Construct Validity Using Multidimensional Item Response Theory.
ERIC Educational Resources Information Center
Ackerman, Terry A.
The concept of a user-specified validity sector is discussed. The idea of the validity sector combines the work of M. D. Reckase (1986) and R. Shealy and W. Stout (1991). Reckase developed a methodology to represent an item in a multidimensional latent space as a vector. Item vectors are computed using multidimensional item response theory item…
Jiang, Li; Tetrick, Lois E
2016-09-01
The present study introduced a preliminary measure of employee safety motivation based on the definition of self-determination theory from Fleming (2012) research and validated the structure of self-determined safety motivation (SDSM) by surveying 375 employees in a Chinese high-risk organization. First, confirmatory factor analysis (CFA) was used to examine the factor structure of SDSM, and indices of five-factor model CFA met the requirements. Second, a nomological network was examined to provide evidence of the construct validity of SDSM. Beyond construct validity, the analysis also produced some interesting results concerning the relationship between leadership antecedents and safety motivation, and between safety motivation and safety behavior. Autonomous motivation was positively related to transformational leadership, negatively related to abusive supervision, and positively related to safety behavior. Controlled motivation with the exception of introjected regulation was negatively related to transformational leadership, positively related to abusive supervision, and negatively related to safety behavior. The unique role of introjected regulation and future research based on self-determination theory were discussed. Copyright © 2016 Elsevier Ltd. All rights reserved.
Nguyen, Huong Thi Thu; Kitaoka, Kazuyo; Sukigara, Masune; Thai, Anh Lan
2018-03-01
This study aimed to create a Vietnamese version of both the Maslach Burnout Inventory-General Survey (MBI-GS) and Areas of Worklife Scale (AWS) to assess the burnout state of Vietnamese clinical nurses and to develop a causal model of burnout of clinical nurses. We conducted a descriptive design using a cross-sectional survey. The questionnaire was hand divided directly by nursing departments to 500 clinical nurses in three hospitals. Vietnamese MBI-GS and AWS were then examined for reliability and validity. We used the revised exhaustion +1 burnout classification to access burnout state. We performed path analysis to develop a Vietnamese causal model based on the original model by Leiter and Maslach's theory. We found that both scales were reliable and valid for assessing burnout. Among nurse participants, the percentage of severe burnout was 0.7% and burnout was 15.8%, and 17.2% of nurses were exhausted. The best predictor of burnout was "on-duty work schedule" that clinical nurses have to work for 24 hours. In the causal model, we also found similarity and difference pathways in comparison with the original model. Vietnamese MBI-GS and AWS were applicable to research on occupational stress. Nearly one-fifth of Vietnamese clinical nurses were working in burnout state. The causal model suggested a range of factors resulting in burnout, and it is necessary to consider the specific solution to prevent burnout problem. Copyright © 2018. Published by Elsevier B.V.
Validation of the kinetic-turbulent-neoclassical theory for edge intrinsic rotation in DIII-D
NASA Astrophysics Data System (ADS)
Ashourvan, Arash; Grierson, B. A.; Battaglia, D. J.; Haskey, S. R.; Stoltzfus-Dueck, T.
2018-05-01
In a recent kinetic model of edge main-ion (deuterium) toroidal velocity, intrinsic rotation results from neoclassical orbits in an inhomogeneous turbulent field [T. Stoltzfus-Dueck, Phys. Rev. Lett. 108, 065002 (2012)]. This model predicts a value for the toroidal velocity that is co-current for a typical inboard X-point plasma at the core-edge boundary (ρ ˜ 0.9). Using this model, the velocity prediction is tested on the DIII-D tokamak for a database of L-mode and H-mode plasmas with nominally low neutral beam torque, including both signs of plasma current. Values for the flux-surface-averaged main-ion rotation velocity in the database are obtained from the impurity carbon rotation by analytically calculating the main-ion—impurity neoclassical offset. The deuterium rotation obtained in this manner has been validated by direct main-ion measurements for a limited number of cases. Key theoretical parameters of ion temperature and turbulent scale length are varied across a wide range in an experimental database of discharges. Using a characteristic electron temperature scale length as a proxy for a turbulent scale length, the predicted main-ion rotation velocity has a general agreement with the experimental measurements for neutral beam injection (NBI) powers in the range PNBI < 4 MW. At higher NBI power, the experimental rotation is observed to saturate and even degrade compared to theory. TRANSP-NUBEAM simulations performed for the database show that for discharges with nominally balanced—but high powered—NBI, the net injected torque through the edge can exceed 1 Nm in the counter-current direction. The theory model has been extended to compute the rotation degradation from this counter-current NBI torque by solving a reduced momentum evolution equation for the edge and found the revised velocity prediction to be in agreement with experiment. Using the theory modeled—and now tested—velocity to predict the bulk plasma rotation opens up a path to more confidently projecting the confinement and stability in ITER.
Beliefs about language development: construct validity evidence.
Donahue, Mavis L; Fu, Qiong; Smith, Everett V
2012-01-01
Understanding language development is incomplete without recognizing children's sociocultural environments, including adult beliefs about language development. Yet there is a need for data supporting valid inferences to assess these beliefs. The current study investigated the psychometric properties of data from a survey (MODeL) designed to explore beliefs in the popular culture, and their alignment with more formal theories. Support for the content, substantive, structural, generalizability, and external aspects of construct validity of the data were investigated. Subscales representing Behaviorist, Cognitive, Nativist, and Sociolinguistic models were identified as dimensions of beliefs. More than half of the items showed a high degree of consensus, suggesting culturally-transmitted beliefs. Behaviorist ideas were most popular. Bilingualism and ethnicity were related to Cognitive and Sociolinguistic beliefs. Identifying these beliefs may clarify the nature of child-directed speech, and enable the design of language intervention programs that are congruent with family and cultural expectations.
Numerical Modeling of Turbulence Effects within an Evaporating Droplet in Atomizing Sprays
NASA Technical Reports Server (NTRS)
Balasubramanyam, M. S.; Chen, C. P.; Trinh, H. P.
2006-01-01
A new approach to account for finite thermal conductivity and turbulence effects within atomizing liquid sprays is presented in this paper. The model is an extension of the T-blob and T-TAB atomization/spray model of Trinh and Chen (2005). This finite conductivity model is based on the two-temperature film theory, where the turbulence characteristics of the droplet are used to estimate the effective thermal diffhsivity within the droplet phase. Both one-way and two-way coupled calculations were performed to investigate the performance of this model. The current evaporation model is incorporated into the T-blob atomization model of Trinh and Chen (2005) and implemented in an existing CFD Eulerian-Lagrangian two-way coupling numerical scheme. Validation studies were carried out by comparing with available evaporating atomization spray experimental data in terms of jet penetration, temperature field, and droplet SMD distribution within the spray. Validation results indicate the superiority of the finite-conductivity model in low speed parallel flow evaporating spray.
Anderson, P. S. L.; Rayfield, E. J.
2012-01-01
Computational models such as finite-element analysis offer biologists a means of exploring the structural mechanics of biological systems that cannot be directly observed. Validated against experimental data, a model can be manipulated to perform virtual experiments, testing variables that are hard to control in physical experiments. The relationship between tooth form and the ability to break down prey is key to understanding the evolution of dentition. Recent experimental work has quantified how tooth shape promotes fracture in biological materials. We present a validated finite-element model derived from physical compression experiments. The model shows close agreement with strain patterns observed in photoelastic test materials and reaction forces measured during these experiments. We use the model to measure strain energy within the test material when different tooth shapes are used. Results show that notched blades deform materials for less strain energy cost than straight blades, giving insights into the energetic relationship between tooth form and prey materials. We identify a hypothetical ‘optimal’ blade angle that minimizes strain energy costs and test alternative prey materials via virtual experiments. Using experimental data and computational models offers an integrative approach to understand the mechanics of tooth morphology. PMID:22399789
Crins, Martine H. P.; Roorda, Leo D.; Smits, Niels; de Vet, Henrica C. W.; Westhovens, Rene; Cella, David; Cook, Karon F.; Revicki, Dennis; van Leeuwen, Jaap; Boers, Maarten; Dekker, Joost; Terwee, Caroline B.
2015-01-01
The Dutch-Flemish PROMIS Group translated the adult PROMIS Pain Interference item bank into Dutch-Flemish. The aims of the current study were to calibrate the parameters of these items using an item response theory (IRT) model, to evaluate the cross-cultural validity of the Dutch-Flemish translations compared to the original English items, and to evaluate their reliability and construct validity. The 40 items in the bank were completed by 1085 Dutch chronic pain patients. Before calibrating the items, IRT model assumptions were evaluated using confirmatory factor analysis (CFA). Items were calibrated using the graded response model (GRM), an IRT model appropriate for items with more than two response options. To evaluate cross-cultural validity, differential item functioning (DIF) for language (Dutch vs. English) was examined. Reliability was evaluated based on standard errors and Cronbach’s alpha. To evaluate construct validity correlations with scores on legacy instruments (e.g., the Disabilities of the Arm, Shoulder and Hand Questionnaire) were calculated. Unidimensionality of the Dutch-Flemish PROMIS Pain Interference item bank was supported by CFA tests of model fit (CFI = 0.986, TLI = 0.986). Furthermore, the data fit the GRM and showed good coverage across the pain interference continuum (threshold-parameters range: -3.04 to 3.44). The Dutch-Flemish PROMIS Pain Interference item bank has good cross-cultural validity (only two out of 40 items showing DIF), good reliability (Cronbach’s alpha = 0.98), and good construct validity (Pearson correlations between 0.62 and 0.75). A computer adaptive test (CAT) and Dutch-Flemish PROMIS short forms of the Dutch-Flemish PROMIS Pain Interference item bank can now be developed. PMID:26214178
Crins, Martine H P; Roorda, Leo D; Smits, Niels; de Vet, Henrica C W; Westhovens, Rene; Cella, David; Cook, Karon F; Revicki, Dennis; van Leeuwen, Jaap; Boers, Maarten; Dekker, Joost; Terwee, Caroline B
2015-01-01
The Dutch-Flemish PROMIS Group translated the adult PROMIS Pain Interference item bank into Dutch-Flemish. The aims of the current study were to calibrate the parameters of these items using an item response theory (IRT) model, to evaluate the cross-cultural validity of the Dutch-Flemish translations compared to the original English items, and to evaluate their reliability and construct validity. The 40 items in the bank were completed by 1085 Dutch chronic pain patients. Before calibrating the items, IRT model assumptions were evaluated using confirmatory factor analysis (CFA). Items were calibrated using the graded response model (GRM), an IRT model appropriate for items with more than two response options. To evaluate cross-cultural validity, differential item functioning (DIF) for language (Dutch vs. English) was examined. Reliability was evaluated based on standard errors and Cronbach's alpha. To evaluate construct validity correlations with scores on legacy instruments (e.g., the Disabilities of the Arm, Shoulder and Hand Questionnaire) were calculated. Unidimensionality of the Dutch-Flemish PROMIS Pain Interference item bank was supported by CFA tests of model fit (CFI = 0.986, TLI = 0.986). Furthermore, the data fit the GRM and showed good coverage across the pain interference continuum (threshold-parameters range: -3.04 to 3.44). The Dutch-Flemish PROMIS Pain Interference item bank has good cross-cultural validity (only two out of 40 items showing DIF), good reliability (Cronbach's alpha = 0.98), and good construct validity (Pearson correlations between 0.62 and 0.75). A computer adaptive test (CAT) and Dutch-Flemish PROMIS short forms of the Dutch-Flemish PROMIS Pain Interference item bank can now be developed.
Hybrid perturbation methods based on statistical time series models
NASA Astrophysics Data System (ADS)
San-Juan, Juan Félix; San-Martín, Montserrat; Pérez, Iván; López, Rosario
2016-04-01
In this work we present a new methodology for orbit propagation, the hybrid perturbation theory, based on the combination of an integration method and a prediction technique. The former, which can be a numerical, analytical or semianalytical theory, generates an initial approximation that contains some inaccuracies derived from the fact that, in order to simplify the expressions and subsequent computations, not all the involved forces are taken into account and only low-order terms are considered, not to mention the fact that mathematical models of perturbations not always reproduce physical phenomena with absolute precision. The prediction technique, which can be based on either statistical time series models or computational intelligence methods, is aimed at modelling and reproducing missing dynamics in the previously integrated approximation. This combination results in the precision improvement of conventional numerical, analytical and semianalytical theories for determining the position and velocity of any artificial satellite or space debris object. In order to validate this methodology, we present a family of three hybrid orbit propagators formed by the combination of three different orders of approximation of an analytical theory and a statistical time series model, and analyse their capability to process the effect produced by the flattening of the Earth. The three considered analytical components are the integration of the Kepler problem, a first-order and a second-order analytical theories, whereas the prediction technique is the same in the three cases, namely an additive Holt-Winters method.
Effective Lagrangians and Current Algebra in Three Dimensions
NASA Astrophysics Data System (ADS)
Ferretti, Gabriele
In this thesis we study three dimensional field theories that arise as effective Lagrangians of quantum chromodynamics in Minkowski space with signature (2,1) (QCD3). In the first chapter, we explain the method of effective Langrangians and the relevance of current algebra techniques to field theory. We also provide the physical motivations for the study of QCD3 as a toy model for confinement and as a theory of quantum antiferromagnets (QAF). In chapter two, we derive the relevant effective Lagrangian by studying the low energy behavior of QCD3, paying particular attention to how the global symmetries are realized at the quantum level. In chapter three, we show how baryons arise as topological solitons of the effective Lagrangian and also show that their statistics depends on the number of colors as predicted by the quark model. We calculate mass splitting and magnetic moments of the soliton and find logarithmic corrections to the naive quark model predictions. In chapter four, we drive the current algebra of the theory. We find that the current algebra is a co -homologically non-trivial generalization of Kac-Moody algebras to three dimensions. This fact may provide a new, non -perturbative way to quantize the theory. In chapter five, we discuss the renormalizability of the model in the large-N expansion. We prove the validity of the non-renormalization theorem and compute the critical exponents in a specific limiting case, the CP^ {N-1} model with a Chern-Simons term. Finally, chapter six contains some brief concluding remarks.
Continued Development and Validation of Methods for Spheromak Simulation
NASA Astrophysics Data System (ADS)
Benedett, Thomas
2015-11-01
The HIT-SI experiment has demonstrated stable sustainment of spheromaks; determining how the underlying physics extrapolate to larger, higher-temperature regimes is of prime importance in determining the viability of the inductively-driven spheromak. It is thus prudent to develop and validate a computational model that can be used to study current results and provide an intermediate step between theory and future experiments. A zero-beta Hall-MHD model has shown good agreement with experimental data at 14.5 kHz injector operation. Experimental observations at higher frequency, where the best performance is achieved, indicate pressure effects are important and likely required to attain quantitative agreement with simulations. Efforts to extend the existing validation to high frequency (~ 36-68 kHz) using an extended MHD model implemented in the PSI-TET arbitrary-geometry 3D MHD code will be presented. Results from verification of the PSI-TET extended MHD model using the GEM magnetic reconnection challenge will also be presented along with investigation of injector configurations for future SIHI experiments using Taylor state equilibrium calculations. Work supported by DoE.
Devaluation and sequential decisions: linking goal-directed and model-based behavior
Friedel, Eva; Koch, Stefan P.; Wendt, Jean; Heinz, Andreas; Deserno, Lorenz; Schlagenhauf, Florian
2014-01-01
In experimental psychology different experiments have been developed to assess goal–directed as compared to habitual control over instrumental decisions. Similar to animal studies selective devaluation procedures have been used. More recently sequential decision-making tasks have been designed to assess the degree of goal-directed vs. habitual choice behavior in terms of an influential computational theory of model-based compared to model-free behavioral control. As recently suggested, different measurements are thought to reflect the same construct. Yet, there has been no attempt to directly assess the construct validity of these different measurements. In the present study, we used a devaluation paradigm and a sequential decision-making task to address this question of construct validity in a sample of 18 healthy male human participants. Correlational analysis revealed a positive association between model-based choices during sequential decisions and goal-directed behavior after devaluation suggesting a single framework underlying both operationalizations and speaking in favor of construct validity of both measurement approaches. Up to now, this has been merely assumed but never been directly tested in humans. PMID:25136310
Le Bihan, Nicolas; Margerin, Ludovic
2009-07-01
In this paper, we present a nonparametric method to estimate the heterogeneity of a random medium from the angular distribution of intensity of waves transmitted through a slab of random material. Our approach is based on the modeling of forward multiple scattering using compound Poisson processes on compact Lie groups. The estimation technique is validated through numerical simulations based on radiative transfer theory.
Sun, Xiaodong; Fang, Dawei; Zhang, Dong; Ma, Qingyu
2013-05-01
Different from the theory of acoustic monopole spherical radiation, the acoustic dipole radiation based theory introduces the radiation pattern of Lorentz force induced dipole sources to describe the principle of magnetoacoustic tomography with magnetic induction (MAT-MI). Although two-dimensional (2D) simulations have been studied for cylindrical phantom models, layer effects of the dipole sources within the entire object along the z direction still need to be investigated to evaluate the performance of MAT-MI for different geometric specifications. The purpose of this work is further verifying the validity and generality of acoustic dipole radiation based theory for MAT-MI with two new models in different shapes, dimensions, and conductivities. Based on the theory of acoustic dipole radiation, the principles of MAT-MI were analyzed with derived analytic formulae. 2D and 3D numerical studies for two new models of aluminum foil and cooked egg were conducted to simulate acoustic pressures and corresponding waveforms, and 2D images of the scanned layers were reconstructed with the simplified back projection algorithm for the waveforms collected around the models. The spatial resolution for conductivity boundary differentiation was also analyzed with different foil thickness. For comparison, two experimental measurements were conducted for a cylindrical aluminum foil phantom and a shell-peeled cooked egg. The collected waveforms and the reconstructed images of the scanned layers were achieved to verify the validation of the acoustic dipole radiation based theory for MAT-MI. Despite the difference between the 2D and 3D simulated pressures, good consistence of the collected waveforms proves that wave clusters are generated by the abrupt pressure changes with bipolar vibration phases, representing the opposite polarities of the conductivity changes along the measurement direction. The configuration of the scanned layer can be reconstructed in terms of shape and size, and the conductivity boundaries are displayed in stripes with different contrast and bipolar intensities. Layer effects are demonstrated to have little influence on the collected waveforms and the reconstructed images of the scanned layers for the two new models. The experimental results have good agreements with numerical simulations, and the reconstructed 2D images provide conductivity configurations in the scanned layers of the aluminum foil and the egg models. It can be concluded that the acoustic pressure of MAT-MI is produced by the divergence of the induced Lorentz force, and the collected waveforms comprise wave clusters with bipolar vibration phases and different amplitudes, providing the information of conductivity boundaries in the scanned layer. With the simplified back projection algorithm for diffraction sources, collected waveforms can be used to reconstruct 2D conductivity contrast image and the conductivity configuration in the scanned layer can be obtained in terms of shape and size in stripes with the spatial resolution of the acoustic wavelength. The favorable results further verify the validity and generality of the acoustic dipole radiation based theory and suggest the feasibility of MAT-MI as an effective electrical impedance contrast imaging approach for medical imaging.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Vriens, L.; Smeets, A.H.M.
1980-09-01
For electron-induced ionization, excitation, and de-excitation, mainly from excited atomic states, a detailed analysis is presented of the dependence of the cross sections and rate coefficients on electron energy and temperature, and on atomic parameters. A wide energy range is covered, including sudden as well as adiabatic collisions. By combining the available experimental and theoretical information, a set of simple analytical formulas is constructed for the cross sections and rate coefficients of the processes mentioned, for the total depopulation, and for three-body recombination. The formulas account for large deviations from classical and semiclassical scaling, as found for excitation. They agreemore » with experimental data and with the theories in their respective ranges of validity, but have a wider range of validity than the separate theories. The simple analytical form further facilitates the application in plasma modeling.« less
The Einstein viscosity with fluid elasticity
NASA Astrophysics Data System (ADS)
Einarsson, Jonas; Yang, Mengfei; Shaqfeh, Eric S. G.
2017-11-01
We give the first correction to the suspension viscosity due to fluid elasticity for a dilute suspension of spheres in a viscoelastic medium. Our perturbation theory is valid to O (Wi2) in the Weissenberg number Wi = γ . λ , where γ is the typical magnitude of the suspension velocity gradient, and λ is the relaxation time of the viscoelastic fluid. For shear flow we find that the suspension shear-thickens due to elastic stretching in strain `hot spots' near the particle, despite the fact that the stress inside the particles decreases relative to the Newtonian case. We thus argue that it is crucial to correctly model the extensional rheology of the suspending medium to predict the shear rheology of the suspension. For uniaxial extensional flow we correct existing results at O (Wi) , and find dramatic strain-rate thickening at O (Wi2) . We validate our theory with fully resolved numerical simulations.
Einstein viscosity with fluid elasticity
NASA Astrophysics Data System (ADS)
Einarsson, Jonas; Yang, Mengfei; Shaqfeh, Eric S. G.
2018-01-01
We give the first correction to the suspension viscosity due to fluid elasticity for a dilute suspension of spheres in a viscoelastic medium. Our perturbation theory is valid to O (ϕ Wi2) in the particle volume fraction ϕ and the Weissenberg number Wi =γ ˙λ , where γ ˙ is the typical magnitude of the suspension velocity gradient, and λ is the relaxation time of the viscoelastic fluid. For shear flow we find that the suspension shear-thickens due to elastic stretching in strain "hot spots" near the particle, despite the fact that the stress inside the particles decreases relative to the Newtonian case. We thus argue that it is crucial to correctly model the extensional rheology of the suspending medium to predict the shear rheology of the suspension. For uniaxial extensional flow we correct existing results at O (ϕ Wi ) , and find dramatic strain-rate thickening at O (ϕ Wi2) . We validate our theory with fully resolved numerical simulations.
Liao, David; Tlsty, Thea D
2014-08-06
Failure to understand evolutionary dynamics has been hypothesized as limiting our ability to control biological systems. An increasing awareness of similarities between macroscopic ecosystems and cellular tissues has inspired optimism that game theory will provide insights into the progression and control of cancer. To realize this potential, the ability to compare game theoretic models and experimental measurements of population dynamics should be broadly disseminated. In this tutorial, we present an analysis method that can be used to train parameters in game theoretic dynamics equations, used to validate the resulting equations, and used to make predictions to challenge these equations and to design treatment strategies. The data analysis techniques in this tutorial are adapted from the analysis of reaction kinetics using the method of initial rates taught in undergraduate general chemistry courses. Reliance on computer programming is avoided to encourage the adoption of these methods as routine bench activities.
ACIRF user's guide: Theory and examples
NASA Astrophysics Data System (ADS)
Dana, Roger A.
1989-12-01
Design and evaluation of radio frequency systems that must operate through ionospheric disturbances resulting from high altitude nuclear detonations requires an accurate channel model. This model must include the effects of high gain antennas that may be used to receive the signals. Such a model can then be used to construct realizations of the received signal for use in digital simulations of trans-ionospheric links or for use in hardware channel simulators. The FORTRAN channel model ACIRF (Antenna Channel Impulse Response Function) generates random realizations of the impulse response function at the outputs of multiple antennas. This user's guide describes the FORTRAN program ACIRF (version 2.0) that generates realizations of channel impulse response functions at the outputs of multiple antennas with arbitrary beamwidths, pointing angles, and relatives positions. This channel model is valid under strong scattering conditions when Rayleigh fading statistics apply. Both frozen-in and turbulent models for the temporal fluctuations are included in this version of ACIRF. The theory of the channel model is described and several examples are given.
Vispoel, Walter P; Morris, Carrie A; Kilinc, Murat
2018-01-01
We applied a new approach to Generalizability theory (G-theory) involving parallel splits and repeated measures to evaluate common uses of the Paulhus Deception Scales based on polytomous and four types of dichotomous scoring. G-theory indices of reliability and validity accounting for specific-factor, transient, and random-response measurement error supported use of polytomous over dichotomous scores as contamination checks; as control, explanatory, and outcome variables; as aspects of construct validation; and as indexes of environmental effects on socially desirable responding. Polytomous scoring also provided results for flagging faking as dependable as those when using dichotomous scoring methods. These findings argue strongly against the nearly exclusive use of dichotomous scoring for the Paulhus Deception Scales in practice and underscore the value of G-theory in demonstrating this. We provide guidelines for applying our G-theory techniques to other objectively scored clinical assessments, for using G-theory to estimate how changes to a measure might improve reliability, and for obtaining software to conduct G-theory analyses free of charge.
The pipe model theory half a century on: a review.
Lehnebach, Romain; Beyer, Robert; Letort, Véronique; Heuret, Patrick
2018-01-23
More than a half century ago, Shinozaki et al. (Shinozaki K, Yoda K, Hozumi K, Kira T. 1964b. A quantitative analysis of plant form - the pipe model theory. II. Further evidence of the theory and its application in forest ecology. Japanese Journal of Ecology14: 133-139) proposed an elegant conceptual framework, the pipe model theory (PMT), to interpret the observed linear relationship between the amount of stem tissue and corresponding supported leaves. The PMT brought a satisfactory answer to two vividly debated problems that were unresolved at the moment of its publication: (1) What determines tree form and which rules drive biomass allocation to the foliar versus stem compartments in plants? (2) How can foliar area or mass in an individual plant, in a stand or at even larger scales be estimated? Since its initial formulation, the PMT has been reinterpreted and used in applications, and has undoubtedly become an important milestone in the mathematical interpretation of plant form and functioning. This article aims to review the PMT by going back to its initial formulation, stating its explicit and implicit properties and discussing them in the light of current biological knowledge and experimental evidence in order to identify the validity and range of applicability of the theory. We also discuss the use of the theory in tree biomechanics and hydraulics as well as in functional-structural plant modelling. Scrutinizing the PMT in the light of modern biological knowledge revealed that most of its properties are not valid as a general rule. The hydraulic framework derived from the PMT has attracted much more attention than its mechanical counterpart and implies that only the conductive portion of a stem cross-section should be proportional to the supported foliage amount rather than the whole of it. The facts that this conductive portion is experimentally difficult to measure and varies with environmental conditions and tree ontogeny might cause the commonly reported non-linear relationships between foliage and stem metrics. Nevertheless, the PMT can still be considered as a portfolio of properties providing a unified framework to integrate and analyse functional-structural relationships. © The Author(s) 2018. Published by Oxford University Press on behalf of the Annals of Botany Company. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.
Reconceptualising the external validity of discrete choice experiments.
Lancsar, Emily; Swait, Joffre
2014-10-01
External validity is a crucial but under-researched topic when considering using discrete choice experiment (DCE) results to inform decision making in clinical, commercial or policy contexts. We present the theory and tests traditionally used to explore external validity that focus on a comparison of final outcomes and review how this traditional definition has been empirically tested in health economics and other sectors (such as transport, environment and marketing) in which DCE methods are applied. While an important component, we argue that the investigation of external validity should be much broader than a comparison of final outcomes. In doing so, we introduce a new and more comprehensive conceptualisation of external validity, closely linked to process validity, that moves us from the simple characterisation of a model as being or not being externally valid on the basis of predictive performance, to the concept that external validity should be an objective pursued from the initial conceptualisation and design of any DCE. We discuss how such a broader definition of external validity can be fruitfully used and suggest innovative ways in which it can be explored in practice.
Jochems, Eline C; Mulder, Cornelis L; Duivenvoorden, Hugo J; van der Feltz-Cornelis, Christina M; van Dam, Arno
2014-08-01
Self-determination theory is potentially useful for understanding reasons why individuals with mental illness do or do not engage in psychiatric treatment. The current study examined the psychometric properties of three questionnaires based on self-determination theory-The Treatment Entry Questionnaire (TEQ), Health Care Climate Questionnaire (HCCQ), and the Short Motivation Feedback List (SMFL)-in a sample of 348 Dutch adult outpatients with primary diagnoses of mood, anxiety, psychotic, and personality disorders. Structural equation modeling showed that the empirical factor structures of the TEQ and SMFL were adequately represented by a model with three intercorrelated factors. These were interpreted as identified, introjected, and external motivation. The reliabilities of the Dutch TEQ, HCCQ, and SMFL were found to be acceptable but can be improved on; congeneric estimates ranged from 0.66 to 0.94 depending on the measure and patient subsample. Preliminary support for the construct validities of the questionnaires was found in the form of theoretically expected associations with other scales, including therapist-rated motivation and treatment engagement and with legally mandated treatment. Additionally, the study provides insights into the relations between measures of motivation based on self-determination theory, the transtheoretical model and the integral model of treatment motivation in psychiatric outpatients with severe mental illness. © The Author(s) 2013.
Reliability and validity of advanced theory-of-mind measures in middle childhood and adolescence.
Hayward, Elizabeth O; Homer, Bruce D
2017-09-01
Although theory-of-mind (ToM) development is well documented for early childhood, there is increasing research investigating changes in ToM reasoning in middle childhood and adolescence. However, the psychometric properties of most advanced ToM measures for use with older children and adolescents have not been firmly established. We report on the reliability and validity of widely used, conventional measures of advanced ToM with this age group. Notable issues with both reliability and validity of several of the measures were evident in the findings. With regard to construct validity, results do not reveal a clear empirical commonality between tasks, and, after accounting for comprehension, developmental trends were evident in only one of the tasks investigated. Statement of contribution What is already known on this subject? Second-order false belief tasks have acceptable internal consistency. The Eyes Test has poor internal consistency. Validity of advanced theory-of-mind tasks is often based on the ability to distinguish clinical from typical groups. What does this study add? This study examines internal consistency across six widely used advanced theory-of-mind tasks. It investigates validity of tasks based on comprehension of items by typically developing individuals. It further assesses construct validity, or commonality between tasks. © 2017 The British Psychological Society.
Signatures of nonlinearity in single cell noise-induced oscillations.
Thomas, Philipp; Straube, Arthur V; Timmer, Jens; Fleck, Christian; Grima, Ramon
2013-10-21
A class of theoretical models seeks to explain rhythmic single cell data by postulating that they are generated by intrinsic noise in biochemical systems whose deterministic models exhibit only damped oscillations. The main features of such noise-induced oscillations are quantified by the power spectrum which measures the dependence of the oscillatory signal's power with frequency. In this paper we derive an approximate closed-form expression for the power spectrum of any monostable biochemical system close to a Hopf bifurcation, where noise-induced oscillations are most pronounced. Unlike the commonly used linear noise approximation which is valid in the macroscopic limit of large volumes, our theory is valid over a wide range of volumes and hence affords a more suitable description of single cell noise-induced oscillations. Our theory predicts that the spectra have three universal features: (i) a dominant peak at some frequency, (ii) a smaller peak at twice the frequency of the dominant peak and (iii) a peak at zero frequency. Of these, the linear noise approximation predicts only the first feature while the remaining two stem from the combination of intrinsic noise and nonlinearity in the law of mass action. The theoretical expressions are shown to accurately match the power spectra determined from stochastic simulations of mitotic and circadian oscillators. Furthermore it is shown how recently acquired single cell rhythmic fibroblast data displays all the features predicted by our theory and that the experimental spectrum is well described by our theory but not by the conventional linear noise approximation. © 2013 Elsevier Ltd. All rights reserved.
The blood donor identity survey: a multidimensional measure of blood donor motivations.
France, Christopher R; Kowalsky, Jennifer M; France, Janis L; Himawan, Lina K; Kessler, Debra A; Shaz, Beth H
2014-08-01
Evidence indicates that donor identity is an important predictor of donation behavior; however, prior studies have relied on diverse, unidimensional measures with limited psychometric support. The goals of this study were to examine the application of self-determination theory to blood donor motivations and to develop and validate a related multidimensional measure of donor identity. Items were developed and administered electronically to a sample of New York Blood Center (NYBC) donors (n=582) and then to a sample of Ohio University students (n=1005). Following initial confirmatory factor analysis (CFA) on the NYBC sample to identify key items related to self-determination theory's six motivational factors, a revised survey was administered to the university sample to reexamine model fit and to assess survey reliability and validity. Consistent with self-determination theory, for both samples CFAs indicated that the best fit to the data was provided by a six-motivational-factor model, including amotivation, external regulation, introjected regulation, identified regulation, integrated regulation, and intrinsic regulation. The Blood Donor Identity Survey provides a psychometrically sound, multidimensional measure of donor motivations (ranging from unmotivated to donate to increasing levels of autonomous motivation to donate) that is suitable for nondonors as well as donors with varying levels of experience. Future research is needed to examine longitudinal changes in donor identity and its relationship to actual donation behavior. © 2014 AABB.
ERIC Educational Resources Information Center
Maslovaty, Nava; Cohen, Arie; Furman, Sari
2008-01-01
The article presents a multi-faceted theory of "ideal high school student" traits. The trait system, as defined by several theories, is a translation of the teachers' belief system into educational objectives. The study focused on Bloom's taxonomies and the structural validity of its principles, using Similarity Structure Analysis. Aware of the…
NASA Astrophysics Data System (ADS)
Stockert, Sven; Wehr, Matthias; Lohmar, Johannes; Abel, Dirk; Hirt, Gerhard
2017-10-01
In the electrical and medical industries the trend towards further miniaturization of devices is accompanied by the demand for smaller manufacturing tolerances. Such industries use a plentitude of small and narrow cold rolled metal strips with high thickness accuracy. Conventional rolling mills can hardly achieve further improvement of these tolerances. However, a model-based controller in combination with an additional piezoelectric actuator for high dynamic roll adjustment is expected to enable the production of the required metal strips with a thickness tolerance of +/-1 µm. The model-based controller has to be based on a rolling theory which can describe the rolling process very accurately. Additionally, the required computing time has to be low in order to predict the rolling process in real-time. In this work, four rolling theories from literature with different levels of complexity are tested for their suitability for the predictive controller. Rolling theories of von Kármán, Siebel, Bland & Ford and Alexander are implemented in Matlab and afterwards transferred to the real-time computer used for the controller. The prediction accuracy of these theories is validated using rolling trials with different thickness reduction and a comparison to the calculated results. Furthermore, the required computing time on the real-time computer is measured. Adequate results according the prediction accuracy can be achieved with the rolling theories developed by Bland & Ford and Alexander. A comparison of the computing time of those two theories reveals that Alexander's theory exceeds the sample rate of 1 kHz of the real-time computer.
Phase transformations at interfaces: Observations from atomistic modeling
Frolov, T.; Asta, M.; Mishin, Y.
2016-10-01
Here, we review the recent progress in theoretical understanding and atomistic computer simulations of phase transformations in materials interfaces, focusing on grain boundaries (GBs) in metallic systems. Recently developed simulation approaches enable the search and structural characterization of GB phases in single-component metals and binary alloys, calculation of thermodynamic properties of individual GB phases, and modeling of the effect of the GB phase transformations on GB kinetics. Atomistic simulations demonstrate that the GB transformations can be induced by varying the temperature, loading the GB with point defects, or varying the amount of solute segregation. The atomic-level understanding obtained from suchmore » simulations can provide input for further development of thermodynamics theories and continuous models of interface phase transformations while simultaneously serving as a testing ground for validation of theories and models. They can also help interpret and guide experimental work in this field.« less
Theory of quantized systems: formal basis for DEVS/HLA distributed simulation environment
NASA Astrophysics Data System (ADS)
Zeigler, Bernard P.; Lee, J. S.
1998-08-01
In the context of a DARPA ASTT project, we are developing an HLA-compliant distributed simulation environment based on the DEVS formalism. This environment will provide a user- friendly, high-level tool-set for developing interoperable discrete and continuous simulation models. One application is the study of contract-based predictive filtering. This paper presents a new approach to predictive filtering based on a process called 'quantization' to reduce state update transmission. Quantization, which generates state updates only at quantum level crossings, abstracts a sender model into a DEVS representation. This affords an alternative, efficient approach to embedding continuous models within distributed discrete event simulations. Applications of quantization to message traffic reduction are discussed. The theory has been validated by DEVSJAVA simulations of test cases. It will be subject to further test in actual distributed simulations using the DEVS/HLA modeling and simulation environment.
Wang, Chengwen; Quan, Long; Zhang, Shijie; Meng, Hongjun; Lan, Yuan
2017-03-01
Hydraulic servomechanism is the typical mechanical/hydraulic double-dynamics coupling system with the high stiffness control and mismatched uncertainties input problems, which hinder direct applications of many advanced control approaches in the hydraulic servo fields. In this paper, by introducing the singular value perturbation theory, the original double-dynamics coupling model of the hydraulic servomechanism was reduced to a integral chain system. So that, the popular ADRC (active disturbance rejection control) technology could be directly applied to the reduced system. In addition, the high stiffness control and mismatched uncertainties input problems are avoided. The validity of the simplified model is analyzed and proven theoretically. The standard linear ADRC algorithm is then developed based on the obtained reduced-order model. Extensive comparative co-simulations and experiments are carried out to illustrate the effectiveness of the proposed method. Copyright © 2017 ISA. Published by Elsevier Ltd. All rights reserved.
Generalized Galileons: instabilities of bouncing and Genesis cosmologies and modified Genesis
NASA Astrophysics Data System (ADS)
Libanov, M.; Mironov, S.; Rubakov, V.
2016-08-01
We study spatially flat bouncing cosmologies and models with the early-time Genesis epoch in a popular class of generalized Galileon theories. We ask whether there exist solutions of these types which are free of gradient and ghost instabilities. We find that irrespectively of the forms of the Lagrangian functions, the bouncing models either are plagued with these instabilities or have singularities. The same result holds for the original Genesis model and its variants in which the scale factor tends to a constant as t → -∞. The result remains valid in theories with additional matter that obeys the Null Energy Condition and interacts with the Galileon only gravitationally. We propose a modified Genesis model which evades our no-go argument and give an explicit example of healthy cosmology that connects the modified Genesis epoch with kination (the epoch still driven by the Galileon field, which is a conventional massless scalar field at that stage).
The Work Instability Scale for Rheumatoid Arthritis (RA-WIS): Does it work in osteoarthritis?
Tang, Kenneth; Beaton, Dorcas E; Lacaille, Diane; Gignac, Monique A M; Zhang, Wei; Anis, Aslam H; Bombardier, Claire
2010-09-01
To validate the 23-item Work Instability Scale for Rheumatoid Arthritis (RA-WIS) for use in osteoarthritis (OA) using both classical test theory and item response theory approaches. Baseline and 12-month follow-up data were collected from workers with OA recruited from community and clinical settings (n = 130). Fit of RA-WIS data to the Rasch model was evaluated by item- and person-fit statistics (size of residual, chi-sq), assessments of differential item functioning, and tests of unidimensionality and local independence. Internal consistency was assessed by KR-20. Convergent construct validity (Spearman r, known-groups) was evaluated against theoretical constructs that assess impact of health on work. Responsiveness to global indicators of change was assessed by standardized response means (SRM) and area under the receiver operating characteristic curves. Data structure of the RA-WIS showed adequate fit to the Rasch model (chi-sq = 83.2, P = 0.03) after addressing local dependency in three item pairs by creating testlets. High internal consistency (KR-20 = 0.93) and convergent validity with work-oriented constructs (|r| = 0.55-0.77) were evident. The RA-WIS correlated most strongly with the concept of illness intrusiveness (r = 0.77) and was highly responsive to changes (SRM = 1.05 [deterioration]; -0.78 [improvement]). Although developed for RA, the RA-WIS is psychometrically sound for OA and demonstrates interval-level property.
NASA Astrophysics Data System (ADS)
Li, Bin
Spatial control behaviors account for a large proportion of human everyday activities from normal daily tasks, such as reaching for objects, to specialized tasks, such as driving, surgery, or operating equipment. These behaviors involve intensive interactions within internal processes (i.e. cognitive, perceptual, and motor control) and with the physical world. This dissertation builds on a concept of interaction pattern and a hierarchical functional model. Interaction pattern represents a type of behavior synergy that humans coordinates cognitive, perceptual, and motor control processes. It contributes to the construction of the hierarchical functional model that delineates humans spatial control behaviors as the coordination of three functional subsystems: planning, guidance, and tracking/pursuit. This dissertation formalizes and validates these two theories and extends them for the investigation of human spatial control skills encompassing development and assessment. Specifically, this dissertation first presents an overview of studies in human spatial control skills encompassing definition, characteristic, development, and assessment, to provide theoretical evidence for the concept of interaction pattern and the hierarchical functional model. The following, the human experiments for collecting motion and gaze data and techniques to register and classify gaze data, are described. This dissertation then elaborates and mathematically formalizes the hierarchical functional model and the concept of interaction pattern. These theories then enables the construction of a succinct simulation model that can reproduce a variety of human performance with a minimal set of hypotheses. This validates the hierarchical functional model as a normative framework for interpreting human spatial control behaviors. The dissertation then investigates human skill development and captures the emergence of interaction pattern. The final part of the dissertation applies the hierarchical functional model for skill assessment and introduces techniques to capture interaction patterns both from the top down using their geometric features and from the bottom up using their dynamical characteristics. The validity and generality of the skill assessment is illustrated using two the remote-control flight and laparoscopic surgical training experiments.
A validation study of a stochastic model of human interaction
NASA Astrophysics Data System (ADS)
Burchfield, Mitchel Talmadge
The purpose of this dissertation is to validate a stochastic model of human interactions which is part of a developmentalism paradigm. Incorporating elements of ancient and contemporary philosophy and science, developmentalism defines human development as a progression of increasing competence and utilizes compatible theories of developmental psychology, cognitive psychology, educational psychology, social psychology, curriculum development, neurology, psychophysics, and physics. To validate a stochastic model of human interactions, the study addressed four research questions: (a) Does attitude vary over time? (b) What are the distributional assumptions underlying attitudes? (c) Does the stochastic model, {-}N{intlimitssbsp{-infty}{infty}}varphi(chi,tau)\\ Psi(tau)dtau, have utility for the study of attitudinal distributions and dynamics? (d) Are the Maxwell-Boltzmann, Fermi-Dirac, and Bose-Einstein theories applicable to human groups? Approximately 25,000 attitude observations were made using the Semantic Differential Scale. Positions of individuals varied over time and the logistic model predicted observed distributions with correlations between 0.98 and 1.0, with estimated standard errors significantly less than the magnitudes of the parameters. The results bring into question the applicability of Fisherian research designs (Fisher, 1922, 1928, 1938) for behavioral research based on the apparent failure of two fundamental assumptions-the noninteractive nature of the objects being studied and normal distribution of attributes. The findings indicate that individual belief structures are representable in terms of a psychological space which has the same or similar properties as physical space. The psychological space not only has dimension, but individuals interact by force equations similar to those described in theoretical physics models. Nonlinear regression techniques were used to estimate Fermi-Dirac parameters from the data. The model explained a high degree of the variance in each probability distribution. The correlation between predicted and observed probabilities ranged from a low of 0.955 to a high value of 0.998, indicating that humans behave in psychological space as Fermions behave in momentum space.
Dawson, Deborah A; Saha, Tulshi D; Grant, Bridget F
2010-02-01
The relative severity of the 11 DSM-IV alcohol use disorder (AUD) criteria are represented by their severity threshold scores, an item response theory (IRT) model parameter inversely proportional to their prevalence. These scores can be used to create a continuous severity measure comprising the total number of criteria endorsed, each weighted by its relative severity. This paper assesses the validity of the severity ranking of the 11 criteria and the overall severity score with respect to known AUD correlates, including alcohol consumption, psychological functioning, family history, antisociality, and early initiation of drinking, in a representative population sample of U.S. past-year drinkers (n=26,946). The unadjusted mean values for all validating measures increased steadily with the severity threshold score, except that legal problems, the criterion with the highest score, was associated with lower values than expected. After adjusting for the total number of criteria endorsed, this direct relationship was no longer evident. The overall severity score was no more highly correlated with the validating measures than a simple count of criteria endorsed, nor did the two measures yield different risk curves. This reflects both within-criterion variation in severity and the fact that the number of criteria endorsed and their severity are so highly correlated that severity is essentially redundant. Attempts to formulate a scalar measure of AUD will do as well by relying on simple counts of criteria or symptom items as by using scales weighted by IRT measures of severity. Published by Elsevier Ireland Ltd.
Modelling Fine Scale Movement Corridors for the Tricarinate Hill Turtle
NASA Astrophysics Data System (ADS)
Mondal, I.; Kumar, R. S.; Habib, B.; Talukdar, G.
2016-06-01
Habitat loss and the destruction of habitat connectivity can lead to species extinction by isolation of population. Identifying important habitat corridors to enhance habitat connectivity is imperative for species conservation by preserving dispersal pattern to maintain genetic diversity. Circuit theory is a novel tool to model habitat connectivity as it considers habitat as an electronic circuit board and species movement as a certain amount of current moving around through different resistors in the circuit. Most studies involving circuit theory have been carried out at small scales on large ranging animals like wolves or pumas, and more recently on tigers. This calls for a study that tests circuit theory at a large scale to model micro-scale habitat connectivity. The present study on a small South-Asian geoemydid, the Tricarinate Hill-turtle (Melanochelys tricarinata), focuses on habitat connectivity at a very fine scale. The Tricarinate has a small body size (carapace length: 127-175 mm) and home range (8000-15000 m2), with very specific habitat requirements and movement patterns. We used very high resolution Worldview satellite data and extensive field observations to derive a model of landscape permeability at 1 : 2,000 scale to suit the target species. Circuit theory was applied to model potential corridors between core habitat patches for the Tricarinate Hill-turtle. The modelled corridors were validated by extensive ground tracking data collected using thread spool technique and found to be functional. Therefore, circuit theory is a promising tool for accurately identifying corridors, to aid in habitat studies of small species.
Setting limits on Effective Field Theories: the case of Dark Matter
NASA Astrophysics Data System (ADS)
Pobbe, Federico; Wulzer, Andrea; Zanetti, Marco
2017-08-01
The usage of Effective Field Theories (EFT) for LHC new physics searches is receiving increasing attention. It is thus important to clarify all the aspects related with the applicability of the EFT formalism in the LHC environment, where the large available energy can produce reactions that overcome the maximal range of validity, i.e. the cutoff, of the theory. We show that this does not forbid to set rigorous limits on the EFT parameter space through a modified version of the ordinary binned likelihood hypothesis test, which we design and validate. Our limit-setting strategy can be carried on in its full-fledged form by the LHC experimental collaborations, or performed externally to the collaborations, through the Simplified Likelihood approach, by relying on certain approximations. We apply it to the recent CMS mono-jet analysis and derive limits on a Dark Matter (DM) EFT model. DM is selected as a case study because the limited reach on the DM production EFT Wilson coefficient and the structure of the theory suggests that the cutoff might be dangerously low, well within the LHC reach. However our strategy can also be applied, if needed, to EFT's parametrising the indirect effects of heavy new physics in the Electroweak and Higgs sectors.
Walczyk, Jeffrey J.; Igou, Frank P.; Dixon, Alexa P.; Tcholakian, Talar
2013-01-01
This article critically reviews techniques and theories relevant to the emerging field of “lie detection by inducing cognitive load selectively on liars.” To help these techniques benefit from past mistakes, we start with a summary of the polygraph-based Controlled Question Technique (CQT) and the major criticisms of it made by the National Research Council (2003), including that it not based on a validated theory and administration procedures have not been standardized. Lessons from the more successful Guilty Knowledge Test are also considered. The critical review that follows starts with the presentation of models and theories offering insights for cognitive lie detection that can undergird theoretically load-inducing approaches. This is followed by evaluation of specific research-based, load-inducing proposals, especially for their susceptibility to rehearsal and other countermeasures. To help organize these proposals and suggest new direction for innovation and refinement, a theoretical taxonomy is presented based on the type of cognitive load induced in examinees (intrinsic or extraneous) and how open-ended the responses to test items are. Finally, four recommendations are proffered that can help researchers and practitioners to avert the corresponding mistakes with the CQT and yield new, valid cognitive lie detection technologies. PMID:23378840
From Theory to Practice: Measuring end-of-life communication quality using multiple goals theory.
Van Scoy, L J; Scott, A M; Reading, J M; Chuang, C H; Chinchilli, V M; Levi, B H; Green, M J
2017-05-01
To describe how multiple goals theory can be used as a reliable and valid measure (i.e., coding scheme) of the quality of conversations about end-of-life issues. We analyzed conversations from 17 conversations in which 68 participants (mean age=51years) played a game that prompted discussion in response to open-ended questions about end-of-life issues. Conversations (mean duration=91min) were audio-recorded and transcribed. Communication quality was assessed by three coders who assigned numeric scores rating how well individuals accomplished task, relational, and identity goals in the conversation. The coding measure, which results in a quantifiable outcome, yielded strong reliability (intra-class correlation range=0.73-0.89 and Cronbach's alpha range=0.69-0.89 for each of the coded domains) and validity (using multilevel nonlinear modeling, we detected significant variability in scores between games for each of the coded domains, all p-values <0.02). Our coding scheme provides a theory-based measure of end-of-life conversation quality that is superior to other methods of measuring communication quality. Our description of the coding method enables researches to adapt and apply this measure to communication interventions in other clinical contexts. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.
Pain assessment in children: theoretical and empirical validity.
Villarruel, A M; Denyes, M J
1991-12-01
Valid assessment of pain in children is foundational for both the nursing practice and research domains, yet few validated methods of pain measurement are currently available for young children. This article describes an innovative research approach used in the development of photographic instruments to measure pain intensity in young African-American and Hispanic children. The instruments were designed to enable children to participate actively in their own care and to do so in ways that are congruent with their developmental and cultural heritage. Conceptualization of the instruments, methodological development, and validation processes grounded in Orem's Self-Care Deficit Theory of Nursing are described. The authors discuss the ways in which the gaps between nursing theory, research, and practice are narrowed when development of instruments to measure clinical nursing phenomena are grounded in nursing theory, validated through research and utilized in practice settings.
Wang, Hongyuan; Zhang, Wei; Dong, Aotuo
2012-11-10
A modeling and validation method of photometric characteristics of the space target was presented in order to track and identify different satellites effectively. The background radiation characteristics models of the target were built based on blackbody radiation theory. The geometry characteristics of the target were illustrated by the surface equations based on its body coordinate system. The material characteristics of the target surface were described by a bidirectional reflectance distribution function model, which considers the character of surface Gauss statistics and microscale self-shadow and is obtained by measurement and modeling in advance. The contributing surfaces of the target to observation system were determined by coordinate transformation according to the relative position of the space-based target, the background radiation sources, and the observation platform. Then a mathematical model on photometric characteristics of the space target was built by summing reflection components of all the surfaces. Photometric characteristics simulation of the space-based target was achieved according to its given geometrical dimensions, physical parameters, and orbital parameters. Experimental validation was made based on the scale model of the satellite. The calculated results fit well with the measured results, which indicates the modeling method of photometric characteristics of the space target is correct.
A quantitative test of population genetics using spatiogenetic patterns in bacterial colonies.
Korolev, Kirill S; Xavier, João B; Nelson, David R; Foster, Kevin R
2011-10-01
It is widely accepted that population-genetics theory is the cornerstone of evolutionary analyses. Empirical tests of the theory, however, are challenging because of the complex relationships between space, dispersal, and evolution. Critically, we lack quantitative validation of the spatial models of population genetics. Here we combine analytics, on- and off-lattice simulations, and experiments with bacteria to perform quantitative tests of the theory. We study two bacterial species, the gut microbe Escherichia coli and the opportunistic pathogen Pseudomonas aeruginosa, and show that spatiogenetic patterns in colony biofilms of both species are accurately described by an extension of the one-dimensional stepping-stone model. We use one empirical measure, genetic diversity at the colony periphery, to parameterize our models and show that we can then accurately predict another key variable: the degree of short-range cell migration along an edge. Moreover, the model allows us to estimate other key parameters, including effective population size (density) at the expansion frontier. While our experimental system is a simplification of natural microbial community, we argue that it constitutes proof of principle that the spatial models of population genetics can quantitatively capture organismal evolution.
Tan, Christine L.; Hassali, Mohamed A.; Saleem, Fahad; Shafie, Asrul A.; Aljadhey, Hisham; Gan, Vincent B.
2015-01-01
Objective: (i) To develop the Pharmacy Value-Added Services Questionnaire (PVASQ) using emerging themes generated from interviews. (ii) To establish reliability and validity of questionnaire instrument. Methods: Using an extended Theory of Planned Behavior as the theoretical model, face-to-face interviews generated salient beliefs of pharmacy value-added services. The PVASQ was constructed initially in English incorporating important themes and later translated into the Malay language with forward and backward translation. Intention (INT) to adopt pharmacy value-added services is predicted by attitudes (ATT), subjective norms (SN), perceived behavioral control (PBC), knowledge and expectations. Using a 7-point Likert-type scale and a dichotomous scale, test-retest reliability (N=25) was assessed by administrating the questionnaire instrument twice at an interval of one week apart. Internal consistency was measured by Cronbach’s alpha and construct validity between two administrations was assessed using the kappa statistic and the intraclass correlation coefficient (ICC). Confirmatory Factor Analysis, CFA (N=410) was conducted to assess construct validity of the PVASQ. Results: The kappa coefficients indicate a moderate to almost perfect strength of agreement between test and retest. The ICC for all scales tested for intra-rater (test-retest) reliability was good. The overall Cronbach’ s alpha (N=25) is 0.912 and 0.908 for the two time points. The result of CFA (N=410) showed most items loaded strongly and correctly into corresponding factors. Only one item was eliminated. Conclusions: This study is the first to develop and establish the reliability and validity of the Pharmacy Value-Added Services Questionnaire instrument using the Theory of Planned Behavior as the theoretical model. The translated Malay language version of PVASQ is reliable and valid to predict Malaysian patients’ intention to adopt pharmacy value-added services to collect partial medicine supply. PMID:26445622
Nahar, Vinayak K; Sharma, Manoj; Catalano, Hannah Priest; Ickes, Melinda J; Johnson, Paul; Ford, M Allison
2016-01-01
Most college students do not adequately participate in enough physical activity (PA) to attain health benefits. A theory-based approach is critical in developing effective interventions to promote PA. The purpose of this study was to examine the utility of the newly proposed multi-theory model (MTM) of health behavior change in predicting initiation and sustenance of PA among college students. Using a cross-sectional design, a valid and reliable survey was administered in October 2015 electronically to students enrolled at a large Southern US University. The internal consistency Cronbach alphas of the subscales were acceptable (0.65-0.92). Only those who did not engage in more than 150 minutes of moderate to vigorous intensity aerobic PA during the past week were included in this study. Of the 495 respondents, 190 met the inclusion criteria of which 141 completed the survey. The majority of participants were females (72.3%) and Caucasians (70.9%). Findings of the confirmatory factor analysis (CFA) confirmed construct validity of subscales (initiation model: χ2 = 253.92 [df = 143], P < 0.001, CFI = 0.91, RMSEA = 0.07, SRMR = 0.07; sustenance model: χ2= 19.40 [df = 22], P < 0.001, CFI = 1.00, RMSEA = 0.00, SRMR = 0.03). Multivariate regression analysis showed that 26% of the variance in the PA initiation was explained by advantages outweighing disadvantages, behavioral confidence, work status, and changes in physical environment. Additionally, 29.7% of the variance in PA sustenance was explained by emotional transformation, practice for change, and changes in social environment. Based on this study's findings, MTM appears to be a robust theoretical framework for predicting PA behavior change. Future research directions and development of suitable intervention strategies are discussed.
Nahar, Vinayak K.; Sharma, Manoj; Catalano, Hannah Priest; Ickes, Melinda J.; Johnson, Paul; Ford, M. Allison
2016-01-01
Background: Most college students do not adequately participate in enough physical activity (PA) to attain health benefits. A theory-based approach is critical in developing effective interventions to promote PA. The purpose of this study was to examine the utility of the newly proposed multi-theory model (MTM) of health behavior change in predicting initiation and sustenance of PA among college students. Methods: Using a cross-sectional design, a valid and reliable survey was administered in October 2015 electronically to students enrolled at a large Southern US University. The internal consistency Cronbach alphas of the subscales were acceptable (0.65-0.92). Only those who did not engage in more than 150 minutes of moderate to vigorous intensity aerobic PA during the past week were included in this study. Results: Of the 495 respondents, 190 met the inclusion criteria of which 141 completed the survey. The majority of participants were females (72.3%) and Caucasians (70.9%). Findings of the confirmatory factor analysis (CFA) confirmed construct validity of subscales (initiation model: χ2 = 253.92 [df = 143], P < 0.001, CFI = 0.91, RMSEA = 0.07, SRMR = 0.07; sustenance model: χ2= 19.40 [df = 22], P < 0.001, CFI = 1.00, RMSEA = 0.00, SRMR = 0.03). Multivariate regression analysis showed that 26% of the variance in the PA initiation was explained by advantages outweighing disadvantages, behavioral confidence, work status, and changes in physical environment. Additionally, 29.7% of the variance in PA sustenance was explained by emotional transformation, practice for change, and changes in social environment. Conclusion: Based on this study’s findings, MTM appears to be a robust theoretical framework for predicting PA behavior change. Future research directions and development of suitable intervention strategies are discussed. PMID:27386419
Tagore, Somnath; De, Rajat K.
2013-01-01
Disease Systems Biology is an area of life sciences, which is not very well understood to date. Analyzing infections and their spread in healthy metabolite networks can be one of the focussed areas in this regard. We have proposed a theory based on the classical forest fire model for analyzing the path of infection spread in healthy metabolic pathways. The theory suggests that when fire erupts in a forest, it spreads, and the surrounding trees also catch fire. Similarly, when we consider a metabolic network, the infection caused in the metabolites of the network spreads like a fire. We have constructed a simulation model which is used to study the infection caused in the metabolic networks from the start of infection, to spread and ultimately combating it. For implementation, we have used two approaches, first, based on quantitative strategies using ordinary differential equations and second, using graph-theory based properties. Furthermore, we are using certain probabilistic scores to complete this task and for interpreting the harm caused in the network, given by a ‘critical value’ to check whether the infection can be cured or not. We have tested our simulation model on metabolic pathways involved in Type I Diabetes mellitus in Homo sapiens. For validating our results biologically, we have used sensitivity analysis, both local and global, as well as for identifying the role of feedbacks in spreading infection in metabolic pathways. Moreover, information in literature has also been used to validate the results. The metabolic network datasets have been collected from the Kyoto Encyclopedia of Genes and Genomes (KEGG). PMID:24039701
Validation of the Community Integration Questionnaire in the adult burn injury population.
Gerrard, Paul; Kazis, Lewis E; Ryan, Colleen M; Shie, Vivian L; Holavanahalli, Radha; Lee, Austin; Jette, Alan; Fauerbach, James A; Esselman, Peter; Herndon, David; Schneider, Jeffrey C
2015-11-01
With improved survival, long-term effects of burn injuries on quality of life, particularly community integration, are important outcomes. This study aims to assess the Community Integration Questionnaire's psychometric properties in the adult burn population. Data were obtained from a multicenter longitudinal data set of burn survivors. The psychometric properties of the Community Integration Questionnaire (n = 492) were examined. The questionnaire items were evaluated for clinical and substantive relevance; validation procedures were conducted on different samples of the population; construct validity was assessed using exploratory factor analysis; internal consistency reliability was examined using Cronbach's α statistics; and item response theory was applied to the final models. The CIQ-15 was reduced by two questions to form the CIQ-13, with a two-factor structure, interpreted as self/family care and social integration. Item response theory testing suggests that Factor 2 captures a wider range of community integration levels. Cronbach's α was 0.80 for Factor 1, 0.77 for Factor 2, and 0.79 for the test as a whole. The CIQ-13 demonstrates validity and reliability in the adult burn survivor population addressing issues of self/family care and social integration. This instrument is useful in future research of community reintegration outcomes in the burn population.
NASA Astrophysics Data System (ADS)
Daniele, Vito G.; Lombardi, Guido; Zich, Rodolfo S.
2017-12-01
Complex scattering problems are often made by composite structures where wedges and penetrable substrates may interact at near field. In this paper (Part 1) together with its companion paper (Part 2) we study the canonical problem constituted of a Perfectly Electrically Conducting (PEC) wedge lying on a grounded dielectric slab with a comprehensive mathematical model based on the application of the Generalized Wiener-Hopf Technique (GWHT) with the help of equivalent circuital representations for linear homogenous regions (angular and layered regions). The proposed procedure is valid for the general case, and the papers focus on E-polarization. The solution is obtained using analytical and semianalytical approaches that reduce the Wiener-Hopf factorization to integral equations. Several numerical test cases validate the proposed method. The scope of Part 1 is to present the method and its validation applied to the problem. The companion paper Part 2 focuses on the properties of the solution, and it presents physical and engineering insights as Geometrical Theory of Diffraction (GTD)/Uniform Theory of Diffraction(UTD) coefficients, total far fields, modal fields, and excitation of surface and leaky waves for different kinds of source. The structure is of interest in antenna technologies and electromagnetic compatibility (tip on a substrate with guiding and antenna properties).
Making ecological models adequate
Getz, Wayne M.; Marshall, Charles R.; Carlson, Colin J.; Giuggioli, Luca; Ryan, Sadie J.; Romañach, Stephanie; Boettiger, Carl; Chamberlain, Samuel D.; Larsen, Laurel; D'Odorico, Paolo; O'Sullivan, David
2018-01-01
Critical evaluation of the adequacy of ecological models is urgently needed to enhance their utility in developing theory and enabling environmental managers and policymakers to make informed decisions. Poorly supported management can have detrimental, costly or irreversible impacts on the environment and society. Here, we examine common issues in ecological modelling and suggest criteria for improving modelling frameworks. An appropriate level of process description is crucial to constructing the best possible model, given the available data and understanding of ecological structures. Model details unsupported by data typically lead to over parameterisation and poor model performance. Conversely, a lack of mechanistic details may limit a model's ability to predict ecological systems’ responses to management. Ecological studies that employ models should follow a set of model adequacy assessment protocols that include: asking a series of critical questions regarding state and control variable selection, the determinacy of data, and the sensitivity and validity of analyses. We also need to improve model elaboration, refinement and coarse graining procedures to better understand the relevancy and adequacy of our models and the role they play in advancing theory, improving hind and forecasting, and enabling problem solving and management.
Molecular Treatment of Nano-Kaolinite Generations.
Táborosi, Attila; Szilagyi, Robert K; Zsirka, Balázs; Fónagy, Orsolya; Horváth, Erzsébet; Kristóf, János
2018-06-18
A procedure is developed for defining a compositionally and structurally realistic, atomic-scale description of exfoliated clay nanoparticles from the kaolinite family of phylloaluminosilicates. By use of coordination chemical principles, chemical environments within a nanoparticle can be separated into inner, outer, and peripheral spheres. The edges of the molecular models of nanoparticles were protonated in a validated manner to achieve charge neutrality. Structural optimizations using semiempirical methods (NDDO Hamiltonians and DFTB formalism) and ab initio density functionals with a saturated basis set revealed previously overlooked molecular origins of morphological changes as a result of exfoliation. While the use of semiempirical methods is desirable for the treatment of nanoparticles composed of tens of thousands of atoms, the structural accuracy is rather modest in comparison to DFT methods. We report a comparative survey of our infrared data for untreated crystalline and various exfoliated states of kaolinite and halloysite. Given the limited availability of experimental techniques for providing direct structural information about nano-kaolinite, the vibrational spectra can be considered as an essential tool for validating structural models. The comparison of experimental and calculated stretching and bending frequencies further justified the use of the preferred level of theory. Overall, an optimal molecular model of the defect-free, ideal nano-kaolinite can be composed with respect to stationary structure and curvature of the potential energy surface using the PW91/SVP level of theory with empirical dispersion correction (PW91+D) and polarizable continuum solvation model (PCM) without the need for a scaled quantum chemical force field. This validated theoretical approach is essential in order to follow the formation of exfoliated clays and their surface reactivity that is experimentally unattainable.
On Maximizing Item Information and Matching Difficulty with Ability.
ERIC Educational Resources Information Center
Bickel, Peter; Buyske, Steven; Chang, Huahua; Ying, Zhiliang
2001-01-01
Examined the assumption that matching difficulty levels of test items with an examinee's ability makes a test more efficient and challenged this assumption through a class of one-parameter item response theory models. Found the validity of the fundamental assumption to be closely related to the van Zwet tail ordering of symmetric distributions (W.…
A Preliminary Study for a New Model of Sense of Community
ERIC Educational Resources Information Center
Tartaglia, Stefano
2006-01-01
Although Sense of Community (SOC) is usually defined as a multidimensional construct, most SOC scales are unidimensional. To reduce the split between theory and empirical research, the present work identifies a multifactor structure for the Italian Sense of Community Scale (ISCS) that has already been validated as a unitary index of SOC. This…
Application of a Method of Estimating DIF for Polytomous Test Items.
ERIC Educational Resources Information Center
Camilli, Gregory; Congdon, Peter
1999-01-01
Demonstrates a method for studying differential item functioning (DIF) that can be used with dichotomous or polytomous items and that is valid for data that follow a partial credit Item Response Theory model. A simulation study shows that positively biased Type I error rates are in accord with results from previous studies. (SLD)
ERIC Educational Resources Information Center
Chang, Hsin Hsin; Fu, Chen Su; Huang, Ching Ying
2017-01-01
Adopting self-determination theory and the perceived characteristics of innovation as the theoretical background, this study investigates the school teachers' willingness to adopt and reuse an e-learning system. Three hundred and eighty-eight valid questionnaires were collected for analysis using structural equation modelling. The results…
ERIC Educational Resources Information Center
Haigh, Emily A. P.; Moore, Michael T.; Kashdan, Todd B.; Fresco, David M.
2011-01-01
Langer's theory of mindfulness proposes that a mindful person seeks out and produces novelty, is attentive to context, and is flexible in thought and behavior. In three independent studies, the factor structure of the Langer Mindfulness/Mindlessness Scale was examined. Confirmatory factor analysis failed to replicate the four-factor model and a…
ERIC Educational Resources Information Center
Ghanizadeh, Afsaneh; Ghonsooly, Behzad
2015-01-01
Causal attributions constitute one of the most universal forms of analyzing reality, since they fulfill basic functions in motivation for action. As a theory of causal explanations for success and failure, attribution research has found a natural context in the academic domain. Despite this, it appears that teacher attribution, in particular…
Can the Simple View Deal with the Complexities of Reading?
ERIC Educational Resources Information Center
Kirby, John R.; Savage, Robert S.
2008-01-01
We review the Simple View of Reading (SVR) model and examine its nature, applicability and validity. We describe the SVR as an abstract framework for understanding the relationship between global linguistic comprehension and word-reading abilities in reading comprehension (RC). We argue that the SVR is neither a full theory of reading nor a…
ERIC Educational Resources Information Center
Moran, Mark; Hawkes, Mark; El Gayar, Omar
2010-01-01
Many educational institutions have implemented ubiquitous or required laptop, notebook, or tablet personal computing programs for their students. Yet, limited evidence exists to validate integration and acceptance of the technology among student populations. This research examines student acceptance of mobile computing devices using a modification…
NASA Astrophysics Data System (ADS)
Lim, Yeunhwan; Holt, Jeremy W.
2017-06-01
We investigate the structure of neutron star crusts, including the crust-core boundary, based on new Skyrme mean field models constrained by the bulk-matter equation of state from chiral effective field theory and the ground-state energies of doubly-magic nuclei. Nuclear pasta phases are studied using both the liquid drop model as well as the Thomas-Fermi approximation. We compare the energy per nucleon for each geometry (spherical nuclei, cylindrical nuclei, nuclear slabs, cylindrical holes, and spherical holes) to obtain the ground state phase as a function of density. We find that the size of the Wigner-Seitz cell depends strongly on the model parameters, especially the coefficients of the density gradient interaction terms. We employ also the thermodynamic instability method to check the validity of the numerical solutions based on energy comparisons.
Error Estimation and Uncertainty Propagation in Computational Fluid Mechanics
NASA Technical Reports Server (NTRS)
Zhu, J. Z.; He, Guowei; Bushnell, Dennis M. (Technical Monitor)
2002-01-01
Numerical simulation has now become an integral part of engineering design process. Critical design decisions are routinely made based on the simulation results and conclusions. Verification and validation of the reliability of the numerical simulation is therefore vitally important in the engineering design processes. We propose to develop theories and methodologies that can automatically provide quantitative information about the reliability of the numerical simulation by estimating numerical approximation error, computational model induced errors and the uncertainties contained in the mathematical models so that the reliability of the numerical simulation can be verified and validated. We also propose to develop and implement methodologies and techniques that can control the error and uncertainty during the numerical simulation so that the reliability of the numerical simulation can be improved.
On holographic Rényi entropy in some modified theories of gravity
NASA Astrophysics Data System (ADS)
Dey, Anshuman; Roy, Pratim; Sarkar, Tapobrata
2018-04-01
We perform a detailed analysis of holographic entanglement Rényi entropy in some modified theories of gravity with four dimensional conformal field theory duals. First, we construct perturbative black hole solutions in a recently proposed model of Einsteinian cubic gravity in five dimensions, and compute the Rényi entropy as well as the scaling dimension of the twist operators in the dual field theory. Consistency of these results are verified from the AdS/CFT correspondence, via a corresponding computation of the Weyl anomaly on the gravity side. Similar analyses are then carried out for three other examples of modified gravity in five dimensions that include a chemical potential, namely Born-Infeld gravity, charged quasi-topological gravity and a class of Weyl corrected gravity theories with a gauge field, with the last example being treated perturbatively. Some interesting bounds in the dual conformal field theory parameters in quasi-topological gravity are pointed out. We also provide arguments on the validity of our perturbative analysis, whenever applicable.
Bao, Junwei Lucas; Zhang, Xin
2016-01-01
Bond dissociation is a fundamental chemical reaction, and the first principles modeling of the kinetics of dissociation reactions with a monotonically increasing potential energy along the dissociation coordinate presents a challenge not only for modern electronic structure methods but also for kinetics theory. In this work, we use multifaceted variable-reaction-coordinate variational transition-state theory (VRC-VTST) to compute the high-pressure limit dissociation rate constant of tetrafluoroethylene (C2F4), in which the potential energies are computed by direct dynamics with the M08-HX exchange correlation functional. To treat the pressure dependence of the unimolecular rate constants, we use the recently developed system-specific quantum Rice–Ramsperger–Kassel theory. The calculations are carried out by direct dynamics using an exchange correlation functional validated against calculations that go beyond coupled-cluster theory with single, double, and triple excitations. Our computed dissociation rate constants agree well with the recent experimental measurements. PMID:27834727
Bao, Junwei Lucas; Zhang, Xin; Truhlar, Donald G
2016-11-29
Bond dissociation is a fundamental chemical reaction, and the first principles modeling of the kinetics of dissociation reactions with a monotonically increasing potential energy along the dissociation coordinate presents a challenge not only for modern electronic structure methods but also for kinetics theory. In this work, we use multifaceted variable-reaction-coordinate variational transition-state theory (VRC-VTST) to compute the high-pressure limit dissociation rate constant of tetrafluoroethylene (C 2 F 4 ), in which the potential energies are computed by direct dynamics with the M08-HX exchange correlation functional. To treat the pressure dependence of the unimolecular rate constants, we use the recently developed system-specific quantum Rice-Ramsperger-Kassel theory. The calculations are carried out by direct dynamics using an exchange correlation functional validated against calculations that go beyond coupled-cluster theory with single, double, and triple excitations. Our computed dissociation rate constants agree well with the recent experimental measurements.
NASA Astrophysics Data System (ADS)
Lukeš, Petr; Rautiainen, Miina; Stenberg, Pauline; Malenovský, Zbyněk
2011-08-01
The spectral invariants theory presents an alternative approach for modeling canopy scattering in remote sensing applications. The theory is particularly appealing in the case of coniferous forests, which typically display grouped structures and require computationally intensive calculation to account for the geometric arrangement of their canopies. However, the validity of the spectral invariants theory should be tested with empirical data sets from different vegetation types. In this paper, we evaluate a method to retrieve two canopy spectral invariants, the recollision probability and the escape factor, for a coniferous forest using imaging spectroscopy data from multiangular CHRIS PROBA and NADIR-view AISA Eagle sensors. Our results indicated that in coniferous canopies the spectral invariants theory performs well in the near infrared spectral range. In the visible range, on the other hand, the spectral invariants theory may not be useful. Secondly, our study suggested that retrieval of the escape factor could be used as a new method to describe the BRDF of a canopy.
NASA Technical Reports Server (NTRS)
Manro, M. E.; Manning, K. J. R.; Hallstaff, T. H.; Rogers, J. T.
1975-01-01
A wind tunnel test of an arrow-wing-body configuration consisting of flat and twisted wings, as well as a variety of leading- and trailing-edge control surface deflections, was conducted at Mach numbers from 0.4 to 1.1 to provide an experimental pressure data base for comparison with theoretical methods. Theory-to-experiment comparisons of detailed pressure distributions were made using current state-of-the-art attached and separated flow methods. The purpose of these comparisons was to delineate conditions under which these theories are valid for both flat and twisted wings and to explore the use of empirical methods to correct the theoretical methods where theory is deficient.
NASA Astrophysics Data System (ADS)
Taousser, Fatima; Defoort, Michael; Djemai, Mohamed
2016-01-01
This paper investigates the consensus problem for linear multi-agent system with fixed communication topology in the presence of intermittent communication using the time-scale theory. Since each agent can only obtain relative local information intermittently, the proposed consensus algorithm is based on a discontinuous local interaction rule. The interaction among agents happens at a disjoint set of continuous-time intervals. The closed-loop multi-agent system can be represented using mixed linear continuous-time and linear discrete-time models due to intermittent information transmissions. The time-scale theory provides a powerful tool to combine continuous-time and discrete-time cases and study the consensus protocol under a unified framework. Using this theory, some conditions are derived to achieve exponential consensus under intermittent information transmissions. Simulations are performed to validate the theoretical results.
The theory of planned behaviour: reactions and reflections.
Ajzen, Icek
2011-09-01
The seven articles in this issue, and the accompanying meta-analysis in Health Psychology Review [McEachan, R.R.C., Conner, M., Taylor, N., & Lawton, R.J. (2011). Prospective prediction of health-related behaviors with the theory of planned behavior: A meta-analysis. Health Psychology Review, 5, 97-144], illustrate the wide application of the theory of planned behaviour [Ajzen, I. (1991). The theory of planned behavior. Organizational Behavior and Human Decision Processes, 50, 179-211] in the health domain. In this editorial, Ajzen reflects on some of the issues raised by the different authors. Among the topics addressed are the nature of intentions and the limits of predictive validity; rationality, affect and emotions; past behaviour and habit; the prototype/willingness model; and the role of such background factors as the big five personality traits and social comparison tendency.
Mean-field theory of differential rotation in density stratified turbulent convection
NASA Astrophysics Data System (ADS)
Rogachevskii, I.
2018-04-01
A mean-field theory of differential rotation in a density stratified turbulent convection has been developed. This theory is based on the combined effects of the turbulent heat flux and anisotropy of turbulent convection on the Reynolds stress. A coupled system of dynamical budget equations consisting in the equations for the Reynolds stress, the entropy fluctuations and the turbulent heat flux has been solved. To close the system of these equations, the spectral approach, which is valid for large Reynolds and Péclet numbers, has been applied. The adopted model of the background turbulent convection takes into account an increase of the turbulence anisotropy and a decrease of the turbulent correlation time with the rotation rate. This theory yields the radial profile of the differential rotation which is in agreement with that for the solar differential rotation.
Construct Validation Theory Applied to the Study of Personality Dysfunction
Zapolski, Tamika C. B.; Guller, Leila; Smith, Gregory T.
2013-01-01
The authors review theory validation and construct validation principles as related to the study of personality dysfunction. Historically, personality disorders have been understood to be syndromes of heterogeneous symptoms. The authors argue that the syndrome approach to description results in diagnoses of unclear meaning and constrained validity. The alternative approach of describing personality dysfunction in terms of homogeneous dimensions of functioning avoids the problems of the syndromal approach and has been shown to provide more valid description and diagnosis. The authors further argue that description based on homogeneous dimensions of personality function/dysfunction is more useful, because it provides direct connections to validated treatments. PMID:22321263
NASA Astrophysics Data System (ADS)
Fitkov-Norris, Elena; Yeghiazarian, Ara
2016-11-01
The analytical tools available to social scientists have traditionally been adapted from tools originally designed for analysis of natural science phenomena. This article discusses the applicability of systems dynamics - a qualitative based modelling approach, as a possible analysis and simulation tool that bridges the gap between social and natural sciences. After a brief overview of the systems dynamics modelling methodology, the advantages as well as limiting factors of systems dynamics to the potential applications in the field of social sciences and human interactions are discussed. The issues arise with regards to operationalization and quantification of latent constructs at the simulation building stage of the systems dynamics methodology and measurement theory is proposed as a ready and waiting solution to the problem of dynamic model calibration, with a view of improving simulation model reliability and validity and encouraging the development of standardised, modular system dynamics models that can be used in social science research.
Microstructural Modeling of Brittle Materials for Enhanced Performance and Reliability.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Teague, Melissa Christine; Teague, Melissa Christine; Rodgers, Theron
Brittle failure is often influenced by difficult to measure and variable microstructure-scale stresses. Recent advances in photoluminescence spectroscopy (PLS), including improved confocal laser measurement and rapid spectroscopic data collection have established the potential to map stresses with microscale spatial resolution (%3C2 microns). Advanced PLS was successfully used to investigate both residual and externally applied stresses in polycrystalline alumina at the microstructure scale. The measured average stresses matched those estimated from beam theory to within one standard deviation, validating the technique. Modeling the residual stresses within the microstructure produced general agreement in comparison with the experimentally measured results. Microstructure scale modelingmore » is primed to take advantage of advanced PLS to enable its refinement and validation, eventually enabling microstructure modeling to become a predictive tool for brittle materials.« less
Item Response Theory Modeling of the Philadelphia Naming Test.
Fergadiotis, Gerasimos; Kellough, Stacey; Hula, William D
2015-06-01
In this study, we investigated the fit of the Philadelphia Naming Test (PNT; Roach, Schwartz, Martin, Grewal, & Brecher, 1996) to an item-response-theory measurement model, estimated the precision of the resulting scores and item parameters, and provided a theoretical rationale for the interpretation of PNT overall scores by relating explanatory variables to item difficulty. This article describes the statistical model underlying the computer adaptive PNT presented in a companion article (Hula, Kellough, & Fergadiotis, 2015). Using archival data, we evaluated the fit of the PNT to 1- and 2-parameter logistic models and examined the precision of the resulting parameter estimates. We regressed the item difficulty estimates on three predictor variables: word length, age of acquisition, and contextual diversity. The 2-parameter logistic model demonstrated marginally better fit, but the fit of the 1-parameter logistic model was adequate. Precision was excellent for both person ability and item difficulty estimates. Word length, age of acquisition, and contextual diversity all independently contributed to variance in item difficulty. Item-response-theory methods can be productively used to analyze and quantify anomia severity in aphasia. Regression of item difficulty on lexical variables supported the validity of the PNT and interpretation of anomia severity scores in the context of current word-finding models.
NASA Astrophysics Data System (ADS)
Starr, Francis; Douglas, Jack; Sastry, Srikanth
2013-03-01
We examine measures of dynamical heterogeneity for a bead-spring polymer melt and test how these scales compare with the scales hypothesized by the Adam and Gibbs (AG) and random first-order transition (RFOT) theories. We show that the time scale of the high-mobility clusters and strings is associated with a diffusive time scale, while the low-mobility particles' time scale relates to a structural relaxation time. The difference of the characteristic times naturally explains the decoupling of diffusion and structural relaxation time scales. We examine the appropriateness of identifying the size scales of mobile particle clusters or strings with the size of cooperatively rearranging regions (CRR) in the AG and RFOT theories. We find that the string size appears to be the most consistent measure of CRR for both the AG and RFOT models. Identifying strings or clusters with the``mosaic'' length of the RFOT model relaxes the conventional assumption that the``entropic droplet'' are compact. We also confirm the validity of the entropy formulation of the AG theory, constraining the exponent values of the RFOT theory. This constraint, together with the analysis of size scales, enables us to estimate the characteristic exponents of RFOT.
Scarinci, Isabel C; Bandura, Lisa; Hidalgo, Bertha; Cherrington, Andrea
2012-01-01
The development of efficacious theory-based, culturally relevant interventions to promote cervical cancer prevention among underserved populations is crucial to the elimination of cancer disparities. The purpose of this article is to describe the development of a theory-based, culturally relevant intervention focusing on primary (sexual risk reduction) and secondary (Pap smear) prevention of cervical cancer among Latina immigrants using intervention mapping (IM). The PEN-3 and Health Belief Model provided theoretical guidance for the intervention development and implementation. IM provides a logical five-step framework in intervention development: delineating proximal program objectives, selecting theory-based intervention methods and strategies, developing a program plan, planning for adoption in implementation, and creating evaluation plans and instruments. We first conducted an extensive literature review and qualitatively examined the sociocultural factors associated with primary and secondary prevention of cervical cancer. We then proceeded to quantitatively validate the qualitative findings, which led to development matrices linking the theoretical constructs with intervention objectives and strategies as well as evaluation. IM was a helpful tool in the development of a theory-based, culturally relevant intervention addressing primary and secondary prevention among Latina immigrants.
Scarinci, Isabel C.; Bandura, Lisa; Hidalgo, Bertha; Cherrington, Andrea
2014-01-01
The development of efficacious theory-based, culturally relevant interventions to promote cervical cancer prevention among underserved populations is crucial to the elimination of cancer disparities. The purpose of this article is to describe the development of a theory-based, culturally relevant intervention focusing on primary (sexual risk reduction) and secondary (Pap smear) prevention of cervical cancer among Latina immigrants using intervention mapping (IM). The PEN-3 and Health Belief Model provided theoretical guidance for the intervention development and implementation. IM provides a logical five-step framework in intervention development: delineating proximal program objectives, selecting theory-based intervention methods and strategies, developing a program plan, planning for adoption in implementation, and creating evaluation plans and instruments. We first conducted an extensive literature review and qualitatively examined the socio-cultural factors associated with primary and secondary prevention of cervical cancer. We then proceeded to quantitatively validate the qualitative findings, which led to development matrices linking the theoretical constructs with intervention objectives and strategies as well as evaluation. IM was a helpful tool in the development of a theory-based, culturally relevant intervention addressing primary and secondary prevention among Latina immigrants. PMID:21422254
PDF-based heterogeneous multiscale filtration model.
Gong, Jian; Rutland, Christopher J
2015-04-21
Motivated by modeling of gasoline particulate filters (GPFs), a probability density function (PDF) based heterogeneous multiscale filtration (HMF) model is developed to calculate filtration efficiency of clean particulate filters. A new methodology based on statistical theory and classic filtration theory is developed in the HMF model. Based on the analysis of experimental porosimetry data, a pore size probability density function is introduced to represent heterogeneity and multiscale characteristics of the porous wall. The filtration efficiency of a filter can be calculated as the sum of the contributions of individual collectors. The resulting HMF model overcomes the limitations of classic mean filtration models which rely on tuning of the mean collector size. Sensitivity analysis shows that the HMF model recovers the classical mean model when the pore size variance is very small. The HMF model is validated by fundamental filtration experimental data from different scales of filter samples. The model shows a good agreement with experimental data at various operating conditions. The effects of the microstructure of filters on filtration efficiency as well as the most penetrating particle size are correctly predicted by the model.
Development and validation of a patient-reported outcome measure for stroke patients.
Luo, Yanhong; Yang, Jie; Zhang, Yanbo
2015-05-08
Family support and patient satisfaction with treatment are crucial for aiding in the recovery from stroke. However, current validated stroke-specific questionnaires may not adequately capture the impact of these two variables on patients undergoing clinical trials of new drugs. Therefore, the aim of this study was to develop and evaluate a new stroke patient-reported outcome measure (Stroke-PROM) instrument for capturing more comprehensive effects of stroke on patients participating in clinical trials of new drugs. A conceptual framework and a pool of items for the preliminary Stroke-PROM were generated by consulting the relevant literature and other questionnaires created in China and other countries, and interviewing 20 patients and 4 experts to ensure that all germane parameters were included. During the first item-selection phase, classical test theory and item response theory were applied to an initial scale completed by 133 patients with stroke. During the item-revaluation phase, classical test theory and item response theory were used again, this time with 475 patients with stroke and 104 healthy participants. During the scale assessment phase, confirmatory factor analysis was applied to the final scale of the Stroke-PROM using the same study population as in the second item-selection phase. Reliability, validity, responsiveness and feasibility of the final scale were tested. The final scale of Stroke-PROM contained 46 items describing four domains (physiology, psychology, society and treatment). These four domains were subdivided into 10 subdomains. Cronbach's α coefficients for the four domains ranged from 0.861 to 0.908. Confirmatory factor analysis supported the validity of the final scale, and the model fit index satisfied the criterion. Differences in the Stroke-PROM mean scores were significant between patients with stroke and healthy participants in nine subdomains (P < 0.001), indicating that the scale showed good responsiveness. The Stroke-PROM is a patient-reported outcome multidimensional questionnaire developed especially for clinical trials of new drugs and is focused on issues of family support and patient satisfaction with treatment. Extensive data analyses supported the validity, reliability and responsiveness of the Stroke-PROM.
An Integrative Behavioral Model of Information Security Policy Compliance
Kim, Sang Hoon; Yang, Kyung Hoon; Park, Sunyoung
2014-01-01
The authors found the behavioral factors that influence the organization members' compliance with the information security policy in organizations on the basis of neutralization theory, Theory of planned behavior, and protection motivation theory. Depending on the theory of planned behavior, members' attitudes towards compliance, as well as normative belief and self-efficacy, were believed to determine the intention to comply with the information security policy. Neutralization theory, a prominent theory in criminology, could be expected to provide the explanation for information system security policy violations. Based on the protection motivation theory, it was inferred that the expected efficacy could have an impact on intentions of compliance. By the above logical reasoning, the integrative behavioral model and eight hypotheses could be derived. Data were collected by conducting a survey; 194 out of 207 questionnaires were available. The test of the causal model was conducted by PLS. The reliability, validity, and model fit were found to be statistically significant. The results of the hypotheses tests showed that seven of the eight hypotheses were acceptable. The theoretical implications of this study are as follows: (1) the study is expected to play a role of the baseline for future research about organization members' compliance with the information security policy, (2) the study attempted an interdisciplinary approach by combining psychology and information system security research, and (3) the study suggested concrete operational definitions of influencing factors for information security policy compliance through a comprehensive theoretical review. Also, the study has some practical implications. First, it can provide the guideline to support the successful execution of the strategic establishment for the implement of information system security policies in organizations. Second, it proves that the need of education and training programs suppressing members' neutralization intention to violate information security policy should be emphasized. PMID:24971373
An integrative behavioral model of information security policy compliance.
Kim, Sang Hoon; Yang, Kyung Hoon; Park, Sunyoung
2014-01-01
The authors found the behavioral factors that influence the organization members' compliance with the information security policy in organizations on the basis of neutralization theory, Theory of planned behavior, and protection motivation theory. Depending on the theory of planned behavior, members' attitudes towards compliance, as well as normative belief and self-efficacy, were believed to determine the intention to comply with the information security policy. Neutralization theory, a prominent theory in criminology, could be expected to provide the explanation for information system security policy violations. Based on the protection motivation theory, it was inferred that the expected efficacy could have an impact on intentions of compliance. By the above logical reasoning, the integrative behavioral model and eight hypotheses could be derived. Data were collected by conducting a survey; 194 out of 207 questionnaires were available. The test of the causal model was conducted by PLS. The reliability, validity, and model fit were found to be statistically significant. The results of the hypotheses tests showed that seven of the eight hypotheses were acceptable. The theoretical implications of this study are as follows: (1) the study is expected to play a role of the baseline for future research about organization members' compliance with the information security policy, (2) the study attempted an interdisciplinary approach by combining psychology and information system security research, and (3) the study suggested concrete operational definitions of influencing factors for information security policy compliance through a comprehensive theoretical review. Also, the study has some practical implications. First, it can provide the guideline to support the successful execution of the strategic establishment for the implement of information system security policies in organizations. Second, it proves that the need of education and training programs suppressing members' neutralization intention to violate information security policy should be emphasized.
Bringing loyalty to e-Health: theory validation using three internet-delivered interventions.
Crutzen, Rik; Cyr, Dianne; de Vries, Nanne K
2011-09-24
Internet-delivered interventions can effectively change health risk behaviors, but the actual use of these interventions by the target group once they access the website is often very low (high attrition, low adherence). Therefore, it is relevant and necessary to focus on factors related to use of an intervention once people arrive at the intervention website. We focused on user perceptions resulting in e-loyalty (ie, intention to visit an intervention again and to recommend it to others). A background theory for e-loyalty, however, is still lacking for Internet-delivered interventions. The objective of our study was to propose and validate a conceptual model regarding user perceptions and e-loyalty within the field of eHealth. We presented at random 3 primary prevention interventions aimed at the general public and, subsequently, participants completed validated measures regarding user perceptions and e-loyalty. Time on each intervention website was assessed by means of server registrations. Of the 592 people who were invited to participate, 397 initiated the study (response rate: 67%) and 351 (48% female, mean age 43 years, varying in educational level) finished the study (retention rate: 88%). Internal consistency of all measures was high (Cronbach alpha > .87). The findings demonstrate that the user perceptions regarding effectiveness (beta(range) .21-.41) and enjoyment (beta(range) .14-.24) both had a positive effect on e-loyalty, which was mediated by active trust (beta(range) .27-.60). User perceptions and e-loyalty had low correlations with time on the website (r(range) .04-.18). The consistent pattern of findings speaks in favor of their robustness and contributes to theory validation regarding e-loyalty. The importance of a theory-driven solution to a practice-based problem (ie, low actual use) needs to be stressed in view of the importance of the Internet in terms of intervention development. Longitudinal studies are needed to investigate whether people will actually revisit intervention websites and whether this leads to changes in health risk behaviors.
Experimental and analytical investigation of a modified ring cusp NSTAR engine
NASA Technical Reports Server (NTRS)
Sengupta, Anita
2005-01-01
A series of experimental measurements on a modified laboratory NSTAR engine were used to validate a zero dimensional analytical discharge performance model of a ring cusp ion thruster. The model predicts the discharge performance of a ring cusp NSTAR thruster as a function the magnetic field configuration, thruster geometry, and throttle level. Analytical formalisms for electron and ion confinement are used to predict the ionization efficiency for a given thruster design. Explicit determination of discharge loss and volume averaged plasma parameters are also obtained. The model was used to predict the performance of the nominal and modified three and four ring cusp 30-cm ion thruster configurations operating at the full power (2.3 kW) NSTAR throttle level. Experimental measurements of the modified engine configuration discharge loss compare well with the predicted value for propellant utilizations from 80 to 95%. The theory, as validated by experiment, indicates that increasing the magnetic strength of the minimum closed reduces maxwellian electron diffusion and electrostatically confines the ion population and subsequent loss to the anode wall. The theory also indicates that increasing the cusp strength and minimizing the cusp area improves primary electron confinement increasing the probability of an ionization collision prior to loss at the cusp.
Development and evaluation of social cognitive measures related to adolescent physical activity.
Dewar, Deborah L; Lubans, David Revalds; Morgan, Philip James; Plotnikoff, Ronald C
2013-05-01
This study aimed to develop and evaluate the construct validity and reliability of modernized social cognitive measures relating to physical activity behaviors in adolescents. An instrument was developed based on constructs from Bandura's Social Cognitive Theory and included the following scales: self-efficacy, situation (perceived physical environment), social support, behavioral strategies, and outcome expectations and expectancies. The questionnaire was administered in a sample of 171 adolescents (age = 13.6 ± 1.2 years, females = 61%). Confirmatory factor analysis was employed to examine model-fit for each scale using multiple indices, including chi-square index, comparative-fit index (CFI), goodness-of-fit index (GFI), and the root mean square error of approximation (RMSEA). Reliability properties were also examined (ICC and Cronbach's alpha). Each scale represented a statistically sound measure: fit indices indicated each model to be an adequate-to-exact fit to the data; internal consistency was acceptable to good (α = 0.63-0.79); rank order repeatability was strong (ICC = 0.82-0.91). Results support the validity and reliability of social cognitive scales relating to physical activity among adolescents. As such, the developed scales have utility for the identification of potential social cognitive correlates of youth physical activity, mediators of physical activity behavior changes and the testing of theoretical models based on Social Cognitive Theory.
Inhibitory mechanism of the matching heuristic in syllogistic reasoning.
Tse, Ping Ping; Moreno Ríos, Sergio; García-Madruga, Juan Antonio; Bajo Molina, María Teresa
2014-11-01
A number of heuristic-based hypotheses have been proposed to explain how people solve syllogisms with automatic processes. In particular, the matching heuristic employs the congruency of the quantifiers in a syllogism—by matching the quantifier of the conclusion with those of the two premises. When the heuristic leads to an invalid conclusion, successful solving of these conflict problems requires the inhibition of automatic heuristic processing. Accordingly, if the automatic processing were based on processing the set of quantifiers, no semantic contents would be inhibited. The mental model theory, however, suggests that people reason using mental models, which always involves semantic processing. Therefore, whatever inhibition occurs in the processing implies the inhibition of the semantic contents. We manipulated the validity of the syllogism and the congruency of the quantifier of its conclusion with those of the two premises according to the matching heuristic. A subsequent lexical decision task (LDT) with related words in the conclusion was used to test any inhibition of the semantic contents after each syllogistic evaluation trial. In the LDT, the facilitation effect of semantic priming diminished after correctly solved conflict syllogisms (match-invalid or mismatch-valid), but was intact after no-conflict syllogisms. The results suggest the involvement of an inhibitory mechanism of semantic contents in syllogistic reasoning when there is a conflict between the output of the syntactic heuristic and actual validity. Our results do not support a uniquely syntactic process of syllogistic reasoning but fit with the predictions based on mental model theory. Copyright © 2014 Elsevier B.V. All rights reserved.
External Validity: From Do-calculus to Transportability Across Populations
2012-05-01
experiments with a different version. Their analysis is a special case of the theory developed in this paper (Petersen, 2011). A related application is...described below. 4 J. PEARL AND E. BAREINBOIM expected effect of a given intervention. Auxiliary to C, a causal model should also yield an estimand Qi(P...and used to confirm or disconfirm the model against the data. The structure of this inferential exercise is shown schematically in Figure 1. For a