ERIC Educational Resources Information Center
Wang, Lihua
2012-01-01
A new method is introduced for teaching group theory analysis of the infrared spectra of organometallic compounds using molecular modeling. The main focus of this method is to enhance student understanding of the symmetry properties of vibrational modes and of the group theory analysis of infrared (IR) spectra by using visual aids provided by…
de Bock, Élodie; Hardouin, Jean-Benoit; Blanchin, Myriam; Le Neel, Tanguy; Kubis, Gildas; Bonnaud-Antignac, Angélique; Dantan, Étienne; Sébille, Véronique
2016-10-01
The objective was to compare classical test theory and Rasch-family models derived from item response theory for the analysis of longitudinal patient-reported outcomes data with possibly informative intermittent missing items. A simulation study was performed in order to assess and compare the performance of classical test theory and Rasch model in terms of bias, control of the type I error and power of the test of time effect. The type I error was controlled for classical test theory and Rasch model whether data were complete or some items were missing. Both methods were unbiased and displayed similar power with complete data. When items were missing, Rasch model remained unbiased and displayed higher power than classical test theory. Rasch model performed better than the classical test theory approach regarding the analysis of longitudinal patient-reported outcomes with possibly informative intermittent missing items mainly for power. This study highlights the interest of Rasch-based models in clinical research and epidemiology for the analysis of incomplete patient-reported outcomes data. © The Author(s) 2013.
NASA Astrophysics Data System (ADS)
Greenwald, Jared
Any good physical theory must resolve current experimental data as well as offer predictions for potential searches in the future. The Standard Model of particle physics, Grand Unied Theories, Minimal Supersymmetric Models and Supergravity are all attempts to provide such a framework. However, they all lack the ability to predict many of the parameters that each of the theories utilize. String theory may yield a solution to this naturalness (or self-predictiveness) problem as well as offer a unifed theory of gravity. Studies in particle physics phenomenology based on perturbative low energy analysis of various string theories can help determine the candidacy of such models. After a review of principles and problems leading up to our current understanding of the universe, we will discuss some of the best particle physics model building techniques that have been developed using string theory. This will culminate in the introduction of a novel approach to a computational, systematic analysis of the various physical phenomena that arise from these string models. We focus on the necessary assumptions, complexity and open questions that arise while making a fully-automated at direction analysis program.
Floquet stability analysis of the longitudinal dynamics of two hovering model insects
Wu, Jiang Hao; Sun, Mao
2012-01-01
Because of the periodically varying aerodynamic and inertial forces of the flapping wings, a hovering or constant-speed flying insect is a cyclically forcing system, and, generally, the flight is not in a fixed-point equilibrium, but in a cyclic-motion equilibrium. Current stability theory of insect flight is based on the averaged model and treats the flight as a fixed-point equilibrium. In the present study, we treated the flight as a cyclic-motion equilibrium and used the Floquet theory to analyse the longitudinal stability of insect flight. Two hovering model insects were considered—a dronefly and a hawkmoth. The former had relatively high wingbeat frequency and small wing-mass to body-mass ratio, and hence very small amplitude of body oscillation; while the latter had relatively low wingbeat frequency and large wing-mass to body-mass ratio, and hence relatively large amplitude of body oscillation. For comparison, analysis using the averaged-model theory (fixed-point stability analysis) was also made. Results of both the cyclic-motion stability analysis and the fixed-point stability analysis were tested by numerical simulation using complete equations of motion coupled with the Navier–Stokes equations. The Floquet theory (cyclic-motion stability analysis) agreed well with the simulation for both the model dronefly and the model hawkmoth; but the averaged-model theory gave good results only for the dronefly. Thus, for an insect with relatively large body oscillation at wingbeat frequency, cyclic-motion stability analysis is required, and for their control analysis, the existing well-developed control theories for systems of fixed-point equilibrium are no longer applicable and new methods that take the cyclic variation of the flight dynamics into account are needed. PMID:22491980
ERIC Educational Resources Information Center
Leventhal, Brian C.; Stone, Clement A.
2018-01-01
Interest in Bayesian analysis of item response theory (IRT) models has grown tremendously due to the appeal of the paradigm among psychometricians, advantages of these methods when analyzing complex models, and availability of general-purpose software. Possible models include models which reflect multidimensionality due to designed test structure,…
The Integration of Psycholinguistic and Discourse Processing Theories of Reading Comprehension.
ERIC Educational Resources Information Center
Beebe, Mona J.
To assess the compatibility of miscue analysis and recall analysis as independent elements in a theory of reading comprehension, a study was performed that operationalized each theory and separated its components into measurable units to allow empirical testing. A cueing strategy model was estimated, but the discourse processing model was broken…
Decision-Making in National Security Affairs: Toward a Typology.
1985-06-07
decisional model, and thus provide the necessary linkage between observation and application of theory in explaining and/or predicting policy decisions . r...examines theories and models of decision -making processes from an interdisciplinary perspective, with a view toward deriving means by which the behavior of...processes, game theory , linear programming, network and graph theory , time series analysis, and the like. The discipline of decision analysis is a relatively
Nonstandard Methods in Lie Theory
ERIC Educational Resources Information Center
Goldbring, Isaac Martin
2009-01-01
In this thesis, we apply model theory to Lie theory and geometric group theory. These applications of model theory come via nonstandard analysis. In Lie theory, we use nonstandard methods to prove two results. First, we give a positive solution to the local form of Hilbert's Fifth Problem, which asks whether every locally euclidean local…
Behavior Analysis in Distance Education: A Systems Approach.
ERIC Educational Resources Information Center
Coldeway, Dan O.
1987-01-01
Describes a model of instructional theory relevant to individualized distance education that is based on Keller's Personalized System of Instruction (PSI), behavior analysis, and the instructional systems development model (ISD). Systems theory is emphasized, and ISD and behavior analysis are discussed as cybernetic processes. (LRW)
Reliability analysis in interdependent smart grid systems
NASA Astrophysics Data System (ADS)
Peng, Hao; Kan, Zhe; Zhao, Dandan; Han, Jianmin; Lu, Jianfeng; Hu, Zhaolong
2018-06-01
Complex network theory is a useful way to study many real complex systems. In this paper, a reliability analysis model based on complex network theory is introduced in interdependent smart grid systems. In this paper, we focus on understanding the structure of smart grid systems and studying the underlying network model, their interactions, and relationships and how cascading failures occur in the interdependent smart grid systems. We propose a practical model for interdependent smart grid systems using complex theory. Besides, based on percolation theory, we also study the effect of cascading failures effect and reveal detailed mathematical analysis of failure propagation in such systems. We analyze the reliability of our proposed model caused by random attacks or failures by calculating the size of giant functioning components in interdependent smart grid systems. Our simulation results also show that there exists a threshold for the proportion of faulty nodes, beyond which the smart grid systems collapse. Also we determine the critical values for different system parameters. In this way, the reliability analysis model based on complex network theory can be effectively utilized for anti-attack and protection purposes in interdependent smart grid systems.
Applying circular economy innovation theory in business process modeling and analysis
NASA Astrophysics Data System (ADS)
Popa, V.; Popa, L.
2017-08-01
The overall aim of this paper is to develop a new conceptual framework for business process modeling and analysis using circular economy innovative theory as a source for business knowledge management. The last part of the paper presents an author’s proposed basic structure for a new business models applying circular economy innovation theories. For people working on new innovative business models in the field of the circular economy this paper provides new ideas for clustering their concepts.
Dual-process models of health-related behaviour and cognition: a review of theory.
Houlihan, S
2018-03-01
The aim of this review was to synthesise a spectrum of theories incorporating dual-process models of health-related behaviour. Review of theory, adapted loosely from Cochrane-style systematic review methodology. Inclusion criteria were specified to identify all relevant dual-process models that explain decision-making in the context of decisions made about human health. Data analysis took the form of iterative template analysis (adapted from the conceptual synthesis framework used in other reviews of theory), and in this way theories were synthesised on the basis of shared theoretical constructs and causal pathways. Analysis and synthesis proceeded in turn, instead of moving uni-directionally from analysis of individual theories to synthesis of multiple theories. Namely, the reviewer considered and reconsidered individual theories and theoretical components in generating the narrative synthesis' main findings. Drawing on systematic review methodology, 11 electronic databases were searched for relevant dual-process theories. After de-duplication, 12,198 records remained. Screening of title and abstract led to the exclusion of 12,036 records, after which 162 full-text records were assessed. Of those, 21 records were included in the review. Moving back and forth between analysis of individual theories and the synthesis of theories grouped on the basis of theme or focus yielded additional insights into the orientation of a theory to an individual. Theories could be grouped in part on their treatment of an individual as an irrational actor, as social actor, as actor in a physical environment or as a self-regulated actor. Synthesising identified theories into a general dual-process model of health-related behaviour indicated that such behaviour is the result of both propositional and unconscious reasoning driven by an individual's response to internal cues (such as heuristics, attitude and affect), physical cues (social and physical environmental stimuli) as well as regulating factors (such as habit) that mediate between them. Copyright © 2017. Published by Elsevier Ltd.
Haberman, Shelby J; Sinharay, Sandip; Chon, Kyong Hee
2013-07-01
Residual analysis (e.g. Hambleton & Swaminathan, Item response theory: principles and applications, Kluwer Academic, Boston, 1985; Hambleton, Swaminathan, & Rogers, Fundamentals of item response theory, Sage, Newbury Park, 1991) is a popular method to assess fit of item response theory (IRT) models. We suggest a form of residual analysis that may be applied to assess item fit for unidimensional IRT models. The residual analysis consists of a comparison of the maximum-likelihood estimate of the item characteristic curve with an alternative ratio estimate of the item characteristic curve. The large sample distribution of the residual is proved to be standardized normal when the IRT model fits the data. We compare the performance of our suggested residual to the standardized residual of Hambleton et al. (Fundamentals of item response theory, Sage, Newbury Park, 1991) in a detailed simulation study. We then calculate our suggested residuals using data from an operational test. The residuals appear to be useful in assessing the item fit for unidimensional IRT models.
Theory analysis of the Dental Hygiene Human Needs Conceptual Model.
MacDonald, L; Bowen, D M
2017-11-01
Theories provide a structural knowing about concept relationships, practice intricacies, and intuitions and thus shape the distinct body of the profession. Capturing ways of knowing and being is essential to any professions' practice, education and research. This process defines the phenomenon of the profession - its existence or experience. Theory evaluation is a systematic criterion-based assessment of a specific theory. This study presents a theory analysis of the Dental Hygiene Human Needs Conceptual Model (DH HNCM). Using the Walker and Avant Theory Analysis, a seven-step process, the DH HNCM, was analysed and evaluated for its meaningfulness and contribution to dental hygiene. The steps include the following: (i) investigate the origins; (ii) examine relationships of the theory's concepts; (iii) assess the logic of the theory's structure; (iv) consider the usefulness to practice; (v) judge the generalizability; (vi) evaluate the parsimony; and (vii) appraise the testability of the theory. Human needs theory in nursing and Maslow's Hierarchy of Need Theory prompted this theory's development. The DH HNCM depicts four concepts based on the paradigm concepts of the profession: client, health/oral health, environment and dental hygiene actions, and includes validated eleven human needs that evolved overtime to eight. It is logical, simplistic, allows scientific predictions and testing, and provides a unique lens for the dental hygiene practitioner. With this model, dental hygienists have entered practice, knowing they enable clients to meet their human needs. For the DH HNCM, theory analysis affirmed that the model is reasonable and insightful and adds to the dental hygiene professions' epistemology and ontology. © 2016 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.
The Use of Modelling for Theory Building in Qualitative Analysis
ERIC Educational Resources Information Center
Briggs, Ann R. J.
2007-01-01
The purpose of this article is to exemplify and enhance the place of modelling as a qualitative process in educational research. Modelling is widely used in quantitative research as a tool for analysis, theory building and prediction. Statistical data lend themselves to graphical representation of values, interrelationships and operational…
Structural reliability analysis under evidence theory using the active learning kriging model
NASA Astrophysics Data System (ADS)
Yang, Xufeng; Liu, Yongshou; Ma, Panke
2017-11-01
Structural reliability analysis under evidence theory is investigated. It is rigorously proved that a surrogate model providing only correct sign prediction of the performance function can meet the accuracy requirement of evidence-theory-based reliability analysis. Accordingly, a method based on the active learning kriging model which only correctly predicts the sign of the performance function is proposed. Interval Monte Carlo simulation and a modified optimization method based on Karush-Kuhn-Tucker conditions are introduced to make the method more efficient in estimating the bounds of failure probability based on the kriging model. Four examples are investigated to demonstrate the efficiency and accuracy of the proposed method.
Chaos Modeling: An Introduction and Research Application.
ERIC Educational Resources Information Center
Newman, Isadore; And Others
1993-01-01
Introduces the basic concepts of chaos theory and chaos modeling. Relates chaos theory to qualitative research and factor analysis. Describes some current research in education and psychology using chaos theory. Claims that the philosophical implications of chaos theory have been misapplied in practical terms. (KS)
NASA Astrophysics Data System (ADS)
Reinisch, Bianca; Krüger, Dirk
2018-02-01
In research on the nature of science, there is a need to investigate the role and status of different scientific knowledge forms. Theories and models are two of the most important knowledge forms within biology and are the focus of this study. During interviews, preservice biology teachers ( N = 10) were asked about their understanding of theories and models. They were requested to give reasons why they see theories and models as either tentative or certain constructs. Their conceptions were then compared to philosophers' positions (e.g., Popper, Giere). A category system was developed from the qualitative content analysis of the interviews. These categories include 16 conceptions for theories ( n tentative = 11; n certai n = 5) and 18 conceptions for models ( n tentative = 10; n certain = 8). The analysis of the interviews showed that the preservice teachers gave reasons for the tentativeness or certainty of theories and models either due to their understanding of the terms or due to their understanding of the generation or evaluation of theories and models. Therefore, a variety of different terminology, from different sources, should be used in learning-teaching situations. Additionally, an understanding of which processes lead to the generation, evaluation, and refinement or rejection of theories and models should be discussed with preservice teachers. Within philosophy of science, there has been a shift from theories to models. This should be transferred to educational contexts by firstly highlighting the role of models and also their connections to theories.
NASA Astrophysics Data System (ADS)
Yin, Shengwen; Yu, Dejie; Yin, Hui; Lü, Hui; Xia, Baizhan
2017-09-01
Considering the epistemic uncertainties within the hybrid Finite Element/Statistical Energy Analysis (FE/SEA) model when it is used for the response analysis of built-up systems in the mid-frequency range, the hybrid Evidence Theory-based Finite Element/Statistical Energy Analysis (ETFE/SEA) model is established by introducing the evidence theory. Based on the hybrid ETFE/SEA model and the sub-interval perturbation technique, the hybrid Sub-interval Perturbation and Evidence Theory-based Finite Element/Statistical Energy Analysis (SIP-ETFE/SEA) approach is proposed. In the hybrid ETFE/SEA model, the uncertainty in the SEA subsystem is modeled by a non-parametric ensemble, while the uncertainty in the FE subsystem is described by the focal element and basic probability assignment (BPA), and dealt with evidence theory. Within the hybrid SIP-ETFE/SEA approach, the mid-frequency response of interest, such as the ensemble average of the energy response and the cross-spectrum response, is calculated analytically by using the conventional hybrid FE/SEA method. Inspired by the probability theory, the intervals of the mean value, variance and cumulative distribution are used to describe the distribution characteristics of mid-frequency responses of built-up systems with epistemic uncertainties. In order to alleviate the computational burdens for the extreme value analysis, the sub-interval perturbation technique based on the first-order Taylor series expansion is used in ETFE/SEA model to acquire the lower and upper bounds of the mid-frequency responses over each focal element. Three numerical examples are given to illustrate the feasibility and effectiveness of the proposed method.
ERIC Educational Resources Information Center
Fukuhara, Hirotaka; Kamata, Akihito
2011-01-01
A differential item functioning (DIF) detection method for testlet-based data was proposed and evaluated in this study. The proposed DIF model is an extension of a bifactor multidimensional item response theory (MIRT) model for testlets. Unlike traditional item response theory (IRT) DIF models, the proposed model takes testlet effects into…
Comparing the Effectiveness of SPSS and EduG Using Different Designs for Generalizability Theory
ERIC Educational Resources Information Center
Teker, Gulsen Tasdelen; Guler, Nese; Uyanik, Gulden Kaya
2015-01-01
Generalizability theory (G theory) provides a broad conceptual framework for social sciences such as psychology and education, and a comprehensive construct for numerous measurement events by using analysis of variance, a strong statistical method. G theory, as an extension of both classical test theory and analysis of variance, is a model which…
Comparing the Fit of Item Response Theory and Factor Analysis Models
ERIC Educational Resources Information Center
Maydeu-Olivares, Alberto; Cai, Li; Hernandez, Adolfo
2011-01-01
Linear factor analysis (FA) models can be reliably tested using test statistics based on residual covariances. We show that the same statistics can be used to reliably test the fit of item response theory (IRT) models for ordinal data (under some conditions). Hence, the fit of an FA model and of an IRT model to the same data set can now be…
Bayesian Structural Equation Modeling: A More Flexible Representation of Substantive Theory
ERIC Educational Resources Information Center
Muthen, Bengt; Asparouhov, Tihomir
2012-01-01
This article proposes a new approach to factor analysis and structural equation modeling using Bayesian analysis. The new approach replaces parameter specifications of exact zeros with approximate zeros based on informative, small-variance priors. It is argued that this produces an analysis that better reflects substantive theories. The proposed…
Erde, E L
1997-01-01
Persons concerned with medical education sometimes argued that medical students need no formal education in ethics. They contended that if admissions were restricted to persons of good character and those students were exposed to good role models, the ethics of medicine would take care of itself. However, no one seems to give much philosophic attention to the ideas of model or role model. In this essay, I undertake such an analysis and add an analysis of role. I show the weakness in relying on role models exclusively and draw implications from these for appeals to virtue theory. Furthermore, I indicate some of the problems about how virtue theory is invoked as the ethical theory that would most closely be associated to the role model rhetoric and consider some of the problems with virtue theory. Although Socrates was interested in the character of the (young) persons with whom he spoke, Socratic education is much more than what role modeling and virtue theory endorse. It-that is, philosophy-is invaluable for ethics education.
Evaluation of the chondral modeling theory using fe-simulation and numeric shape optimization
Plochocki, Jeffrey H; Ward, Carol V; Smith, Douglas E
2009-01-01
The chondral modeling theory proposes that hydrostatic pressure within articular cartilage regulates joint size, shape, and congruence through regional variations in rates of tissue proliferation.The purpose of this study is to develop a computational model using a nonlinear two-dimensional finite element analysis in conjunction with numeric shape optimization to evaluate the chondral modeling theory. The model employed in this analysis is generated from an MR image of the medial portion of the tibiofemoral joint in a subadult male. Stress-regulated morphological changes are simulated until skeletal maturity and evaluated against the chondral modeling theory. The computed results are found to support the chondral modeling theory. The shape-optimized model exhibits increased joint congruence, broader stress distributions in articular cartilage, and a relative decrease in joint diameter. The results for the computational model correspond well with experimental data and provide valuable insights into the mechanical determinants of joint growth. The model also provides a crucial first step toward developing a comprehensive model that can be employed to test the influence of mechanical variables on joint conformation. PMID:19438771
NASA Astrophysics Data System (ADS)
Oblow, E. M.
1982-10-01
An evaluation was made of the mathematical and economic basis for conversion processes in the Long-term Energy Analysis Program (LEAP) energy economy model. Conversion processes are the main modeling subunit in LEAP used to represent energy conversion industries and are supposedly based on the classical economic theory of the firm. Questions about uniqueness and existence of LEAP solutions and their relation to classical equilibrium economic theory prompted the study. An analysis of classical theory and LEAP model equations was made to determine their exact relationship. The conclusions drawn from this analysis were that LEAP theory is not consistent with the classical theory of the firm. Specifically, the capacity factor formalism used by LEAP does not support a classical interpretation in terms of a technological production function for energy conversion processes. The economic implications of this inconsistency are suboptimal process operation and short term negative profits in years where plant operation should be terminated. A new capacity factor formalism, which retains the behavioral features of the original model, is proposed to resolve these discrepancies.
Bonetti, Debbie; Johnston, Marie; Clarkson, Jan E; Grimshaw, Jeremy; Pitts, Nigel B; Eccles, Martin; Steen, Nick; Thomas, Ruth; Maclennan, Graeme; Glidewell, Liz; Walker, Anne
2010-04-08
Psychological models are used to understand and predict behaviour in a wide range of settings, but have not been consistently applied to health professional behaviours, and the contribution of differing theories is not clear. This study explored the usefulness of a range of models to predict an evidence-based behaviour -- the placing of fissure sealants. Measures were collected by postal questionnaire from a random sample of general dental practitioners (GDPs) in Scotland. Outcomes were behavioural simulation (scenario decision-making), and behavioural intention. Predictor variables were from the Theory of Planned Behaviour (TPB), Social Cognitive Theory (SCT), Common Sense Self-regulation Model (CS-SRM), Operant Learning Theory (OLT), Implementation Intention (II), Stage Model, and knowledge (a non-theoretical construct). Multiple regression analysis was used to examine the predictive value of each theoretical model individually. Significant constructs from all theories were then entered into a 'cross theory' stepwise regression analysis to investigate their combined predictive value. Behavioural simulation - theory level variance explained was: TPB 31%; SCT 29%; II 7%; OLT 30%. Neither CS-SRM nor stage explained significant variance. In the cross theory analysis, habit (OLT), timeline acute (CS-SRM), and outcome expectancy (SCT) entered the equation, together explaining 38% of the variance. Behavioural intention - theory level variance explained was: TPB 30%; SCT 24%; OLT 58%, CS-SRM 27%. GDPs in the action stage had significantly higher intention to place fissure sealants. In the cross theory analysis, habit (OLT) and attitude (TPB) entered the equation, together explaining 68% of the variance in intention. The study provides evidence that psychological models can be useful in understanding and predicting clinical behaviour. Taking a theory-based approach enables the creation of a replicable methodology for identifying factors that may predict clinical behaviour and so provide possible targets for knowledge translation interventions. Results suggest that more evidence-based behaviour may be achieved by influencing beliefs about the positive outcomes of placing fissure sealants and building a habit of placing them as part of patient management. However a number of conceptual and methodological challenges remain.
Information compression in the context model
NASA Technical Reports Server (NTRS)
Gebhardt, Joerg; Kruse, Rudolf; Nauck, Detlef
1992-01-01
The Context Model provides a formal framework for the representation, interpretation, and analysis of vague and uncertain data. The clear semantics of the underlying concepts make it feasible to compare well-known approaches to the modeling of imperfect knowledge like that given in Bayes Theory, Shafer's Evidence Theory, the Transferable Belief Model, and Possibility Theory. In this paper we present the basic ideas of the Context Model and show its applicability as an alternative foundation of Possibility Theory and the epistemic view of fuzzy sets.
ERIC Educational Resources Information Center
Buraphadeja, Vasa; Dawson, Kara
2008-01-01
This article reviews content analysis studies aimed to assess critical thinking in computer-mediated communication. It also discusses theories and content analysis models that encourage critical thinking skills in asynchronous learning environments and reviews theories and factors that may foster critical thinking skills and new knowledge…
An analysis of the Petri net based model of the human body iron homeostasis process.
Sackmann, Andrea; Formanowicz, Dorota; Formanowicz, Piotr; Koch, Ina; Blazewicz, Jacek
2007-02-01
In the paper a Petri net based model of the human body iron homeostasis is presented and analyzed. The body iron homeostasis is an important but not fully understood complex process. The modeling of the process presented in the paper is expressed in the language of Petri net theory. An application of this theory to the description of biological processes allows for very precise analysis of the resulting models. Here, such an analysis of the body iron homeostasis model from a mathematical point of view is given.
A New Higher-Order Composite Theory for Analysis and Design of High Speed Tilt-Rotor Blades
NASA Technical Reports Server (NTRS)
McCarthy, Thomas Robert
1996-01-01
A higher-order theory is developed to model composite box beams with arbitrary wall thicknesses. The theory, based on a refined displacement field, represents a three-dimensional model which approximates the elasticity solution. Therefore, the cross-sectional properties are not reduced to one-dimensional beam parameters. Both inplane and out-of-plane warping are automatically included in the formulation. The model accurately captures the transverse shear stresses through the thickness of each wall while satisfying all stress-free boundary conditions. Several numerical results are presented to validate the present theory. The developed theory is then used to model the load carrying member of a tilt-rotor blade which has thick-walled sections. The composite structural analysis is coupled with an aerodynamic analysis to compute the aeroelastic stability of the blade. Finally, a multidisciplinary optimization procedure is developed to improve the aerodynamic, structural and aeroelastic performance of the tilt-rotor aircraft. The Kreisselmeier-Steinhauser function is used to formulate the multiobjective function problem and a hybrid approximate analysis is used to reduce the computational effort. The optimum results are compared with the baseline values and show significant improvements in the overall performance of the tilt-rotor blade.
Liou, Shwu-Ru
2009-01-01
To systematically analyse the Organizational Commitment model and Theory of Reasoned Action and determine concepts that can better explain nurses' intention to leave their job. The Organizational Commitment model and Theory of Reasoned Action have been proposed and applied to understand intention to leave and turnover behaviour, which are major contributors to nursing shortage. However, the appropriateness of applying these two models in nursing was not analysed. Three main criteria of a useful model were used for the analysis: consistency in the use of concepts, testability and predictability. Both theories use concepts consistently. Concepts in the Theory of Reasoned Action are defined broadly whereas they are operationally defined in the Organizational Commitment model. Predictability of the Theory of Reasoned Action is questionable whereas the Organizational Commitment model can be applied to predict intention to leave. A model was proposed based on this analysis. Organizational commitment, intention to leave, work experiences, job characteristics and personal characteristics can be concepts for predicting nurses' intention to leave. Nursing managers may consider nurses' personal characteristics and experiences to increase their organizational commitment and enhance their intention to stay. Empirical studies are needed to test and cross-validate the re-synthesized model for nurses' intention to leave their job.
NASA Astrophysics Data System (ADS)
Matsypura, Dmytro
In this dissertation, I develop a new theoretical framework for the modeling, pricing analysis, and computation of solutions to electric power supply chains with power generators, suppliers, transmission service providers, and the inclusion of consumer demands. In particular, I advocate the application of finite-dimensional variational inequality theory, projected dynamical systems theory, game theory, network theory, and other tools that have been recently proposed for the modeling and analysis of supply chain networks (cf. Nagurney (2006)) to electric power markets. This dissertation contributes to the extant literature on the modeling, analysis, and solution of supply chain networks, including global supply chains, in general, and electric power supply chains, in particular, in the following ways. It develops a theoretical framework for modeling, pricing analysis, and computation of electric power flows/transactions in electric power systems using the rationale for supply chain analysis. The models developed include both static and dynamic ones. The dissertation also adds a new dimension to the methodology of the theory of projected dynamical systems by proving that, irrespective of the speeds of adjustment, the equilibrium of the system remains the same. Finally, I include alternative fuel suppliers, along with their behavior into the supply chain modeling and analysis framework. This dissertation has strong practical implications. In an era in which technology and globalization, coupled with increasing risk and uncertainty, complicate electricity demand and supply within and between nations, the successful management of electric power systems and pricing become increasingly pressing topics with relevance not only for economic prosperity but also national security. This dissertation addresses such related topics by providing models, pricing tools, and algorithms for decentralized electric power supply chains. This dissertation is based heavily on the following coauthored papers: Nagurney, Cruz, and Matsypura (2003), Nagurney and Matsypura (2004, 2005, 2006), Matsypura and Nagurney (2005), Matsypura, Nagurney, and Liu (2006).
ERIC Educational Resources Information Center
Sussman, Joshua; Beaujean, A. Alexander; Worrell, Frank C.; Watson, Stevie
2013-01-01
Item response models (IRMs) were used to analyze Cross Racial Identity Scale (CRIS) scores. Rasch analysis scores were compared with classical test theory (CTT) scores. The partial credit model demonstrated a high goodness of fit and correlations between Rasch and CTT scores ranged from 0.91 to 0.99. CRIS scores are supported by both methods.…
Vasil'ev, G F
2013-01-01
Owing to methodical disadvantages, the theory of control still lacks the potential for the analysis of biological systems. To get the full benefit of the method in addition to the algorithmic model of control (as of today the only used model in the theory of control) a parametric model of control is offered to employ. The reasoning for it is explained. The approach suggested provides the possibility to use all potential of the modern theory of control for the analysis of biological systems. The cybernetic approach is shown taking a system of the rise of glucose concentration in blood as an example.
Constraints and stability in vector theories with spontaneous Lorentz violation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bluhm, Robert; Gagne, Nolan L.; Potting, Robertus
2008-06-15
Vector theories with spontaneous Lorentz violation, known as bumblebee models, are examined in flat spacetime using a Hamiltonian constraint analysis. In some of these models, Nambu-Goldstone modes appear with properties similar to photons in electromagnetism. However, depending on the form of the theory, additional modes and constraints can appear that have no counterparts in electromagnetism. An examination of these constraints and additional degrees of freedom, including their nonlinear effects, is made for a variety of models with different kinetic and potential terms, and the results are compared with electromagnetism. The Hamiltonian constraint analysis also permits an investigation of the stabilitymore » of these models. For certain bumblebee theories with a timelike vector, suitable restrictions of the initial-value solutions are identified that yield ghost-free models with a positive Hamiltonian. In each case, the restricted phase space is found to match that of electromagnetism in a nonlinear gauge.« less
Ali, Ashehad A.; Medlyn, Belinda E.; Aubier, Thomas G.; ...
2015-10-06
Differential species responses to atmospheric CO 2 concentration (C a) could lead to quantitative changes in competition among species and community composition, with flow-on effects for ecosystem function. However, there has been little theoretical analysis of how elevated C a (eC a) will affect plant competition, or how composition of plant communities might change. Such theoretical analysis is needed for developing testable hypotheses to frame experimental research. Here, we investigated theoretically how plant competition might change under eC a by implementing two alternative competition theories, resource use theory and resource capture theory, in a plant carbon and nitrogen cycling model.more » The model makes several novel predictions for the impact of eC a on plant community composition. Using resource use theory, the model predicts that eC a is unlikely to change species dominance in competition, but is likely to increase coexistence among species. Using resource capture theory, the model predicts that eC a may increase community evenness. Collectively, both theories suggest that eC a will favor coexistence and hence that species diversity should increase with eC a. Our theoretical analysis leads to a novel hypothesis for the impact of eC a on plant community composition. In this study, the hypothesis has potential to help guide the design and interpretation of eC a experiments.« less
A High Precision Prediction Model Using Hybrid Grey Dynamic Model
ERIC Educational Resources Information Center
Li, Guo-Dong; Yamaguchi, Daisuke; Nagai, Masatake; Masuda, Shiro
2008-01-01
In this paper, we propose a new prediction analysis model which combines the first order one variable Grey differential equation Model (abbreviated as GM(1,1) model) from grey system theory and time series Autoregressive Integrated Moving Average (ARIMA) model from statistics theory. We abbreviate the combined GM(1,1) ARIMA model as ARGM(1,1)…
Khoshnood, Zohreh; Rayyani, Masoud; Tirgari, Batool
2018-01-13
Background Analysis of nursing theoretical works and its role in knowledge development is presented as an essential process of critical reflection. Health promotion model (HPM) focuses on helping people achieve higher levels of well-being and identifies background factors that influence health behaviors. Objectives This paper aims to evaluate, and critique HPM by Barnum's criteria. Methods The present study reviewed books and articles derived from Proquest, PubMed, Blackwell Databases. The method of evaluation for this model is based on Barnum's criteria for analysis, application and evaluation of nursing theories. The criteria selected by Barnum embrace both internal and external criticism. Internal criticism deals with how theory components fit with each other (internal construction of theory) and external criticism deals with the way in which theory relates to the extended world (which considers theory in its relationships to human beings, nursing, and health). Results The electronic database search yielded over 27,717 titles and abstracts. Following removal of duplicates, 18,963 titles and abstracts were screened using the inclusion criteria and 1278 manuscripts were retrieved. Of these, 80 were specific to HPM and 23 to analysis of any theory in nursing relating to the aim of this article. After final selection using the inclusion criteria for this review, 28 manuscripts were identified as examining the factors contributing to theory analysis. Evaluation of health promotion theory showed that the philosophical claims and their content are consistent and clear. HPM has a logical structure and was applied to diverse age groups from differing cultures with varying health concerns. Conclusion In conclusion, among the strategies for theory critique, the Barnum approach is structured and accurate, considers theory in its relationship to human beings, community psychiatric nursing, and health. While according to Pender, nursing assessment, diagnosis and interventions are utilized to operationalize the HPM through practical application and research.
Item response theory - A first approach
NASA Astrophysics Data System (ADS)
Nunes, Sandra; Oliveira, Teresa; Oliveira, Amílcar
2017-07-01
The Item Response Theory (IRT) has become one of the most popular scoring frameworks for measurement data, frequently used in computerized adaptive testing, cognitively diagnostic assessment and test equating. According to Andrade et al. (2000), IRT can be defined as a set of mathematical models (Item Response Models - IRM) constructed to represent the probability of an individual giving the right answer to an item of a particular test. The number of Item Responsible Models available to measurement analysis has increased considerably in the last fifteen years due to increasing computer power and due to a demand for accuracy and more meaningful inferences grounded in complex data. The developments in modeling with Item Response Theory were related with developments in estimation theory, most remarkably Bayesian estimation with Markov chain Monte Carlo algorithms (Patz & Junker, 1999). The popularity of Item Response Theory has also implied numerous overviews in books and journals, and many connections between IRT and other statistical estimation procedures, such as factor analysis and structural equation modeling, have been made repeatedly (Van der Lindem & Hambleton, 1997). As stated before the Item Response Theory covers a variety of measurement models, ranging from basic one-dimensional models for dichotomously and polytomously scored items and their multidimensional analogues to models that incorporate information about cognitive sub-processes which influence the overall item response process. The aim of this work is to introduce the main concepts associated with one-dimensional models of Item Response Theory, to specify the logistic models with one, two and three parameters, to discuss some properties of these models and to present the main estimation procedures.
Strategic analysis for safeguards systems: a feasibility study. Volume 2. Appendix
DOE Office of Scientific and Technical Information (OSTI.GOV)
Goldman, A J
1984-12-01
This appendix provides detailed information regarding game theory (strategic analysis) and its potential role in safeguards to supplement the main body of this report. In particualr, it includes an extensive, though not comprehensive review of literature on game theory and on other topics that relate to the formulation of a game-theoretic model (e.g. the payoff functions). The appendix describes the basic form and components of game theory models, and the solvability of various models. It then discusses three basic issues related to the use of strategic analysis in material accounting: (1) its understandability; (2) its viability in regulatory settings; andmore » (3) difficulties in the use of mixed strategies. Each of the components of a game theoretic model are then discussed and related to the present context.« less
2018-01-01
Exhaust gas recirculation (EGR) is one of the main methods of reducing NOX emissions and has been widely used in marine diesel engines. This paper proposes an optimized comprehensive assessment method based on multi-objective grey situation decision theory, grey relation theory and grey entropy analysis to evaluate the performance and optimize rate determination of EGR, which currently lack clear theoretical guidance. First, multi-objective grey situation decision theory is used to establish the initial decision-making model according to the main EGR parameters. The optimal compromise between diesel engine combustion and emission performance is transformed into a decision-making target weight problem. After establishing the initial model and considering the characteristics of EGR under different conditions, an optimized target weight algorithm based on grey relation theory and grey entropy analysis is applied to generate the comprehensive evaluation and decision-making model. Finally, the proposed method is successfully applied to a TBD234V12 turbocharged diesel engine, and the results clearly illustrate the feasibility of the proposed method for providing theoretical support and a reference for further EGR optimization. PMID:29377956
Zu, Xianghuan; Yang, Chuanlei; Wang, Hechun; Wang, Yinyan
2018-01-01
Exhaust gas recirculation (EGR) is one of the main methods of reducing NOX emissions and has been widely used in marine diesel engines. This paper proposes an optimized comprehensive assessment method based on multi-objective grey situation decision theory, grey relation theory and grey entropy analysis to evaluate the performance and optimize rate determination of EGR, which currently lack clear theoretical guidance. First, multi-objective grey situation decision theory is used to establish the initial decision-making model according to the main EGR parameters. The optimal compromise between diesel engine combustion and emission performance is transformed into a decision-making target weight problem. After establishing the initial model and considering the characteristics of EGR under different conditions, an optimized target weight algorithm based on grey relation theory and grey entropy analysis is applied to generate the comprehensive evaluation and decision-making model. Finally, the proposed method is successfully applied to a TBD234V12 turbocharged diesel engine, and the results clearly illustrate the feasibility of the proposed method for providing theoretical support and a reference for further EGR optimization.
ERIC Educational Resources Information Center
Marcoulides, Katerina M.
2018-01-01
This study examined the use of Bayesian analysis methods for the estimation of item parameters in a two-parameter logistic item response theory model. Using simulated data under various design conditions with both informative and non-informative priors, the parameter recovery of Bayesian analysis methods were examined. Overall results showed that…
Spanish Velar-Insertion and Analogy: A Usage-Based Diachronic Analysis
ERIC Educational Resources Information Center
Fondow, Steven Richard
2010-01-01
The theory of Analogical and Exemplar Modeling (AEM) suggests renewed discussion of the formalization of analogy and its possible incorporation in linguistic theory. AEM is a usage-based model founded upon Exemplar Modeling (Bybee 2007, Pierrehumbert 2001) that utilizes several principles of the Analogical Modeling of Language (Skousen 1992, 1995,…
Self construction in schizophrenia: a discourse analysis.
Meehan, Trudy; MacLachlan, Malcolm
2008-06-01
Lysaker and Lysaker (Theory and Psychology, 12(2), 207-220, 2002) employ a dialogical theory of self in their writings on self disruption in schizophrenia. It is argued here that this theory could be enriched by incorporating a discursive and social constructionist model of self. Harr's model enables researchers to use subject positions to identify self construction in people with a diagnosis of schizophrenia that the dialogical model, using analysis of narrative, does not as easily recognize. The paper presents a discourse analysis of self construction in eight participants with a diagnosis of schizophrenia. Transcripts from semi-structured interviews are analysed, wherein focus falls on how participants construct self in talk through the use of subject positioning. The findings indicate that Harr's theory of self and the implied method of discourse analysis enables more subtle and nuanced constructions of self to be identified than those highlighted by Lysaker and Lysaker (Theory and Psychology, 12(2), 207-220, 2002). The analysis of subject positions revealed that participants constructed self in the form of Harr's (The singular self: An introduction to the psychology of personhood, 1998, London: Sage) self1, self2, and self3. The findings suggest that there may be constructions of self used by people diagnosed with schizophrenia that are not recognized by the current research methods focusing on narrative. The paper argues for the recognition of these constructions and by implication a model of self that takes into account different levels of visibility of self construction in talk.
Evaluation of a Theory of Instructional Sequences for Physics Instruction
NASA Astrophysics Data System (ADS)
Wackermann, Rainer; Trendel, Georg; Fischer, Hans E.
2010-05-01
The background of the study is the theory of basis models of teaching and learning, a comprehensive set of models of learning processes which includes, for example, learning through experience and problem-solving. The combined use of different models of learning processes has not been fully investigated and it is frequently not clear under what circumstances a particular model should be used by teachers. In contrast, the theory under investigation here gives guidelines for choosing a particular model and provides instructional sequences for each model. The aim is to investigate the implementation of the theory applied to physics instruction and to show if possible effects for the students may be attributed to the use of the theory. Therefore, a theory-oriented education programme for 18 physics teachers was developed and implemented in the 2005/06 school year. The main features of the intervention consisted of coaching physics lessons and video analysis according to the theory. The study follows a pre-treatment-post design with non-equivalent control group. Findings of repeated-measures ANOVAs show large effects for teachers' subjective beliefs, large effects for classroom actions, and small to medium effects for student outcomes such as perceived instructional quality and student emotions. The teachers/classes that applied the theory especially well according to video analysis showed the larger effects. The results showed that differentiating between different models of learning processes improves physics instruction. Effects can be followed through to student outcomes. The education programme effect was clearer for classroom actions and students' outcomes than for teachers' beliefs.
ERIC Educational Resources Information Center
Sharif, Rukhsar
2017-01-01
This conceptual paper serves to create a model of creativity and innovation at different organizational levels. It draws on John Holland's Theory of Vocational Choice (1973) as the basis for its structure by incorporating the six different personality types from his theory: conventional, enterprising, realistic, social, investigative, and…
Some problems with social cognition models: a pragmatic and conceptual analysis.
Ogden, Jane
2003-07-01
Empirical articles published between 1997 and 2001 from 4 health psychology journals that tested or applied 1 or more social cognition models (theory of reasoned action, theory of planned behavior, health belief model, and protection motivation theory; N = 47) were scrutinized for their pragmatic and conceptual basis. In terms of their pragmatic basis, these 4 models were useful for guiding research. The analysis of their conceptual basis was less positive. First, these models do not enable the generation of hypotheses because their constructs are unspecific; they therefore cannot be tested. Second, they focus on analytic truths rather than synthetic ones, and the conclusions resulting from their application are often true by definition rather than by observation. Finally, they may create and change both cognitions and behavior rather than describe them.
From theory to experimental design-Quantifying a trait-based theory of predator-prey dynamics.
Laubmeier, A N; Wootton, Kate; Banks, J E; Bommarco, Riccardo; Curtsdotter, Alva; Jonsson, Tomas; Roslin, Tomas; Banks, H T
2018-01-01
Successfully applying theoretical models to natural communities and predicting ecosystem behavior under changing conditions is the backbone of predictive ecology. However, the experiments required to test these models are dictated by practical constraints, and models are often opportunistically validated against data for which they were never intended. Alternatively, we can inform and improve experimental design by an in-depth pre-experimental analysis of the model, generating experiments better targeted at testing the validity of a theory. Here, we describe this process for a specific experiment. Starting from food web ecological theory, we formulate a model and design an experiment to optimally test the validity of the theory, supplementing traditional design considerations with model analysis. The experiment itself will be run and described in a separate paper. The theory we test is that trophic population dynamics are dictated by species traits, and we study this in a community of terrestrial arthropods. We depart from the Allometric Trophic Network (ATN) model and hypothesize that including habitat use, in addition to body mass, is necessary to better model trophic interactions. We therefore formulate new terms which account for micro-habitat use as well as intra- and interspecific interference in the ATN model. We design an experiment and an effective sampling regime to test this model and the underlying assumptions about the traits dominating trophic interactions. We arrive at a detailed sampling protocol to maximize information content in the empirical data obtained from the experiment and, relying on theoretical analysis of the proposed model, explore potential shortcomings of our design. Consequently, since this is a "pre-experimental" exercise aimed at improving the links between hypothesis formulation, model construction, experimental design and data collection, we hasten to publish our findings before analyzing data from the actual experiment, thus setting the stage for strong inference.
1994-03-01
asked whether the planned structure considered (a) all objectives, (b) all functions, (c) all relevant units of analysis such as the plant , the...literature and provides an integrative model of design for high perfor-ming organizations. The model is based on an analysis of current theories of...important midrange theories underlie much of the work on organizational analysis . 0 Systems Approaches. These approaches emphasize the rational, goal
Information Security Analysis Using Game Theory and Simulation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Schlicher, Bob G; Abercrombie, Robert K
Information security analysis can be performed using game theory implemented in dynamic simulations of Agent Based Models (ABMs). Such simulations can be verified with the results from game theory analysis and further used to explore larger scale, real world scenarios involving multiple attackers, defenders, and information assets. Our approach addresses imperfect information and scalability that allows us to also address previous limitations of current stochastic game models. Such models only consider perfect information assuming that the defender is always able to detect attacks; assuming that the state transition probabilities are fixed before the game assuming that the players actions aremore » always synchronous; and that most models are not scalable with the size and complexity of systems under consideration. Our use of ABMs yields results of selected experiments that demonstrate our proposed approach and provides a quantitative measure for realistic information systems and their related security scenarios.« less
ID201202961, DOE S-124,539, Information Security Analysis Using Game Theory and Simulation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Abercrombie, Robert K; Schlicher, Bob G
Information security analysis can be performed using game theory implemented in dynamic simulations of Agent Based Models (ABMs). Such simulations can be verified with the results from game theory analysis and further used to explore larger scale, real world scenarios involving multiple attackers, defenders, and information assets. Our approach addresses imperfect information and scalability that allows us to also address previous limitations of current stochastic game models. Such models only consider perfect information assuming that the defender is always able to detect attacks; assuming that the state transition probabilities are fixed before the game assuming that the players actions aremore » always synchronous; and that most models are not scalable with the size and complexity of systems under consideration. Our use of ABMs yields results of selected experiments that demonstrate our proposed approach and provides a quantitative measure for realistic information systems and their related security scenarios.« less
Phase structure of completely asymptotically free SU(Nc) models with quarks and scalar quarks
NASA Astrophysics Data System (ADS)
Hansen, F. F.; Janowski, T.; Langæble, K.; Mann, R. B.; Sannino, F.; Steele, T. G.; Wang, Z. W.
2018-03-01
We determine the phase diagram of completely asymptotically free SU (Nc) gauge theories featuring Ns complex scalars and Nf Dirac quarks transforming according to the fundamental representation of the gauge group. The analysis is performed at the maximum known order in perturbation theory. We unveil a very rich dynamics and associated phase structure. Intriguingly, we discover that the completely asymptotically free conditions guarantee that the infrared dynamics displays long-distance conformality, and in a regime when perturbation theory is applicable. We conclude our analysis by determining the quantum corrected potential of the model and summarizing the possible patterns of radiative symmetry breaking. These models are of potential phenomenological interest as either elementary or composite ultraviolet finite extensions of the standard model.
Modeling Human-Computer Decision Making with Covariance Structure Analysis.
ERIC Educational Resources Information Center
Coovert, Michael D.; And Others
Arguing that sufficient theory exists about the interplay between human information processing, computer systems, and the demands of various tasks to construct useful theories of human-computer interaction, this study presents a structural model of human-computer interaction and reports the results of various statistical analyses of this model.…
Defense Acquisition Research Journal. Volume 21, Number 1, Issue 68
2014-01-01
Harrison’s game theory model of competition examines the bidding behavior of two equal competitors, but it does not address character- istics that...analysis examines a series of outcomes in both competitive and sole-source acquisition programs, using a statistical model that builds on a game theory ...model- ing, within a game theory framework developed by Todd Harrison, to show that the DoD may actually incur increased costs from competi- tion
Decision analysis with cumulative prospect theory.
Bayoumi, A M; Redelmeier, D A
2000-01-01
Individuals sometimes express preferences that do not follow expected utility theory. Cumulative prospect theory adjusts for some phenomena by using decision weights rather than probabilities when analyzing a decision tree. The authors examined how probability transformations from cumulative prospect theory might alter a decision analysis of a prophylactic therapy in AIDS, eliciting utilities from patients with HIV infection (n = 75) and calculating expected outcomes using an established Markov model. They next focused on transformations of three sets of probabilities: 1) the probabilities used in calculating standard-gamble utility scores; 2) the probabilities of being in discrete Markov states; 3) the probabilities of transitioning between Markov states. The same prophylaxis strategy yielded the highest quality-adjusted survival under all transformations. For the average patient, prophylaxis appeared relatively less advantageous when standard-gamble utilities were transformed. Prophylaxis appeared relatively more advantageous when state probabilities were transformed and relatively less advantageous when transition probabilities were transformed. Transforming standard-gamble and transition probabilities simultaneously decreased the gain from prophylaxis by almost half. Sensitivity analysis indicated that even near-linear probability weighting transformations could substantially alter quality-adjusted survival estimates. The magnitude of benefit estimated in a decision-analytic model can change significantly after using cumulative prospect theory. Incorporating cumulative prospect theory into decision analysis can provide a form of sensitivity analysis and may help describe when people deviate from expected utility theory.
ERIC Educational Resources Information Center
Dimitrov, Dimiter M.
2007-01-01
The validation of cognitive attributes required for correct answers on binary test items or tasks has been addressed in previous research through the integration of cognitive psychology and psychometric models using parametric or nonparametric item response theory, latent class modeling, and Bayesian modeling. All previous models, each with their…
ERIC Educational Resources Information Center
Hannah, David R.; Venkatachary, Ranga
2010-01-01
In this article, the authors present a retrospective analysis of an instructor's multiyear redesign of a course on organization theory into what is called a hybrid Classroom-as-Organization model. It is suggested that this new course design served to apprentice students to function in quasi-real organizational structures. The authors further argue…
ERIC Educational Resources Information Center
Schulze, Corina; Bryan, Valerie
2017-01-01
Through the framework of power-control theory (PCT), we provide a model of juvenile offending that places the gendered-raced treatment of juveniles central to the analysis. We test the theory using a unique sample that is predominately African American, poor, and composed entirely of juvenile offenders. Multivariate models compare the predictive…
Innovating Method of Existing Mechanical Product Based on TRIZ Theory
NASA Astrophysics Data System (ADS)
Zhao, Cunyou; Shi, Dongyan; Wu, Han
Main way of product development is adaptive design and variant design based on existing product. In this paper, conceptual design frame and its flow model of innovating products is put forward through combining the methods of conceptual design and TRIZ theory. Process system model of innovating design that includes requirement analysis, total function analysis and decomposing, engineering problem analysis, finding solution of engineering problem and primarily design is constructed and this establishes the base for innovating design of existing product.
Geometrically nonlinear analysis of laminated elastic structures
NASA Technical Reports Server (NTRS)
Reddy, J. N.
1984-01-01
Laminated composite plates and shells that can be used to model automobile bodies, aircraft wings and fuselages, and pressure vessels among many other were analyzed. The finite element method, a numerical technique for engineering analysis of structures, is used to model the geometry and approximate the solution. Various alternative formulations for analyzing laminated plates and shells are developed and their finite element models are tested for accuracy and economy in computation. These include the shear deformation laminate theory and degenerated 3-D elasticity theory for laminates.
Introduction to Multilevel Item Response Theory Analysis: Descriptive and Explanatory Models
ERIC Educational Resources Information Center
Sulis, Isabella; Toland, Michael D.
2017-01-01
Item response theory (IRT) models are the main psychometric approach for the development, evaluation, and refinement of multi-item instruments and scaling of latent traits, whereas multilevel models are the primary statistical method when considering the dependence between person responses when primary units (e.g., students) are nested within…
Modeling Conditional Probabilities in Complex Educational Assessments. CSE Technical Report.
ERIC Educational Resources Information Center
Mislevy, Robert J.; Almond, Russell; Dibello, Lou; Jenkins, Frank; Steinberg, Linda; Yan, Duanli; Senturk, Deniz
An active area in psychometric research is coordinated task design and statistical analysis built around cognitive models. Compared with classical test theory and item response theory, there is often less information from observed data about the measurement-model parameters. On the other hand, there is more information from the grounding…
ERIC Educational Resources Information Center
Ferguson, William D.
2011-01-01
Undergraduate economics lags behind cutting-edge economic theory. The author briefly reviews six related advances that profoundly extend and deepen economic analysis: game-theoretic modeling, collective-action problems, information economics and contracting, social preference theory, conceptualizing rationality, and institutional theory. He offers…
Analysis of neutral beam driven impurity flow reversal in PLT
DOE Office of Scientific and Technical Information (OSTI.GOV)
Malik, M.A.; Stacey, W.M. Jr.; Thomas, C.E.
1986-10-01
The Stacey-Sigmar impurity transport theory for tokamak plasmas is applied to the analysis of experimental data from the PLT tokamak with a tungsten limiter. The drag term, which is a central piece in the theory, is evaluated from the recently developed gyroviscous theory for radial momentum transfer. An effort is made to base the modeling of the experiment on measured quantities. Where measured data is not available, recourse is made to extrapolation or numerical modeling. The theoretical and the experimental tungsten fluxes are shown to agree very closely within the uncertainties of the experimental data.
Meta-analyses of Theory use in Medication Adherence Intervention Research
Conn, Vicki S.; Enriquez, Maithe; Ruppar, Todd M.; Chan, Keith C.
2016-01-01
Objective This systematic review applied meta-analytic procedures to integrate primary research that examined theory- or model-linked medication adherence interventions. Methods Extensive literature searching strategies were used to locate trials testing interventions with medication adherence behavior outcomes measured by electronic event monitoring, pharmacy refills, pill counts, and self-reports. Random-effects model analysis was used to calculate standardized mean difference effect sizes for medication adherence outcomes. Results Codable data were extracted from 146 comparisons with 19,348 participants. The most common theories and models were social cognitive theory and motivational interviewing. The overall weighted effect size for all interventions comparing treatment and control participants was 0.294. The effect size for interventions based on single-theories was 0.323 and for multiple-theory interventions was 0.214. Effect sizes for individual theories and models ranged from 0.041 to 0.447. The largest effect sizes were for interventions based on the health belief model (0.477) and adult learning theory (0.443). The smallest effect sizes were for interventions based on PRECEDE (0.041) and self-regulation (0.118). Conclusion These findings suggest that theory- and model-linked interventions have a significant but modest effect on medication adherence outcomes. PMID:26931748
Network Data: Statistical Theory and New Models
2016-02-17
SECURITY CLASSIFICATION OF: During this period of review, Bin Yu worked on many thrusts of high-dimensional statistical theory and methodologies. Her...research covered a wide range of topics in statistics including analysis and methods for spectral clustering for sparse and structured networks...2,7,8,21], sparse modeling (e.g. Lasso) [4,10,11,17,18,19], statistical guarantees for the EM algorithm [3], statistical analysis of algorithm leveraging
A Workbench for Discovering Task-Specific Theories of Learning
1989-03-03
mind (the cognitive architecture) will not be of much use to educators who wish to perform a cognitive task analysis of their subject matter before...analysis packages that can be added to a cognitive architecture, thus creating a ’workbench’ for performing cognitive task analysis . Such tools becomes...learning theories have been. Keywords: Cognitive task analysis , Instructional design, Cognitive modelling, Learning.
Montgomery Jr., Erwin B.
2016-01-01
Theories impact the movement disorders clinic, not only affecting the development of new therapies but determining how current therapies are used. Models are theories that are procedural rather than declarative. Theories and models are important because, as argued by Kant, one cannot know the thing-in-itself (das Ding an sich) and only a model is knowable. Further, biological variability forces higher level abstraction relevant for all variants. It is that abstraction that is raison d’être of theories and models. Theories “connect the dots” to move from correlation to causation. The necessity of theory makes theories helpful or counterproductive. Theories and models of the pathophysiology and physiology of the basal ganglia–thalamic–cortical system do not spontaneously arise but have a history and consequently are legacies. Over the last 40 years, numerous theories and models of the basal ganglia have been proposed only to be forgotten or dismissed, rarely critiqued. It is not harsh to say that current popular theories positing increased neuronal activities in the Globus Pallidus Interna (GPi), excessive beta oscillations and increased synchronization not only fail to provide an adequate explication but are inconsistent with many observations. It is likely that their shared intellectual and epistemic inheritance plays a factor in their shared failures. These issues are critically examined. How one is to derive theories and models and have hope these will be better is explored as well. PMID:27708569
Depicting the logic of three evaluation theories.
Hansen, Mark; Alkin, Marvin C; Wallace, Tanner Lebaron
2013-06-01
Here, we describe the development of logic models depicting three theories of evaluation practice: Practical Participatory (Cousins & Whitmore, 1998), Values-engaged (Greene, 2005a, 2005b), and Emergent Realist (Mark et al., 1998). We begin with a discussion of evaluation theory and the particular theories that were chosen for our analysis. We then outline the steps involved in constructing the models. The theoretical prescriptions and claims represented here follow a logic model template developed at the University Wisconsin-Extension (Taylor-Powell & Henert, 2008), which also closely aligns with Mark's (2008) framework for research on evaluation. Copyright © 2012 Elsevier Ltd. All rights reserved.
1997-09-30
modeled as either an effective fluid, effective viscoelastic solid, or a saturated poroelastic medium. The analysis included only the breathing mode...separated for each model . Finally, if a sediment is modeled by Biot theory, which describes wave propagation in a saturated poroelastic medium, then two...theory to sediment acoustics . The predicted resonance behavior under each model is distinct, so an optical extinction measurement may provide an
PESTAN: Pesticide Analytical Model Version 4.0 User's Guide
The principal objective of this User's Guide to provide essential information on the aspects such as model conceptualization, model theory, assumptions and limitations, determination of input parameters, analysis of results and sensitivity analysis.
Testing students' e-learning via Facebook through Bayesian structural equation modeling.
Salarzadeh Jenatabadi, Hashem; Moghavvemi, Sedigheh; Wan Mohamed Radzi, Che Wan Jasimah Bt; Babashamsi, Parastoo; Arashi, Mohammad
2017-01-01
Learning is an intentional activity, with several factors affecting students' intention to use new learning technology. Researchers have investigated technology acceptance in different contexts by developing various theories/models and testing them by a number of means. Although most theories/models developed have been examined through regression or structural equation modeling, Bayesian analysis offers more accurate data analysis results. To address this gap, the unified theory of acceptance and technology use in the context of e-learning via Facebook are re-examined in this study using Bayesian analysis. The data (S1 Data) were collected from 170 students enrolled in a business statistics course at University of Malaya, Malaysia, and tested with the maximum likelihood and Bayesian approaches. The difference between the two methods' results indicates that performance expectancy and hedonic motivation are the strongest factors influencing the intention to use e-learning via Facebook. The Bayesian estimation model exhibited better data fit than the maximum likelihood estimator model. The results of the Bayesian and maximum likelihood estimator approaches are compared and the reasons for the result discrepancy are deliberated.
Testing students’ e-learning via Facebook through Bayesian structural equation modeling
Moghavvemi, Sedigheh; Wan Mohamed Radzi, Che Wan Jasimah Bt; Babashamsi, Parastoo; Arashi, Mohammad
2017-01-01
Learning is an intentional activity, with several factors affecting students’ intention to use new learning technology. Researchers have investigated technology acceptance in different contexts by developing various theories/models and testing them by a number of means. Although most theories/models developed have been examined through regression or structural equation modeling, Bayesian analysis offers more accurate data analysis results. To address this gap, the unified theory of acceptance and technology use in the context of e-learning via Facebook are re-examined in this study using Bayesian analysis. The data (S1 Data) were collected from 170 students enrolled in a business statistics course at University of Malaya, Malaysia, and tested with the maximum likelihood and Bayesian approaches. The difference between the two methods’ results indicates that performance expectancy and hedonic motivation are the strongest factors influencing the intention to use e-learning via Facebook. The Bayesian estimation model exhibited better data fit than the maximum likelihood estimator model. The results of the Bayesian and maximum likelihood estimator approaches are compared and the reasons for the result discrepancy are deliberated. PMID:28886019
Towards a Theory of Organisational Culture.
ERIC Educational Resources Information Center
Owens, Robert G.; Steinhoff, Carl R.
1989-01-01
The development of the paper-and-pencil instrument called the Organizational Culture Assessment Inventory (OCAI) is based on the theory of organizational culture. Recent literature and organizational analysis are combined with Schein's model of organizational culture to provide the background for metaphorical analysis of organizational culture…
Hagerty, Thomas A; Samuels, William; Norcini-Pala, Andrea; Gigliotti, Eileen
2017-04-01
A confirmatory factor analysis of data from the responses of 12,436 patients to 16 items on the Consumer Assessment of Healthcare Providers and Systems-Hospital survey was used to test a latent factor structure based on Peplau's middle-range theory of interpersonal relations. A two-factor model based on Peplau's theory fit these data well, whereas a three-factor model also based on Peplau's theory fit them excellently and provided a suitable alternate factor structure for the data. Though neither the two- nor three-factor model fit as well as the original factor structure, these results support using Peplau's theory to demonstrate nursing's extensive contribution to the experiences of hospitalized patients.
Similarity Theory of Withdrawn Water Temperature Experiment
2015-01-01
Selective withdrawal from a thermal stratified reservoir has been widely utilized in managing reservoir water withdrawal. Besides theoretical analysis and numerical simulation, model test was also necessary in studying the temperature of withdrawn water. However, information on the similarity theory of the withdrawn water temperature model remains lacking. Considering flow features of selective withdrawal, the similarity theory of the withdrawn water temperature model was analyzed theoretically based on the modification of governing equations, the Boussinesq approximation, and some simplifications. The similarity conditions between the model and the prototype were suggested. The conversion of withdrawn water temperature between the model and the prototype was proposed. Meanwhile, the fundamental theory of temperature distribution conversion was firstly proposed, which could significantly improve the experiment efficiency when the basic temperature of the model was different from the prototype. Based on the similarity theory, an experiment was performed on the withdrawn water temperature which was verified by numerical method. PMID:26065020
Koch, Ina; Junker, Björn H; Heiner, Monika
2005-04-01
Because of the complexity of metabolic networks and their regulation, formal modelling is a useful method to improve the understanding of these systems. An essential step in network modelling is to validate the network model. Petri net theory provides algorithms and methods, which can be applied directly to metabolic network modelling and analysis in order to validate the model. The metabolism between sucrose and starch in the potato tuber is of great research interest. Even if the metabolism is one of the best studied in sink organs, it is not yet fully understood. We provide an approach for model validation of metabolic networks using Petri net theory, which we demonstrate for the sucrose breakdown pathway in the potato tuber. We start with hierarchical modelling of the metabolic network as a Petri net and continue with the analysis of qualitative properties of the network. The results characterize the net structure and give insights into the complex net behaviour.
Recent developments in rotary-wing aerodynamic theory
NASA Technical Reports Server (NTRS)
Johnson, W.
1986-01-01
Current progress in the computational analysis of rotary-wing flowfields is surveyed, and some typical results are presented in graphs. Topics examined include potential theory, rotating coordinate systems, lifting-surface theory (moving singularity, fixed wing, and rotary wing), panel methods (surface singularity representations, integral equations, and compressible flows), transonic theory (the small-disturbance equation), wake analysis (hovering rotor-wake models and transonic blade-vortex interaction), limitations on computational aerodynamics, and viscous-flow methods (dynamic-stall theories and lifting-line theory). It is suggested that the present algorithms and advanced computers make it possible to begin working toward the ultimate goal of turbulent Navier-Stokes calculations for an entire rotorcraft.
ERIC Educational Resources Information Center
Flowers, Claudia P.; Raju, Nambury S.; Oshima, T. C.
Current interest in the assessment of measurement equivalence emphasizes two methods of analysis, linear, and nonlinear procedures. This study simulated data using the graded response model to examine the performance of linear (confirmatory factor analysis or CFA) and nonlinear (item-response-theory-based differential item function or IRT-Based…
Interest Rates and Coupon Bonds in Quantum Finance
NASA Astrophysics Data System (ADS)
Baaquie, Belal E.
2009-09-01
1. Synopsis; 2. Interest rates and coupon bonds; 3. Options and option theory; 4. Interest rate and coupon bond options; 5. Quantum field theory of bond forward interest rates; 6. Libor Market Model of interest rates; 7. Empirical analysis of forward interest rates; 8. Libor Market Model of interest rate options; 9. Numeraires for bond forward interest rates; 10. Empirical analysis of interest rate caps; 11. Coupon bond European and Asian options; 12. Empirical analysis of interest rate swaptions; 13. Correlation of coupon bond options; 14. Hedging interest rate options; 15. Interest rate Hamiltonian and option theory; 16. American options for coupon bonds and interest rates; 17. Hamiltonian derivation of coupon bond options; Appendixes; Glossaries; List of symbols; Reference; Index.
NASA Astrophysics Data System (ADS)
Grover, D.; Seth, R. K.
2018-05-01
Analysis and numerical results are presented for the thermoelastic dissipation of a homogeneous isotropic, thermally conducting, Kelvin-Voigt type circular micro-plate based on Kirchhoff's Love plate theory utilizing generalized viscothermoelasticity theory of dual-phase-lagging model. The analytical expressions for thermoelastic damping of vibration and frequency shift are obtained for generalized dual-phase-lagging model and coupled viscothermoelastic plates. The scaled thermoelastic damping has been illustrated in case of circular plate and axisymmetric circular plate for fixed aspect ratio for clamped and simply supported boundary conditions. It is observed that the damping of vibrations significantly depend on time delay and mechanical relaxation times in addition to thermo-mechanical coupling in circular plate under resonance conditions and plate dimensions.
ERIC Educational Resources Information Center
Beekhoven, S.; De Jong, U.; Van Hout, H.
2002-01-01
Compared elements of rational choice theory and integration theory on the basis of their power to explain variance in academic progress. Asserts that the concepts should be combined, and the distinction between social and academic integration abandoned. Empirical analysis showed that an extended model, comprising both integration and rational…
A Biblical-Theological Model of Cognitive Dissonance Theory: Relevance for Christian Educators
ERIC Educational Resources Information Center
Bowen, Danny Ray
2012-01-01
The purpose of this content analysis research was to develop a biblical-theological model of Cognitive Dissonance Theory applicable to pedagogy. Evidence of cognitive dissonance found in Scripture was used to infer a purpose for the innate drive toward consonance. This inferred purpose was incorporated into a model that improves the descriptive…
ERIC Educational Resources Information Center
Wiio, Osmo A.
A more unified approach to communication theory can evolve through systems modeling of information theory, communication modes, and mass media operations. Such systematic analysis proposes, as is the case care here, that information models be based upon combinations of energy changes and exchanges and changes in receiver systems. The mass media is…
ERIC Educational Resources Information Center
Kelderman, Henk
1992-01-01
Describes algorithms used in the computer program LOGIMO for obtaining maximum likelihood estimates of the parameters in loglinear models. These algorithms are also useful for the analysis of loglinear item-response theory models. Presents modified versions of the iterative proportional fitting and Newton-Raphson algorithms. Simulated data…
ERIC Educational Resources Information Center
Becher, Ayelet; Orland-Barak, Lily
2016-01-01
This study suggests an integrative qualitative methodological framework for capturing complexity in mentoring activity. Specifically, the model examines how historical developments of a discipline direct mentors' mediation of professional knowledge through the language that they use. The model integrates social activity theory and a framework of…
Model Choice and Sample Size in Item Response Theory Analysis of Aphasia Tests
ERIC Educational Resources Information Center
Hula, William D.; Fergadiotis, Gerasimos; Martin, Nadine
2012-01-01
Purpose: The purpose of this study was to identify the most appropriate item response theory (IRT) measurement model for aphasia tests requiring 2-choice responses and to determine whether small samples are adequate for estimating such models. Method: Pyramids and Palm Trees (Howard & Patterson, 1992) test data that had been collected from…
Two Paradoxes in Linear Regression Analysis.
Feng, Ge; Peng, Jing; Tu, Dongke; Zheng, Julia Z; Feng, Changyong
2016-12-25
Regression is one of the favorite tools in applied statistics. However, misuse and misinterpretation of results from regression analysis are common in biomedical research. In this paper we use statistical theory and simulation studies to clarify some paradoxes around this popular statistical method. In particular, we show that a widely used model selection procedure employed in many publications in top medical journals is wrong. Formal procedures based on solid statistical theory should be used in model selection.
NASA Technical Reports Server (NTRS)
Goldberg, Robert K.
2000-01-01
A research program is in progress to develop strain rate dependent deformation and failure models for the analysis of polymer matrix composites subject to impact loads. Previously, strain rate dependent inelastic constitutive equations developed to model the polymer matrix were implemented into a mechanics of materials based micromechanics method. In the current work, the computation of the effective inelastic strain in the micromechanics model was modified to fully incorporate the Poisson effect. The micromechanics equations were also combined with classical laminate theory to enable the analysis of symmetric multilayered laminates subject to in-plane loading. A quasi-incremental trapezoidal integration method was implemented to integrate the constitutive equations within the laminate theory. Verification studies were conducted using an AS4/PEEK composite using a variety of laminate configurations and strain rates. The predicted results compared well with experimentally obtained values.
Traffic Flow Density Distribution Based on FEM
NASA Astrophysics Data System (ADS)
Ma, Jing; Cui, Jianming
In analysis of normal traffic flow, it usually uses the static or dynamic model to numerical analyze based on fluid mechanics. However, in such handling process, the problem of massive modeling and data handling exist, and the accuracy is not high. Finite Element Method (FEM) is a production which is developed from the combination of a modern mathematics, mathematics and computer technology, and it has been widely applied in various domain such as engineering. Based on existing theory of traffic flow, ITS and the development of FEM, a simulation theory of the FEM that solves the problems existing in traffic flow is put forward. Based on this theory, using the existing Finite Element Analysis (FEA) software, the traffic flow is simulated analyzed with fluid mechanics and the dynamics. Massive data processing problem of manually modeling and numerical analysis is solved, and the authenticity of simulation is enhanced.
Acknowledging the Infrasystem: A Critical Feminist Analysis of Systems Theory.
ERIC Educational Resources Information Center
Creedon, Pamela J.
1993-01-01
Examines the absence of a critical feminist perspective in the application of systems theory as a unifying model for public relations. Describes an unacknowledged third system, the infrasystem, that constructs both suprasystem and subsystem interactions. Concludes with a case analysis of sport as illustration. (HB)
Identifying Key Actors in Heterogeneous Networks
2017-11-29
analysis (SNA) and game theory (GT) to improve accuracy for detecting significant or “powerful” actors within a total actor space when both resource...coalesce in order to achieve a desired outcome. Cooperative game theory (CGT) models of coalition formation are based on two limiting assumptions: that...demonstration of a new approach for synthesizing social network analysis and game theory. The ultimate goal of this research agenda is to generalize
Dependence in probabilistic modeling Dempster-Shafer theory and probability bounds analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ferson, Scott; Nelsen, Roger B.; Hajagos, Janos
2015-05-01
This report summarizes methods to incorporate information (or lack of information) about inter-variable dependence into risk assessments that use Dempster-Shafer theory or probability bounds analysis to address epistemic and aleatory uncertainty. The report reviews techniques for simulating correlated variates for a given correlation measure and dependence model, computation of bounds on distribution functions under a specified dependence model, formulation of parametric and empirical dependence models, and bounding approaches that can be used when information about the intervariable dependence is incomplete. The report also reviews several of the most pervasive and dangerous myths among risk analysts about dependence in probabilistic models.
Trauma and Psychotherapy: Implications from a Behavior Analysis Perspective
ERIC Educational Resources Information Center
Prather, Walter
2007-01-01
Attachment theory provides a useful conceptual framework for understanding trauma and the treatment of abuse in children. This article examines attachment theory and traditional models of family therapy from the perspective of behavior analysis, and provides a rationale for a behavioral treatment approach for abused children and their foster or…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Johnson, J. D.; Oberkampf, William Louis; Helton, Jon Craig
2006-10-01
Evidence theory provides an alternative to probability theory for the representation of epistemic uncertainty in model predictions that derives from epistemic uncertainty in model inputs, where the descriptor epistemic is used to indicate uncertainty that derives from a lack of knowledge with respect to the appropriate values to use for various inputs to the model. The potential benefit, and hence appeal, of evidence theory is that it allows a less restrictive specification of uncertainty than is possible within the axiomatic structure on which probability theory is based. Unfortunately, the propagation of an evidence theory representation for uncertainty through a modelmore » is more computationally demanding than the propagation of a probabilistic representation for uncertainty, with this difficulty constituting a serious obstacle to the use of evidence theory in the representation of uncertainty in predictions obtained from computationally intensive models. This presentation describes and illustrates a sampling-based computational strategy for the representation of epistemic uncertainty in model predictions with evidence theory. Preliminary trials indicate that the presented strategy can be used to propagate uncertainty representations based on evidence theory in analysis situations where naive sampling-based (i.e., unsophisticated Monte Carlo) procedures are impracticable due to computational cost.« less
ERIC Educational Resources Information Center
Kimball, Ezekiel W.
2012-01-01
Recent literature suggests a problematic connection between theory and practice in higher education scholarship generally and the study of student learning and development specifically (e.g. Bensimon, 2007; Kezar, 2000; Love, 2012). Much of this disconnect stems from a lack of differentiation between various types of theory used in student affairs…
NASA Astrophysics Data System (ADS)
Ma, Zhanshan (Sam)
In evolutionary computing (EC), population size is one of the critical parameters that a researcher has to deal with. Hence, it was no surprise that the pioneers of EC, such as De Jong (1975) and Holland (1975), had already studied the population sizing from the very beginning of EC. What is perhaps surprising is that more than three decades later, we still largely depend on the experience or ad-hoc trial-and-error approach to set the population size. For example, in a recent monograph, Eiben and Smith (2003) indicated: "In almost all EC applications, the population size is constant and does not change during the evolutionary search." Despite enormous research on this issue in recent years, we still lack a well accepted theory for population sizing. In this paper, I propose to develop a population dynamics theory forEC with the inspiration from the population dynamics theory of biological populations in nature. Essentially, the EC population is considered as a dynamic system over time (generations) and space (search space or fitness landscape), similar to the spatial and temporal dynamics of biological populations in nature. With this conceptual mapping, I propose to 'transplant' the biological population dynamics theory to EC via three steps: (i) experimentally test the feasibility—whether or not emulating natural population dynamics improves the EC performance; (ii) comparatively study the underlying mechanisms—why there are improvements, primarily via statistical modeling analysis; (iii) conduct theoretical analysis with theoretical models such as percolation theory and extended evolutionary game theory that are generally applicable to both EC and natural populations. This article is a summary of a series of studies we have performed to achieve the general goal [27][30]-[32]. In the following, I start with an extremely brief introduction on the theory and models of natural population dynamics (Sections 1 & 2). In Sections 4 to 6, I briefly discuss three categories of population dynamics models: deterministic modeling with Logistic chaos map as an example, stochastic modeling with spatial distribution patterns as an example, as well as survival analysis and extended evolutionary game theory (EEGT) modeling. Sample experiment results with Genetic algorithms (GA) are presented to demonstrate the applications of these models. The proposed EC population dynamics approach also makes survival selection largely unnecessary or much simplified since the individuals are naturally selected (controlled) by the mathematical models for EC population dynamics.
An imprecise probability approach for squeal instability analysis based on evidence theory
NASA Astrophysics Data System (ADS)
Lü, Hui; Shangguan, Wen-Bin; Yu, Dejie
2017-01-01
An imprecise probability approach based on evidence theory is proposed for squeal instability analysis of uncertain disc brakes in this paper. First, the squeal instability of the finite element (FE) model of a disc brake is investigated and its dominant unstable eigenvalue is detected by running two typical numerical simulations, i.e., complex eigenvalue analysis (CEA) and transient dynamical analysis. Next, the uncertainty mainly caused by contact and friction is taken into account and some key parameters of the brake are described as uncertain parameters. All these uncertain parameters are usually involved with imprecise data such as incomplete information and conflict information. Finally, a squeal instability analysis model considering imprecise uncertainty is established by integrating evidence theory, Taylor expansion, subinterval analysis and surrogate model. In the proposed analysis model, the uncertain parameters with imprecise data are treated as evidence variables, and the belief measure and plausibility measure are employed to evaluate system squeal instability. The effectiveness of the proposed approach is demonstrated by numerical examples and some interesting observations and conclusions are summarized from the analyses and discussions. The proposed approach is generally limited to the squeal problems without too many investigated parameters. It can be considered as a potential method for squeal instability analysis, which will act as the first step to reduce squeal noise of uncertain brakes with imprecise information.
NASA Astrophysics Data System (ADS)
Cocco, Alex P.; Nakajo, Arata; Chiu, Wilson K. S.
2017-12-01
We present a fully analytical, heuristic model - the "Analytical Transport Network Model" - for steady-state, diffusive, potential flow through a 3-D network. Employing a combination of graph theory, linear algebra, and geometry, the model explicitly relates a microstructural network's topology and the morphology of its channels to an effective material transport coefficient (a general term meant to encompass, e.g., conductivity or diffusion coefficient). The model's transport coefficient predictions agree well with those from electrochemical fin (ECF) theory and finite element analysis (FEA), but are computed 0.5-1.5 and 5-6 orders of magnitude faster, respectively. In addition, the theory explicitly relates a number of morphological and topological parameters directly to the transport coefficient, whereby the distributions that characterize the structure are readily available for further analysis. Furthermore, ATN's explicit development provides insight into the nature of the tortuosity factor and offers the potential to apply theory from network science and to consider the optimization of a network's effective resistance in a mathematically rigorous manner. The ATN model's speed and relative ease-of-use offer the potential to aid in accelerating the design (with respect to transport), and thus reducing the cost, of energy materials.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Barbero, E.J.
1989-01-01
In this study, a computational model for accurate analysis of composite laminates and laminates with including delaminated interfaces is developed. An accurate prediction of stress distributions, including interlaminar stresses, is obtained by using the Generalized Laminate Plate Theory of Reddy in which layer-wise linear approximation of the displacements through the thickness is used. Analytical as well as finite-element solutions of the theory are developed for bending and vibrations of laminated composite plates for the linear theory. Geometrical nonlinearity, including buckling and postbuckling are included and used to perform stress analysis of laminated plates. A general two dimensional theory of laminatedmore » cylindrical shells is also developed in this study. Geometrical nonlinearity and transverse compressibility are included. Delaminations between layers of composite plates are modelled by jump discontinuity conditions at the interfaces. The theory includes multiple delaminations through the thickness. Geometric nonlinearity is included to capture layer buckling. The strain energy release rate distribution along the boundary of delaminations is computed by a novel algorithm. The computational models presented herein are accurate for global behavior and particularly appropriate for the study of local effects.« less
Two Paradoxes in Linear Regression Analysis
FENG, Ge; PENG, Jing; TU, Dongke; ZHENG, Julia Z.; FENG, Changyong
2016-01-01
Summary Regression is one of the favorite tools in applied statistics. However, misuse and misinterpretation of results from regression analysis are common in biomedical research. In this paper we use statistical theory and simulation studies to clarify some paradoxes around this popular statistical method. In particular, we show that a widely used model selection procedure employed in many publications in top medical journals is wrong. Formal procedures based on solid statistical theory should be used in model selection. PMID:28638214
Cultural Geography Model Validation
2010-03-01
the Cultural Geography Model (CGM), a government owned, open source multi - agent system utilizing Bayesian networks, queuing systems, the Theory of...referent determined either from theory or SME opinion. 4. CGM Overview The CGM is a government-owned, open source, data driven multi - agent social...HSCB, validation, social network analysis ABSTRACT: In the current warfighting environment , the military needs robust modeling and simulation (M&S
Hybrid modeling and empirical analysis of automobile supply chain network
NASA Astrophysics Data System (ADS)
Sun, Jun-yan; Tang, Jian-ming; Fu, Wei-ping; Wu, Bing-ying
2017-05-01
Based on the connection mechanism of nodes which automatically select upstream and downstream agents, a simulation model for dynamic evolutionary process of consumer-driven automobile supply chain is established by integrating ABM and discrete modeling in the GIS-based map. Firstly, the rationality is proved by analyzing the consistency of sales and changes in various agent parameters between the simulation model and a real automobile supply chain. Second, through complex network theory, hierarchical structures of the model and relationships of networks at different levels are analyzed to calculate various characteristic parameters such as mean distance, mean clustering coefficients, and degree distributions. By doing so, it verifies that the model is a typical scale-free network and small-world network. Finally, the motion law of this model is analyzed from the perspective of complex self-adaptive systems. The chaotic state of the simulation system is verified, which suggests that this system has typical nonlinear characteristics. This model not only macroscopically illustrates the dynamic evolution of complex networks of automobile supply chain but also microcosmically reflects the business process of each agent. Moreover, the model construction and simulation of the system by means of combining CAS theory and complex networks supplies a novel method for supply chain analysis, as well as theory bases and experience for supply chain analysis of auto companies.
Lau-Walker, Margaret
2006-02-01
This paper analyses the two prominent psychological theories of patient response--illness representation and self-efficacy--and explore the possibilities of the development of a conceptual individualized care model that would make use of both theories. Analysis of the literature established common themes that were used as the basis to form a conceptual framework intended to assist in the joint application of these theories to therapeutic settings. Both theories emphasize personal experience, pre-construction of self, individual response to illness and treatment, and that the patients' beliefs are more influential in their recovery than the severity of the illness. Where the theories are most divergent is their application to therapeutic interventions, which reflects the different sources of influence that each theory emphasizes. Based on their similarities and differences it is possible to integrate the two theories into a conceptual care model. The Interactive Care Model combines both theories of patient response and provides an explicit framework for further research into the design of effective therapeutic interventions in rehabilitation care.
Modeling of composite beams and plates for static and dynamic analysis
NASA Technical Reports Server (NTRS)
Hodges, Dewey H.; Atilgan, Ali R.; Lee, Bok Woo
1990-01-01
A rigorous theory and corresponding computational algorithms was developed for a variety of problems regarding the analysis of composite beams and plates. The modeling approach is intended to be applicable to both static and dynamic analysis of generally anisotropic, nonhomogeneous beams and plates. Development of a theory for analysis of the local deformation of plates was the major focus. Some work was performed on global deformation of beams. Because of the strong parallel between beams and plates, the two were treated together as thin bodies, especially in cases where it will clarify the meaning of certain terminology and the motivation behind certain mathematical operations.
An integrative feminist model: the evolving feminist perspective on intimate partner violence.
McPhail, Beverly A; Busch, Noël Bridget; Kulkarni, Shanti; Rice, Gail
2007-08-01
The feminist perspective on intimate partner violence is a predominant model in the field, although not immune to criticism. In this research, frontline workers in the violence against women movement responded to critiques of the feminist model. The project used a focus group and a modified grounded theory analysis. Participants agreed with some criticisms, including an overreliance on a punitive criminal justice system, but reported skepticism toward proposed alternatives. Findings led to the development of the Integrative Feminist Model, which expands the feminist perspective in response to critiques, new research, and alternative theories while retaining a gendered analysis of violence.
A refined shear deformation theory for the analysis of laminated plates
NASA Technical Reports Server (NTRS)
Reddy, J. N.
1986-01-01
A refined, third-order plate theory that accounts for the transverse shear strains is presented, the Navier solutions are derived for certain simply supported cross-ply and antisymmetric angle-ply laminates, and finite-element models are developed for general laminates. The new theory does not require the shear correction factors of the first-order theory (i.e., the Reissner-Mindlin plate theory) because the transverse shear stresses are represented parabolically in the present theory. A mixed finite-element model that uses independent approximations of the generalized displacements and generalized moments, and a displacement model that uses only the generalized displacements as degrees of freedom are developed. The displacement model requires C sup 1-continuity of the transverse deflection across the inter-element boundaries, whereas the mixed model requires a C sup 0-element. Also, the mixed model does not require continuous approximations (between elements) of the bending moments. Numerical results are presented to show the accuracy of the present theory in predicting the transverse stresses. Numerical results are also presented for the nonlinear bending of plates, and the results compare well with the experimental results available in the literature.
Johnson, Victoria A; Ronan, Kevin R; Johnston, David M; Peace, Robin
2016-11-01
A main weakness in the evaluation of disaster education programs for children is evaluators' propensity to judge program effectiveness based on changes in children's knowledge. Few studies have articulated an explicit program theory of how children's education would achieve desired outcomes and impacts related to disaster risk reduction in households and communities. This article describes the advantages of constructing program theory models for the purpose of evaluating disaster education programs for children. Following a review of some potential frameworks for program theory development, including the logic model, the program theory matrix, and the stage step model, the article provides working examples of these frameworks. The first example is the development of a program theory matrix used in an evaluation of ShakeOut, an earthquake drill practiced in two Washington State school districts. The model illustrates a theory of action; specifically, the effectiveness of school earthquake drills in preventing injuries and deaths during disasters. The second example is the development of a stage step model used for a process evaluation of What's the Plan Stan?, a voluntary teaching resource distributed to all New Zealand primary schools for curricular integration of disaster education. The model illustrates a theory of use; specifically, expanding the reach of disaster education for children through increased promotion of the resource. The process of developing the program theory models for the purpose of evaluation planning is discussed, as well as the advantages and shortcomings of the theory-based approaches. © 2015 Society for Risk Analysis.
Bayesian structural equation modeling: a more flexible representation of substantive theory.
Muthén, Bengt; Asparouhov, Tihomir
2012-09-01
This article proposes a new approach to factor analysis and structural equation modeling using Bayesian analysis. The new approach replaces parameter specifications of exact zeros with approximate zeros based on informative, small-variance priors. It is argued that this produces an analysis that better reflects substantive theories. The proposed Bayesian approach is particularly beneficial in applications where parameters are added to a conventional model such that a nonidentified model is obtained if maximum-likelihood estimation is applied. This approach is useful for measurement aspects of latent variable modeling, such as with confirmatory factor analysis, and the measurement part of structural equation modeling. Two application areas are studied, cross-loadings and residual correlations in confirmatory factor analysis. An example using a full structural equation model is also presented, showing an efficient way to find model misspecification. The approach encompasses 3 elements: model testing using posterior predictive checking, model estimation, and model modification. Monte Carlo simulations and real data are analyzed using Mplus. The real-data analyses use data from Holzinger and Swineford's (1939) classic mental abilities study, Big Five personality factor data from a British survey, and science achievement data from the National Educational Longitudinal Study of 1988.
Precision Higgs Boson Physics and Implications for Beyond the Standard Model Physics Theories
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wells, James
The discovery of the Higgs boson is one of science's most impressive recent achievements. We have taken a leap forward in understanding what is at the heart of elementary particle mass generation. We now have a significant opportunity to develop even deeper understanding of how the fundamental laws of nature are constructed. As such, we need intense focus from the scientific community to put this discovery in its proper context, to realign and narrow our understanding of viable theory based on this positive discovery, and to detail the implications the discovery has for theories that attempt to answer questions beyondmore » what the Standard Model can explain. This project's first main object is to develop a state-of-the-art analysis of precision Higgs boson physics. This is to be done in the tradition of the electroweak precision measurements of the LEP/SLC era. Indeed, the electroweak precision studies of the past are necessary inputs to the full precision Higgs program. Calculations will be presented to the community of Higgs boson observables that detail just how well various couplings of the Higgs boson can be measured, and more. These will be carried out using state-of-the-art theory computations coupled with the new experimental results coming in from the LHC. The project's second main objective is to utilize the results obtained from LHC Higgs boson experiments and the precision analysis, along with the direct search studies at LHC, and discern viable theories of physics beyond the Standard Model that unify physics to a deeper level. Studies will be performed on supersymmetric theories, theories of extra spatial dimensions (and related theories, such as compositeness), and theories that contain hidden sector states uniquely accessible to the Higgs boson. In addition, if data becomes incompatible with the Standard Model's low-energy effective lagrangian, new physics theories will be developed that explain the anomaly and put it into a more unified framework beyond the Standard Model.« less
NASA Technical Reports Server (NTRS)
Nguyen, Nhan; Ting, Eric; Chaparro, Daniel
2017-01-01
This paper investigates the effect of nonlinear large deflection bending on the aerodynamic performance of a high aspect ratio flexible wing. A set of nonlinear static aeroelastic equations are derived for the large bending deflection of a high aspect ratio wing structure. An analysis is conducted to compare the nonlinear bending theory with the linear bending theory. The results show that the nonlinear bending theory is length-preserving whereas the linear bending theory causes a non-physical effect of lengthening the wing structure under the no axial load condition. A modified lifting line theory is developed to compute the lift and drag coefficients of a wing structure undergoing a large bending deflection. The lift and drag coefficients are more accurately estimated by the nonlinear bending theory due to its length-preserving property. The nonlinear bending theory yields lower lift and span efficiency than the linear bending theory. A coupled aerodynamic-nonlinear finite element model is developed to implement the nonlinear bending theory for a Common Research Model (CRM) flexible wing wind tunnel model to be tested in the University of Washington Aeronautical Laboratory (UWAL). The structural stiffness of the model is designed to give about 10% wing tip deflection which is large enough that could cause the nonlinear deflection effect to become significant. The computational results show that the nonlinear bending theory yields slightly less lift than the linear bending theory for this wind tunnel model. As a result, the linear bending theory is deemed adequate for the CRM wind tunnel model.
Analysis of high vacuum systems using SINDA'85
NASA Technical Reports Server (NTRS)
Spivey, R. A.; Clanton, S. E.; Moore, J. D.
1993-01-01
The theory, algorithms, and test data correlation analysis of a math model developed to predict performance of the Space Station Freedom Vacuum Exhaust System are presented. The theory used to predict the flow characteristics of viscous, transition, and molecular flow is presented in detail. Development of user subroutines which predict the flow characteristics in conjunction with the SINDA'85/FLUINT analysis software are discussed. The resistance-capacitance network approach with application to vacuum system analysis is demonstrated and results from the model are correlated with test data. The model was developed to predict the performance of the Space Station Freedom Vacuum Exhaust System. However, the unique use of the user subroutines developed in this model and written into the SINDA'85/FLUINT thermal analysis model provides a powerful tool that can be used to predict the transient performance of vacuum systems and gas flow in tubes of virtually any geometry. This can be accomplished using a resistance-capacitance (R-C) method very similar to the methods used to perform thermal analyses.
A Proposed Model of Jazz Theory Knowledge Acquisition
ERIC Educational Resources Information Center
Ciorba, Charles R.; Russell, Brian E.
2014-01-01
The purpose of this study was to test a hypothesized model that proposes a causal relationship between motivation and academic achievement on the acquisition of jazz theory knowledge. A reliability analysis of the latent variables ranged from 0.92 to 0.94. Confirmatory factor analyses of the motivation (standardized root mean square residual…
ERIC Educational Resources Information Center
Holster, Trevor A.; Lake, J.
2016-01-01
Stewart questioned Beglar's use of Rasch analysis of the Vocabulary Size Test (VST) and advocated the use of 3-parameter logistic item response theory (3PLIRT) on the basis that it models a non-zero lower asymptote for items, often called a "guessing" parameter. In support of this theory, Stewart presented fit statistics derived from…
Data Visualization of Item-Total Correlation by Median Smoothing
ERIC Educational Resources Information Center
Yu, Chong Ho; Douglas, Samantha; Lee, Anna; An, Min
2016-01-01
This paper aims to illustrate how data visualization could be utilized to identify errors prior to modeling, using an example with multi-dimensional item response theory (MIRT). MIRT combines item response theory and factor analysis to identify a psychometric model that investigates two or more latent traits. While it may seem convenient to…
Principals' Leadership and Teachers' Motivation: Self-Determination Theory Analysis
ERIC Educational Resources Information Center
Eyal, Ori; Roth, Guy
2011-01-01
Purpose: The purpose of this paper is to investigate the relationship between educational leadership and teacher's motivation. The research described here was anchored in the convergence of two fundamental theories of leadership and motivation: the full range model of leadership and self-determination theory. The central hypotheses were that…
Underwood, Peter; Waterson, Patrick
2014-07-01
The Swiss Cheese Model (SCM) is the most popular accident causation model and is widely used throughout various industries. A debate exists in the research literature over whether the SCM remains a viable tool for accident analysis. Critics of the model suggest that it provides a sequential, oversimplified view of accidents. Conversely, proponents suggest that it embodies the concepts of systems theory, as per the contemporary systemic analysis techniques. The aim of this paper was to consider whether the SCM can provide a systems thinking approach and remain a viable option for accident analysis. To achieve this, the train derailment at Grayrigg was analysed with an SCM-based model (the ATSB accident investigation model) and two systemic accident analysis methods (AcciMap and STAMP). The analysis outputs and usage of the techniques were compared. The findings of the study showed that each model applied the systems thinking approach. However, the ATSB model and AcciMap graphically presented their findings in a more succinct manner, whereas STAMP more clearly embodied the concepts of systems theory. The study suggests that, whilst the selection of an analysis method is subject to trade-offs that practitioners and researchers must make, the SCM remains a viable model for accident analysis. Copyright © 2013 Elsevier Ltd. All rights reserved.
Hagger, Martin S.; Gucciardi, Daniel F.; Chatzisarantis, Nikos L. D.
2017-01-01
Tests of social cognitive theories provide informative data on the factors that relate to health behavior, and the processes and mechanisms involved. In the present article, we contend that tests of social cognitive theories should adhere to the principles of nomological validity, defined as the degree to which predictions in a formal theoretical network are confirmed. We highlight the importance of nomological validity tests to ensure theory predictions can be disconfirmed through observation. We argue that researchers should be explicit on the conditions that lead to theory disconfirmation, and identify any auxiliary assumptions on which theory effects may be conditional. We contend that few researchers formally test the nomological validity of theories, or outline conditions that lead to model rejection and the auxiliary assumptions that may explain findings that run counter to hypotheses, raising potential for ‘falsification evasion.’ We present a brief analysis of studies (k = 122) testing four key social cognitive theories in health behavior to illustrate deficiencies in reporting theory tests and evaluations of nomological validity. Our analysis revealed that few articles report explicit statements suggesting that their findings support or reject the hypotheses of the theories tested, even when findings point to rejection. We illustrate the importance of explicit a priori specification of fundamental theory hypotheses and associated auxiliary assumptions, and identification of the conditions which would lead to rejection of theory predictions. We also demonstrate the value of confirmatory analytic techniques, meta-analytic structural equation modeling, and Bayesian analyses in providing robust converging evidence for nomological validity. We provide a set of guidelines for researchers on how to adopt and apply the nomological validity approach to testing health behavior models. PMID:29163307
The scientific theory profile: A philosophy of science model for science teachers
NASA Astrophysics Data System (ADS)
Loving, Cathleen C.
A model called the Scientific Theory Profile was developed for use with preservice and inservice science teachers or with graduate students interested in the various ways scientific theories are perceived. Early indications - from a survey of institutions with science education programs and a survey of current science methods texts - are that too little emphasis is placed on what contemporary writings reveal about the nature and importance of scientific theories. This prompted the development of the Profile. The Profile consists of a grid, with the x-axis representing methods for judging theories (rational vs. natural), and the y-axis representing views on reigning scientific theories as being the Truth versus models of what works best (realism vs. anti-realism). Three well-known philosophers of science who were selected for detailed analysis and who form the keystone positions on the Profile are Thomas Kuhn, Carl Hempel, and Sir Karl Popper. The hypothesis was that an analysis of the writings of respected individuals in philosophy and history of science who have different perspectives on theories (as well as overarching areas of agreement) could be translated into relative coordinates on a graph; and that this visual model might be helpful to science teachers in developing a balanced philosophy of science and a deeper understanding of the power of reigning theories. Nine other contemporary philosophers, all influenced by the three originals, are included in brief analyses, with their positions on the grid being relative to the keystones. The Scientific Theory Profile then forms the basis for a course, now in the planning stages, in perspectives on the nature of science, primarily for science teachers, with some objectives and activities suggested.
NASA Astrophysics Data System (ADS)
Arai, Shun; Nishizawa, Atsushi
2018-05-01
Gravitational waves (GW) are generally affected by modification of a gravity theory during propagation at cosmological distances. We numerically perform a quantitative analysis on Horndeski theory at the cosmological scale to constrain the Horndeski theory by GW observations in a model-independent way. We formulate a parametrization for a numerical simulation based on the Monte Carlo method and obtain the classification of the models that agrees with cosmic accelerating expansion within observational errors of the Hubble parameter. As a result, we find that a large group of the models in the Horndeski theory that mimic cosmic expansion of the Λ CDM model can be excluded from the simultaneous detection of a GW and its electromagnetic transient counterpart. Based on our result and the latest detection of GW170817 and GRB170817A, we conclude that the subclass of Horndeski theory including arbitrary functions G4 and G5 can hardly explain cosmic accelerating expansion without fine-tuning.
NASA Astrophysics Data System (ADS)
Dasgupta, Sambarta
Transient stability and sensitivity analysis of power systems are problems of enormous academic and practical interest. These classical problems have received renewed interest, because of the advancement in sensor technology in the form of phasor measurement units (PMUs). The advancement in sensor technology has provided unique opportunity for the development of real-time stability monitoring and sensitivity analysis tools. Transient stability problem in power system is inherently a problem of stability analysis of the non-equilibrium dynamics, because for a short time period following a fault or disturbance the system trajectory moves away from the equilibrium point. The real-time stability decision has to be made over this short time period. However, the existing stability definitions and hence analysis tools for transient stability are asymptotic in nature. In this thesis, we discover theoretical foundations for the short-term transient stability analysis of power systems, based on the theory of normally hyperbolic invariant manifolds and finite time Lyapunov exponents, adopted from geometric theory of dynamical systems. The theory of normally hyperbolic surfaces allows us to characterize the rate of expansion and contraction of co-dimension one material surfaces in the phase space. The expansion and contraction rates of these material surfaces can be computed in finite time. We prove that the expansion and contraction rates can be used as finite time transient stability certificates. Furthermore, material surfaces with maximum expansion and contraction rate are identified with the stability boundaries. These stability boundaries are used for computation of stability margin. We have used the theoretical framework for the development of model-based and model-free real-time stability monitoring methods. Both the model-based and model-free approaches rely on the availability of high resolution time series data from the PMUs for stability prediction. The problem of sensitivity analysis of power system, subjected to changes or uncertainty in load parameters and network topology, is also studied using the theory of normally hyperbolic manifolds. The sensitivity analysis is used for the identification and rank ordering of the critical interactions and parameters in the power network. The sensitivity analysis is carried out both in finite time and in asymptotic. One of the distinguishing features of the asymptotic sensitivity analysis is that the asymptotic dynamics of the system is assumed to be a periodic orbit. For asymptotic sensitivity analysis we employ combination of tools from ergodic theory and geometric theory of dynamical systems.
Econometrics of exhaustible resource supply: a theory and an application
DOE Office of Scientific and Technical Information (OSTI.GOV)
Epple, D.
1983-01-01
This report takes a major step toward developing a fruitful approach to empirical analysis of resource supply. It is the first empirical application of resource theory that has successfully integrated the effects of depletion of nonrenewable resources with the effects of uncertainty about future costs and prices on supply behavior. Thus, the model is a major improvement over traditional engineering-optimization models that assume complete certainty, and over traditional econometrics models that are only implicitly related to the theory of resource supply. The model is used to test hypotheses about interdependence of oil and natural gas discoveries, depletion, ultimate recovery, andmore » the role of price expectations. This paper demonstrates the feasibility of using exhaustible resource theory in the development of empirically testable models. 19 refs., 1 fig., 5 tabs.« less
JOURNAL SCOPE GUIDELINES: Paper classification scheme
NASA Astrophysics Data System (ADS)
2005-06-01
This scheme is used to clarify the journal's scope and enable authors and readers to more easily locate the appropriate section for their work. For each of the sections listed in the scope statement we suggest some more detailed subject areas which help define that subject area. These lists are by no means exhaustive and are intended only as a guide to the type of papers we envisage appearing in each section. We acknowledge that no classification scheme can be perfect and that there are some papers which might be placed in more than one section. We are happy to provide further advice on paper classification to authors upon request (please email jphysa@iop.org). 1. Statistical physics numerical and computational methods statistical mechanics, phase transitions and critical phenomena quantum condensed matter theory Bose-Einstein condensation strongly correlated electron systems exactly solvable models in statistical mechanics lattice models, random walks and combinatorics field-theoretical models in statistical mechanics disordered systems, spin glasses and neural networks nonequilibrium systems network theory 2. Chaotic and complex systems nonlinear dynamics and classical chaos fractals and multifractals quantum chaos classical and quantum transport cellular automata granular systems and self-organization pattern formation biophysical models 3. Mathematical physics combinatorics algebraic structures and number theory matrix theory classical and quantum groups, symmetry and representation theory Lie algebras, special functions and orthogonal polynomials ordinary and partial differential equations difference and functional equations integrable systems soliton theory functional analysis and operator theory inverse problems geometry, differential geometry and topology numerical approximation and analysis geometric integration computational methods 4. Quantum mechanics and quantum information theory coherent states eigenvalue problems supersymmetric quantum mechanics scattering theory relativistic quantum mechanics semiclassical approximations foundations of quantum mechanics and measurement theory entanglement and quantum nonlocality geometric phases and quantum tomography quantum tunnelling decoherence and open systems quantum cryptography, communication and computation theoretical quantum optics 5. Classical and quantum field theory quantum field theory gauge and conformal field theory quantum electrodynamics and quantum chromodynamics Casimir effect integrable field theory random matrix theory applications in field theory string theory and its developments classical field theory and electromagnetism metamaterials 6. Fluid and plasma theory turbulence fundamental plasma physics kinetic theory magnetohydrodynamics and multifluid descriptions strongly coupled plasmas one-component plasmas non-neutral plasmas astrophysical and dusty plasmas
NASA Technical Reports Server (NTRS)
Hesse, Michael; Birn, J.; Denton, Richard E.; Drake, J.; Gombosi, T.; Hoshino, M.; Matthaeus, B.; Sibeck, D.
2005-01-01
When targeting physical understanding of space plasmas, our focus is gradually shifting away from discovery-type investigations to missions and studies that address our basic understanding of processes we know to be important. For these studies, theory and models provide physical predictions that need to be verified or falsified by empirical evidence. Within this paradigm, a tight integration between theory, modeling, and space flight mission design and execution is essential. NASA's Magnetospheric MultiScale (MMS) mission is a pathfinder in this new era of space research. The prime objective of MMS is to understand magnetic reconnection, arguably the most fundamental of plasma processes. In particular, MMS targets the microphysical processes, which permit magnetic reconnection to operate in the collisionless plasmas that permeate space and astrophysical systems. More specifically, MMS will provide closure to such elemental questions as how particles become demagnetized in the reconnection diffusion region, which effects determine the reconnection rate, and how reconnection is coupled to environmental conditions such as magnetic shear angles. Solutions to these problems have remained elusive in past and present spacecraft missions primarily due to instrumental limitations - yet they are fundamental to the large-scale dynamics of collisionless plasmas. Owing to the lack of measurements, most of our present knowledge of these processes is based on results from modern theory and modeling studies of the reconnection process. Proper design and execution of a mission targeting magnetic reconnection should include this knowledge and have to ensure that all relevant scales and effects can be resolved by mission measurements. The SMART mission has responded to this need through a tight integration between instrument and theory and modeling teams. Input from theory and modeling is fed into all aspects of science mission design, and theory and modeling activities are tailored to SMART needs during mission development and science analysis. In this presentation, we will present an overview of SMART theory and modeling team activities. In particular, we will provide examples of science objectives derived from state-of-the art models, and of recent research results that continue to be utilized in SMART mission development.
Do violations of the axioms of expected utility theory threaten decision analysis?
Nease, R F
1996-01-01
Research demonstrates that people violate the independence principle of expected utility theory, raising the question of whether expected utility theory is normative for medical decision making. The author provides three arguments that violations of the independence principle are less problematic than they might first appear. First, the independence principle follows from other more fundamental axioms whose appeal may be more readily apparent than that of the independence principle. Second, the axioms need not be descriptive to be normative, and they need not be attractive to all decision makers for expected utility theory to be useful for some. Finally, by providing a metaphor of decision analysis as a conversation between the actual decision maker and a model decision maker, the author argues that expected utility theory need not be purely normative for decision analysis to be useful. In short, violations of the independence principle do not necessarily represent direct violations of the axioms of expected utility theory; behavioral violations of the axioms of expected utility theory do not necessarily imply that decision analysis is not normative; and full normativeness is not necessary for decision analysis to generate valuable insights.
Titman, Andrew C; Lancaster, Gillian A; Colver, Allan F
2016-10-01
Both item response theory and structural equation models are useful in the analysis of ordered categorical responses from health assessment questionnaires. We highlight the advantages and disadvantages of the item response theory and structural equation modelling approaches to modelling ordinal data, from within a community health setting. Using data from the SPARCLE project focussing on children with cerebral palsy, this paper investigates the relationship between two ordinal rating scales, the KIDSCREEN, which measures quality-of-life, and Life-H, which measures participation. Practical issues relating to fitting models, such as non-positive definite observed or fitted correlation matrices, and approaches to assessing model fit are discussed. item response theory models allow properties such as the conditional independence of particular domains of a measurement instrument to be assessed. When, as with the SPARCLE data, the latent traits are multidimensional, structural equation models generally provide a much more convenient modelling framework. © The Author(s) 2013.
Haer, Toon; Botzen, W J Wouter; de Moel, Hans; Aerts, Jeroen C J H
2017-10-01
Recent studies showed that climate change and socioeconomic trends are expected to increase flood risks in many regions. However, in these studies, human behavior is commonly assumed to be constant, which neglects interaction and feedback loops between human and environmental systems. This neglect of human adaptation leads to a misrepresentation of flood risk. This article presents an agent-based model that incorporates human decision making in flood risk analysis. In particular, household investments in loss-reducing measures are examined under three economic decision models: (1) expected utility theory, which is the traditional economic model of rational agents; (2) prospect theory, which takes account of bounded rationality; and (3) a prospect theory model, which accounts for changing risk perceptions and social interactions through a process of Bayesian updating. We show that neglecting human behavior in flood risk assessment studies can result in a considerable misestimation of future flood risk, which is in our case study an overestimation of a factor two. Furthermore, we show how behavior models can support flood risk analysis under different behavioral assumptions, illustrating the need to include the dynamic adaptive human behavior of, for instance, households, insurers, and governments. The method presented here provides a solid basis for exploring human behavior and the resulting flood risk with respect to low-probability/high-impact risks. © 2016 The Authors Risk Analysis published by Wiley Periodicals, Inc. on behalf of Society for Risk Analysis.
Kozinszky, Zoltan; Töreki, Annamária; Hompoth, Emőke A; Dudas, Robert B; Németh, Gábor
2017-04-01
We endeavoured to analyze the factor structure of the Edinburgh Postnatal Depression Scale (EPDS) during a screening programme in Hungary, using exploratory (EFA) and confirmatory factor analysis (CFA), testing both previously published models and newly developed theory-driven ones, after a critical analysis of the literature. Between April 2011 and January 2015, a sample of 2967 pregnant women (between 12th and 30th weeks of gestation) and 714 women 6 weeks after delivery completed the Hungarian version of the EPDS in South-East Hungary. EFAs suggested unidimensionality in both samples. 33 out of 42 previously published models showed good and 6 acceptable fit with our antepartum data in CFAs, whilst 10 of them showed good and 28 acceptable fit in our postpartum sample. Using multiple fit indices, our theory-driven anhedonia (items 1,2) - anxiety (items 4,5) - low mood (items 8,9) model provided the best fit in the antepartum sample. In the postpartum sample, our theory-driven models were again among the best performing models, including an anhedonia and an anxiety factor together with either a low mood or a suicidal risk factor (items 3,6,10). The EPDS showed moderate within- and between-culture invariability, although this would also need to be re-examined with a theory-driven approach. Copyright © 2017 Elsevier Ireland Ltd. All rights reserved.
Latent Transition Analysis with a Mixture Item Response Theory Measurement Model
ERIC Educational Resources Information Center
Cho, Sun-Joo; Cohen, Allan S.; Kim, Seock-Ho; Bottge, Brian
2010-01-01
A latent transition analysis (LTA) model was described with a mixture Rasch model (MRM) as the measurement model. Unlike the LTA, which was developed with a latent class measurement model, the LTA-MRM permits within-class variability on the latent variable, making it more useful for measuring treatment effects within latent classes. A simulation…
ERIC Educational Resources Information Center
Li, Feifei
2017-01-01
An information-correction method for testlet-based tests is introduced. This method takes advantage of both generalizability theory (GT) and item response theory (IRT). The measurement error for the examinee proficiency parameter is often underestimated when a unidimensional conditional-independence IRT model is specified for a testlet dataset. By…
NASA Astrophysics Data System (ADS)
Chanprathak, Anusorn; Worakham, Paisan; Suikraduang, Arun
2018-01-01
The promotion science teacher attribution model to develop the moral and ethical characteristics was to analyze, synthesis, and develop the guidelines of the scoping study into concepts, theories and research related about the moral and ethics of characteristically teachers from the resources, including research papers, research articles related research, and interviews with luminaries of 9 members. Using interviews and document analysis, data analysis, content analysis, and present an essay was built. The promoting attributes a teacher, moral principles, concepts and theories involved and guidance of a qualified were developed. The Multiple-Attribute Consensus Reaching (MACR) from 12 educational experts were checked the suitability and feasibility of the model, the possibility of the manual with the research instruments consisted of the promotion model attributes the moral and ethics teacher's characteristics were evaluated, to guide the promotion attributes' model forms were assessed, the first edition of the manual data analysis, information obtained from the evaluation of the suitability and feasibility analysis model and guide for the average were administered. The results have found that; the promoting moral teacher attribute data to their moral and ethical characteristics was divided into two groups, priests and scholars. In both groups, the promotion attributes, focusing on teacher's groups is moral in nature to modify the idea to a change of attitude within the organism. Students got down to real experience; an analysis and synthesis face learning environments that cause cognitive skills to act as a self-realization possibly. The promotion model, moral principles, including the importance of the activities, objectives and evaluation methods were attributed. These core concepts learning theory and social cognitive theory, and integrated learning experience were comprised in five stages and four processes, namely; the intended, memory storage process, the actions, and the incentives of the motivation processes. The appropriateness and feasibility of the model and the appropriateness of the form scales evidence of a high level. The assessing guide on the appropriateness of the guide scale and the possibility of the manual guide responded of at a high level, similarly.
Simulations of Stagewise Development with a Symbolic Architecture
NASA Astrophysics Data System (ADS)
Gobet, Fernand
This chapter compares Piaget's theory of development with Feigenbaum & Simon's (1962; 1984) EPAM theory. An attempt is made to map the concepts of assimilation and accommodation in Piaget's theory onto the concepts of familiarisation and accommodation in EPAM. An EPAM-like model of the balance scale task is then presented, with a discussion of preliminary results showing how it accounts for children's discontinuous, stage-like development. The analysis focuses on the transition between rules, using catastrophe flags (Gilmore, 1981) as criteria. It is argued that some symbolic models may be described as dynamical systems, in the same way as some non-symbolic models.
Unidimensional and Multidimensional Models for Item Response Theory.
ERIC Educational Resources Information Center
McDonald, Roderick P.
This paper provides an up-to-date review of the relationship between item response theory (IRT) and (nonlinear) common factor theory and draws out of this relationship some implications for current and future research in IRT. Nonlinear common factor analysis yields a natural embodiment of the weak principle of local independence in appropriate…
Cosmological reconstruction and Om diagnostic analysis of Einstein-Aether theory
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pasqua, Antonio; Chattopadhyay, Surajit; Momeni, Davood
In this paper, we analyze the cosmological models in Einstein-Aether gravity, which is a modified theory of gravity in which a time-like vector field breaks the Lorentz symmetry. We use this formalism to analyse different cosmological models with different behavior of the scale factor. In this analysis, we use a certain functional dependence of the Dark Energy (DE) on the Hubble parameter H . It will be demonstrated that the Aether vector field has a non-trivial effect on these cosmological models. We also perform the Om diagnostic in Einstein-Aether gravity and we fit the parameters of the cosmological models usingmore » recent observational data.« less
Development of a Higher Order Laminate Theory for Modeling Composites with Induced Strain Actuators
NASA Technical Reports Server (NTRS)
Chattopadhyay, Aditi; Seeley, Charles E.
1996-01-01
A refined higher order plate theory is developed to investigate the actuation mechanism of piezoelectric materials surface bonded or embedded in composite laminates. The current analysis uses a displacement field which accurately accounts for transverse shear stresses. Some higher order terms are identified by using the conditions that shear stresses vanish at all free surfaces. Therefore, all boundary conditions for displacements and stresses are satisfied in the present theory. The analysis is implemented using the finite element method which provides a convenient means to construct a numerical solution due to the discrete nature of the actuators. The higher order theory is computationally less expensive than a full three dimensional analysis. The theory is also shown to agree well with published experimental results. Numerical examples are presented for composite plates with thicknesses ranging from thin to very thick.
Models versus theories as a primary carrier of nursing knowledge: A philosophical argument.
Bender, Miriam
2018-01-01
Theories and models are not equivalent. I argue that an orientation towards models as a primary carrier of nursing knowledge overcomes many ongoing challenges in philosophy of nursing science, including the theory-practice divide and the paradoxical pursuit of predictive theories in a discipline that is defined by process and a commitment to the non-reducibility of the health/care experience. Scientific models describe and explain the dynamics of specific phenomenon. This is distinct from theory, which is traditionally defined as propositions that explain and/or predict the world. The philosophical case has been made against theoretical universalism, showing that a theory can be true in its domain, but that no domain is universal. Subsequently, philosophers focused on scientific models argued that they do the work of defining the boundary conditions-the domain(s)-of a theory. Further analysis has shown the ways models can be constructed and function independent of theory, meaning models can comprise distinct, autonomous "carriers of scientific knowledge." Models are viewed as representations of the active dynamics, or mechanisms, of a phenomenon. Mechanisms are entities and activities organized such that they are productive of regular changes. Importantly, mechanisms are by definition not static: change may alter the mechanism and thereby alter or create entirely new phenomena. Orienting away from theory, and towards models, focuses scholarly activity on dynamics and change. This makes models arguably critical to nursing science, enabling the production of actionable knowledge about the dynamics of process and change in health/care. I briefly explore the implications for nursing-and health/care-knowledge and practice. © 2017 John Wiley & Sons Ltd.
NASA Astrophysics Data System (ADS)
Ansari, R.; Torabi, J.; Norouzzadeh, A.
2018-04-01
Due to the capability of Eringen's nonlocal elasticity theory to capture the small length scale effect, it is widely used to study the mechanical behaviors of nanostructures. Previous studies have indicated that in some cases, the differential form of this theory cannot correctly predict the behavior of structure, and the integral form should be employed to avoid obtaining inconsistent results. The present study deals with the bending analysis of nanoplates resting on elastic foundation based on the integral formulation of Eringen's nonlocal theory. Since the formulation is presented in a general form, arbitrary kernel functions can be used. The first order shear deformation plate theory is considered to model the nanoplates, and the governing equations for both integral and differential forms are presented. Finally, the finite element method is applied to solve the problem. Selected results are given to investigate the effects of elastic foundation and to compare the predictions of integral nonlocal model with those of its differential nonlocal and local counterparts. It is found that by the use of proposed integral formulation of Eringen's nonlocal model, the paradox observed for the cantilever nanoplate is resolved.
ODE/IM correspondence and the Argyres-Douglas theory
NASA Astrophysics Data System (ADS)
Ito, Katsushi; Shu, Hongfei
2017-08-01
We study the quantum spectral curve of the Argyres-Douglas theories in the Nekrasov-Sahashvili limit of the Omega-background. Using the ODE/IM correspondence we investigate the quantum integrable model corresponding to the quantum spectral curve. We show that the models for the A 2 N -type theories are non-unitary coset models ( A 1)1 × ( A 1) L /( A 1) L+1 at the fractional level L=2/2N+1-2 , which appear in the study of the 4d/2d correspondence of N = 2 superconformal field theories. Based on the WKB analysis, we clarify the relation between the Y-functions and the quantum periods and study the exact Bohr-Sommerfeld quantization condition for the quantum periods. We also discuss the quantum spectral curves for the D and E type theories.
Analysis of Asymmetry by a Slide-Vector.
ERIC Educational Resources Information Center
Zielman, Berrie; Heiser, Willem J.
1993-01-01
An algorithm based on the majorization theory of J. de Leeuw and W. J. Heiser is presented for fitting the slide-vector model. It views the model as a constrained version of the unfolding model. A three-way variant is proposed, and two examples from market structure analysis are presented. (SLD)
Optimization of life support systems and their systems reliability
NASA Technical Reports Server (NTRS)
Fan, L. T.; Hwang, C. L.; Erickson, L. E.
1971-01-01
The identification, analysis, and optimization of life support systems and subsystems have been investigated. For each system or subsystem that has been considered, the procedure involves the establishment of a set of system equations (or mathematical model) based on theory and experimental evidences; the analysis and simulation of the model; the optimization of the operation, control, and reliability; analysis of sensitivity of the system based on the model; and, if possible, experimental verification of the theoretical and computational results. Research activities include: (1) modeling of air flow in a confined space; (2) review of several different gas-liquid contactors utilizing centrifugal force: (3) review of carbon dioxide reduction contactors in space vehicles and other enclosed structures: (4) application of modern optimal control theory to environmental control of confined spaces; (5) optimal control of class of nonlinear diffusional distributed parameter systems: (6) optimization of system reliability of life support systems and sub-systems: (7) modeling, simulation and optimal control of the human thermal system: and (8) analysis and optimization of the water-vapor eletrolysis cell.
Stability and Bifurcation Analysis of a Three-Species Food Chain Model with Delay
NASA Astrophysics Data System (ADS)
Pal, Nikhil; Samanta, Sudip; Biswas, Santanu; Alquran, Marwan; Al-Khaled, Kamel; Chattopadhyay, Joydev
In the present paper, we study the effect of gestation delay on a tri-trophic food chain model with Holling type-II functional response. The essential mathematical features of the proposed model are analyzed with the help of equilibrium analysis, stability analysis, and bifurcation theory. Considering time-delay as the bifurcation parameter, the Hopf-bifurcation analysis is carried out around the coexisting equilibrium. The direction of Hopf-bifurcation and the stability of the bifurcating periodic solutions are determined by applying the normal form theory and center manifold theorem. We observe that if the magnitude of the delay is increased, the system loses stability and shows limit cycle oscillations through Hopf-bifurcation. The system also shows the chaotic dynamics via period-doubling bifurcation for further enhancement of time-delay. Our analytical findings are illustrated through numerical simulations.
A Study of Crisis Management Based on Stakeholders Analysis Model
NASA Astrophysics Data System (ADS)
Qingchun, Yue
2017-11-01
From the view of stakeholder theory, not only the enterprises should provide services to shareholders, but also take care of the demands of stakeholders. Stakeholders for the enterprise crisis are the organizations and individuals, which cause crisis, respond to the crisis and affected by the enterprise crisis. In this paper, first of all, to comb the development of stakeholder theory systematically; secondly, with the help of the enterprise crisis stakeholder analysis model, analyze the concept of stakeholders for the enterprise crisis and membership, and with the example of Shuanghui Group for further analysis; finally, we put forward relevant proposals for the enterprise crisis from the view of stakeholders.
A higher-order theory for geometrically nonlinear analysis of composite laminates
NASA Technical Reports Server (NTRS)
Reddy, J. N.; Liu, C. F.
1987-01-01
A third-order shear deformation theory of laminated composite plates and shells is developed, the Navier solutions are derived, and its finite element models are developed. The theory allows parabolic description of the transverse shear stresses, and therefore the shear correction factors of the usual shear deformation theory are not required in the present theory. The theory also accounts for the von Karman nonlinear strains. Closed-form solutions of the theory for rectangular cross-ply and angle-ply plates and cross-ply shells are developed. The finite element model is based on independent approximations of the displacements and bending moments (i.e., mixed finite element model), and therefore, only C sup o -approximation is required. The finite element model is used to analyze cross-ply and angle-ply laminated plates and shells for bending and natural vibration. Many of the numerical results presented here should serve as references for future investigations. Three major conclusions resulted from the research: First, for thick laminates, shear deformation theories predict deflections, stresses and vibration frequencies significantly different from those predicted by classical theories. Second, even for thin laminates, shear deformation effects are significant in dynamic and geometrically nonlinear analyses. Third, the present third-order theory is more accurate compared to the classical and firt-order theories in predicting static and dynamic response of laminated plates and shells made of high-modulus composite materials.
Modeling and analysis of the TF30-P-3 compressor system with inlet pressure distortion
NASA Technical Reports Server (NTRS)
Mazzawy, R. S.; Banks, G. A.
1976-01-01
Circumferential inlet distortion testing of a TF30-P-3 afterburning turbofan engine was conducted at NASA-Lewis Research Center. Pratt and Whitney Aircraft analyzed the data using its multiple segment parallel compressor model and classical compressor theory. Distortion attenuation analysis resulted in a detailed flow field calculation with good agreement between multiple segment model predictions and the test data. Sensitivity of the engine stall line to circumferential inlet distortion was calculated on the basis of parallel compressor theory to be more severe than indicated by the data. However, the calculated stall site location was in agreement with high response instrumentation measurements.
Hagger, Martin S; Chan, Derwin K C; Protogerou, Cleo; Chatzisarantis, Nikos L D
2016-08-01
Synthesizing research on social cognitive theories applied to health behavior is an important step in the development of an evidence base of psychological factors as targets for effective behavioral interventions. However, few meta-analyses of research on social cognitive theories in health contexts have conducted simultaneous tests of theoretically-stipulated pattern effects using path analysis. We argue that conducting path analyses of meta-analytic effects among constructs from social cognitive theories is important to test nomological validity, account for mediation effects, and evaluate unique effects of theory constructs independent of past behavior. We illustrate our points by conducting new analyses of two meta-analyses of a popular theory applied to health behaviors, the theory of planned behavior. We conducted meta-analytic path analyses of the theory in two behavioral contexts (alcohol and dietary behaviors) using data from the primary studies included in the original meta-analyses augmented to include intercorrelations among constructs and relations with past behavior missing from the original analysis. Findings supported the nomological validity of the theory and its hypotheses for both behaviors, confirmed important model processes through mediation analysis, demonstrated the attenuating effect of past behavior on theory relations, and provided estimates of the unique effects of theory constructs independent of past behavior. Our analysis illustrates the importance of conducting a simultaneous test of theory-stipulated effects in meta-analyses of social cognitive theories applied to health behavior. We recommend researchers adopt this analytic procedure when synthesizing evidence across primary tests of social cognitive theories in health. Copyright © 2016 Elsevier Inc. All rights reserved.
An introduction to Item Response Theory and Rasch Analysis of the Eating Assessment Tool (EAT-10).
Kean, Jacob; Brodke, Darrel S; Biber, Joshua; Gross, Paul
2018-03-01
Item response theory has its origins in educational measurement and is now commonly applied in health-related measurement of latent traits, such as function and symptoms. This application is due in large part to gains in the precision of measurement attributable to item response theory and corresponding decreases in response burden, study costs, and study duration. The purpose of this paper is twofold: introduce basic concepts of item response theory and demonstrate this analytic approach in a worked example, a Rasch model (1PL) analysis of the Eating Assessment Tool (EAT-10), a commonly used measure for oropharyngeal dysphagia. The results of the analysis were largely concordant with previous studies of the EAT-10 and illustrate for brain impairment clinicians and researchers how IRT analysis can yield greater precision of measurement.
NASA Astrophysics Data System (ADS)
Najafi, M. N.
2018-04-01
The coupling of the c = ‑2, c=\\frac{1}{2} and c = 0 conformal field theories are numerically considered in this paper. As the prototypes of the couplings, (c_1=-2)\\oplus (c_2=0) and (c_1=-2)\\oplus (c_2=\\frac{1}{2}) , we consider the Bak–Tang–Weisenfeld (BTW) model on the 2D square critical site-percolation and the BTW model on Ising-correlated percolation lattices respectively. Some geometrical techniques are used to characterize the presumable conformal symmetry of the resultant systems. Based on the numerical analysis of the diffusivity parameter (κ) in the Schramm–Loewner evolution (SLE) theory we propose that the algebra of the central charges of the coupled models is closed. This result is based on the analysis of the conformal loop ensemble (CLE) analysis. The diffusivity parameter in each case is obtained by calculating the fractal dimension of loops (and the corresponding exponent of mean-square root distance), the direct SLE mapping method, the left passage probability and the winding angle analysis. More precisely we numerically show that the coupling (c_1=-2)\\oplus (c_2=\\frac{1}{2}) results to 2D self-avoiding walk (SAW) fixed point corresponding to c = 0 conformal field theory, whereas the coupling (c_1=-2)\\oplus (c_2=0) results to the 2D critical Ising fixed point corresponding to the c=\\frac{1}{2} conformal field theory.
Differentiating between precursor and control variables when analyzing reasoned action theories.
Hennessy, Michael; Bleakley, Amy; Fishbein, Martin; Brown, Larry; Diclemente, Ralph; Romer, Daniel; Valois, Robert; Vanable, Peter A; Carey, Michael P; Salazar, Laura
2010-02-01
This paper highlights the distinction between precursor and control variables in the context of reasoned action theory. Here the theory is combined with structural equation modeling to demonstrate how age and past sexual behavior should be situated in a reasoned action analysis. A two wave longitudinal survey sample of African-American adolescents is analyzed where the target behavior is having vaginal sex. Results differ when age and past behavior are used as control variables and when they are correctly used as precursors. Because control variables do not appear in any form of reasoned action theory, this approach to including background variables is not correct when analyzing data sets based on the theoretical axioms of the Theory of Reasoned Action, the Theory of Planned Behavior, or the Integrative Model.
Differentiating Between Precursor and Control Variables When Analyzing Reasoned Action Theories
Hennessy, Michael; Bleakley, Amy; Fishbein, Martin; Brown, Larry; DiClemente, Ralph; Romer, Daniel; Valois, Robert; Vanable, Peter A.; Carey, Michael P.; Salazar, Laura
2010-01-01
This paper highlights the distinction between precursor and control variables in the context of reasoned action theory. Here the theory is combined with structural equation modeling to demonstrate how age and past sexual behavior should be situated in a reasoned action analysis. A two wave longitudinal survey sample of African-American adolescents is analyzed where the target behavior is having vaginal sex. Results differ when age and past behavior are used as control variables and when they are correctly used as precursors. Because control variables do not appear in any form of reasoned action theory, this approach to including background variables is not correct when analyzing data sets based on the theoretical axioms of the Theory of Reasoned Action, the Theory of Planned Behavior, or the Integrative Model PMID:19370408
NASA Astrophysics Data System (ADS)
Ebrahimi, Farzad; Barati, Mohammad Reza
2017-12-01
This paper develops a higher order refined beam model with a parabolic shear strain function for vibration analysis of porous nanocrystalline nanobeams based on nonlocal couple stress theory. Nanocrystalline nanobeam is composed from three phases which are nano-grains, nano-voids and interface. Nano-voids or porosities inside the material have a stiffness-softening impact on the nanobeam. Nonlocal elasticity theory of Eringen is applied in analysis of nanocrystalline nanobeams for the first time. Also, modified couple stress theory is employed to capture grains rigid rotations. The governing equations obtained from Hamilton's principle are solved applying an analytical approach which satisfies various boundary conditions. The reliability of present approach is verified by comparing obtained results with those provided in literature. Finally the influences of nonlocal parameter, couple stress, grain size, porosities and shear deformation on the vibration characteristics of nanocrystalline nanobeams are explored.
Comparing cosmic web classifiers using information theory
DOE Office of Scientific and Technical Information (OSTI.GOV)
Leclercq, Florent; Lavaux, Guilhem; Wandelt, Benjamin
We introduce a decision scheme for optimally choosing a classifier, which segments the cosmic web into different structure types (voids, sheets, filaments, and clusters). Our framework, based on information theory, accounts for the design aims of different classes of possible applications: (i) parameter inference, (ii) model selection, and (iii) prediction of new observations. As an illustration, we use cosmographic maps of web-types in the Sloan Digital Sky Survey to assess the relative performance of the classifiers T-WEB, DIVA and ORIGAMI for: (i) analyzing the morphology of the cosmic web, (ii) discriminating dark energy models, and (iii) predicting galaxy colors. Ourmore » study substantiates a data-supported connection between cosmic web analysis and information theory, and paves the path towards principled design of analysis procedures for the next generation of galaxy surveys. We have made the cosmic web maps, galaxy catalog, and analysis scripts used in this work publicly available.« less
Theory-based interventions for contraception.
Lopez, Laureen M; Tolley, Elizabeth E; Grimes, David A; Chen-Mok, Mario
2009-01-21
The explicit use of theory in research helps expand the knowledge base. Theories and models have been used extensively in HIV-prevention research and in interventions for preventing sexually transmitted infections (STIs). The health behavior field uses many theories or models of change. However, educational interventions addressing contraception often have no stated theoretical base. Review randomized controlled trials that tested a theoretical approach to inform contraceptive choice; encourage contraceptive use; or promote adherence to, or continuation of, a contraceptive regimen. We searched computerized databases for trials that tested a theory-based intervention for improving contraceptive use (MEDLINE, POPLINE, CENTRAL, PsycINFO, EMBASE, ClinicalTrials.gov, and ICTRP). We also wrote to researchers to find other trials. Trials tested a theory-based intervention for improving contraceptive use. We excluded trials focused on high-risk groups. Interventions addressed the use of one or more contraceptive methods. The reports provided evidence that the intervention was based on a specific theory or model. The primary outcomes were pregnancy, contraceptive choice, initiating or changing contraceptive use, contraceptive regimen adherence, and contraception continuation. The primary author evaluated abstracts for eligibility. Two authors extracted data from included studies. We calculated the odds ratio for dichotomous outcomes and the mean difference for continuous data. No meta-analysis was conducted due to intervention differences. Of 26 trials, 12 interventions addressed contraception (other than condoms), while 14 focused on condom use for preventing HIV or STIs. In 2 of 10 trials with pregnancy or birth data, a theory-based group showed better results. Four of nine trials with contraceptive use (other than condoms) showed better outcomes in an experimental group. For condom use, a theory-based group had favorable results in 14 of 20 trials, but the number was halved in a subgroup analysis. Social Cognitive Theory was the main theoretical basis for 12 trials, and 10 showed positive results. Of the other 14 trials, favorable results were shown for other social cognition models (N=2), motivational interviewing (N=5), and the AIDS Risk Reduction Model (N=2). No major patterns were detected by type of theory, intervention, or target population. Family planning researchers and practitioners could apply the relevant theories and effective interventions from HIV and STI prevention. More thorough use of single theories would help inform the field about what works. Better reporting is needed on research design and intervention implementation.
A Theory of Bayesian Data Analysis
1989-10-10
and the sim- plification of models," in Evaluation of Econometric Models, J. Kmenta and J. 20 Ramsey , eds., Academic Press, 245-268. Edwards, W...Evaluation of Econometric Models, ed. by J. Kmenta and J. Ramsey , Academic Press, 197-217. Hill, Bruce M., (1980c), Review of Specification Searches, by E...also Hill (1970a, 1975a) for earlier thoughts the subject with regard to tests of significance, and Smith.(1986). The Baesi theory of tests of
Froese, Tom; Iizuka, Hiroyuki; Ikegami, Takashi
2013-08-01
Synthetic approaches to social interaction support the development of a second-person neuroscience. Agent-based models and psychological experiments can be related in a mutually informing manner. Models have the advantage of making the nonlinear brain-body-environment-body-brain system as a whole accessible to analysis by dynamical systems theory. We highlight some general principles of how social interaction can partially constitute an individual's behavior.
ERIC Educational Resources Information Center
Finch, Holmes
2010-01-01
The accuracy of item parameter estimates in the multidimensional item response theory (MIRT) model context is one that has not been researched in great detail. This study examines the ability of two confirmatory factor analysis models specifically for dichotomous data to properly estimate item parameters using common formulae for converting factor…
Basile, Kathleen C; Hall, Jeffrey E; Walters, Mikel L
2013-07-01
This study tested resource and feminist-informed theories to explain physical, sexual, psychological, and stalking intimate partner violence (IPV) perpetrated by court-mandated men. Data were obtained from 340 men arrested for physical assault of a partner before their court-ordered treatment. Using path analysis, findings provided partial support for each model. Ineffective arguing and substance-use problems were moderators of resources and perpetration. Dominance mediated early exposures and perpetration in the feminist-informed model. In both models, predictors of stalking were different than those for other types of perpetration. Future studies should replicate this research and determine the utility of combining models.
Nonlinear analysis of 0-3 polarized PLZT microplate based on the new modified couple stress theory
NASA Astrophysics Data System (ADS)
Wang, Liming; Zheng, Shijie
2018-02-01
In this study, based on the new modified couple stress theory, the size- dependent model for nonlinear bending analysis of a pure 0-3 polarized PLZT plate is developed for the first time. The equilibrium equations are derived from a variational formulation based on the potential energy principle and the new modified couple stress theory. The Galerkin method is adopted to derive the nonlinear algebraic equations from governing differential equations. And then the nonlinear algebraic equations are solved by using Newton-Raphson method. After simplification, the new model includes only a material length scale parameter. In addition, numerical examples are carried out to study the effect of material length scale parameter on the nonlinear bending of a simply supported pure 0-3 polarized PLZT plate subjected to light illumination and uniform distributed load. The results indicate the new model is able to capture the size effect and geometric nonlinearity.
ERIC Educational Resources Information Center
Algesheimer, René; Bagozzi, Richard P.; Dholakia, Utpal M.
2018-01-01
We offer a new conceptualization and measurement models for constructs at the group-level of analysis in small group research. The conceptualization starts with classical notions of group behavior proposed by Tönnies, Simmel, and Weber and then draws upon plural subject theory by philosophers Gilbert and Tuomela to frame a new perspective…
A Bayesian Approach to Person Fit Analysis in Item Response Theory Models. Research Report.
ERIC Educational Resources Information Center
Glas, Cees A. W.; Meijer, Rob R.
A Bayesian approach to the evaluation of person fit in item response theory (IRT) models is presented. In a posterior predictive check, the observed value on a discrepancy variable is positioned in its posterior distribution. In a Bayesian framework, a Markov Chain Monte Carlo procedure can be used to generate samples of the posterior distribution…
On-Orbit System Identification
NASA Technical Reports Server (NTRS)
Mettler, E.; Milman, M. H.; Bayard, D.; Eldred, D. B.
1987-01-01
Information derived from accelerometer readings benefits important engineering and control functions. Report discusses methodology for detection, identification, and analysis of motions within space station. Techniques of vibration and rotation analyses, control theory, statistics, filter theory, and transform methods integrated to form system for generating models and model parameters that characterize total motion of complicated space station, with respect to both control-induced and random mechanical disturbances.
ERIC Educational Resources Information Center
Christensen, Eben J.; Redd, Steven B.
2004-01-01
The bureaucratic politics model and the poliheuristic theory are used to examine how political advice presented in various contexts influences choice. Organizational advisers who offer endogenous political advice are compared with situations in which the decision maker is offered advice by a separate, or exogenous, political adviser. Results show…
A Model and a Metric for the Analysis of Status Attainment Processes. Discussion Paper No. 492-78.
ERIC Educational Resources Information Center
Sorensen, Aage B.
This paper proposes a theory of the status attainment process, and specifies it in a mathematical model. The theory justifies a transformation of the conventional status scores to a metric that produces a exponential distribution of attainments, and a transformation of educational attainments to a metric that reflects the competitive advantage…
ERIC Educational Resources Information Center
Chan, Jacob Yui Chung; Chan, Fong; Ditchman, Nicole; Phillips, Brian; Chou, Chih-Chin
2013-01-01
Objective: To evaluate Snyder's (2002) hope theory as a motivational model of community participation and life satisfaction. Setting: Manitoba chapter of the Canadian Paraplegic Association. Participants: One-hundred and sixteen participants with spinal cord injuries who were members of the Manitoba chapter of the Canadian Paraplegic Association.…
ERIC Educational Resources Information Center
McCollum, Daniel L.; Kajs, Lawrence T.
2009-01-01
The goal orientation theory of motivation posits sets of beliefs people hold regarding their goals. The 2 x 2 model of goal orientations has received almost no attention in the domain of educational leadership. The present researchers used a confirmatory factor analysis to test a measure based on the hypothesized 2 x 2 model in educational…
Bergeron, Kim; Abdi, Samiya; DeCorby, Kara; Mensah, Gloria; Rempel, Benjamin; Manson, Heather
2017-11-28
There is limited research on capacity building interventions that include theoretical foundations. The purpose of this systematic review is to identify underlying theories, models and frameworks used to support capacity building interventions relevant to public health practice. The aim is to inform and improve capacity building practices and services offered by public health organizations. Four search strategies were used: 1) electronic database searching; 2) reference lists of included papers; 3) key informant consultation; and 4) grey literature searching. Inclusion and exclusion criteria are outlined with included papers focusing on capacity building, learning plans, professional development plans in combination with tools, resources, processes, procedures, steps, model, framework, guideline, described in a public health or healthcare setting, or non-government, government, or community organizations as they relate to healthcare, and explicitly or implicitly mention a theory, model and/or framework that grounds the type of capacity building approach developed. Quality assessment were performed on all included articles. Data analysis included a process for synthesizing, analyzing and presenting descriptive summaries, categorizing theoretical foundations according to which theory, model and/or framework was used and whether or not the theory, model or framework was implied or explicitly identified. Nineteen articles were included in this review. A total of 28 theories, models and frameworks were identified. Of this number, two theories (Diffusion of Innovations and Transformational Learning), two models (Ecological and Interactive Systems Framework for Dissemination and Implementation) and one framework (Bloom's Taxonomy of Learning) were identified as the most frequently cited. This review identifies specific theories, models and frameworks to support capacity building interventions relevant to public health organizations. It provides public health practitioners with a menu of potentially usable theories, models and frameworks to support capacity building efforts. The findings also support the need for the use of theories, models or frameworks to be intentional, explicitly identified, referenced and for it to be clearly outlined how they were applied to the capacity building intervention.
Linguistics, Computers, and the Language Teacher. A Communicative Approach.
ERIC Educational Resources Information Center
Underwood, John H.
This analysis of the state of the art of computer programs and programming for language teaching has two parts. In the first part, an overview of the theory and practice of language teaching, Noam Chomsky's view of language, and the implications and problems of generative theory are presented. The theory behind the input model of language…
ERIC Educational Resources Information Center
Bernard, Robert M.; Abrami, Philip C.; Wade, Anne; Borokhovski, Evgueni; Lou, Yiping
2004-01-01
Simonson, Schlosser and Hanson (1999) argue that a new theory called "equivalency theory" is needed to account for the unique features of the "teleconferencing" (synchronous) model of DE that is prevalent in many North American universities. Based on a comprehensive meta-analysis of the comparative literature of DE (Bernard,…
From Myths to Models: The (Re)Production of World Culture in Comparative Education
ERIC Educational Resources Information Center
Silova, Iveta; Brehm, William C.
2015-01-01
This article traces the emergence of the world culture theory in comparative education using critical discourse analysis. By chronicling the emergence and expansion of world culture theory over the past four decades, we highlight the (unintended) limitations and exclusive regimes of thought that have resulted. We argue that the theory's…
Fit for Practice: Analysis and Evaluation of Watson's Theory of Human Caring.
Pajnkihar, Majda; McKenna, Hugh P; Štiglic, Gregor; Vrbnjak, Dominika
2017-07-01
The aim of the authors of this paper is to analyze Watson's theory of human caring for its usefulness and worth in education, practice, and research. The reason for undertaking this analysis is to evaluate if Watson's theory would be useful for nursing in those countries where such theories were not an established part of the nursing curriculum. Furthermore, in some European countries, their political past or cultural influences led to an unquestioned adoption of the biomedical model. As their political culture changes, many social structures have had to be revisited, and for nursing, this has meant the introduction of theoretical reasoning, teaching, and practice.
Analysis of Modeling Processes
ERIC Educational Resources Information Center
Bandura, Albert
1975-01-01
Traditional learning theories stress that people are either conditioned through reward and punishment or by close association with neutral or evocative stimuli. These direct experience theories do not account for people's learning complex behavior through observation. Attentional, retention, motoric reproduction, reinforcement, and motivational…
NASA Astrophysics Data System (ADS)
Norouzzadeh, A.; Ansari, R.; Rouhi, H.
2017-05-01
Differential form of Eringen's nonlocal elasticity theory is widely employed to capture the small-scale effects on the behavior of nanostructures. However, paradoxical results are obtained via the differential nonlocal constitutive relations in some cases such as in the vibration and bending analysis of cantilevers, and recourse must be made to the integral (original) form of Eringen's theory. Motivated by this consideration, a novel nonlocal formulation is developed herein based on the original formulation of Eringen's theory to study the buckling behavior of nanobeams. The governing equations are derived according to the Timoshenko beam theory, and are represented in a suitable vector-matrix form which is applicable to the finite-element analysis. In addition, an isogeometric analysis (IGA) is conducted for the solution of buckling problem. Construction of exact geometry using non-uniform rational B-splines and easy implementation of geometry refinement tools are the main advantages of IGA. A comparison study is performed between the predictions of integral and differential nonlocal models for nanobeams under different kinds of end conditions.
Empowerment theory: clarifying the nature of higher-order multidimensional constructs.
Peterson, N Andrew
2014-03-01
Development of empowerment theory has focused on defining the construct at different levels of analysis, presenting new frameworks or dimensions, and explaining relationships between empowerment-related processes and outcomes. Less studied, and less conceptually developed, is the nature of empowerment as a higher-order multidimensional construct. One critical issue is whether empowerment is conceptualized as a superordinate construct (i.e., empowerment is manifested by its dimensions), an aggregate construct (i.e., empowerment is formed by its dimensions), or rather as a set of distinct constructs. To date, researchers have presented superordinate models without careful consideration of the relationships between dimensions and the higher-order construct of empowerment. Empirical studies can yield very different results, however, depending on the conceptualization of a construct. This paper represents the first attempt to address this issue systematically in empowerment theory. It is argued that superordinate models of empowerment are misspecified and research that tests alternative models at different levels of analysis is needed to advance theory, research, and practice in this area. Recommendations for future work are discussed.
Statistical correlation analysis for comparing vibration data from test and analysis
NASA Technical Reports Server (NTRS)
Butler, T. G.; Strang, R. F.; Purves, L. R.; Hershfeld, D. J.
1986-01-01
A theory was developed to compare vibration modes obtained by NASTRAN analysis with those obtained experimentally. Because many more analytical modes can be obtained than experimental modes, the analytical set was treated as expansion functions for putting both sources in comparative form. The dimensional symmetry was developed for three general cases: nonsymmetric whole model compared with a nonsymmetric whole structural test, symmetric analytical portion compared with a symmetric experimental portion, and analytical symmetric portion with a whole experimental test. The theory was coded and a statistical correlation program was installed as a utility. The theory is established with small classical structures.
NASA Astrophysics Data System (ADS)
Borovkov, Alexei I.; Avdeev, Ilya V.; Artemyev, A.
1999-05-01
In present work, the stress, vibration and buckling finite element analysis of laminated beams is performed. Review of the equivalent single-layer (ESL) laminate theories is done. Finite element algorithms and procedures integrated into the original FEA program system and based on the classical laminated plate theory (CLPT), first-order shear deformation theory (FSDT), third-order theory of Reddy (TSDT-R) and third- order theory of Kant (TSDT-K) with the use of the Lanczos method for solving of the eigenproblem are developed. Several numerical tests and examples of bending, free vibration and buckling of multilayered and sandwich beams with various material, geometry properties and boundary conditions are solved. New effective higher-order hierarchical element for the accurate calculation of transverse shear stress is proposed. The comparative analysis of results obtained by the considered models and solutions of 2D problems of the heterogeneous anisotropic elasticity is fulfilled.
The hidden flat like universe. Starobinsky-like inflation induced by f (T) gravity
NASA Astrophysics Data System (ADS)
El Hanafy, W.; Nashed, G. G. L.
2015-06-01
We study a single-fluid component in a flat like universe (FLU) governed by f( T) gravity theories, where T is the teleparallel torsion scalar. The FLU model, regardless of the value of the spatial curvature k, identifies a special class of f( T) gravity theories. Remarkably, FLU f( T) gravity does not reduce to teleparallel gravity theory. In large Hubble spacetime the theory is consistent with the inflationary universe scenario and respects the conservation principle. The equation of state evolves similarly in all models . We study the case when the torsion tensor consists of a scalar field, which enables to derive a quintessence potential from the obtained f( T) gravity theory. The potential produces Starobinsky-like model naturally without using a conformal transformation, with higher orders continuously interpolate between Starobinsky and quadratic inflation models. The slow-roll analysis shows double solutions, so that for a single value of the scalar tilt (spectral index) the theory can predict double tensor-to-scalar ratios r of E-mode and B-mode polarizations.
Application of ply level analysis to flexural wave propagation
NASA Astrophysics Data System (ADS)
Valisetty, R. R.; Rehfield, L. W.
1988-10-01
A brief survey is presented of the shear deformation theories of laminated plates. It indicates that there are certain non-classical influences that affect bending-related behavior in the same way as do the transverse shear stresses. They include bending- and stretching-related section warping and the concomitant non-classical surface parallel stress contributions and the transverse normal stress. A bending theory gives significantly improved performance if these non-classical affects are incorporated. The heterogeneous shear deformations that are characteristic of laminates with highly dissimilar materials, however, require that attention be paid to the modeling of local rotations. In this paper, it is shown that a ply level analysis can be used to model such disparate shear deformations. Here, equilibrium of each layer is analyzed separately. Earlier applications of this analysis include free-edge laminate stresses. It is now extended to the study of flexural wave propagation in laminates. A recently developed homogeneous plate theory is used as a ply level model. Due consideration is given to the non-classical influences and no shear correction factors are introduced extraneously in this theory. The results for the lowest flexural mode of travelling planar harmonic waves indicate that this approach is competitive and yields better results for certain laminates.
NASA Astrophysics Data System (ADS)
Ghadiri, Majid; Safarpour, Hamed
2016-09-01
In this paper, size-dependent effect of an embedded magneto-electro-elastic (MEE) nanoshell subjected to thermo-electro-magnetic loadings on free vibration behavior is investigated. Also, the surrounding elastic medium has been considered as the model of Winkler characterized by the spring. The size-dependent MEE nanoshell is investigated on the basis of the modified couple stress theory. Taking attention to the first-order shear deformation theory (FSDT), the modeled nanoshell and its equations of motion are derived using principle of minimum potential energy. The accuracy of the presented model is validated with some cases in the literature. Finally, using the Navier-type method, an analytical solution of governing equations for vibration behavior of simply supported MEE cylindrical nanoshell under combined loadings is presented and the effects of material length scale parameter, temperature changes, external electric potential, external magnetic potential, circumferential wave numbers, constant of spring, shear correction factor and length-to-radius ratio of the nanoshell on natural frequency are identified. Since there has been no research about size-dependent analysis MEE cylindrical nanoshell under combined loadings based on FSDT, numerical results are presented to be served as benchmarks for future analysis of MEE nanoshells using the modified couple stress theory.
Flutter analysis of composite box beams
NASA Technical Reports Server (NTRS)
Hodges, Dewey H.; Greenman, Matthew
1995-01-01
The dynamic aeroelastic instability of flutter is an important factor in the design of modern high-speed, flexible aircraft. The current trend is toward the creative use of composites to delay flutter. To obtain an optimum design, we need an accurate as well as efficient model. As a first step towards this goal, flutter analysis is carried out for an unswept composite box beam using a linear structural model and Theodorsen's unsteady aerodynamic theory. Structurally, the wing was modeled as a thin-walled box-beam of rectangular cross section. Theodorsen's theory was used to get 2-D unsteady aerodynamic forces, which were integrated over the span. A free-vibration analysis is carried out. These fundamental modes are used to get the flutter solution using the V-g method. Future work is intended to build on this foundation.
Analysis of a Teacher's Pedagogical Arguments Using Toulmin's Model and Argumentation Schemes
ERIC Educational Resources Information Center
Metaxas, N.; Potari, D.; Zachariades, T.
2016-01-01
In this article, we elaborate methodologies to study the argumentation speech of a teacher involved in argumentative activities. The standard tool of analysis of teachers' argumentation concerning pedagogical matters is Toulmin's model. The theory of argumentation schemes offers an alternative perspective on the analysis of arguments. We propose…
Modeling of composite beams and plates for static and dynamic analysis
NASA Technical Reports Server (NTRS)
Hodges, Dewey H.
1992-01-01
A rigorous theory and the corresponding computational algorithms were developed for through-the-thickness analysis of composite plates. This type of analysis is needed in order to find the elastic stiffness constants of a plate. Additionally, the analysis is used to post-process the resulting plate solution in order to find approximate three-dimensional displacement, strain, and stress distributions throughout the plate. It was decided that the variational-asymptotical method (VAM) would serve as a suitable framework in which to solve these types of problems. Work during this reporting period has progressed along two lines: (1) further evaluation of neo-classical plate theory (NCPT) as applied to shear-coupled laminates; and (2) continued modeling of plates with nonuniform thickness.
Nash Equilibria in Theory of Reasoned Action
NASA Astrophysics Data System (ADS)
Almeida, Leando; Cruz, José; Ferreira, Helena; Pinto, Alberto Adrego
2009-08-01
Game theory and Decision Theory have been applied to many different areas such as Physics, Economics, Biology, etc. In its application to Psychology, we introduce, in the literature, a Game Theoretical Model of Planned Behavior or Reasoned Action by establishing an analogy between two specific theories. In this study we take in account that individual decision-making is an outcome of a process where group decisions can determine individual probabilistic behavior. Using Game Theory concepts, we describe how intentions can be transformed in behavior and according to the Nash Equilibrium, this process will correspond to the best individual decision/response taking in account the collective response. This analysis can be extended to several examples based in the Game Theoretical Model of Planned Behavior or Reasoned Action.
ERIC Educational Resources Information Center
Merrill, Samuel, III; Enelow, James M.
This document consists of two modules. The first studies a variety of multicandidate voting systems, including approval, Borda, and cumulative voting, using a model which takes account of a voter's intensity of preference for candidates. The voter's optimal strategy is investigated for each voting system using decision criteria under uncertainty…
A thermostatted kinetic theory model for event-driven pedestrian dynamics
NASA Astrophysics Data System (ADS)
Bianca, Carlo; Mogno, Caterina
2018-06-01
This paper is devoted to the modeling of the pedestrian dynamics by means of the thermostatted kinetic theory. Specifically the microscopic interactions among pedestrians and an external force field are modeled for simulating the evacuation of pedestrians from a metro station. The fundamentals of the stochastic game theory and the thermostatted kinetic theory are coupled for the derivation of a specific mathematical model which depicts the time evolution of the distribution of pedestrians at different exits of a metro station. The perturbation theory is employed in order to establish the stability analysis of the nonequilibrium stationary states in the case of a metro station consisting of two exits. A general sensitivity analysis on the initial conditions, the magnitude of the external force field and the number of exits is presented by means of numerical simulations which, in particular, show how the asymptotic distribution and the convergence time are affected by the presence of an external force field. The results show how, in evacuation conditions, the interaction dynamics among pedestrians can be negligible with respect to the external force. The important role of the thermostat term in allowing the reaching of the nonequilibrium stationary state is stressed out. Research perspectives are underlined at the end of paper, in particular for what concerns the derivation of frameworks that take into account the definition of local external actions and the introduction of the space and velocity dynamics.
PROC IRT: A SAS Procedure for Item Response Theory
Matlock Cole, Ki; Paek, Insu
2017-01-01
This article reviews the procedure for item response theory (PROC IRT) procedure in SAS/STAT 14.1 to conduct item response theory (IRT) analyses of dichotomous and polytomous datasets that are unidimensional or multidimensional. The review provides an overview of available features, including models, estimation procedures, interfacing, input, and output files. A small-scale simulation study evaluates the IRT model parameter recovery of the PROC IRT procedure. The use of the IRT procedure in Statistical Analysis Software (SAS) may be useful for researchers who frequently utilize SAS for analyses, research, and teaching.
NASA Astrophysics Data System (ADS)
Bellomo, Nicola; Elaiw, Ahmed; Alghamdi, Mohamed Ali
2016-03-01
The paper by Burini, De Lillo, and Gibelli [8] presents an overview and critical analysis of the literature on the modeling of learning dynamics. The first reference is the celebrated paper by Cucker and Smale [9]. Then, the authors also propose their own approach, based on suitable development of methods of the kinetic theory [6] and theoretical tools of evolutionary game theory [12,13], recently developed on graphs [2].
Using Persuasion Models to Identify Givers.
ERIC Educational Resources Information Center
Ferguson, Mary Ann; And Others
1986-01-01
Assesses the feasibility of and suggests using W. J. McGuire's information processing theory and cognitive response analysis theory in research studies to identify "givers"--those who are likely to contribute money and resources to charities or volunteer to aid philanthropic organizations. (SRT)
Using health education theories to explain behavior change: a cross-country analysis. 2000-2001.
Murray-Johnson, Lisa; Witte, Kim; Boulay, Marc; Figueroa, Maria Elena; Storey, Douglas; Tweedie, Ian
Scholars within the fields of public health, health education, health promotion, and health communication look to specific theories to explain health behavior change. The purpose of this article is to critically compare four health theories and key variables within them with regard to behavior change in the area of reproductive health. Using cross-country analyses of Ghana, Nepal, and Nicaragua (data sets provided by the Center for Communication Programs, Johns Hopkins University), the authors looked at the Health Belief Model, Theory of Reasoned Action, Extended Parallel Process Model, and Social Cognitive Theory for these two defined objectives. Results show that all four theories provide an excellent fit to the data, but that certain variables within them may have particular value for understanding specific aspects of behavior change. Recommendations for the selection of theories to use as guidelines in the design and evaluation of reproductive health programs are provided.
Modeling Composite Assessment Data Using Item Response Theory
Ueckert, Sebastian
2018-01-01
Composite assessments aim to combine different aspects of a disease in a single score and are utilized in a variety of therapeutic areas. The data arising from these evaluations are inherently discrete with distinct statistical properties. This tutorial presents the framework of the item response theory (IRT) for the analysis of this data type in a pharmacometric context. The article considers both conceptual (terms and assumptions) and practical questions (modeling software, data requirements, and model building). PMID:29493119
Estimating outflow facility through pressure dependent pathways of the human eye
Gardiner, Bruce S.
2017-01-01
We develop and test a new theory for pressure dependent outflow from the eye. The theory comprises three main parameters: (i) a constant hydraulic conductivity, (ii) an exponential decay constant and (iii) a no-flow intraocular pressure, from which the total pressure dependent outflow, average outflow facilities and local outflow facilities for the whole eye may be evaluated. We use a new notation to specify precisely the meaning of model parameters and so model outputs. Drawing on a range of published data, we apply the theory to animal eyes, enucleated eyes and in vivo human eyes, and demonstrate how to evaluate model parameters. It is shown that the theory can fit high quality experimental data remarkably well. The new theory predicts that outflow facilities and total pressure dependent outflow for the whole eye are more than twice as large as estimates based on the Goldman equation and fluorometric analysis of anterior aqueous outflow. It appears likely that this discrepancy can be largely explained by pseudofacility and aqueous flow through the retinal pigmented epithelium, while any residual discrepancy may be due to pathological processes in aged eyes. The model predicts that if the hydraulic conductivity is too small, or the exponential decay constant is too large, then intraocular eye pressure may become unstable when subjected to normal circadian changes in aqueous production. The model also predicts relationships between variables that may be helpful when planning future experiments, and the model generates many novel testable hypotheses. With additional research, the analysis described here may find application in the differential diagnosis, prognosis and monitoring of glaucoma. PMID:29261696
Unperturbed Schelling Segregation in Two or Three Dimensions
NASA Astrophysics Data System (ADS)
Barmpalias, George; Elwes, Richard; Lewis-Pye, Andrew
2016-09-01
Schelling's models of segregation, first described in 1969 (Am Econ Rev 59:488-493, 1969) are among the best known models of self-organising behaviour. Their original purpose was to identify mechanisms of urban racial segregation. But his models form part of a family which arises in statistical mechanics, neural networks, social science, and beyond, where populations of agents interact on networks. Despite extensive study, unperturbed Schelling models have largely resisted rigorous analysis, prior results generally focusing on variants in which noise is introduced into the dynamics, the resulting system being amenable to standard techniques from statistical mechanics or stochastic evolutionary game theory (Young in Individual strategy and social structure: an evolutionary theory of institutions, Princeton University Press, Princeton, 1998). A series of recent papers (Brandt et al. in: Proceedings of the 44th annual ACM symposium on theory of computing (STOC 2012), 2012); Barmpalias et al. in: 55th annual IEEE symposium on foundations of computer science, Philadelphia, 2014, J Stat Phys 158:806-852, 2015), has seen the first rigorous analyses of 1-dimensional unperturbed Schelling models, in an asymptotic framework largely unknown in statistical mechanics. Here we provide the first such analysis of 2- and 3-dimensional unperturbed models, establishing most of the phase diagram, and answering a challenge from Brandt et al. in: Proceedings of the 44th annual ACM symposium on theory of computing (STOC 2012), 2012).
Relativistic tests with lunar laser ranging
NASA Astrophysics Data System (ADS)
Hofmann, F.; Müller, J.
2018-02-01
This paper presents the recent version of the lunar laser ranging (LLR) analysis model at the Institut für Erdmessung (IfE), Leibniz Universität Hannover and highlights a few tests of Einstein’s theory of gravitation using LLR data. Investigations related to a possible temporal variation of the gravitational constant, the equivalence principle, the PPN parameters β and γ as well as the geodetic precession were carried out. The LLR analysis model was updated by gravitational effects of the Sun and planets with the Moon as extended body. The higher-order gravitational interaction between Earth and Moon as well as effects of the solid Earth tides on the lunar motion were refined. The basis for the modeled lunar rotation is now a 2-layer core/mantle model according to the DE430 ephemeris. The validity of Einstein’s theory was studied using this updated analysis model and an LLR data set from 1970 to January 2015. Within the estimated accuracies, no deviations from Einstein’s theory are detected. A relative temporal variation of the gravitational constant is estimated as \\dot{G}/G_0=(7.1+/-7.6)×10-14~yr-1 , the test of the equivalence principle gives Δ(m_g/m_i)EM=(-3+/-5)×10-14 and the Nordtvedt parameter \
Eastwood, John G; Kemp, Lynn A; Jalaludin, Bin B
2016-01-01
We have recently described a protocol for a study that aims to build a theory of neighbourhood context and postnatal depression. That protocol proposed a critical realist Explanatory Theory Building Method comprising of an: (1) emergent phase, (2) construction phase, and (3) confirmatory phase. A concurrent triangulated mixed method multilevel cross-sectional study design was described. The protocol also described in detail the Theory Construction Phase which will be presented here. The Theory Construction Phase will include: (1) defining stratified levels; (2) analytic resolution; (3) abductive reasoning; (4) comparative analysis (triangulation); (5) retroduction; (6) postulate and proposition development; (7) comparison and assessment of theories; and (8) conceptual frameworks and model development. The stratified levels of analysis in this study were predominantly social and psychological. The abductive analysis used the theoretical frames of: Stress Process; Social Isolation; Social Exclusion; Social Services; Social Capital, Acculturation Theory and Global-economic level mechanisms. Realist propositions are presented for each analysis of triangulated data. Inference to best explanation is used to assess and compare theories. A conceptual framework of maternal depression, stress and context is presented that includes examples of mechanisms at psychological, social, cultural and global-economic levels. Stress was identified as a necessary mechanism that has the tendency to cause several outcomes including depression, anxiety, and health harming behaviours. The conceptual framework subsequently included conditional mechanisms identified through the retroduction including the stressors of isolation and expectations and buffers of social support and trust. The meta-theory of critical realism is used here to generate and construct social epidemiological theory using stratified ontology and both abductive and retroductive analysis. The findings will be applied to the development of a middle range theory and subsequent programme theory for local perinatal child and family interventions.
Sweet, Shane N.; Fortier, Michelle S.; Strachan, Shaelyn M.; Blanchard, Chris M.; Boulay, Pierre
2014-01-01
Self-determination theory and self-efficacy theory are prominent theories in the physical activity literature, and studies have begun integrating their concepts. Sweet, Fortier, Strachan and Blanchard (2012) have integrated these two theories in a cross-sectional study. Therefore, this study sought to test a longitudinal integrated model to predict physical activity at the end of a 4-month cardiac rehabilitation program based on theory, research and Sweet et al.’s cross-sectional model. Participants from two cardiac rehabilitation programs (N=109) answered validated self-report questionnaires at baseline, two and four months. Data were analyzed using Amos to assess the path analysis and model fit. Prior to integration, perceived competence and self-efficacy were combined, and labeled as confidence. After controlling for 2-month physical activity and cardiac rehabilitation site, no motivational variables significantly predicted residual change in 4-month physical activity. Although confidence at two months did not predict residual change in 4-month physical activity, it had a strong positive relationship with 2-month physical activity (β=0.30, P<0.001). The overall model retained good fit indices. In conclusion, results diverged from theoretical predictions of physical activity, but self-determination and self-efficacy theory were still partially supported. Because the model had good fit, this study demonstrated that theoretical integration is feasible. PMID:26973926
From bed to bench: bridging from informatics practice to theory: an exploratory analysis.
Haux, R; Lehmann, C U
2014-01-01
In 2009, Applied Clinical Informatics (ACI)--focused on applications in clinical informatics--was launched as a companion journal to Methods of Information in Medicine (MIM). Both journals are official journals of the International Medical Informatics Association. To explore which congruencies and interdependencies exist in publications from theory to practice and from practice to theory and to determine existing gaps. Major topics discussed in ACI and MIM were analyzed. We explored if the intention of publishing companion journals to provide an information bridge from informatics theory to informatics practice and vice versa could be supported by this model. In this manuscript we will report on congruencies and interdependences from practice to theory and on major topics in MIM. Retrospective, prolective observational study on recent publications of ACI and MIM. All publications of the years 2012 and 2013 were indexed and analyzed. Hundred and ninety-six publications were analyzed (ACI 87, MIM 109). In MIM publications, modelling aspects as well as methodological and evaluation approaches for the analysis of data, information, and knowledge in biomedicine and health care were frequently raised - and often discussed from an interdisciplinary point of view. Important themes were ambient-assisted living, anatomic spatial relations, biomedical informatics as scientific discipline, boosting, coding, computerized physician order entry, data analysis, grid and cloud computing, health care systems and services, health-enabling technologies, health information search, health information systems, imaging, knowledge-based decision support, patient records, signal analysis, and web science. Congruencies between journals could be found in themes, but with a different focus on content. Interdependencies from practice to theory, found in these publications, were only limited. Bridging from informatics theory to practice and vice versa remains a major component of successful research and practice as well as a major challenge.
Ejeta, Luche Tadesse; Ardalan, Ali; Paton, Douglas
2015-01-01
Background: Preparedness for disasters and emergencies at individual, community and organizational levels could be more effective tools in mitigating (the growing incidence) of disaster risk and ameliorating their impacts. That is, to play more significant roles in disaster risk reduction (DRR). Preparedness efforts focus on changing human behaviors in ways that reduce people’s risk and increase their ability to cope with hazard consequences. While preparedness initiatives have used behavioral theories to facilitate DRR, many theories have been used and little is known about which behavioral theories are more commonly used, where they have been used, and why they have been preferred over alternative behavioral theories. Given that theories differ with respect to the variables used and the relationship between them, a systematic analysis is an essential first step to answering questions about the relative utility of theories and providing a more robust evidence base for preparedness components of DRR strategies. The goal of this systematic review was to search and summarize evidence by assessing the application of behavioral theories to disaster and emergency health preparedness across the world. Methods: The protocol was prepared in which the study objectives, questions, inclusion and exclusion criteria, and sensitive search strategies were developed and pilot-tested at the beginning of the study. Using selected keywords, articles were searched mainly in PubMed, Scopus, Mosby’s Index (Nursing Index) and Safetylit databases. Articles were assessed based on their titles, abstracts, and their full texts. The data were extracted from selected articles and results were presented using qualitative and quantitative methods. Results: In total, 2040 titles, 450 abstracts and 62 full texts of articles were assessed for eligibility criteria, whilst five articles were archived from other sources, and then finally, 33 articles were selected. The Health Belief Model (HBM), Extended Parallel Process Model (EPPM), Theory of Planned Behavior (TPB) and Social Cognitive Theories were most commonly applied to influenza (H1N1 and H5N1), floods, and earthquake hazards. Studies were predominantly conducted in USA (13 studies). In Asia, where the annual number of disasters and victims exceeds those in other continents, only three studies were identified. Overall, the main constructs of HBM (perceived susceptibility, severity, benefits, and barriers), EPPM (higher threat and higher efficacy), TPB (attitude and subjective norm), and the majority of the constructs utilized in Social Cognitive Theories were associated with preparedness for diverse hazards. However, while all the theories described above describe the relationships between constituent variables, with the exception of research on Social Cognitive Theories, few studies of other theories and models used path analysis to identify the interdependence relationships between the constructs described in the respective theories/models. Similarly, few identified how other mediating variables could influence disaster and emergency preparedness. Conclusions: The existing evidence on the application of behavioral theories and models to disaster and emergency preparedness is chiefly from developed countries. This raises issues regarding their utility in countries, particularly in Asisa and the Middle East, where cultural characteristics are very different to those prevailing in the Western countries in which theories have been developed and tested. The theories and models discussed here have been applied predominantly to disease outbreaks and natural hazards, and information on their utility as guides to preparedness for man-made hazards is lacking. Hence, future studies related to behavioral theories and models addressing preparedness need to target developing countries where disaster risk and the consequent need for preparedness is high. A need for additional work on demonstrating the relationships of variables and constructs, including more clearly articulating roles for mediating effects was also identified in this analysis. PMID:26203400
ERIC Educational Resources Information Center
Rakkapao, Suttida; Prasitpong, Singha; Arayathanitkul, Kwan
2016-01-01
This study investigated the multiple-choice test of understanding of vectors (TUV), by applying item response theory (IRT). The difficulty, discriminatory, and guessing parameters of the TUV items were fit with the three-parameter logistic model of IRT, using the parscale program. The TUV ability is an ability parameter, here estimated assuming…
The concept of shared mental models in healthcare collaboration.
McComb, Sara; Simpson, Vicki
2014-07-01
To report an analysis of the concept of shared mental models in health care. Shared mental models have been described as facilitators of effective teamwork. The complexity and criticality of the current healthcare system requires shared mental models to enhance safe and effective patient/client care. Yet, the current concept definition in the healthcare literature is vague and, therefore, difficult to apply consistently in research and practice. Concept analysis. Literature for this concept analysis was retrieved from several databases, including CINAHL, PubMed and MEDLINE (EBSCO Interface), for the years 1997-2013. Walker and Avant's approach to concept analysis was employed and, following Paley's guidance, embedded in extant theory from the team literature. Although teamwork and collaboration are discussed frequently in healthcare literature, the concept of shared mental models in that context is not as commonly found but is increasing in appearance. Our concept analysis defines shared mental models as individually held knowledge structures that help team members function collaboratively in their environments and are comprised of the attributes of content, similarity, accuracy and dynamics. This theoretically grounded concept analysis provides a foundation for a middle-range descriptive theory of shared mental models in nursing and health care. Further research concerning the impact of shared mental models in the healthcare setting can result in development and refinement of shared mental models to support effective teamwork and collaboration. © 2013 John Wiley & Sons Ltd.
NASA Astrophysics Data System (ADS)
Uchidate, M.
2018-09-01
In this study, with the aim of establishing a systematic knowledge on the impact of summit extraction methods and stochastic model selection in rough contact analysis, the contact area ratio (A r /A a ) obtained by statistical contact models with different summit extraction methods was compared with a direct simulation using the boundary element method (BEM). Fifty areal topography datasets with different autocorrelation functions in terms of the power index and correlation length were used for investigation. The non-causal 2D auto-regressive model which can generate datasets with specified parameters was employed in this research. Three summit extraction methods, Nayak’s theory, 8-point analysis and watershed segmentation, were examined. With regard to the stochastic model, Bhushan’s model and BGT (Bush-Gibson-Thomas) model were applied. The values of A r /A a from the stochastic models tended to be smaller than BEM. The discrepancy between the Bhushan’s model with the 8-point analysis and BEM was slightly smaller than Nayak’s theory. The results with the watershed segmentation was similar to those with the 8-point analysis. The impact of the Wolf pruning on the discrepancy between the stochastic analysis and BEM was not very clear. In case of the BGT model which employs surface gradients, good quantitative agreement against BEM was obtained when the Nayak’s bandwidth parameter was large.
Mangling Expertise Using Post-Coding Analysis to Complexify Teacher Learning
ERIC Educational Resources Information Center
Mills, Tammy
2017-01-01
A recent movement in teacher education research encompasses working with and through theory. In response to the call from Jackson and Mazzei (2013) to use theory to think with data and use data to think with theory, the author hopes to portray the complexities of teacher learning by avoiding models of teacher learning and development that tend to…
ERIC Educational Resources Information Center
Barth-Cohen, Lauren A.; Wittmann, Michael C.
2017-01-01
This article presents an empirical analysis of conceptual difficulties encountered and ways students made progress in learning at both individual and group levels in a classroom environment in which the students used an embodied modeling activity to make sense of a specific scientific scenario. The theoretical framework, coordination class theory,…
Foundations of reusable and interoperable facet models using category theory
2016-01-01
Faceted browsing has become ubiquitous with modern digital libraries and online search engines, yet the process is still difficult to abstractly model in a manner that supports the development of interoperable and reusable interfaces. We propose category theory as a theoretical foundation for faceted browsing and demonstrate how the interactive process can be mathematically abstracted. Existing efforts in facet modeling are based upon set theory, formal concept analysis, and light-weight ontologies, but in many regards, they are implementations of faceted browsing rather than a specification of the basic, underlying structures and interactions. We will demonstrate that category theory allows us to specify faceted objects and study the relationships and interactions within a faceted browsing system. Resulting implementations can then be constructed through a category-theoretic lens using these models, allowing abstract comparison and communication that naturally support interoperability and reuse. PMID:27942248
Functional renormalization group analysis of tensorial group field theories on Rd
NASA Astrophysics Data System (ADS)
Geloun, Joseph Ben; Martini, Riccardo; Oriti, Daniele
2016-07-01
Rank-d tensorial group field theories are quantum field theories (QFTs) defined on a group manifold G×d , which represent a nonlocal generalization of standard QFT and a candidate formalism for quantum gravity, since, when endowed with appropriate data, they can be interpreted as defining a field theoretic description of the fundamental building blocks of quantum spacetime. Their renormalization analysis is crucial both for establishing their consistency as quantum field theories and for studying the emergence of continuum spacetime and geometry from them. In this paper, we study the renormalization group flow of two simple classes of tensorial group field theories (TGFTs), defined for the group G =R for arbitrary rank, both without and with gauge invariance conditions, by means of functional renormalization group techniques. The issue of IR divergences is tackled by the definition of a proper thermodynamic limit for TGFTs. We map the phase diagram of such models, in a simple truncation, and identify both UV and IR fixed points of the RG flow. Encouragingly, for all the models we study, we find evidence for the existence of a phase transition of condensation type.
Bottom, William P; Kong, Dejun Tony
2012-01-01
Reflecting on his wartime government service, Walter Lippmann (1922) developed a theory of policy formulation and error. Introducing the constructs of stereotype, mental model, blind spots, and the process of manufacturing consent, his theory prescribed interdisciplinary social science as a tool for enhancing policy making in business and government. Lippmann used his influence with the Rockefeller foundations, business leaders, Harvard and the University of Chicago to gain support for this program. Citation analysis of references to "stereotype" and Lippmann reveals the rapid spread of the concept across the social sciences and in public discourse paralleled by obliteration by incorporation of the wider theory in behavioral science. "Stereotype" is increasingly invoked in anthropology, economics, and sociology though Lippmann and his wider theory ceased being cited decades ago. In psychology, citations are increasing but content analysis revealed blind spots and misconceptions about the theory and prescription. Studies of heuristics, biases, and organizational decision substantiate Lippmann's theory of judgment and choice. But his model for social science failed to consider the bounded rationality and blind spots of its practitioners. Policy formulation today is supported by research from narrow disciplinary silos not interdisciplinary science that reflects an awareness of history. © 2012 Wiley Periodicals, Inc.
Uncertainties in Forecasting Streamflow using Entropy Theory
NASA Astrophysics Data System (ADS)
Cui, H.; Singh, V. P.
2017-12-01
Streamflow forecasting is essential in river restoration, reservoir operation, power generation, irrigation, navigation, and water management. However, there is always uncertainties accompanied in forecast, which may affect the forecasting results and lead to large variations. Therefore, uncertainties must be considered and be assessed properly when forecasting streamflow for water management. The aim of our work is to quantify the uncertainties involved in forecasting streamflow and provide reliable streamflow forecast. Despite that streamflow time series are stochastic, they exhibit seasonal and periodic patterns. Therefore, streamflow forecasting entails modeling seasonality, periodicity, and its correlation structure, and assessing uncertainties. This study applies entropy theory to forecast streamflow and measure uncertainties during the forecasting process. To apply entropy theory for streamflow forecasting, spectral analysis is combined to time series analysis, as spectral analysis can be employed to characterize patterns of streamflow variation and identify the periodicity of streamflow. That is, it permits to extract significant information for understanding the streamflow process and prediction thereof. Application of entropy theory for streamflow forecasting involves determination of spectral density, determination of parameters, and extension of autocorrelation function. The uncertainties brought by precipitation input, forecasting model and forecasted results are measured separately using entropy. With information theory, how these uncertainties transported and aggregated during these processes will be described.
NASA Astrophysics Data System (ADS)
Le Foll, S.; André, F.; Delmas, A.; Bouilly, J. M.; Aspa, Y.
2012-06-01
A backward Monte Carlo method for modelling the spectral directional emittance of fibrous media has been developed. It uses Mie theory to calculate the radiative properties of single fibres, modelled as infinite cylinders, and the complex refractive index is computed by a Drude-Lorenz model for the dielectric function. The absorption and scattering coefficient are homogenised over several fibres, but the scattering phase function of a single one is used to determine the scattering direction of energy inside the medium. Sensitivity analysis based on several Monte Carlo results has been performed to estimate coefficients for a Multi-Linear Model (MLM) specifically developed for inverse analysis of experimental data. This model concurs with the Monte Carlo method and is highly computationally efficient. In contrast, the surface emissivity model, which assumes an opaque medium, shows poor agreement with the reference Monte Carlo calculations.
Using IBMs to Investigate Spatially-dependent Processes in Landscape Genetics Theory
Much of landscape and conservation genetics theory has been derived using non-spatialmathematical models. Here, we use a mechanistic, spatially-explicit, eco-evolutionary IBM to examine the utility of this theoretical framework in landscapes with spatial structure. Our analysis...
Applying STAMP in Accident Analysis
NASA Technical Reports Server (NTRS)
Leveson, Nancy; Daouk, Mirna; Dulac, Nicolas; Marais, Karen
2003-01-01
Accident models play a critical role in accident investigation and analysis. Most traditional models are based on an underlying chain of events. These models, however, have serious limitations when used for complex, socio-technical systems. Previously, Leveson proposed a new accident model (STAMP) based on system theory. In STAMP, the basic concept is not an event but a constraint. This paper shows how STAMP can be applied to accident analysis using three different views or models of the accident process and proposes a notation for describing this process.
Discrete-Layer Piezoelectric Plate and Shell Models for Active Tip-Clearance Control
NASA Technical Reports Server (NTRS)
Heyliger, P. R.; Ramirez, G.; Pei, K. C.
1994-01-01
The objectives of this work were to develop computational tools for the analysis of active-sensory composite structures with added or embedded piezoelectric layers. The targeted application for this class of smart composite laminates and the analytical development is the accomplishment of active tip-clearance control in turbomachinery components. Two distinct theories and analytical models were developed and explored under this contract: (1) a discrete-layer plate theory and corresponding computational models, and (2) a three dimensional general discrete-layer element generated in curvilinear coordinates for modeling laminated composite piezoelectric shells. Both models were developed from the complete electromechanical constitutive relations of piezoelectric materials, and incorporate both displacements and potentials as state variables. This report describes the development and results of these models. The discrete-layer theories imply that the displacement field and electrostatic potential through-the-thickness of the laminate are described over an individual layer rather than as a smeared function over the thickness of the entire plate or shell thickness. This is especially crucial for composites with embedded piezoelectric layers, as the actuating and sensing elements within these layers are poorly represented by effective or smeared properties. Linear Lagrange interpolation polynomials were used to describe the through-thickness laminate behavior. Both analytic and finite element approximations were used in the plane or surface of the structure. In this context, theoretical developments are presented for the discrete-layer plate theory, the discrete-layer shell theory, and the formulation of an exact solution for simply-supported piezoelectric plates. Finally, evaluations and results from a number of separate examples are presented for the static and dynamic analysis of the plate geometry. Comparisons between the different approaches are provided when possible, and initial conclusions regarding the accuracy and limitations of these models are given.
Global Constraints on Anomalous Triple Gauge Couplings in the Effective Field Theory Approach.
Falkowski, Adam; González-Alonso, Martín; Greljo, Admir; Marzocca, David
2016-01-08
We present a combined analysis of LHC Higgs data (signal strengths) together with LEP-2 WW production measurements. To characterize possible deviations from the standard model (SM) predictions, we employ the framework of an effective field theory (EFT) where the SM is extended by higher-dimensional operators suppressed by the mass scale of new physics Λ. The analysis is performed consistently at the order Λ(-2) in the EFT expansion keeping all the relevant operators. While the two data sets suffer from flat directions, together they impose stringent model-independent constraints on the anomalous triple gauge couplings.
Equilibrium paths analysis of materials with rheological properties by using the chaos theory
NASA Astrophysics Data System (ADS)
Bednarek, Paweł; Rządkowski, Jan
2018-01-01
The numerical equilibrium path analysis of the material with random rheological properties by using standard procedures and specialist computer programs was not successful. The proper solution for the analysed heuristic model of the material was obtained on the base of chaos theory elements and neural networks. The paper deals with mathematical reasons of used computer programs and also are elaborated the properties of the attractor used in analysis. There are presented results of conducted numerical analysis both in a numerical and in graphical form for the used procedures.
A Markov Chain Monte Carlo Approach to Confirmatory Item Factor Analysis
ERIC Educational Resources Information Center
Edwards, Michael C.
2010-01-01
Item factor analysis has a rich tradition in both the structural equation modeling and item response theory frameworks. The goal of this paper is to demonstrate a novel combination of various Markov chain Monte Carlo (MCMC) estimation routines to estimate parameters of a wide variety of confirmatory item factor analysis models. Further, I show…
Stepwise Analysis of Differential Item Functioning Based on Multiple-Group Partial Credit Model.
ERIC Educational Resources Information Center
Muraki, Eiji
1999-01-01
Extended an Item Response Theory (IRT) method for detection of differential item functioning to the partial credit model and applied the method to simulated data using a stepwise procedure. Then applied the stepwise DIF analysis based on the multiple-group partial credit model to writing trend data from the National Assessment of Educational…
Piezoelectric transformer structural modeling--a review.
Yang, Jiashi
2007-06-01
A review on piezoelectric transformer structural modeling is presented. The operating principle and the basic behavior of piezoelectric transformers as governed by the linear theory of piezoelectricity are shown by a simple, theoretical analysis on a Rosen transformer based on extensional modes of a nonhomogeneous ceramic rod. Various transformers are classified according to their structural shapes, operating modes, and voltage transforming capability. Theoretical and numerical modeling results from the theory of piezoelectricity are reviewed. More advances modeling on thermal and nonlinear effects also are discussed. The article contains 167 references.
NASA Astrophysics Data System (ADS)
Mashayekhi, Mohammad Jalali; Behdinan, Kamran
2017-10-01
The increasing demand to minimize undesired vibration and noise levels in several high-tech industries has generated a renewed interest in vibration transfer path analysis. Analyzing vibration transfer paths within a system is of crucial importance in designing an effective vibration isolation strategy. Most of the existing vibration transfer path analysis techniques are empirical which are suitable for diagnosis and troubleshooting purpose. The lack of an analytical transfer path analysis to be used in the design stage is the main motivation behind this research. In this paper an analytical transfer path analysis based on the four-pole theory is proposed for multi-energy-domain systems. Bond graph modeling technique which is an effective approach to model multi-energy-domain systems is used to develop the system model. In this paper an electro-mechanical system is used as a benchmark example to elucidate the effectiveness of the proposed technique. An algorithm to obtain the equivalent four-pole representation of a dynamical systems based on the corresponding bond graph model is also presented in this paper.
NASA Astrophysics Data System (ADS)
Virella, Juan C.; Prato, Carlos A.; Godoy, Luis A.
2008-05-01
The influence of nonlinear wave theory on the sloshing natural periods and their modal pressure distributions are investigated for rectangular tanks under the assumption of two-dimensional behavior. Natural periods and mode shapes are computed and compared for both linear wave theory (LWT) and nonlinear wave theory (NLWT) models, using the finite element package ABAQUS. Linear wave theory is implemented in an acoustic model, whereas a plane strain problem with large displacements is used in NLWT. Pressure distributions acting on the tank walls are obtained for the first three sloshing modes using both linear and nonlinear wave theory. It is found that the nonlinearity does not have significant effects on the natural sloshing periods. For the sloshing pressures on the tank walls, different distributions were found using linear and nonlinear wave theory models. However, in all cases studied, the linear wave theory conservatively estimated the magnitude of the pressure distribution, whereas larger pressures resultant heights were obtained when using the nonlinear theory. It is concluded that the nonlinearity of the surface wave does not have major effects in the pressure distribution on the walls for rectangular tanks.
Analyzing Test-Taking Behavior: Decision Theory Meets Psychometric Theory.
Budescu, David V; Bo, Yuanchao
2015-12-01
We investigate the implications of penalizing incorrect answers to multiple-choice tests, from the perspective of both test-takers and test-makers. To do so, we use a model that combines a well-known item response theory model with prospect theory (Kahneman and Tversky, Prospect theory: An analysis of decision under risk, Econometrica 47:263-91, 1979). Our results reveal that when test-takers are fully informed of the scoring rule, the use of any penalty has detrimental effects for both test-takers (they are always penalized in excess, particularly those who are risk averse and loss averse) and test-makers (the bias of the estimated scores, as well as the variance and skewness of their distribution, increase as a function of the severity of the penalty).
Statistical sensitivity analysis of a simple nuclear waste repository model
NASA Astrophysics Data System (ADS)
Ronen, Y.; Lucius, J. L.; Blow, E. M.
1980-06-01
A preliminary step in a comprehensive sensitivity analysis of the modeling of a nuclear waste repository. The purpose of the complete analysis is to determine which modeling parameters and physical data are most important in determining key design performance criteria and then to obtain the uncertainty in the design for safety considerations. The theory for a statistical screening design methodology is developed for later use in the overall program. The theory was applied to the test case of determining the relative importance of the sensitivity of near field temperature distribution in a single level salt repository to modeling parameters. The exact values of the sensitivities to these physical and modeling parameters were then obtained using direct methods of recalculation. The sensitivity coefficients found to be important for the sample problem were thermal loading, distance between the spent fuel canisters and their radius. Other important parameters were those related to salt properties at a point of interest in the repository.
NASA Astrophysics Data System (ADS)
Outada, Nisrine
2016-09-01
I have read with great interest the paper [5] where the authors present an overview and critical analysis of the literature on the modeling of the crowd dynamics with special attention to evacuation dynamics. The approach developed is based on suitable development of methods of the kinetic theory. Interactions, which lead to the decision choice, are modeled by theoretical tools of stochastic evolutionary game theory [11,12]. However, the paper [5] provides not only a survey focused on topics of great interest for our society, but also it looks ahead to a variety of interesting and challenging mathematical problems. Specifically, I am interested in the derivation of macroscopic (hydrodynamic) models from the underlying description given from the kinetic theory approach, more specifically by the kinetic theory for active particles [8]. A general reference on crowd modeling is the recently published book [10].
Why involve families in acute mental healthcare? A collaborative conceptual review
Sandhu, Sima; Giacco, Domenico; Barrett, Katherine; Bennison, Gerry; Collinson, Sue; Priebe, Stefan
2017-01-01
Objectives Family involvement is strongly recommended in clinical guidelines but suffers from poor implementation. To explore this topic at a conceptual level, a multidisciplinary review team including academics, clinicians and individuals with lived experience undertook a review to explore the theoretical background of family involvement models in acute mental health treatment and how this relates to their delivery. Design A conceptual review was undertaken, including a systematic search and narrative synthesis. Included family models were mapped onto the most commonly referenced underlying theories: the diathesis–stress model, systems theories and postmodern theories of mental health. Common components of the models were summarised and compared. Lastly, a thematic analysis was undertaken to explore the role of patients and families in the delivery of the approaches. Setting General adult acute mental health treatment. Results Six distinct family involvement models were identified: Calgary Family Assessment and Intervention Models, ERIC (Equipe Rapide d’Intervention de Crise), Family Psychoeducation Models, Family Systems Approach, Open Dialogue and the Somerset Model. Findings indicated that despite wide variation in the theoretical models underlying family involvement models, there were many commonalities in their components, such as a focus on communication, language use and joint decision-making. Thematic analysis of the role of patients and families identified several issues for implementation. This included potential harms that could emerge during delivery of the models, such as imposing linear ‘patient–carer’ relationships and the risk of perceived coercion. Conclusions We conclude that future staff training may benefit from discussing the chosen family involvement model within the context of other theories of mental health. This may help to clarify the underlying purpose of family involvement and address the diverse needs and world views of patients, families and professionals in acute settings. PMID:28963308
An Ontology of Power: Perception and Reality in Conflict
2016-12-01
synthetic model was developed as the constant comparative analysis was resumed through the application of selected theory toward the original source...The synthetic model represents a series of maxims for the analysis of a complex social system, developed through a study of contemporary national...and categories. A model of strategic agency is proposed as an alternative framework for developing security strategy. The strategic agency model draws
On numerical integration and computer implementation of viscoplastic models
NASA Technical Reports Server (NTRS)
Chang, T. Y.; Chang, J. P.; Thompson, R. L.
1985-01-01
Due to the stringent design requirement for aerospace or nuclear structural components, considerable research interests have been generated on the development of constitutive models for representing the inelastic behavior of metals at elevated temperatures. In particular, a class of unified theories (or viscoplastic constitutive models) have been proposed to simulate material responses such as cyclic plasticity, rate sensitivity, creep deformations, strain hardening or softening, etc. This approach differs from the conventional creep and plasticity theory in that both the creep and plastic deformations are treated as unified time-dependent quantities. Although most of viscoplastic models give better material behavior representation, the associated constitutive differential equations have stiff regimes which present numerical difficulties in time-dependent analysis. In this connection, appropriate solution algorithm must be developed for viscoplastic analysis via finite element method.
Einstein’s gravity from a polynomial affine model
NASA Astrophysics Data System (ADS)
Castillo-Felisola, Oscar; Skirzewski, Aureliano
2018-03-01
We show that the effective field equations for a recently formulated polynomial affine model of gravity, in the sector of a torsion-free connection, accept general Einstein manifolds—with or without cosmological constant—as solutions. Moreover, the effective field equations are partially those obtained from a gravitational Yang–Mills theory known as Stephenson–Kilmister–Yang theory. Additionally, we find a generalization of a minimally coupled massless scalar field in General Relativity within a ‘minimally’ coupled scalar field in this affine model. Finally, we present a brief (perturbative) analysis of the propagators of the gravitational theory, and count the degrees of freedom. For completeness, we prove that a Birkhoff-like theorem is valid for the analyzed sector.
Dynamical Systems in Circuit Designer's Eyes
DOE Office of Scientific and Technical Information (OSTI.GOV)
Odyniec, M.
Examples of nonlinear circuit design are given. Focus of the design process is on theory and engineering methods (as opposed to numerical analysis). Modeling is related to measurements It is seen that the phase plane is still very useful with proper models Harmonic balance/describing function offers powerful insight (via the combination of simulation with circuit and ODE theory). Measurement and simulation capabilities increased, especially harmonics measurements (since sinusoids are easy to generate)
ERIC Educational Resources Information Center
Hsieh, Chueh-An; von Eye, Alexander A.; Maier, Kimberly S.
2010-01-01
The application of multidimensional item response theory models to repeated observations has demonstrated great promise in developmental research. It allows researchers to take into consideration both the characteristics of item response and measurement error in longitudinal trajectory analysis, which improves the reliability and validity of the…
ERIC Educational Resources Information Center
Duroisin, Natacha; Demeuse, Marc
2015-01-01
One possible way of evaluating set curricula is to examine the consistency of study programmes with students' psycho-cognitive development. Three theories were used to evaluate matching between developmental theories and content proposed in the mathematics programmes (geometry section) for primary and the beginning of secondary education. These…
An analysis of possible applications of fuzzy set theory to the actuarial credibility theory
NASA Technical Reports Server (NTRS)
Ostaszewski, Krzysztof; Karwowski, Waldemar
1992-01-01
In this work, we review the basic concepts of actuarial credibility theory from the point of view of introducing applications of the fuzzy set-theoretic method. We show how the concept of actuarial credibility can be modeled through the fuzzy set membership functions and how fuzzy set methods, especially fuzzy pattern recognition, can provide an alternative tool for estimating credibility.
ERIC Educational Resources Information Center
Semlak, William D.; And Others
A study used M. Cuffe and J. F. Cragan's three-dimensional model for understanding corporate culture within an organization to describe the managerial styles of chairpersons at Illinois State University. Case studies were completed for 18 chairpersons, who then sorted 60 statements on leadership style on a forced choice continuum from most…
McSherry, Wilfred
2006-07-01
The aim of this study was to generate a deeper understanding of the factors and forces that may inhibit or advance the concepts of spirituality and spiritual care within both nursing and health care. This manuscript presents a model that emerged from a qualitative study using grounded theory. Implementation and use of this model may assist all health care practitioners and organizations to advance the concepts of spirituality and spiritual care within their own sphere of practice. The model has been termed the principal components model because participants identified six components as being crucial to the advancement of spiritual health care. Grounded theory was used meaning that there was concurrent data collection and analysis. Theoretical sampling was used to develop the emerging theory. These processes, along with data analysis, open, axial and theoretical coding led to the identification of a core category and the construction of the principal components model. Fifty-three participants (24 men and 29 women) were recruited and all consented to be interviewed. The sample included nurses (n=24), chaplains (n=7), a social worker (n=1), an occupational therapist (n=1), physiotherapists (n=2), patients (n=14) and the public (n=4). The investigation was conducted in three phases to substantiate the emerging theory and the development of the model. The principal components model contained six components: individuality, inclusivity, integrated, inter/intra-disciplinary, innate and institution. A great deal has been written on the concepts of spirituality and spiritual care. However, rhetoric alone will not remove some of the intrinsic and extrinsic barriers that are inhibiting the advancement of the spiritual dimension in terms of theory and practice. An awareness of and adherence to the principal components model may assist nurses and health care professionals to engage with and overcome some of the structural, organizational, political and social variables that are impacting upon spiritual care.
NASA Technical Reports Server (NTRS)
Laxmanan, V.
1985-01-01
A critical review of the present dendritic growth theories and models is presented. Mathematically rigorous solutions to dendritic growth are found to rely on an ad hoc assumption that dendrites grow at the maximum possible growth rate. This hypothesis is found to be in error and is replaced by stability criteria which consider the conditions under which a dendrite tip advances in a stable fashion in a liquid. The important elements of a satisfactory model for dendritic solidification are summarized and a theoretically consistent model for dendritic growth under an imposed thermal gradient is proposed and described. The model is based on the modification of an analysis due to Burden and Hunt (1974) and predicts correctly in all respects, the transition from a dendritic to a planar interface at both very low and very large growth rates.
A New Methodology of Spatial Cross-Correlation Analysis
Chen, Yanguang
2015-01-01
Spatial correlation modeling comprises both spatial autocorrelation and spatial cross-correlation processes. The spatial autocorrelation theory has been well-developed. It is necessary to advance the method of spatial cross-correlation analysis to supplement the autocorrelation analysis. This paper presents a set of models and analytical procedures for spatial cross-correlation analysis. By analogy with Moran’s index newly expressed in a spatial quadratic form, a theoretical framework is derived for geographical cross-correlation modeling. First, two sets of spatial cross-correlation coefficients are defined, including a global spatial cross-correlation coefficient and local spatial cross-correlation coefficients. Second, a pair of scatterplots of spatial cross-correlation is proposed, and the plots can be used to visually reveal the causality behind spatial systems. Based on the global cross-correlation coefficient, Pearson’s correlation coefficient can be decomposed into two parts: direct correlation (partial correlation) and indirect correlation (spatial cross-correlation). As an example, the methodology is applied to the relationships between China’s urbanization and economic development to illustrate how to model spatial cross-correlation phenomena. This study is an introduction to developing the theory of spatial cross-correlation, and future geographical spatial analysis might benefit from these models and indexes. PMID:25993120
A new methodology of spatial cross-correlation analysis.
Chen, Yanguang
2015-01-01
Spatial correlation modeling comprises both spatial autocorrelation and spatial cross-correlation processes. The spatial autocorrelation theory has been well-developed. It is necessary to advance the method of spatial cross-correlation analysis to supplement the autocorrelation analysis. This paper presents a set of models and analytical procedures for spatial cross-correlation analysis. By analogy with Moran's index newly expressed in a spatial quadratic form, a theoretical framework is derived for geographical cross-correlation modeling. First, two sets of spatial cross-correlation coefficients are defined, including a global spatial cross-correlation coefficient and local spatial cross-correlation coefficients. Second, a pair of scatterplots of spatial cross-correlation is proposed, and the plots can be used to visually reveal the causality behind spatial systems. Based on the global cross-correlation coefficient, Pearson's correlation coefficient can be decomposed into two parts: direct correlation (partial correlation) and indirect correlation (spatial cross-correlation). As an example, the methodology is applied to the relationships between China's urbanization and economic development to illustrate how to model spatial cross-correlation phenomena. This study is an introduction to developing the theory of spatial cross-correlation, and future geographical spatial analysis might benefit from these models and indexes.
A Unified Framework for Monetary Theory and Policy Analysis.
ERIC Educational Resources Information Center
Lagos, Ricardo; Wright, Randall
2005-01-01
Search-theoretic models of monetary exchange are based on explicit descriptions of the frictions that make money essential. However, tractable versions of these models typically make strong assumptions that render them ill suited for monetary policy analysis. We propose a new framework, based on explicit micro foundations, within which macro…
NASA Technical Reports Server (NTRS)
Moncada, Albert M.; Chattopadhyay, Aditi; Bednarcyk, Brett A.; Arnold, Steven M.
2008-01-01
Predicting failure in a composite can be done with ply level mechanisms and/or micro level mechanisms. This paper uses the Generalized Method of Cells and High-Fidelity Generalized Method of Cells micromechanics theories, coupled with classical lamination theory, as implemented within NASA's Micromechanics Analysis Code with Generalized Method of Cells. The code is able to implement different failure theories on the level of both the fiber and the matrix constituents within a laminate. A comparison is made among maximum stress, maximum strain, Tsai-Hill, and Tsai-Wu failure theories. To verify the failure theories the Worldwide Failure Exercise (WWFE) experiments have been used. The WWFE is a comprehensive study that covers a wide range of polymer matrix composite laminates. The numerical results indicate good correlation with the experimental results for most of the composite layups, but also point to the need for more accurate resin damage progression models.
NASA Astrophysics Data System (ADS)
Sarout, Joël.
2012-04-01
For the first time, a comprehensive and quantitative analysis of the domains of validity of popular wave propagation theories for porous/cracked media is provided. The case of a simple, yet versatile rock microstructure is detailed. The microstructural parameters controlling the applicability of the scattering theories, the effective medium theories, the quasi-static (Gassmann limit) and dynamic (inertial) poroelasticity are analysed in terms of pores/cracks characteristic size, geometry and connectivity. To this end, a new permeability model is devised combining the hydraulic radius and percolation concepts. The predictions of this model are compared to published micromechanical models of permeability for the limiting cases of capillary tubes and penny-shaped cracks. It is also compared to published experimental data on natural rocks in these limiting cases. It explicitly accounts for pore space topology around the percolation threshold and far above it. Thanks to this permeability model, the scattering, squirt-flow and Biot cut-off frequencies are quantitatively compared. This comparison leads to an explicit mapping of the domains of validity of these wave propagation theories as a function of the rock's actual microstructure. How this mapping impacts seismic, geophysical and ultrasonic wave velocity data interpretation is discussed. The methodology demonstrated here and the outcomes of this analysis are meant to constitute a quantitative guide for the selection of the most suitable modelling strategy to be employed for prediction and/or interpretation of rocks elastic properties in laboratory-or field-scale applications when information regarding the rock's microstructure is available.
Prospect theory on the brain? Toward a cognitive neuroscience of decision under risk.
Trepel, Christopher; Fox, Craig R; Poldrack, Russell A
2005-04-01
Most decisions must be made without advance knowledge of their consequences. Economists and psychologists have devoted much attention to modeling decisions made under conditions of risk in which options can be characterized by a known probability distribution over possible outcomes. The descriptive shortcomings of classical economic models motivated the development of prospect theory (D. Kahneman, A. Tversky, Prospect theory: An analysis of decision under risk. Econometrica, 4 (1979) 263-291; A. Tversky, D. Kahneman, Advances in prospect theory: Cumulative representation of uncertainty. Journal of Risk and Uncertainty, 5 (4) (1992) 297-323) the most successful behavioral model of decision under risk. In the prospect theory, subjective value is modeled by a value function that is concave for gains, convex for losses, and steeper for losses than for gains; the impact of probabilities are characterized by a weighting function that overweights low probabilities and underweights moderate to high probabilities. We outline the possible neural bases of the components of prospect theory, surveying evidence from human imaging, lesion, and neuropharmacology studies as well as animal neurophysiology studies. These results provide preliminary suggestions concerning the neural bases of prospect theory that include a broad set of brain regions and neuromodulatory systems. These data suggest that focused studies of decision making in the context of quantitative models may provide substantial leverage towards a fuller understanding of the cognitive neuroscience of decision making.
NASA Astrophysics Data System (ADS)
Ansari, R.; Sahmani, S.
2012-04-01
The free vibration response of single-walled carbon nanotubes (SWCNTs) is investigated in this work using various nonlocal beam theories. To this end, the nonlocal elasticity equations of Eringen are incorporated into the various classical beam theories namely as Euler-Bernoulli beam theory (EBT), Timoshenko beam theory (TBT), and Reddy beam theory (RBT) to consider the size-effects on the vibration analysis of SWCNTs. The generalized differential quadrature (GDQ) method is employed to discretize the governing differential equations of each nonlocal beam theory corresponding to four commonly used boundary conditions. Then molecular dynamics (MD) simulation is implemented to obtain fundamental frequencies of nanotubes with different chiralities and values of aspect ratio to compare them with the results obtained by the nonlocal beam models. Through the fitting of the two series of numerical results, appropriate values of nonlocal parameter are derived relevant to each type of chirality, nonlocal beam model, and boundary conditions. It is found that in contrast to the chirality, the type of nonlocal beam model and boundary conditions make difference between the calibrated values of nonlocal parameter corresponding to each one.
Ultrasound beam transmission using a discretely orthogonal Gaussian aperture basis
NASA Astrophysics Data System (ADS)
Roberts, R. A.
2018-04-01
Work is reported on development of a computational model for ultrasound beam transmission at an arbitrary geometry transmission interface for generally anisotropic materials. The work addresses problems encountered when the fundamental assumptions of ray theory do not hold, thereby introducing errors into ray-theory-based transmission models. Specifically, problems occur when the asymptotic integral analysis underlying ray theory encounters multiple stationary phase points in close proximity, due to focusing caused by concavity on either the entry surface or a material slowness surface. The approach presented here projects integrands over both the transducer aperture and the entry surface beam footprint onto a Gaussian-derived basis set, thereby distributing the integral over a summation of second-order phase integrals which are amenable to single stationary phase point analysis. Significantly, convergence is assured provided a sufficiently fine distribution of basis functions is used.
Kukafka, Rita; Johnson, Stephen B; Linfante, Allison; Allegrante, John P
2003-06-01
Many interventions to improve the success of information technology (IT) implementations are grounded in behavioral science, using theories, and models to identify conditions and determinants of successful use. However, each model in the IT literature has evolved to address specific theoretical problems of particular disciplinary concerns, and each model has been tested and has evolved using, in most cases, a more or less restricted set of IT implementation procedures. Functionally, this limits the perspective for taking into account the multiple factors at the individual, group, and organizational levels that influence use behavior. While a rich body of literature has emerged, employing prominent models such as the Technology Adoption Model, Social-Cognitive Theory, and Diffusion of Innovation Theory, the complexity of defining a suitable multi-level intervention has largely been overlooked. A gap exists between the implementation of IT and the integration of theories and models that can be utilized to develop multi-level approaches to identify factors that impede usage behavior. We present a novel framework that is intended to guide synthesis of more than one theoretical perspective for the purpose of planning multi-level interventions to enhance IT use. This integrative framework is adapted from PRECEDE/PROCEDE, a conceptual framework used by health planners in hundreds of published studies to direct interventions that account for the multiple determinants of behavior. Since we claim that the literature on IT use behavior does not now include a multi-level approach, we undertook a systematic literature analysis to confirm this assertion. Our framework facilitated organizing this literature synthesis and our analysis was aimed at determining if the IT implementation approaches in the published literature were characterized by an approach that considered at least two levels of IT usage determinants. We found that while 61% of studies mentioned or referred to theory, none considered two or more levels. In other words, although the researchers employ behavioral theory, they omit two fundamental propositions: (1) IT usage is influenced by multiple factors and (2) interventions must be multi-dimensional. Our literature synthesis may provide additional insight into the reason for high failure rates associated with underutilized systems, and underscores the need to move beyond the current dominant approach that employs a single model to guide IT implementation plans that aim to address factors associated with IT acceptance and subsequent positive use behavior.
NASA Astrophysics Data System (ADS)
Cunningham, Jessica D.
Newton's Universe (NU), an innovative teacher training program, strives to obtain measures from rural, middle school science teachers and their students to determine the impact of its distance learning course on understanding of temperature. No consensus exists on the most appropriate and useful method of analysis to measure change in psychological constructs over time. Several item response theory (IRT) models have been deemed useful in measuring change, which makes the choice of an IRT model not obvious. The appropriateness and utility of each model, including a comparison to a traditional analysis of variance approach, was investigated using middle school science student performance on an assessment over an instructional period. Predetermined criteria were outlined to guide model selection based on several factors including research questions, data properties, and meaningful interpretations to determine the most appropriate model for this study. All methods employed in this study reiterated one common interpretation of the data -- specifically, that the students of teachers with any NU course experience had significantly greater gains in performance over the instructional period. However, clear distinctions were made between an analysis of variance and the racked and stacked analysis using the Rasch model. Although limited research exists examining the usefulness of the Rasch model in measuring change in understanding over time, this study applied these methods and detailed plausible implications for data-driven decisions based upon results for NU and others. Being mindful of the advantages and usefulness of each method of analysis may help others make informed decisions about choosing an appropriate model to depict changes to evaluate other programs. Results may encourage other researchers to consider the meaningfulness of using IRT for this purpose. Results have implications for data-driven decisions for future professional development courses, in science education and other disciplines. KEYWORDS: Item Response Theory, Rasch Model, Racking and Stacking, Measuring Change in Student Performance, Newton's Universe teacher training
Nanomechanical properties of phospholipid microbubbles.
Buchner Santos, Evelyn; Morris, Julia K; Glynos, Emmanouil; Sboros, Vassilis; Koutsos, Vasileios
2012-04-03
This study uses atomic force microscopy (AFM) force-deformation (F-Δ) curves to investigate for the first time the Young's modulus of a phospholipid microbubble (MB) ultrasound contrast agent. The stiffness of the MBs was calculated from the gradient of the F-Δ curves, and the Young's modulus of the MB shell was calculated by employing two different mechanical models based on the Reissner and elastic membrane theories. We found that the relatively soft phospholipid-based MBs behave inherently differently to stiffer, polymer-based MBs [Glynos, E.; Koutsos, V.; McDicken, W. N.; Moran, C. M.; Pye, S. D.; Ross, J. A.; Sboros, V. Langmuir2009, 25 (13), 7514-7522] and that elastic membrane theory is the most appropriate of the models tested for evaluating the Young's modulus of the phospholipid shell, agreeing with values available for living cell membranes, supported lipid bilayers, and synthetic phospholipid vesicles. Furthermore, we show that AFM F-Δ curves in combination with a suitable mechanical model can assess the shell properties of phospholipid MBs. The "effective" Young's modulus of the whole bubble was also calculated by analysis using Hertz theory. This analysis yielded values which are in agreement with results from studies which used Hertz theory to analyze similar systems such as cells.
Darwinism and the Behavioral Theory of Sociocultural Evolution: An Analysis.
ERIC Educational Resources Information Center
Langdon, John
1979-01-01
Challenges the view that the social sciences are theoretically impoverished disciplines when compared with the natural sciences. Demonstrates that the synthesis of an abstract Darwinian model of systemic adaptation and the behavioral principles of social learning produces a logical theory of sociocultural evolution. (DB)
Considering Horn's Parallel Analysis from a Random Matrix Theory Point of View.
Saccenti, Edoardo; Timmerman, Marieke E
2017-03-01
Horn's parallel analysis is a widely used method for assessing the number of principal components and common factors. We discuss the theoretical foundations of parallel analysis for principal components based on a covariance matrix by making use of arguments from random matrix theory. In particular, we show that (i) for the first component, parallel analysis is an inferential method equivalent to the Tracy-Widom test, (ii) its use to test high-order eigenvalues is equivalent to the use of the joint distribution of the eigenvalues, and thus should be discouraged, and (iii) a formal test for higher-order components can be obtained based on a Tracy-Widom approximation. We illustrate the performance of the two testing procedures using simulated data generated under both a principal component model and a common factors model. For the principal component model, the Tracy-Widom test performs consistently in all conditions, while parallel analysis shows unpredictable behavior for higher-order components. For the common factor model, including major and minor factors, both procedures are heuristic approaches, with variable performance. We conclude that the Tracy-Widom procedure is preferred over parallel analysis for statistically testing the number of principal components based on a covariance matrix.
ERIC Educational Resources Information Center
Ferrando, Pere J.
2008-01-01
This paper develops results and procedures for obtaining linear composites of factor scores that maximize: (a) test information, and (b) validity with respect to external variables in the multiple factor analysis (FA) model. I treat FA as a multidimensional item response theory model, and use Ackerman's multidimensional information approach based…
A new ply model for interlaminar stress analysis
NASA Technical Reports Server (NTRS)
Rao Valisetty, R.; Rehfield, L. W.
1985-01-01
An accurate estimate of interlaminar stresses is crucial to understanding, as well as predicting, many delamination-related failures in composite materials. A new model for ply-level sublaminate analysis is presented and applied. The homogeneous plate theory developed earlier by the authors (Valisetty and Rehfield, 1983) is further refined, and the equations are reduced appropriately for the classical finite-width free-edge laminate elasticity problem and a related delamination crack growth problem. It is applied to the laminate on a ply-by-ply basis. This theory incorporates all the essential physical effects and appears to be an adequate model for predicting the behavior of individual layers in equilibrium. On the basis of the number of equations and boundary conditions required for the implementation of layer equilibrium, this theory also appears to be the simplest of its kind presented so far. The stress induced in the free-edge region of a (0,90,90,0) laminate in uniform extension and the energy release rates for the delamination between the -30 deg and 90 deg plies of a (+, -30,+, -30, 90,90)s laminate are computed using the new analysis. The results are in excellent agreement with the existing numerical solutions. The new ply behavioral model appears to be very promising; it yields stresses and displacements that are statically and kinematically compatible at interlaminar surfaces.
An Efficient Soft Set-Based Approach for Conflict Analysis
Sutoyo, Edi; Mungad, Mungad; Hamid, Suraya; Herawan, Tutut
2016-01-01
Conflict analysis has been used as an important tool in economic, business, governmental and political dispute, games, management negotiations, military operations and etc. There are many mathematical formal models have been proposed to handle conflict situations and one of the most popular is rough set theory. With the ability to handle vagueness from the conflict data set, rough set theory has been successfully used. However, computational time is still an issue when determining the certainty, coverage, and strength of conflict situations. In this paper, we present an alternative approach to handle conflict situations, based on some ideas using soft set theory. The novelty of the proposed approach is that, unlike in rough set theory that uses decision rules, it is based on the concept of co-occurrence of parameters in soft set theory. We illustrate the proposed approach by means of a tutorial example of voting analysis in conflict situations. Furthermore, we elaborate the proposed approach on real world dataset of political conflict in Indonesian Parliament. We show that, the proposed approach achieves lower computational time as compared to rough set theory of up to 3.9%. PMID:26928627
An Efficient Soft Set-Based Approach for Conflict Analysis.
Sutoyo, Edi; Mungad, Mungad; Hamid, Suraya; Herawan, Tutut
2016-01-01
Conflict analysis has been used as an important tool in economic, business, governmental and political dispute, games, management negotiations, military operations and etc. There are many mathematical formal models have been proposed to handle conflict situations and one of the most popular is rough set theory. With the ability to handle vagueness from the conflict data set, rough set theory has been successfully used. However, computational time is still an issue when determining the certainty, coverage, and strength of conflict situations. In this paper, we present an alternative approach to handle conflict situations, based on some ideas using soft set theory. The novelty of the proposed approach is that, unlike in rough set theory that uses decision rules, it is based on the concept of co-occurrence of parameters in soft set theory. We illustrate the proposed approach by means of a tutorial example of voting analysis in conflict situations. Furthermore, we elaborate the proposed approach on real world dataset of political conflict in Indonesian Parliament. We show that, the proposed approach achieves lower computational time as compared to rough set theory of up to 3.9%.
Traffic Games: Modeling Freeway Traffic with Game Theory
Cortés-Berrueco, Luis E.; Gershenson, Carlos; Stephens, Christopher R.
2016-01-01
We apply game theory to a vehicular traffic model to study the effect of driver strategies on traffic flow. The resulting model inherits the realistic dynamics achieved by a two-lane traffic model and aims to incorporate phenomena caused by driver-driver interactions. To achieve this goal, a game-theoretic description of driver interaction was developed. This game-theoretic formalization allows one to model different lane-changing behaviors and to keep track of mobility performance. We simulate the evolution of cooperation, traffic flow, and mobility performance for different modeled behaviors. The analysis of these results indicates a mobility optimization process achieved by drivers’ interactions. PMID:27855176
Traffic Games: Modeling Freeway Traffic with Game Theory.
Cortés-Berrueco, Luis E; Gershenson, Carlos; Stephens, Christopher R
2016-01-01
We apply game theory to a vehicular traffic model to study the effect of driver strategies on traffic flow. The resulting model inherits the realistic dynamics achieved by a two-lane traffic model and aims to incorporate phenomena caused by driver-driver interactions. To achieve this goal, a game-theoretic description of driver interaction was developed. This game-theoretic formalization allows one to model different lane-changing behaviors and to keep track of mobility performance. We simulate the evolution of cooperation, traffic flow, and mobility performance for different modeled behaviors. The analysis of these results indicates a mobility optimization process achieved by drivers' interactions.
The application of single particle hydrodynamics in continuum models of multiphase flow
NASA Technical Reports Server (NTRS)
Decker, Rand
1988-01-01
A review of the application of single particle hydrodynamics in models for the exchange of interphase momentum in continuum models of multiphase flow is presented. Considered are the equations of motion for a laminar, mechanical two phase flow. Inherent to this theory is a model for the interphase exchange of momentum due to drag between the dispersed particulate and continuous fluid phases. In addition, applications of two phase flow theory to de-mixing flows require the modeling of interphase momentum exchange due to lift forces. The applications of single particle analysis in deriving models for drag and lift are examined.
The Analyst's "Use" of Theory or Theories: The Play of Theory.
Cooper, Steven H
2017-10-01
Two clinical vignettes demonstrate a methodological approach that guides the analyst's attention to metaphors and surfaces that are the focus of different theories. Clinically, the use of different theories expands the metaphorical language with which the analyst tries to make contact with the patient's unconscious life. Metaphorical expressions may be said to relate to each other as the syntax of unconscious fantasy (Arlow 1979). The unconscious fantasy itself represents a metaphorical construction of childhood experience that has persisted, dynamically expressive and emergent into adult life. This persistence is evident in how, in some instances, long periods of an analysis focus on translating one or a few metaphors, chiefly because the manifest metaphorical expressions of a central theme regularly lead to better understanding of an unconscious fantasy. At times employing another model or theory assists in a level of self-reflection about clinical understanding and clinical decisions. The analyst's choice of theory or theories is unique to the analyst and is not prescriptive, except as illustrating a way to think about these issues. The use of multiple models in no way suggests or implies that theories may be integrated.
Wilson loops and chiral correlators on squashed spheres
NASA Astrophysics Data System (ADS)
Fucito, F.; Morales, J. F.; Poghossian, R.
2015-11-01
We study chiral deformations of N=2 and N=4 supersymmetric gauge theories obtained by turning on τ J tr Φ J interactions with Φ the N=2 superfield. Using localization, we compute the deformed gauge theory partition function Z(overrightarrow{τ}|q) and the expectation value of circular Wilson loops W on a squashed four-sphere. In the case of the deformed {N}=4 theory, exact formulas for Z and W are derived in terms of an underlying U( N) interacting matrix model replacing the free Gaussian model describing the {N}=4 theory. Using the AGT correspondence, the τ J -deformations are related to the insertions of commuting integrals of motion in the four-point CFT correlator and chiral correlators are expressed as τ-derivatives of the gauge theory partition function on a finite Ω-background. In the so called Nekrasov-Shatashvili limit, the entire ring of chiral relations is extracted from the ɛ-deformed Seiberg-Witten curve. As a byproduct of our analysis we show that SU(2) gauge theories on rational Ω-backgrounds are dual to CFT minimal models.
Nonlinear Poisson Equation for Heterogeneous Media
Hu, Langhua; Wei, Guo-Wei
2012-01-01
The Poisson equation is a widely accepted model for electrostatic analysis. However, the Poisson equation is derived based on electric polarizations in a linear, isotropic, and homogeneous dielectric medium. This article introduces a nonlinear Poisson equation to take into consideration of hyperpolarization effects due to intensive charges and possible nonlinear, anisotropic, and heterogeneous media. Variational principle is utilized to derive the nonlinear Poisson model from an electrostatic energy functional. To apply the proposed nonlinear Poisson equation for the solvation analysis, we also construct a nonpolar solvation energy functional based on the nonlinear Poisson equation by using the geometric measure theory. At a fixed temperature, the proposed nonlinear Poisson theory is extensively validated by the electrostatic analysis of the Kirkwood model and a set of 20 proteins, and the solvation analysis of a set of 17 small molecules whose experimental measurements are also available for a comparison. Moreover, the nonlinear Poisson equation is further applied to the solvation analysis of 21 compounds at different temperatures. Numerical results are compared to theoretical prediction, experimental measurements, and those obtained from other theoretical methods in the literature. A good agreement between our results and experimental data as well as theoretical results suggests that the proposed nonlinear Poisson model is a potentially useful model for electrostatic analysis involving hyperpolarization effects. PMID:22947937
NASA Astrophysics Data System (ADS)
Sun, Yuan; Bhattacherjee, Anol
2011-11-01
Information technology (IT) usage within organisations is a multi-level phenomenon that is influenced by individual-level and organisational-level variables. Yet, current theories, such as the unified theory of acceptance and use of technology, describe IT usage as solely an individual-level phenomenon. This article postulates a model of organisational IT usage that integrates salient organisational-level variables such as user training, top management support and technical support within an individual-level model to postulate a multi-level model of IT usage. The multi-level model was then empirically validated using multi-level data collected from 128 end users and 26 managers in 26 firms in China regarding their use of enterprise resource planning systems and analysed using the multi-level structural equation modelling (MSEM) technique. We demonstrate the utility of MSEM analysis of multi-level data relative to the more common structural equation modelling analysis of single-level data and show how single-level data can be aggregated to approximate multi-level analysis when multi-level data collection is not possible. We hope that this article will motivate future scholars to employ multi-level data and multi-level analysis for understanding organisational phenomena that are truly multi-level in nature.
FINITE DIFFERENCE THEORY, * LINEAR ALGEBRA , APPLIED MATHEMATICS, APPROXIMATION(MATHEMATICS), BOUNDARY VALUE PROBLEMS, COMPUTATIONS, HYPERBOLAS, MATHEMATICAL MODELS, NUMERICAL ANALYSIS, PARTIAL DIFFERENTIAL EQUATIONS, STABILITY.
Transverse forces on a vortex in lattice models of superfluids
NASA Astrophysics Data System (ADS)
Sonin, E. B.
2013-12-01
The paper derives the transverse forces (the Magnus and the Lorentz forces) in the lattice models of superfluids in the continuous approximation. The continuous approximation restores translational invariance absent in the original lattice model, but the theory is not Galilean invariant. As a result, calculation of the two transverse forces on the vortex, Magnus force and Lorentz force, requires the analysis of two balances, for the true momentum of particles in the lattice (Magnus force) and for the quasimomentum (Lorentz force) known from the Bloch theory of particles in the periodic potential. While the developed theory yields the same Lorentz force, which was well known before, a new general expression for the Magnus force was obtained. The theory demonstrates how a small Magnus force emerges in the Josephson-junction array if the particle-hole symmetry is broken. The continuous approximation for the Bose-Hubbard model close to the superfluid-insulator transition was developed, which was used for calculation of the Magnus force. The theory shows that there is an area in the phase diagram for the Bose-Hubbard model, where the Magnus force has an inverse sign with respect to that which is expected from the sign of velocity circulation.
ERIC Educational Resources Information Center
Hsu, Shun-Yi
An instructional model based on a learning cycle including correlation, analysis, and generalization (CAG) was developed and applied to design an instructional module for grade 8 students in Taiwan, Republic of China. The CAG model was based on Piagetian theory and a concept model (Pella, 1975). The module developed for heat and temperature was…
NASA Astrophysics Data System (ADS)
Hayashi, Tomohiko; Oshima, Hiraku; Harano, Yuichi; Kinoshita, Masahiro
2016-09-01
For neutral hard-sphere solutes, we compare the reduced density profile of water around a solute g(r), solvation free energy μ, energy U, and entropy S under the isochoric condition predicted by the two theories: dielectrically consistent reference interaction site model (DRISM) and angle-dependent integral equation (ADIE) theories. A molecular model for water pertinent to each theory is adopted. The hypernetted-chain (HNC) closure is employed in the ADIE theory, and the HNC and Kovalenko-Hirata (K-H) closures are tested in the DRISM theory. We also calculate g(r), U, S, and μ of the same solute in a hard-sphere solvent whose molecular diameter and number density are set at those of water, in which case the radial-symmetric integral equation (RSIE) theory is employed. The dependences of μ, U, and S on the excluded volume and solvent-accessible surface area are analyzed using the morphometric approach (MA). The results from the ADIE theory are in by far better agreement with those from computer simulations available for g(r), U, and μ. For the DRISM theory, g(r) in the vicinity of the solute is quite high and becomes progressively higher as the solute diameter d U increases. By contrast, for the ADIE theory, it is much lower and becomes further lower as d U increases. Due to unphysically positive U and significantly larger |S|, μ from the DRISM theory becomes too high. It is interesting that μ, U, and S from the K-H closure are worse than those from the HNC closure. Overall, the results from the DRISM theory with a molecular model for water are quite similar to those from the RSIE theory with the hard-sphere solvent. Based on the results of the MA analysis, we comparatively discuss the different theoretical methods for cases where they are applied to studies on the solvation of a protein.
A computational model for simulating text comprehension.
Lemaire, Benoît; Denhière, Guy; Bellissens, Cédrick; Jhean-Larose, Sandra
2006-11-01
In the present article, we outline the architecture of a computer program for simulating the process by which humans comprehend texts. The program is based on psycholinguistic theories about human memory and text comprehension processes, such as the construction-integration model (Kintsch, 1998), the latent semantic analysis theory of knowledge representation (Landauer & Dumais, 1997), and the predication algorithms (Kintsch, 2001; Lemaire & Bianco, 2003), and it is intended to help psycholinguists investigate the way humans comprehend texts.
2D problems of surface growth theory with applications to additive manufacturing
NASA Astrophysics Data System (ADS)
Manzhirov, A. V.; Mikhin, M. N.
2018-04-01
We study 2D problems of surface growth theory of deformable solids and their applications to the analysis of the stress-strain state of AM fabricated products and structures. Statements of the problems are given, and a solution method based on the approaches of the theory of functions of a complex variable is suggested. Computations are carried out for model problems. Qualitative and quantitative results are discussed.
NASA Astrophysics Data System (ADS)
Soleimani, Ahmad; Naei, Mohammad Hasan; Mashhadi, Mahmoud Mosavi
In this paper, the first order shear deformation theory (FSDT) is used to investigate the postbuckling behavior of orthotropic single-layered graphene sheet (SLGS) under in-plane loadings. Nonlocal elasticity theory and von-Karman nonlinear model in combination with the isogeometric analysis (IGA) have been applied to study the postbuckling characteristics of SLGSs. In contrast to the classical model, the nonlocal continuum model developed in this work considers the size-effects on the postbuckling characteristics of SLGSs. FSDT takes into account effects of shear deformations through-the-thickness of plate. Geometric imperfection which is defined as a very small transverse displacement of the mid-plane is applied on undeformed nanoplate to create initial deviation in graphene sheet from being perfectly flat. Nonlinear governing equations of motion for SLGS are derived from the principle of virtual work and a variational formulation. At the end, the results are presented as the postbuckling equilibrium paths of SLGS. The influence of various parameters such as edge length, nonlocal parameter, compression ratio, boundary conditions and aspect ratio on the postbuckling path is investigated. The results of this work show the high accuracy of nonlocal FSDT-based analysis for postbuckling behavior of graphene sheets.
Anomalies, conformal manifolds, and spheres
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gomis, Jaume; Hsin, Po-Shen; Komargodski, Zohar
The two-point function of exactly marginal operators leads to a universal contribution to the trace anomaly in even dimensions. We study aspects of this trace anomaly, emphasizing its interpretation as a sigma model, whose target space $M$ is the space of conformal field theories (a.k.a. the conformal manifold). When the underlying quantum field theory is supersymmetric, this sigma model has to be appropriately supersymmetrized. As examples, we consider in some detail $N$ = (2; 2) and $N$ = (0; 2) supersymmetric theories in d = 2 and $N$ = 2 supersymmetric theories in d = 4. This reasoning leads tomore » new information about the conformal manifolds of these theories, for example, we show that the manifold is K ahler-Hodge and we further argue that it has vanishing K ahler class. For $N$ = (2; 2) theories in d = 2 and N = 2 theories in d = 4 we also show that the relation between the sphere partition function and the K ahler potential of $M$ follows immediately from the appropriate sigma models that we construct. Ultimately, along the way we find several examples of potential trace anomalies that obey the Wess-Zumino consistency conditions, but can be ruled out by a more detailed analysis.« less
Anomalies, conformal manifolds, and spheres
NASA Astrophysics Data System (ADS)
Gomis, Jaume; Hsin, Po-Shen; Komargodski, Zohar; Schwimmer, Adam; Seiberg, Nathan; Theisen, Stefan
2016-03-01
The two-point function of exactly marginal operators leads to a universal contribution to the trace anomaly in even dimensions. We study aspects of this trace anomaly, emphasizing its interpretation as a sigma model, whose target space {M} is the space of conformal field theories (a.k.a. the conformal manifold). When the underlying quantum field theory is supersymmetric, this sigma model has to be appropriately supersymmetrized. As examples, we consider in some detail {N}=(2,2) and {N}=(0,2) supersymmetric theories in d = 2 and {N}=2 supersymmetric theories in d = 4. This reasoning leads to new information about the conformal manifolds of these theories, for example, we show that the manifold is Kähler-Hodge and we further argue that it has vanishing Kähler class. For {N}=(2,2) theories in d = 2 and {N}=2 theories in d = 4 we also show that the relation between the sphere partition function and the Kähler potential of {M} follows immediately from the appropriate sigma models that we construct. Along the way we find several examples of potential trace anomalies that obey the Wess-Zumino consistency conditions, but can be ruled out by a more detailed analysis.
Anomalies, conformal manifolds, and spheres
Gomis, Jaume; Hsin, Po-Shen; Komargodski, Zohar; ...
2016-03-04
The two-point function of exactly marginal operators leads to a universal contribution to the trace anomaly in even dimensions. We study aspects of this trace anomaly, emphasizing its interpretation as a sigma model, whose target space $M$ is the space of conformal field theories (a.k.a. the conformal manifold). When the underlying quantum field theory is supersymmetric, this sigma model has to be appropriately supersymmetrized. As examples, we consider in some detail $N$ = (2; 2) and $N$ = (0; 2) supersymmetric theories in d = 2 and $N$ = 2 supersymmetric theories in d = 4. This reasoning leads tomore » new information about the conformal manifolds of these theories, for example, we show that the manifold is K ahler-Hodge and we further argue that it has vanishing K ahler class. For $N$ = (2; 2) theories in d = 2 and N = 2 theories in d = 4 we also show that the relation between the sphere partition function and the K ahler potential of $M$ follows immediately from the appropriate sigma models that we construct. Ultimately, along the way we find several examples of potential trace anomalies that obey the Wess-Zumino consistency conditions, but can be ruled out by a more detailed analysis.« less
Allometric scaling theory applied to FIA biomass estimation
David C. Chojnacky
2002-01-01
Tree biomass estimates in the Forest Inventory and Analysis (FIA) database are derived from numerous methodologies whose abundance and complexity raise questions about consistent results throughout the U.S. A new model based on allometric scaling theory ("WBE") offers simplified methodology and a theoretically sound basis for improving the reliability and...
Practical Guide to Conducting an Item Response Theory Analysis
ERIC Educational Resources Information Center
Toland, Michael D.
2014-01-01
Item response theory (IRT) is a psychometric technique used in the development, evaluation, improvement, and scoring of multi-item scales. This pedagogical article provides the necessary information needed to understand how to conduct, interpret, and report results from two commonly used ordered polytomous IRT models (Samejima's graded…
ERIC Educational Resources Information Center
Dalgleish, Tim
2004-01-01
The evolution of multirepresentational cognitive theorizing in psychopathology is illustrated by detailed discussion and analysis of a number of prototypical models of posttraumatic stress disorder (PTSD). Network and schema theories, which focus on a single, explicit aspect/format of mental representation, are compared with theories that focus on…
Modelling by Differential Equations
ERIC Educational Resources Information Center
Chaachoua, Hamid; Saglam, Ayse
2006-01-01
This paper aims to show the close relation between physics and mathematics taking into account especially the theory of differential equations. By analysing the problems posed by scientists in the seventeenth century, we note that physics is very important for the emergence of this theory. Taking into account this analysis, we show the…
Overview of a Linguistic Theory of Design. AI Memo 383A.
ERIC Educational Resources Information Center
Miller, Mark L.; Goldstein, Ira P.
The SPADE theory, which uses linguistic formalisms to model the planning and debugging processes of computer programming, was simultaneously developed and tested in three separate contexts--computer uses in education, automatic programming (a traditional artificial intelligence arena), and protocol analysis (the domain of information processing…
Stability analysis of dynamic collaboration model with control signals on two lanes
NASA Astrophysics Data System (ADS)
Li, Zhipeng; Zhang, Run; Xu, Shangzhi; Qian, Yeqing; Xu, Juan
2014-12-01
In this paper, the influence of control signals on the stability of two-lane traffic flow is mainly studied by applying control theory with lane changing behaviors. We present the two-lane dynamic collaboration model with lateral friction and the expressions of feedback control signals. What is more, utilizing the delayed feedback control theory to the two-lane dynamic collaboration model with control signals, we investigate the stability of traffic flow theoretically and the stability conditions for both lanes are derived with finding that the forward and lateral feedback signals can improve the stability of traffic flow while the backward feedback signals cannot achieve it. Besides, direct simulations are conducted to verify the results of theoretical analysis, which shows that the feedback signals have a significant effect on the running state of two vehicle groups, and the results are same with the theoretical analysis.
Hollnagel, H; Malterud, K
1995-12-01
The study was designed to present and apply theoretical and empirical knowledge for the construction of a clinical model intended to shift the attention of the general practitioner from objective risk factors to self-assessed health resources in male and female patients. Review, discussion and analysis of selected theoretical models about personal health resources involving assessing existing theories according to their emphasis concerning self-assessed vs. doctor-assessed health resources, specific health resources vs. life and coping in general, abstract vs. clinically applicable theory, gender perspective explicitly included or not. Relevant theoretical models on health and coping (salutogenesis, coping and social support, control/demand, locus of control, health belief model, quality of life), and the perspective of the underprivileged Other (critical theory, feminist standpoint theory, the patient-centred clinical method) were presented and assessed. Components from Antonovsky's salutogenetic perspective and McWhinney's patient-centred clinical method, supported by gender perspectives, were integrated to a clinical model which is presented. General practitioners are recommended to shift their attention from objective risk factors to self-assessed health resources by means of the clinical model. The relevance and feasibility of the model should be explored in empirical research.
QTest: Quantitative Testing of Theories of Binary Choice.
Regenwetter, Michel; Davis-Stober, Clintin P; Lim, Shiau Hong; Guo, Ying; Popova, Anna; Zwilling, Chris; Cha, Yun-Shil; Messner, William
2014-01-01
The goal of this paper is to make modeling and quantitative testing accessible to behavioral decision researchers interested in substantive questions. We provide a novel, rigorous, yet very general, quantitative diagnostic framework for testing theories of binary choice. This permits the nontechnical scholar to proceed far beyond traditionally rather superficial methods of analysis, and it permits the quantitatively savvy scholar to triage theoretical proposals before investing effort into complex and specialized quantitative analyses. Our theoretical framework links static algebraic decision theory with observed variability in behavioral binary choice data. The paper is supplemented with a custom-designed public-domain statistical analysis package, the QTest software. We illustrate our approach with a quantitative analysis using published laboratory data, including tests of novel versions of "Random Cumulative Prospect Theory." A major asset of the approach is the potential to distinguish decision makers who have a fixed preference and commit errors in observed choices from decision makers who waver in their preferences.
Theory of agent-based market models with controlled levels of greed and anxiety
NASA Astrophysics Data System (ADS)
Papadopoulos, P.; Coolen, A. C. C.
2010-01-01
We use generating functional analysis to study minority-game-type market models with generalized strategy valuation updates that control the psychology of agents' actions. The agents' choice between trend-following and contrarian trading, and their vigor in each, depends on the overall state of the market. Even in 'fake history' models, the theory now involves an effective overall bid process (coupled to the effective agent process) which can exhibit profound remanence effects and new phase transitions. For some models the bid process can be solved directly, others require Maxwell-construction-type approximations.
MAC/GMC 4.0 User's Manual: Keywords Manual. Volume 2
NASA Technical Reports Server (NTRS)
Bednarcyk, Brett A.; Arnold, Steven M.
2002-01-01
This document is the second volume in the three volume set of User's Manuals for the Micromechanics Analysis Code with Generalized Method of Cells Version 4.0 (MAC/GMC 4.0). Volume 1 is the Theory Manual, this document is the Keywords Manual, and Volume 3 is the Example Problem Manual. MAC/GMC 4.0 is a composite material and laminate analysis software program developed at the NASA Glenn Research Center. It is based on the generalized method of cells (GMC) micromechanics theory, which provides access to the local stress and strain fields in the composite material. This access grants GMC the ability to accommodate arbitrary local models for inelastic material behavior and various types of damage and failure analysis. MAC/GMC 4.0 has been built around GMC to provide the theory with a user-friendly framework, along with a library of local inelastic, damage, and failure models. Further, applications of simulated thermo-mechanical loading, generation of output results, and selection of architectures to represent the composite material have been automated in MAC/GMC 4.0. Finally, classical lamination theory has been implemented within MAC/GMC 4.0 wherein GMC is used to model the composite material response of each ply. Consequently, the full range of GMC composite material capabilities is available for analysis of arbitrary laminate configurations as well. This volume describes the basic information required to use the MAC/GMC 4.0 software, including a 'Getting Started' section, and an in-depth description of each of the 22 keywords used in the input file to control the execution of the code.
ERIC Educational Resources Information Center
Brown, Steven D.; Tramayne, Selena; Hoxha, Denada; Telander, Kyle; Fan, Xiaoyan; Lent, Robert W.
2008-01-01
This study tested Social Cognitive Career Theory's (SCCT) academic performance model using a two-stage approach that combined meta-analytic and structural equation modeling methodologies. Unbiased correlations obtained from a previously published meta-analysis [Robbins, S. B., Lauver, K., Le, H., Davis, D., & Langley, R. (2004). Do psychosocial…
Evaluating Cognitive Theory: A Joint Modeling Approach Using Responses and Response Times
ERIC Educational Resources Information Center
Klein Entink, Rinke H.; Kuhn, Jorg-Tobias; Hornke, Lutz F.; Fox, Jean-Paul
2009-01-01
In current psychological research, the analysis of data from computer-based assessments or experiments is often confined to accuracy scores. Response times, although being an important source of additional information, are either neglected or analyzed separately. In this article, a new model is developed that allows the simultaneous analysis of…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bastero-Gil, Mar; Cerezo, Rafael; Berera, Arjun
2012-11-01
The effects of bulk viscosity are examined for inflationary dynamics in which dissipation and thermalization are present. A complete stability analysis is done for the background inflaton evolution equations, which includes both inflaton dissipation and radiation bulk viscous effects. Three representative approaches of bulk viscous irreversible thermodynamics are analyzed: the Eckart noncausal theory, the linear and causal theory of Israel-Stewart and a more recent nonlinear and causal bulk viscous theory. It is found that the causal theories allow for larger bulk viscosities before encountering an instability in comparison to the noncausal Eckart theory. It is also shown that the causalmore » theories tend to suppress the radiation production due to bulk viscous pressure, because of the presence of relaxation effects implicit in these theories. Bulk viscosity coefficients derived from quantum field theory are applied to warm inflation model building and an analysis is made of the effects to the duration of inflation. The treatment of bulk pressure would also be relevant to the reheating phase after inflation in cold inflation dynamics and during the radiation dominated regime, although very little work in both areas has been done; the methodology developed in this paper could be extended to apply to these other problems.« less
Using a theory-driven conceptual framework in qualitative health research.
Macfarlane, Anne; O'Reilly-de Brún, Mary
2012-05-01
The role and merits of highly inductive research designs in qualitative health research are well established, and there has been a powerful proliferation of grounded theory method in the field. However, tight qualitative research designs informed by social theory can be useful to sensitize researchers to concepts and processes that they might not necessarily identify through inductive processes. In this article, we provide a reflexive account of our experience of using a theory-driven conceptual framework, the Normalization Process Model, in a qualitative evaluation of general practitioners' uptake of a free, pilot, language interpreting service in the Republic of Ireland. We reflect on our decisions about whether or not to use the Model, and describe our actual use of it to inform research questions, sampling, coding, and data analysis. We conclude with reflections on the added value that the Model and tight design brought to our research.
Derivative Trade Optimizing Model Utilizing GP Based on Behavioral Finance Theory
NASA Astrophysics Data System (ADS)
Matsumura, Koki; Kawamoto, Masaru
This paper proposed a new technique which makes the strategy trees for the derivative (option) trading investment decision based on the behavioral finance theory and optimizes it using evolutionary computation, in order to achieve high profitability. The strategy tree uses a technical analysis based on a statistical, experienced technique for the investment decision. The trading model is represented by various technical indexes, and the strategy tree is optimized by the genetic programming(GP) which is one of the evolutionary computations. Moreover, this paper proposed a method using the prospect theory based on the behavioral finance theory to set psychological bias for profit and deficit and attempted to select the appropriate strike price of option for the higher investment efficiency. As a result, this technique produced a good result and found the effectiveness of this trading model by the optimized dealings strategy.
NASA Technical Reports Server (NTRS)
Marr, W. A., Jr.
1972-01-01
The behavior of finite element models employing different constitutive relations to describe the stress-strain behavior of soils is investigated. Three models, which assume small strain theory is applicable, include a nondilatant, a dilatant and a strain hardening constitutive relation. Two models are formulated using large strain theory and include a hyperbolic and a Tresca elastic perfectly plastic constitutive relation. These finite element models are used to analyze retaining walls and footings. Methods of improving the finite element solutions are investigated. For nonlinear problems better solutions can be obtained by using smaller load increment sizes and more iterations per load increment than by increasing the number of elements. Suitable methods of treating tension stresses and stresses which exceed the yield criteria are discussed.
NASA Technical Reports Server (NTRS)
Williams, M.; Harris, W. L.
1984-01-01
The purpose of the analysis is to determine if inflow turbulence distortion may be a cause of experimentally observed changes in sound pressure levels when the rotor mean loading is varied. The effect of helicopter rotor mean aerodynamics on inflow turbulence is studied within the framework of the turbulence rapid distortion theory developed by Pearson (1959) and Deissler (1961). The distorted inflow turbulence is related to the resultant noise by conventional broadband noise theory. A comparison of the distortion model with experimental data shows that the theoretical model is unable to totally explain observed increases in model rotor sound pressures with increased rotor mean thrust. Comparison of full scale rotor data with the theoretical model shows that a shear-type distortion may explain decreasing sound pressure levels with increasing thrust.
Impelluso, Thomas J
2003-06-01
An algorithm for bone remodeling is presented which allows for both a redistribution of density and a continuous change of principal material directions for the orthotropic material properties of bone. It employs a modal analysis to add density for growth and a local effective strain based analysis to redistribute density. General re-distribution functions are presented. The model utilizes theories of cellular solids to relate density and strength. The code predicts the same general density distributions and local orthotropy as observed in reality.
Democratic parenting: paradoxical messages in democratic parent education theories
NASA Astrophysics Data System (ADS)
Oryan, Shlomit; Gastil, John
2013-06-01
Some prominent parent education theories in the United States and other Western countries base their educational viewpoint explicitly on democratic values, such as mutual respect, equality and personal freedom. These democratic parenting theories advocate sharing power with children and including them in family decision making. This study presents a textual analysis of two such theories, the Adlerian model of parent education and the Parent Effectiveness Training (PET) model, as they are embodied in two original bestselling textbooks. Through content and argumentation analysis of these influential texts, this study examines the paradoxes inherent in these two theories when they articulate how to implement fully democratic principles within the parent-child relationship. We discover that in spite of their democratic rationale, both books offer communication practices that guide the child to modify misbehaviour, enforce parental power, and manipulate the child to make decisions that follow parental judgment, and thus do not endorse the use of a truly democratic parenting style. We suggest, as an alternative to the democratic parenting style, that parents be introduced to a guardianship management style, in which they do not share authority with children, but seek opportunities for enabling children to make more autonomous decisions and participate in more family decision making.
Edwards, Katie M; Gidycz, Christine A; Murphy, Megan J
2015-10-01
The purpose of the current study was to build on the existing literature to better understand young women's leaving processes in abusive dating relationships using a prospective design. Two social psychological models-the investment model and theory of planned behavior-were tested. According to the investment model, relationship continuation is predicted by commitment, which is a function of investment, satisfaction, and low quality of alternatives. The theory of planned behavior asserts that a specific behavior is predicted by an individual's intention to use a behavior, which is a function of the individual's attitudes toward the behavior, the subjective norms toward the behavior, and the individual's perceived behavioral control over the behavior. College women (N = 169 young women in abusive relatinships) completed surveys at two time points, approximately 4 months apart, to assess initially for the presence of intimate partner violence (IPV) in a current relationship and investment model and theory of planned behavior variables; the purpose of the 4-month follow-up session was to determine if women had remained in or terminated their abusive relationship. Path analytic results demonstrated that both the theory of planned behavior and investment models were good fits to the data in prospectively predicting abused women's stay/leave decisions. However, the theory of planned behavior was a better fit to the data than the investment model. Implications for future research and intervention are discussed. © The Author(s) 2014.
Model of superconductivity formation on ideal crystal lattice defect–twin or twin boundary (MSC-TB)
NASA Astrophysics Data System (ADS)
Chizhov, V. A.; Zaitsev, F. S.; Bychkov, V. L.
2018-03-01
The report provides a review of the experimental material on superconductivity (SP) accumulated by 2017, a critical analysis of the Bardeen-Cooper-Schrieffer theory (BCS) has been given, and a new model of the super-conductivity effect proposed in works of V.A. Chizhov has been presented. The new model allows to understand the mechanism of the SP formation and to explain many experimental facts on the basis of the theory of pro-cesses occurring in the ideal defect of the crystal lattice – the twinning boundary (MSC-TB). Specific materials, including new ones, are described, which, in accordance with the theory of MSC-TB, should have improved properties of SC, promising directions for further research are formulated.
[A competency model of rural general practitioners: theory construction and empirical study].
Yang, Xiu-Mu; Qi, Yu-Long; Shne, Zheng-Fu; Han, Bu-Xin; Meng, Bei
2015-04-01
To perform theory construction and empirical study of the competency model of rural general practitioners. Through literature study, job analysis, interviews, and expert team discussion, the questionnaire of rural general practitioners competency was constructed. A total of 1458 rural general practitioners were surveyed by the questionnaire in 6 central provinces. The common factors were constructed using the principal component method of exploratory factor analysis and confirmatory factor analysis. The influence of the competency characteristics on the working performance was analyzed using regression equation analysis. The Cronbach 's alpha coefficient of the questionnaire was 0.974. The model consisted of 9 dimensions and 59 items. The 9 competency dimensions included basic public health service ability, basic clinical skills, system analysis capability, information management capability, communication and cooperation ability, occupational moral ability, non-medical professional knowledge, personal traits and psychological adaptability. The rate of explained cumulative total variance was 76.855%. The model fitting index were Χ(2)/df 1.88, GFI=0.94, NFI=0.96, NNFI=0.98, PNFI=0.91, RMSEA=0.068, CFI=0.97, IFI=0.97, RFI=0.96, suggesting good model fitting. Regression analysis showed that the competency characteristics had a significant effect on job performance. The rural general practitioners competency model provides reference for rural doctor training, rural order directional cultivation of medical students, and competency performance management of the rural general practitioners.
Ising versus S U (2) 2 string-net ladder
NASA Astrophysics Data System (ADS)
Vidal, Julien
2018-03-01
We consider the string-net model obtained from S U (2) 2 fusion rules. These fusion rules are shared by two different sets of anyon theories. In this paper, we study the competition between the two corresponding non-Abelian quantum phases in the ladder geometry. A detailed symmetry analysis shows that the nontrivial low-energy sector corresponds to the transverse-field cluster model that displays a critical point described by the s o (2) 1 conformal field theory. Other sectors are obtained by freezing spins in this model.
The anchoring bias reflects rational use of cognitive resources.
Lieder, Falk; Griffiths, Thomas L; M Huys, Quentin J; Goodman, Noah D
2018-02-01
Cognitive biases, such as the anchoring bias, pose a serious challenge to rational accounts of human cognition. We investigate whether rational theories can meet this challenge by taking into account the mind's bounded cognitive resources. We asked what reasoning under uncertainty would look like if people made rational use of their finite time and limited cognitive resources. To answer this question, we applied a mathematical theory of bounded rationality to the problem of numerical estimation. Our analysis led to a rational process model that can be interpreted in terms of anchoring-and-adjustment. This model provided a unifying explanation for ten anchoring phenomena including the differential effect of accuracy motivation on the bias towards provided versus self-generated anchors. Our results illustrate the potential of resource-rational analysis to provide formal theories that can unify a wide range of empirical results and reconcile the impressive capacities of the human mind with its apparently irrational cognitive biases.
[Transpersonal caring in nursing: an analysis grounded in a conceptual model].
Favero, Luciane; Pagliuca, Lorita Marlena Freitag; Lacerda, Maria Ribeiro
2013-04-01
This theoretical study aimed to analyze the attributes, antecedents, and consequences of the transpersonal caring concept through using the Concepts Analysis Model. For this purpose, books published based on Jean Watson's theory in Portuguese and English from 1979 to 2012 were listed. Fulfilled inclusion criteria, only six literary works remained. The time of care and intention to be in relationship were the most cited antecedents. The most present attributes were intersubjectivity and relationships among those ones involved in the process. With regard to consequences, the most present element was the fact that the transpersonal caring provides restoration/reconstitution (healing). The study allowed noting tiny changes in the concept definition over the years and publications of theory. Thus, analyzing the attributes, antecedents, and consequences of concept provided its better understanding and comprehension of its importance in the Human Caring Theory proposed by American theorist.
Psychoacoustic entropy theory and its implications for performance practice
NASA Astrophysics Data System (ADS)
Strohman, Gregory J.
This dissertation attempts to motivate, derive and imply potential uses for a generalized perceptual theory of musical harmony called psychoacoustic entropy theory. This theory treats the human auditory system as a physical system which takes acoustic measurements. As a result, the human auditory system is subject to all the appropriate uncertainties and limitations of other physical measurement systems. This is the theoretic basis for defining psychoacoustic entropy. Psychoacoustic entropy is a numerical quantity which indexes the degree to which the human auditory system perceives instantaneous disorder within a sound pressure wave. Chapter one explains the importance of harmonic analysis as a tool for performance practice. It also outlines the critical limitations for many of the most influential historical approaches to modeling harmonic stability, particularly when compared to available scientific research in psychoacoustics. Rather than analyze a musical excerpt, psychoacoustic entropy is calculated directly from sound pressure waves themselves. This frames psychoacoustic entropy theory in the most general possible terms as a theory of musical harmony, enabling it to be invoked for any perceivable sound. Chapter two provides and examines many widely accepted mathematical models of the acoustics and psychoacoustics of these sound pressure waves. Chapter three introduces entropy as a precise way of measuring perceived uncertainty in sound pressure waves. Entropy is used, in combination with the acoustic and psychoacoustic models introduced in chapter two, to motivate the mathematical formulation of psychoacoustic entropy theory. Chapter four shows how to use psychoacoustic entropy theory to analyze the certain types of musical harmonies, while chapter five applies the analytical tools developed in chapter four to two short musical excerpts to influence their interpretation. Almost every form of harmonic analysis invokes some degree of mathematical reasoning. However, the limited scope of most harmonic systems used for Western common practice music greatly simplifies the necessary level of mathematical detail. Psychoacoustic entropy theory requires a greater deal of mathematical complexity due to its sheer scope as a generalized theory of musical harmony. Fortunately, under specific assumptions the theory can take on vastly simpler forms. Psychoacoustic entropy theory appears to be highly compatible with the latest scientific research in psychoacoustics. However, the theory itself should be regarded as a hypothesis and this dissertation an experiment in progress. The evaluation of psychoacoustic entropy theory as a scientific theory of human sonic perception must wait for more rigorous future research.
NASA Technical Reports Server (NTRS)
Mathison, Steven R.; Herakovich, Carl T.; Pindera, Marek-Jerzy; Shuart, Mark J.
1987-01-01
The objective was to determine the effect of nonlinear material behavior on the response and failure of unnotched and notched angle-ply laminates under uniaxial compressive loading. The endochronic theory was chosen as the constitutive theory to model the AS4/3502 graphite-epoxy material system. Three-dimensional finite element analysis incorporating the endochronic theory was used to determine the stresses and strains in the laminates. An incremental/iterative initial strain algorithm was used in the finite element program. To increase computational efficiency, a 180 deg rotational symmetry relationship was utilized and the finite element program was vectorized to run on a supercomputer. Laminate response was compared to experimentation revealing excellent agreement for both the unnotched and notched angle-ply laminates. Predicted stresses in the region of the hole were examined and are presented, comparing linear elastic analysis to the inelastic endochronic theory analysis. A failure analysis of the unnotched and notched laminates was performed using the quadratic tensor polynomial. Predicted fracture loads compared well with experimentation for the unnotched laminates, but were very conservative in comparison with experiments for the notched laminates.
Film and membrane-model thermodynamics of free thin liquid films.
Radke, C J
2015-07-01
In spite of over 7 decades of effort, the thermodynamics of thin free liquid films (as in emulsions and foams) lacks clarity. Following a brief review of the meaning and measurement of thin-film forces (i.e., conjoining/disjoining pressures), we offer a consistent analysis of thin-film thermodynamics. By carefully defining film reversible work, two distinct thermodynamic formalisms emerge: a film model with two zero-volume membranes each of film tension γ(f) and a membrane model with a single zero-volume membrane of membrane tension 2γ(m). In both models, detailed thermodynamic analysis gives rise to thin-film Gibbs adsorption equations that allow calculation of film and membrane tensions from measurements of disjoining-pressure isotherms. A modified Young-Laplace equation arises in the film model to calculate film-thickness profiles from the film center to the surrounding bulk meniscus. No corresponding relation exists in the membrane model. Illustrative calculations of disjoining-pressure isotherms for water are presented using square-gradient theory. We report considerable deviations from Hamaker theory for films less than about 3 nm in thickness. Such thin films are considerably more attractive than in classical Hamaker theory. Available molecular simulations reinforce this finding. Copyright © 2014 Elsevier Inc. All rights reserved.
Social judgment theory based model on opinion formation, polarization and evolution
NASA Astrophysics Data System (ADS)
Chau, H. F.; Wong, C. Y.; Chow, F. K.; Fung, Chi-Hang Fred
2014-12-01
The dynamical origin of opinion polarization in the real world is an interesting topic that physical scientists may help to understand. To properly model the dynamics, the theory must be fully compatible with findings by social psychologists on microscopic opinion change. Here we introduce a generic model of opinion formation with homogeneous agents based on the well-known social judgment theory in social psychology by extending a similar model proposed by Jager and Amblard. The agents’ opinions will eventually cluster around extreme and/or moderate opinions forming three phases in a two-dimensional parameter space that describes the microscopic opinion response of the agents. The dynamics of this model can be qualitatively understood by mean-field analysis. More importantly, first-order phase transition in opinion distribution is observed by evolving the system under a slow change in the system parameters, showing that punctuated equilibria in public opinion can occur even in a fully connected social network.
Steerability Analysis of Tracked Vehicles: Theory and User’s Guide for Computer Program TVSTEER
1986-08-01
Baladi , Donald E. Barnes, Rebecca P. BergerC oStructures Laboratory NDEPARTMENT OF THE ARMY ___ Waterways Experiment Station, Corps of Engineers . U P0 Box...Analysis of Tracked Vehicles: Theory and User’s Guide for Computer Program TVSTEER - 12 PERSONAL AUTHOR(S) Baladi , George Y., Barnes, Donald E...mathematical model was formulated by Drs. George Y. Baladi and Behzad Rohani. The logic and computer programming were accomplished by Dr. Baladi and
Quantum walks with an anisotropic coin II: scattering theory
NASA Astrophysics Data System (ADS)
Richard, S.; Suzuki, A.; de Aldecoa, R. Tiedra
2018-05-01
We perform the scattering analysis of the evolution operator of quantum walks with an anisotropic coin, and we prove a weak limit theorem for their asymptotic velocity. The quantum walks that we consider include one-defect models, two-phase quantum walks, and topological phase quantum walks as special cases. Our analysis is based on an abstract framework for the scattering theory of unitary operators in a two-Hilbert spaces setting, which is of independent interest.
The application of sensitivity analysis to models of large scale physiological systems
NASA Technical Reports Server (NTRS)
Leonard, J. I.
1974-01-01
A survey of the literature of sensitivity analysis as it applies to biological systems is reported as well as a brief development of sensitivity theory. A simple population model and a more complex thermoregulatory model illustrate the investigatory techniques and interpretation of parameter sensitivity analysis. The role of sensitivity analysis in validating and verifying models, and in identifying relative parameter influence in estimating errors in model behavior due to uncertainty in input data is presented. This analysis is valuable to the simulationist and the experimentalist in allocating resources for data collection. A method for reducing highly complex, nonlinear models to simple linear algebraic models that could be useful for making rapid, first order calculations of system behavior is presented.
Time-dependence of graph theory metrics in functional connectivity analysis
Chiang, Sharon; Cassese, Alberto; Guindani, Michele; Vannucci, Marina; Yeh, Hsiang J.; Haneef, Zulfi; Stern, John M.
2016-01-01
Brain graphs provide a useful way to computationally model the network structure of the connectome, and this has led to increasing interest in the use of graph theory to quantitate and investigate the topological characteristics of the healthy brain and brain disorders on the network level. The majority of graph theory investigations of functional connectivity have relied on the assumption of temporal stationarity. However, recent evidence increasingly suggests that functional connectivity fluctuates over the length of the scan. In this study, we investigate the stationarity of brain network topology using a Bayesian hidden Markov model (HMM) approach that estimates the dynamic structure of graph theoretical measures of whole-brain functional connectivity. In addition to extracting the stationary distribution and transition probabilities of commonly employed graph theory measures, we propose two estimators of temporal stationarity: the S-index and N-index. These indexes can be used to quantify different aspects of the temporal stationarity of graph theory measures. We apply the method and proposed estimators to resting-state functional MRI data from healthy controls and patients with temporal lobe epilepsy. Our analysis shows that several graph theory measures, including small-world index, global integration measures, and betweenness centrality, may exhibit greater stationarity over time and therefore be more robust. Additionally, we demonstrate that accounting for subject-level differences in the level of temporal stationarity of network topology may increase discriminatory power in discriminating between disease states. Our results confirm and extend findings from other studies regarding the dynamic nature of functional connectivity, and suggest that using statistical models which explicitly account for the dynamic nature of functional connectivity in graph theory analyses may improve the sensitivity of investigations and consistency across investigations. PMID:26518632
Time-dependence of graph theory metrics in functional connectivity analysis.
Chiang, Sharon; Cassese, Alberto; Guindani, Michele; Vannucci, Marina; Yeh, Hsiang J; Haneef, Zulfi; Stern, John M
2016-01-15
Brain graphs provide a useful way to computationally model the network structure of the connectome, and this has led to increasing interest in the use of graph theory to quantitate and investigate the topological characteristics of the healthy brain and brain disorders on the network level. The majority of graph theory investigations of functional connectivity have relied on the assumption of temporal stationarity. However, recent evidence increasingly suggests that functional connectivity fluctuates over the length of the scan. In this study, we investigate the stationarity of brain network topology using a Bayesian hidden Markov model (HMM) approach that estimates the dynamic structure of graph theoretical measures of whole-brain functional connectivity. In addition to extracting the stationary distribution and transition probabilities of commonly employed graph theory measures, we propose two estimators of temporal stationarity: the S-index and N-index. These indexes can be used to quantify different aspects of the temporal stationarity of graph theory measures. We apply the method and proposed estimators to resting-state functional MRI data from healthy controls and patients with temporal lobe epilepsy. Our analysis shows that several graph theory measures, including small-world index, global integration measures, and betweenness centrality, may exhibit greater stationarity over time and therefore be more robust. Additionally, we demonstrate that accounting for subject-level differences in the level of temporal stationarity of network topology may increase discriminatory power in discriminating between disease states. Our results confirm and extend findings from other studies regarding the dynamic nature of functional connectivity, and suggest that using statistical models which explicitly account for the dynamic nature of functional connectivity in graph theory analyses may improve the sensitivity of investigations and consistency across investigations. Copyright © 2015 Elsevier Inc. All rights reserved.
Unexpected Results are Usually Wrong, but Often Interesting
NASA Astrophysics Data System (ADS)
Huber, M.
2014-12-01
In climate modeling, an unexpected result is usually wrong, arising from some sort of mistake. Despite the fact that we all bemoan uncertainty in climate, the field is underlain by a robust, successful body of theory and any properly conducted modeling experiment is posed and conducted within that context. Consequently, if results from a complex climate model disagree with theory or from expectations from simpler models, much skepticism is in order. But, this exposes the fundamental tension of using complex, sophisticated models. If simple models and theory were perfect there would be no reason for complex models--the entire point of sophisticated models is to see if unexpected phenomena arise as emergent properties of the system. In this talk, I will step through some paleoclimate examples, drawn from my own work, of unexpected results that emerge from complex climate models arising from mistakes of two kinds. The first kind of mistake, is what I call a 'smart mistake'; it is an intentional incorporation of assumptions, boundary conditions, or physics that is in violation of theoretical or observational constraints. The second mistake, a 'dumb mistake', is just that, an unintentional violation. Analysis of such mistaken simulations provides some potentially novel and certainly interesting insights into what is possible and right in paleoclimate modeling by forcing the reexamination of well-held assumptions and theories.
Melonic Phase Transition in Group Field Theory
NASA Astrophysics Data System (ADS)
Baratin, Aristide; Carrozza, Sylvain; Oriti, Daniele; Ryan, James; Smerlak, Matteo
2014-08-01
Group field theories have recently been shown to admit a 1/N expansion dominated by so-called `melonic graphs', dual to triangulated spheres. In this note, we deepen the analysis of this melonic sector. We obtain a combinatorial formula for the melonic amplitudes in terms of a graph polynomial related to a higher-dimensional generalization of the Kirchhoff tree-matrix theorem. Simple bounds on these amplitudes show the existence of a phase transition driven by melonic interaction processes. We restrict our study to the Boulatov-Ooguri models, which describe topological BF theories and are the basis for the construction of 4-dimensional models of quantum gravity.
Brassey, Charlotte A.; Margetts, Lee; Kitchener, Andrew C.; Withers, Philip J.; Manning, Phillip L.; Sellers, William I.
2013-01-01
Classic beam theory is frequently used in biomechanics to model the stress behaviour of vertebrate long bones, particularly when creating intraspecific scaling models. Although methodologically straightforward, classic beam theory requires complex irregular bones to be approximated as slender beams, and the errors associated with simplifying complex organic structures to such an extent are unknown. Alternative approaches, such as finite element analysis (FEA), while much more time-consuming to perform, require no such assumptions. This study compares the results obtained using classic beam theory with those from FEA to quantify the beam theory errors and to provide recommendations about when a full FEA is essential for reasonable biomechanical predictions. High-resolution computed tomographic scans of eight vertebrate long bones were used to calculate diaphyseal stress owing to various loading regimes. Under compression, FEA values of minimum principal stress (σmin) were on average 142 per cent (±28% s.e.) larger than those predicted by beam theory, with deviation between the two models correlated to shaft curvature (two-tailed p = 0.03, r2 = 0.56). Under bending, FEA values of maximum principal stress (σmax) and beam theory values differed on average by 12 per cent (±4% s.e.), with deviation between the models significantly correlated to cross-sectional asymmetry at midshaft (two-tailed p = 0.02, r2 = 0.62). In torsion, assuming maximum stress values occurred at the location of minimum cortical thickness brought beam theory and FEA values closest in line, and in this case FEA values of τtorsion were on average 14 per cent (±5% s.e.) higher than beam theory. Therefore, FEA is the preferred modelling solution when estimates of absolute diaphyseal stress are required, although values calculated by beam theory for bending may be acceptable in some situations. PMID:23173199
PDF-based heterogeneous multiscale filtration model.
Gong, Jian; Rutland, Christopher J
2015-04-21
Motivated by modeling of gasoline particulate filters (GPFs), a probability density function (PDF) based heterogeneous multiscale filtration (HMF) model is developed to calculate filtration efficiency of clean particulate filters. A new methodology based on statistical theory and classic filtration theory is developed in the HMF model. Based on the analysis of experimental porosimetry data, a pore size probability density function is introduced to represent heterogeneity and multiscale characteristics of the porous wall. The filtration efficiency of a filter can be calculated as the sum of the contributions of individual collectors. The resulting HMF model overcomes the limitations of classic mean filtration models which rely on tuning of the mean collector size. Sensitivity analysis shows that the HMF model recovers the classical mean model when the pore size variance is very small. The HMF model is validated by fundamental filtration experimental data from different scales of filter samples. The model shows a good agreement with experimental data at various operating conditions. The effects of the microstructure of filters on filtration efficiency as well as the most penetrating particle size are correctly predicted by the model.
2012-09-01
intelligence continues to evolve as attention to cognitive processes and mechanisms, a deeper understanding of related issues, and new theories ...hierarchical models that describe specific abilities arranged according to increasing specificity and developmental complexity [6-8]. Theories have also...persistence) not tapped directly by existing measures of intellectual ability. Wechsler’s theory of intelligence is central to the development of the mostly
Group decisions in biodiversity conservation: implications from game theory.
Frank, David M; Sarkar, Sahotra
2010-05-27
Decision analysis and game theory have proved useful tools in various biodiversity conservation planning and modeling contexts. This paper shows how game theory may be used to inform group decisions in biodiversity conservation scenarios by modeling conflicts between stakeholders to identify Pareto-inefficient Nash equilibria. These are cases in which each agent pursuing individual self-interest leads to a worse outcome for all, relative to other feasible outcomes. Three case studies from biodiversity conservation contexts showing this feature are modeled to demonstrate how game-theoretical representation can inform group decision-making. The mathematical theory of games is used to model three biodiversity conservation scenarios with Pareto-inefficient Nash equilibria: (i) a two-agent case involving wild dogs in South Africa; (ii) a three-agent raptor and grouse conservation scenario from the United Kingdom; and (iii) an n-agent fish and coral conservation scenario from the Philippines. In each case there is reason to believe that traditional mechanism-design solutions that appeal to material incentives may be inadequate, and the game-theoretical analysis recommends a resumption of further deliberation between agents and the initiation of trust--and confidence--building measures. Game theory can and should be used as a normative tool in biodiversity conservation contexts: identifying scenarios with Pareto-inefficient Nash equilibria enables constructive action in order to achieve (closer to) optimal conservation outcomes, whether by policy solutions based on mechanism design or otherwise. However, there is mounting evidence that formal mechanism-design solutions may backfire in certain cases. Such scenarios demand a return to group deliberation and the creation of reciprocal relationships of trust.
Eastwood, John G; Jalaludin, Bin B; Kemp, Lynn A
2014-01-01
A recent criticism of social epidemiological studies, and multi-level studies in particular has been a paucity of theory. We will present here the protocol for a study that aims to build a theory of the social epidemiology of maternal depression. We use a critical realist approach which is trans-disciplinary, encompassing both quantitative and qualitative traditions, and that assumes both ontological and hierarchical stratification of reality. We describe a critical realist Explanatory Theory Building Method comprising of an: 1) emergent phase, 2) construction phase, and 3) confirmatory phase. A concurrent triangulated mixed method multilevel cross-sectional study design is described. The Emergent Phase uses: interviews, focus groups, exploratory data analysis, exploratory factor analysis, regression, and multilevel Bayesian spatial data analysis to detect and describe phenomena. Abductive and retroductive reasoning will be applied to: categorical principal component analysis, exploratory factor analysis, regression, coding of concepts and categories, constant comparative analysis, drawing of conceptual networks, and situational analysis to generate theoretical concepts. The Theory Construction Phase will include: 1) defining stratified levels; 2) analytic resolution; 3) abductive reasoning; 4) comparative analysis (triangulation); 5) retroduction; 6) postulate and proposition development; 7) comparison and assessment of theories; and 8) conceptual frameworks and model development. The strength of the critical realist methodology described is the extent to which this paradigm is able to support the epistemological, ontological, axiological, methodological and rhetorical positions of both quantitative and qualitative research in the field of social epidemiology. The extensive multilevel Bayesian studies, intensive qualitative studies, latent variable theory, abductive triangulation, and Inference to Best Explanation provide a strong foundation for Theory Construction. The study will contribute to defining the role that realism and mixed methods can play in explaining the social determinants and developmental origins of health and disease.
NASA Astrophysics Data System (ADS)
Rezaei Kivi, Araz; Azizi, Saber; Norouzi, Peyman
2017-12-01
In this paper, the nonlinear size-dependent static and dynamic behavior of an electrostatically actuated nano-beam is investigated. A fully clamped nano-beam is considered for the modeling of the deformable electrode of the NEMS. The governing differential equation of the motion is derived using Hamiltonian principle based on couple stress theory; a non-classical theory for considering length scale effects. The nonlinear partial differential equation of the motion is discretized to a nonlinear Duffing type ODE's using Galerkin method. Static and dynamic pull-in instabilities obtained by both classical theory and MCST are compared. At the second stage of analysis, shooting technique is utilized to obtain the frequency response curve, and to capture the periodic solutions of the motion; the stability of the periodic solutions are gained by Floquet theory. The nonlinear dynamic behavior of the deformable electrode due to the AC harmonic accompanied with size dependency is investigated.
Analysis of the UCSF Symptom Management Theory: Implications for Pediatric Oncology Nursing
Linder, Lauri
2015-01-01
Symptom management research is a priority for both children and adults with cancer. The UCSF Symptom Management Theory (SMT) is a middle range theory depicting symptom management as a multidimensional process. A theory analysis using the process described by Walker and Avant evaluated the SMT with attention to application in research involving children with cancer. Application of the SMT in studies involving children has been limited to descriptive studies testing only portions of the theory. Findings of these studies have provided empiric support for the relationships proposed within the SMT. Considerations for future research involving children include attention to measurement of symptoms and clarity regarding the location of the parents and family within the model. With additional testing and refinement, the SMT has the potential to guide nursing research and practice to improve symptoms for children with cancer. PMID:20639345
Cluster Analysis for Cognitive Diagnosis: Theory and Applications
ERIC Educational Resources Information Center
Chiu, Chia-Yi; Douglas, Jeffrey A.; Li, Xiaodong
2009-01-01
Latent class models for cognitive diagnosis often begin with specification of a matrix that indicates which attributes or skills are needed for each item. Then by imposing restrictions that take this into account, along with a theory governing how subjects interact with items, parametric formulations of item response functions are derived and…
Iberian Spanish "Macho": Vantages and Polysemy in Culturally Defined Meaning
ERIC Educational Resources Information Center
Grace, Caroline A.; Glaz, Adam
2010-01-01
This study explores some specific aspects of compatibility between cognitive models. Robert E. MacLaury's theory of vantages as arrangements of coordinates and Lakoff's concept of radial categories are mutually reinforcing to an analysis of semantic polysemy. Vantage Theory (VT) includes the notions of "zooming in" and "zooming out", allowing…
ERIC Educational Resources Information Center
Smith, Jennifer L.; Skinner, S. Rachel; Fenwick, Jennifer
2011-01-01
Grounded theory principles were systematically employed to reveal key differences in pregnancy risk and underlying disparities in contraceptive use in (a) never-pregnant (b) pregnant-terminated and (c) pregnant-continued teenagers. Analysis of 69 semistructured interviews revealed a bicausal model of pregnancy protection that accounted for…
Prosumers and Emirecs: Analysis of Two Confronted Theories
ERIC Educational Resources Information Center
Aparici, Roberto; García-Marín, David
2018-01-01
In the 1970s, the publications of Alvin Toffler and Jean Cloutier were essential for the emergence of two concepts, prosumer and emirec, whose meanings have been mistakenly equated by numerous scholars and researchers. At the same time, the mercantilist theories linked to prosumption have made invisible the models of communication designed by…
Experiential Learning Theory as One of the Foundations of Adult Learning Practice Worldwide
ERIC Educational Resources Information Center
Dernova, Maiya
2015-01-01
The paper presents the analysis of existing theory, assumptions, and models of adult experiential learning. The experiential learning is a learning based on a learning cycle guided by the dual dialectics of action-reflection and experience-abstraction. It defines learning as a process of knowledge creation through experience transformation, so…
A Note on Powers in Finite Fields
ERIC Educational Resources Information Center
Aabrandt, Andreas; Hansen, Vagn Lundsgaard
2016-01-01
The study of solutions to polynomial equations over finite fields has a long history in mathematics and is an interesting area of contemporary research. In recent years, the subject has found important applications in the modelling of problems from applied mathematical fields such as signal analysis, system theory, coding theory and cryptology. In…
Development and Validation of the Sorokin Psychosocial Love Inventory for Divorced Individuals
ERIC Educational Resources Information Center
D'Ambrosio, Joseph G.; Faul, Anna C.
2013-01-01
Objective: This study describes the development and validation of the Sorokin Psychosocial Love Inventory (SPSLI) measuring love actions toward a former spouse. Method: Classical measurement theory and confirmatory factor analysis (CFA) were utilized with an a priori theory and factor model to validate the SPSLI. Results: A 15-item scale…
Comparison of Reliability Measures under Factor Analysis and Item Response Theory
ERIC Educational Resources Information Center
Cheng, Ying; Yuan, Ke-Hai; Liu, Cheng
2012-01-01
Reliability of test scores is one of the most pervasive psychometric concepts in measurement. Reliability coefficients based on a unifactor model for continuous indicators include maximal reliability rho and an unweighted sum score-based omega, among many others. With increasing popularity of item response theory, a parallel reliability measure pi…
Usability and Gratifications--Towards a Website Analysis Model.
ERIC Educational Resources Information Center
Bunz, Ulla K.
This paper discusses Web site usability issues. Specifically, it assumes that the usability of a Web site depends more on the perception of the user than on the objectively assessable usability criteria of the Web site. Two pilot studies, based on theoretical notions of uses and gratifications theory and similar theories, are presented. In the…
Class-Related Emotions in Secondary Physical Education: A Control-Value Theory Approach
ERIC Educational Resources Information Center
Simonton, Kelly L.; Garn, Alex C.; Solmon, Melinda Ann
2017-01-01
Purpose: Grounded in control-value theory, a model of students' achievement emotions in physical education (PE) was investigated. Methods: A path analysis tested hypotheses that students' (N = 529) perceptions of teacher responsiveness, assertiveness, and clarity predict control and value beliefs which, in turn, predict enjoyment and boredom.…
ERIC Educational Resources Information Center
Wetzel, Eunike; Hell, Benedikt
2014-01-01
Vocational interest inventories are commonly analyzed using a unidimensional approach, that is, each subscale is analyzed separately. However, the theories on which these inventories are based often postulate specific relationships between the interest traits. This article presents a multidimensional approach to the analysis of vocational interest…
A Contingency View of Problem Solving in Schools: A Case Analysis.
ERIC Educational Resources Information Center
Hanson, E. Mark; Brown, Michael E.
Patterns of problem-solving activity in one middle-class urban high school are examined and a problem solving model rooted in a conceptual framework of contingency theory is presented. Contingency theory stresses that as political, economic, and social conditions in an organization's environment become problematic, the internal structures of the…
Dynamics of essential collective motions in proteins: Theory
NASA Astrophysics Data System (ADS)
Stepanova, Maria
2007-11-01
A general theoretical background is introduced for characterization of conformational motions in protein molecules, and for building reduced coarse-grained models of proteins, based on the statistical analysis of their phase trajectories. Using the projection operator technique, a system of coupled generalized Langevin equations is derived for essential collective coordinates, which are generated by principal component analysis of molecular dynamic trajectories. The number of essential degrees of freedom is not limited in the theory. An explicit analytic relation is established between the generalized Langevin equation for essential collective coordinates and that for the all-atom phase trajectory projected onto the subspace of essential collective degrees of freedom. The theory introduced is applied to identify correlated dynamic domains in a macromolecule and to construct coarse-grained models representing the conformational motions in a protein through a few interacting domains embedded in a dissipative medium. A rigorous theoretical background is provided for identification of dynamic correlated domains in a macromolecule. Examples of domain identification in protein G are given and employed to interpret NMR experiments. Challenges and potential outcomes of the theory are discussed.
Theories of reasoned action and planned behavior as models of condom use: a meta-analysis.
Albarracín, D; Johnson, B T; Fishbein, M; Muellerleile, P A
2001-01-01
To examine how well the theories of reasoned action and planned behavior predict condom use, the authors synthesized 96 data sets (N = 22,594) containing associations between the models' key variables. Consistent with the theory of reasoned action's predictions, (a) condom use was related to intentions (weighted mean r. = .45), (b) intentions were based on attitudes (r. = .58) and subjective norms (r. = .39), and (c) attitudes were associated with behavioral beliefs (r. = .56) and norms were associated with normative beliefs (r. = .46). Consistent with the theory of planned behavior's predictions, perceived behavioral control was related to condom use intentions (r. = .45) and condom use (r. = .25), but in contrast to the theory, it did not contribute significantly to condom use. The strength of these associations, however, was influenced by the consideration of past behavior. Implications of these results for HIV prevention efforts are discussed.
The design of patient decision support interventions: addressing the theory-practice gap.
Elwyn, Glyn; Stiel, Mareike; Durand, Marie-Anne; Boivin, Jacky
2011-08-01
Although an increasing number of decision support interventions for patients (including decision aids) are produced, few make explicit use of theory. We argue the importance of using theory to guide design. The aim of this work was to address this theory-practice gap and to examine how a range of selected decision-making theories could inform the design and evaluation of decision support interventions. We reviewed the decision-making literature and selected relevant theories. We assessed their key principles, theoretical pathways and predictions in order to determine how they could inform the design of two core components of decision support interventions, namely, information and deliberation components and to specify theory-based outcome measures. Eight theories were selected: (1) the expected utility theory; (2) the conflict model of decision making; (3) prospect theory; (4) fuzzy-trace theory; (5) the differentiation and consolidation theory; (6) the ecological rationality theory; (7) the rational-emotional model of decision avoidance; and finally, (8) the Attend, React, Explain, Adapt model of affective forecasting. Some theories have strong relevance to the information design (e.g. prospect theory); some are more relevant to deliberation processes (conflict theory, differentiation theory and ecological validity). None of the theories in isolation was sufficient to inform the design of all the necessary components of decision support interventions. It was also clear that most work in theory-building has focused on explaining or describing how humans think rather than on how tools could be designed to help humans make good decisions. It is not surprising therefore that a large theory-practice gap exists as we consider decision support for patients. There was no relevant theory that integrated all the necessary contributions to the task of making good decisions in collaborative interactions. Initiatives such as the International Patient Decision Aids Standards Collaboration influence standards for the design of decision support interventions. However, this analysis points to the need to undertake more work in providing theoretical foundations for these interventions. © 2010 Blackwell Publishing Ltd.
ERIC Educational Resources Information Center
Chou, Yeh-Tai; Wang, Wen-Chung
2010-01-01
Dimensionality is an important assumption in item response theory (IRT). Principal component analysis on standardized residuals has been used to check dimensionality, especially under the family of Rasch models. It has been suggested that an eigenvalue greater than 1.5 for the first eigenvalue signifies a violation of unidimensionality when there…
What are the mechanics of quantum cognition?
Navarro, Daniel Joseph; Fuss, Ian
2013-06-01
Pothos & Busemeyer (P&B) argue that quantum probability (QP) provides a descriptive model of behavior and can also provide a rational analysis of a task. We discuss QP models using Marr's levels of analysis, arguing that they make most sense as algorithmic level theories. We also highlight the importance of having clear interpretations for basic mechanisms such as interference.
On holographic Rényi entropy in some modified theories of gravity
NASA Astrophysics Data System (ADS)
Dey, Anshuman; Roy, Pratim; Sarkar, Tapobrata
2018-04-01
We perform a detailed analysis of holographic entanglement Rényi entropy in some modified theories of gravity with four dimensional conformal field theory duals. First, we construct perturbative black hole solutions in a recently proposed model of Einsteinian cubic gravity in five dimensions, and compute the Rényi entropy as well as the scaling dimension of the twist operators in the dual field theory. Consistency of these results are verified from the AdS/CFT correspondence, via a corresponding computation of the Weyl anomaly on the gravity side. Similar analyses are then carried out for three other examples of modified gravity in five dimensions that include a chemical potential, namely Born-Infeld gravity, charged quasi-topological gravity and a class of Weyl corrected gravity theories with a gauge field, with the last example being treated perturbatively. Some interesting bounds in the dual conformal field theory parameters in quasi-topological gravity are pointed out. We also provide arguments on the validity of our perturbative analysis, whenever applicable.
Prediction of tautomer ratios by embedded-cluster integral equation theory
NASA Astrophysics Data System (ADS)
Kast, Stefan M.; Heil, Jochen; Güssregen, Stefan; Schmidt, K. Friedemann
2010-04-01
The "embedded cluster reference interaction site model" (EC-RISM) approach combines statistical-mechanical integral equation theory and quantum-chemical calculations for predicting thermodynamic data for chemical reactions in solution. The electronic structure of the solute is determined self-consistently with the structure of the solvent that is described by 3D RISM integral equation theory. The continuous solvent-site distribution is mapped onto a set of discrete background charges ("embedded cluster") that represent an additional contribution to the molecular Hamiltonian. The EC-RISM analysis of the SAMPL2 challenge set of tautomers proceeds in three stages. Firstly, the group of compounds for which quantitative experimental free energy data was provided was taken to determine appropriate levels of quantum-chemical theory for geometry optimization and free energy prediction. Secondly, the resulting workflow was applied to the full set, allowing for chemical interpretations of the results. Thirdly, disclosure of experimental data for parts of the compounds facilitated a detailed analysis of methodical issues and suggestions for future improvements of the model. Without specifically adjusting parameters, the EC-RISM model yields the smallest value of the root mean square error for the first set (0.6 kcal mol-1) as well as for the full set of quantitative reaction data (2.0 kcal mol-1) among the SAMPL2 participants.
Development of a unified constitutive model for an isotropic nickel base superalloy Rene 80
NASA Technical Reports Server (NTRS)
Ramaswamy, V. G.; Vanstone, R. H.; Laflen, J. H.; Stouffer, D. C.
1988-01-01
Accurate analysis of stress-strain behavior is of critical importance in the evaluation of life capabilities of hot section turbine engine components such as turbine blades and vanes. The constitutive equations used in the finite element analysis of such components must be capable of modeling a variety of complex behavior exhibited at high temperatures by cast superalloys. The classical separation of plasticity and creep employed in most of the finite element codes in use today is known to be deficient in modeling elevated temperature time dependent phenomena. Rate dependent, unified constitutive theories can overcome many of these difficulties. A new unified constitutive theory was developed to model the high temperature, time dependent behavior of Rene' 80 which is a cast turbine blade and vane nickel base superalloy. Considerations in model development included the cyclic softening behavior of Rene' 80, rate independence at lower temperatures and the development of a new model for static recovery.
Molenaar, Dylan; Tuerlinckx, Francis; van der Maas, Han L J
2015-01-01
A generalized linear modeling framework to the analysis of responses and response times is outlined. In this framework, referred to as bivariate generalized linear item response theory (B-GLIRT), separate generalized linear measurement models are specified for the responses and the response times that are subsequently linked by cross-relations. The cross-relations can take various forms. Here, we focus on cross-relations with a linear or interaction term for ability tests, and cross-relations with a curvilinear term for personality tests. In addition, we discuss how popular existing models from the psychometric literature are special cases in the B-GLIRT framework depending on restrictions in the cross-relation. This allows us to compare existing models conceptually and empirically. We discuss various extensions of the traditional models motivated by practical problems. We also illustrate the applicability of our approach using various real data examples, including data on personality and cognitive ability.
NASA Astrophysics Data System (ADS)
Fitkov-Norris, Elena; Yeghiazarian, Ara
2016-11-01
The analytical tools available to social scientists have traditionally been adapted from tools originally designed for analysis of natural science phenomena. This article discusses the applicability of systems dynamics - a qualitative based modelling approach, as a possible analysis and simulation tool that bridges the gap between social and natural sciences. After a brief overview of the systems dynamics modelling methodology, the advantages as well as limiting factors of systems dynamics to the potential applications in the field of social sciences and human interactions are discussed. The issues arise with regards to operationalization and quantification of latent constructs at the simulation building stage of the systems dynamics methodology and measurement theory is proposed as a ready and waiting solution to the problem of dynamic model calibration, with a view of improving simulation model reliability and validity and encouraging the development of standardised, modular system dynamics models that can be used in social science research.
1996-06-01
for Software Synthesis." KBSE . IEEE, 1993. 51. Kang, Kyo C., et al. Feature-Oriented Domain Analysis ( FODA ) Feasibility Study. Technical Report...and usefulness in domain analysis and modeling. Rumbaugh uses three distinct views to describe a domain: (1) the object model describes structural...Gibbons describe a methodology where Structured Analysis is used to build a hierarchical system structure chart. This structure chart is then translated
Transverse Vibration of Tapered Single-Walled Carbon Nanotubes Embedded in Viscoelastic Medium
NASA Astrophysics Data System (ADS)
Lei, Y. J.; Zhang, D. P.; Shen, Z. B.
2017-12-01
Based on the nonlocal theory, Euler-Bernoulli beam theory and Kelvin viscoelastic foundation model, free transverse vibration is studied for a tapered viscoelastic single-walled carbon nanotube (visco-SWCNT) embedded in a viscoelastic medium. Firstly, the governing equations for vibration analysis are established. And then, we derive the natural frequencies in closed form for SWCNTs with arbitrary boundary conditions by applying transfer function method and perturbation method. Numerical results are also presented to discuss the effects of nonlocal parameter, relaxation time and taper parameter of SWCNTs, and material property parameters of the medium. This study demonstrates that the proposed model is available for vibration analysis of the tapered SWCNTs-viscoelastic medium coupling system.
How to do a grounded theory study: a worked example of a study of dental practices
2011-01-01
Background Qualitative methodologies are increasingly popular in medical research. Grounded theory is the methodology most-often cited by authors of qualitative studies in medicine, but it has been suggested that many 'grounded theory' studies are not concordant with the methodology. In this paper we provide a worked example of a grounded theory project. Our aim is to provide a model for practice, to connect medical researchers with a useful methodology, and to increase the quality of 'grounded theory' research published in the medical literature. Methods We documented a worked example of using grounded theory methodology in practice. Results We describe our sampling, data collection, data analysis and interpretation. We explain how these steps were consistent with grounded theory methodology, and show how they related to one another. Grounded theory methodology assisted us to develop a detailed model of the process of adapting preventive protocols into dental practice, and to analyse variation in this process in different dental practices. Conclusions By employing grounded theory methodology rigorously, medical researchers can better design and justify their methods, and produce high-quality findings that will be more useful to patients, professionals and the research community. PMID:21902844
How to do a grounded theory study: a worked example of a study of dental practices.
Sbaraini, Alexandra; Carter, Stacy M; Evans, R Wendell; Blinkhorn, Anthony
2011-09-09
Qualitative methodologies are increasingly popular in medical research. Grounded theory is the methodology most-often cited by authors of qualitative studies in medicine, but it has been suggested that many 'grounded theory' studies are not concordant with the methodology. In this paper we provide a worked example of a grounded theory project. Our aim is to provide a model for practice, to connect medical researchers with a useful methodology, and to increase the quality of 'grounded theory' research published in the medical literature. We documented a worked example of using grounded theory methodology in practice. We describe our sampling, data collection, data analysis and interpretation. We explain how these steps were consistent with grounded theory methodology, and show how they related to one another. Grounded theory methodology assisted us to develop a detailed model of the process of adapting preventive protocols into dental practice, and to analyse variation in this process in different dental practices. By employing grounded theory methodology rigorously, medical researchers can better design and justify their methods, and produce high-quality findings that will be more useful to patients, professionals and the research community.
A queueing theory based model for business continuity in hospitals.
Miniati, R; Cecconi, G; Dori, F; Frosini, F; Iadanza, E; Biffi Gentili, G; Niccolini, F; Gusinu, R
2013-01-01
Clinical activities can be seen as results of precise and defined events' succession where every single phase is characterized by a waiting time which includes working duration and possible delay. Technology makes part of this process. For a proper business continuity management, planning the minimum number of devices according to the working load only is not enough. A risk analysis on the whole process should be carried out in order to define which interventions and extra purchase have to be made. Markov models and reliability engineering approaches can be used for evaluating the possible interventions and to protect the whole system from technology failures. The following paper reports a case study on the application of the proposed integrated model, including risk analysis approach and queuing theory model, for defining the proper number of device which are essential to guarantee medical activity and comply the business continuity management requirements in hospitals.
Modelling and finite-time stability analysis of psoriasis pathogenesis
NASA Astrophysics Data System (ADS)
Oza, Harshal B.; Pandey, Rakesh; Roper, Daniel; Al-Nuaimi, Yusur; Spurgeon, Sarah K.; Goodfellow, Marc
2017-08-01
A new systems model of psoriasis is presented and analysed from the perspective of control theory. Cytokines are treated as actuators to the plant model that govern the cell population under the reasonable assumption that cytokine dynamics are faster than the cell population dynamics. The analysis of various equilibria is undertaken based on singular perturbation theory. Finite-time stability and stabilisation have been studied in various engineering applications where the principal paradigm uses non-Lipschitz functions of the states. A comprehensive study of the finite-time stability properties of the proposed psoriasis dynamics is carried out. It is demonstrated that the dynamics are finite-time convergent to certain equilibrium points rather than asymptotically or exponentially convergent. This feature of finite-time convergence motivates the development of a modified version of the Michaelis-Menten function, frequently used in biology. This framework is used to model cytokines as fast finite-time actuators.
Nonlinear Poisson equation for heterogeneous media.
Hu, Langhua; Wei, Guo-Wei
2012-08-22
The Poisson equation is a widely accepted model for electrostatic analysis. However, the Poisson equation is derived based on electric polarizations in a linear, isotropic, and homogeneous dielectric medium. This article introduces a nonlinear Poisson equation to take into consideration of hyperpolarization effects due to intensive charges and possible nonlinear, anisotropic, and heterogeneous media. Variational principle is utilized to derive the nonlinear Poisson model from an electrostatic energy functional. To apply the proposed nonlinear Poisson equation for the solvation analysis, we also construct a nonpolar solvation energy functional based on the nonlinear Poisson equation by using the geometric measure theory. At a fixed temperature, the proposed nonlinear Poisson theory is extensively validated by the electrostatic analysis of the Kirkwood model and a set of 20 proteins, and the solvation analysis of a set of 17 small molecules whose experimental measurements are also available for a comparison. Moreover, the nonlinear Poisson equation is further applied to the solvation analysis of 21 compounds at different temperatures. Numerical results are compared to theoretical prediction, experimental measurements, and those obtained from other theoretical methods in the literature. A good agreement between our results and experimental data as well as theoretical results suggests that the proposed nonlinear Poisson model is a potentially useful model for electrostatic analysis involving hyperpolarization effects. Copyright © 2012 Biophysical Society. Published by Elsevier Inc. All rights reserved.
Holographic renormalization group and cosmology in theories with quasilocalized gravity
NASA Astrophysics Data System (ADS)
Csáki, Csaba; Erlich, Joshua; Hollowood, Timothy J.; Terning, John
2001-03-01
We study the long distance behavior of brane theories with quasilocalized gravity. The five-dimensional (5D) effective theory at large scales follows from a holographic renormalization group flow. As intuitively expected, the graviton is effectively four dimensional at intermediate scales and becomes five dimensional at large scales. However, in the holographic effective theory the essentially 4D radion dominates at long distances and gives rise to scalar antigravity. The holographic description shows that at large distances the Gregory-Rubakov-Sibiryakov (GRS) model is equivalent to the model recently proposed by Dvali, Gabadadze, and Porrati (DGP), where a tensionless brane is embedded into 5D Minkowski space, with an additional induced 4D Einstein-Hilbert term on the brane. In the holographic description the radion of the GRS model is automatically localized on the tensionless brane, and provides the ghostlike field necessary to cancel the extra graviton polarization of the DGP model. Thus, there is a holographic duality between these theories. This analysis provides physical insight into how the GRS model works at intermediate scales; in particular it sheds light on the size of the width of the graviton resonance, and also demonstrates how the holographic renormalization group can be used as a practical tool for calculations.
Holographic renormalization group and cosmology in theories with quasilocalized gravity
DOE Office of Scientific and Technical Information (OSTI.GOV)
Csaki, Csaba; Erlich, Joshua; Hollowood, Timothy J.
2001-03-15
We study the long distance behavior of brane theories with quasilocalized gravity. The five-dimensional (5D) effective theory at large scales follows from a holographic renormalization group flow. As intuitively expected, the graviton is effectively four dimensional at intermediate scales and becomes five dimensional at large scales. However, in the holographic effective theory the essentially 4D radion dominates at long distances and gives rise to scalar antigravity. The holographic description shows that at large distances the Gregory-Rubakov-Sibiryakov (GRS) model is equivalent to the model recently proposed by Dvali, Gabadadze, and Porrati (DGP), where a tensionless brane is embedded into 5D Minkowskimore » space, with an additional induced 4D Einstein-Hilbert term on the brane. In the holographic description the radion of the GRS model is automatically localized on the tensionless brane, and provides the ghostlike field necessary to cancel the extra graviton polarization of the DGP model. Thus, there is a holographic duality between these theories. This analysis provides physical insight into how the GRS model works at intermediate scales; in particular it sheds light on the size of the width of the graviton resonance, and also demonstrates how the holographic renormalization group can be used as a practical tool for calculations.« less
Cohn, Amy M.; Hagman, Brett T.; Graff, Fiona S.; Noel, Nora E.
2011-01-01
Objective: The present study examined the latent continuum of alcohol-related negative consequences among first-year college women using methods from item response theory and classical test theory. Method: Participants (N = 315) were college women in their freshman year who reported consuming any alcohol in the past 90 days and who completed assessments of alcohol consumption and alcohol-related negative consequences using the Rutgers Alcohol Problem Index. Results: Item response theory analyses showed poor model fit for five items identified in the Rutgers Alcohol Problem Index. Two-parameter item response theory logistic models were applied to the remaining 18 items to examine estimates of item difficulty (i.e., severity) and discrimination parameters. The item difficulty parameters ranged from 0.591 to 2.031, and the discrimination parameters ranged from 0.321 to 2.371. Classical test theory analyses indicated that the omission of the five misfit items did not significantly alter the psychometric properties of the construct. Conclusions: Findings suggest that those consequences that had greater severity and discrimination parameters may be used as screening items to identify female problem drinkers at risk for an alcohol use disorder. PMID:22051212
From atoms to steps: The microscopic origins of crystal evolution
NASA Astrophysics Data System (ADS)
Patrone, Paul N.; Einstein, T. L.; Margetis, Dionisios
2014-07-01
The Burton-Cabrera-Frank (BCF) theory of crystal growth has been successful in describing a wide range of phenomena in surface physics. Typical crystal surfaces are slightly misoriented with respect to a facet plane; thus, the BCF theory views such systems as composed of staircase-like structures of steps separating terraces. Adsorbed atoms (adatoms), which are represented by a continuous density, diffuse on terraces, and steps move by absorbing or emitting these adatoms. Here we shed light on the microscopic origins of the BCF theory by deriving a simple, one-dimensional (1D) version of the theory from an atomistic, kinetic restricted solid-on-solid (KRSOS) model without external material deposition. We define the time-dependent adatom density and step position as appropriate ensemble averages in the KRSOS model, thereby exposing the non-equilibrium statistical mechanics origins of the BCF theory. Our analysis reveals that the BCF theory is valid in a low adatom-density regime, much in the same way that an ideal gas approximation applies to dilute gasses. We find conditions under which the surface remains in a low-density regime and discuss the microscopic origin of corrections to the BCF model.
A concept analysis of forensic risk.
Kettles, A M
2004-08-01
Forensic risk is a term used in relation to many forms of clinical practice, such as assessment, intervention and management. Rarely is the term defined in the literature and as a concept it is multifaceted. Concept analysis is a method for exploring and evaluating the meaning of words. It gives precise definitions, both theoretical and operational, for use in theory, clinical practice and research. A concept analysis provides a logical basis for defining terms through providing defining attributes, case examples (model, contrary, borderline, related), antecedents and consequences and the implications for nursing. Concept analysis helps us to refine and define a concept that derives from practice, research or theory. This paper will use the strategy of concept analysis to find a working definition for the concept of forensic risk. In conclusion, the historical background and literature are reviewed using concept analysis to bring the term into focus and to define it more clearly. Forensic risk is found to derive both from forensic practice and from risk theory. A proposed definition of forensic risk is given.
Modeling Adversaries in Counterterrorism Decisions Using Prospect Theory.
Merrick, Jason R W; Leclerc, Philip
2016-04-01
Counterterrorism decisions have been an intense area of research in recent years. Both decision analysis and game theory have been used to model such decisions, and more recently approaches have been developed that combine the techniques of the two disciplines. However, each of these approaches assumes that the attacker is maximizing its utility. Experimental research shows that human beings do not make decisions by maximizing expected utility without aid, but instead deviate in specific ways such as loss aversion or likelihood insensitivity. In this article, we modify existing methods for counterterrorism decisions. We keep expected utility as the defender's paradigm to seek for the rational decision, but we use prospect theory to solve for the attacker's decision to descriptively model the attacker's loss aversion and likelihood insensitivity. We study the effects of this approach in a critical decision, whether to screen containers entering the United States for radioactive materials. We find that the defender's optimal decision is sensitive to the attacker's levels of loss aversion and likelihood insensitivity, meaning that understanding such descriptive decision effects is important in making such decisions. © 2014 Society for Risk Analysis.
Introducing local property tax for fiscal decentralization and local authority autonomy
NASA Astrophysics Data System (ADS)
Dimopoulos, Thomas; Labropoulos, Tassos; Hadjimitsis, Diafantos G.
2015-06-01
Charles Tiebout (1956), in his work "A Pure Theory of Local Expenditures", provides a vision of the workings of the local public sector, acknowledging many similarities to the features of a competitive market, however omitting any references to local taxation. Contrary to other researchers' claim that the Tiebout model and the theory of fiscal decentralization are by no means synonymous, this paper aims to expand Tiebout's theory, by adding the local property tax in the context, introducing a fair, ad valorem property taxation system based on the automated assessment of the value of real estate properties within the boundaries of local authorities. Computer Assisted Mass Appraisal methodology integrated with Remote Sensing technology and GIS analysis is applied to local authorities' property registries and cadastral data, building a spatial relational database and providing data to be statistically processed through Multiple Regression Analysis modeling. The proposed scheme accomplishes economy of scale using CAMA procedures on one hand, but also succeeds in making local authorities self-sufficient through a decentralized, fair, locally calibrated property taxation model, providing rational income administration.
Why involve families in acute mental healthcare? A collaborative conceptual review.
Dirik, Aysegul; Sandhu, Sima; Giacco, Domenico; Barrett, Katherine; Bennison, Gerry; Collinson, Sue; Priebe, Stefan
2017-09-27
Family involvement is strongly recommended in clinical guidelines but suffers from poor implementation. To explore this topic at a conceptual level, a multidisciplinary review team including academics, clinicians and individuals with lived experience undertook a review to explore the theoretical background of family involvement models in acute mental health treatment and how this relates to their delivery. A conceptual review was undertaken, including a systematic search and narrative synthesis. Included family models were mapped onto the most commonly referenced underlying theories: the diathesis-stress model, systems theories and postmodern theories of mental health. Common components of the models were summarised and compared. Lastly, a thematic analysis was undertaken to explore the role of patients and families in the delivery of the approaches. General adult acute mental health treatment. Six distinct family involvement models were identified: Calgary Family Assessment and Intervention Models, ERIC (Equipe Rapide d'Intervention de Crise), Family Psychoeducation Models, Family Systems Approach, Open Dialogue and the Somerset Model. Findings indicated that despite wide variation in the theoretical models underlying family involvement models, there were many commonalities in their components, such as a focus on communication, language use and joint decision-making. Thematic analysis of the role of patients and families identified several issues for implementation. This included potential harms that could emerge during delivery of the models, such as imposing linear 'patient-carer' relationships and the risk of perceived coercion. We conclude that future staff training may benefit from discussing the chosen family involvement model within the context of other theories of mental health. This may help to clarify the underlying purpose of family involvement and address the diverse needs and world views of patients, families and professionals in acute settings. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2017. All rights reserved. No commercial use is permitted unless otherwise expressly granted.
The Dissipation Rate Transport Equation and Subgrid-Scale Models in Rotating Turbulence
NASA Technical Reports Server (NTRS)
Rubinstein, Robert; Ye, Zhou
1997-01-01
The dissipation rate transport equation remains the most uncertain part of turbulence modeling. The difficulties arc increased when external agencies like rotation prevent straightforward dimensional analysis from determining the correct form of the modelled equation. In this work, the dissipation rate transport equation and subgrid scale models for rotating turbulence are derived from an analytical statistical theory of rotating turbulence. In the strong rotation limit, the theory predicts a turbulent steady state in which the inertial range energy spectrum scales as k(sup -2) and the turbulent time scale is the inverse rotation rate. This scaling has been derived previously by heuristic arguments.
2005-05-01
TANK WALL.........................74 6 VERIFICATION - BONDED JOINT HOMOGENOUS ISOTROPIC AND ORTHOTROPIC DELALE & ERDOGAN PUBLICATION (SIX EXAMPLES...developed for verification of BondJo 87 6.3.2 Adhesive stress comparisons between BondJo, Ansys solid model FEA and Delale and Erdogan plate theory 88...comparisons for condition 1 91 6.3.6 Adhesive stress comparisons between BondJo, Ansys solid model FEA and Delale and Erdogan plate theory 92 x FIGURE
ERIC Educational Resources Information Center
Awuah, Lawrence J.
2012-01-01
Understanding citizens' adoption of electronic-government (e-government) is an important topic, as the use of e-government has become an integral part of governance. Success of such initiatives depends largely on the efficient use of e-government services. The unified theory of acceptance and use of technology (UTAUT) model has provided a…
Research on Capturing of Customer Requirements Based on Innovation Theory
NASA Astrophysics Data System (ADS)
junwu, Ding; dongtao, Yang; zhenqiang, Bao
To exactly and effectively capture customer requirements information, a new customer requirements capturing modeling method was proposed. Based on the analysis of function requirement models of previous products and the application of technology system evolution laws of the Theory of Innovative Problem Solving (TRIZ), the customer requirements could be evolved from existing product designs, through modifying the functional requirement unit and confirming the direction of evolution design. Finally, a case study was provided to illustrate the feasibility of the proposed approach.
1998-04-28
be discussed. 2.1 ECONOMIC REPLACEMENT THEORY Decisions about heavy equipment should be made based on sound economic principles , not emotions...Life) will be less than L*. The converse is also true. 2.1.3 The Repair Limit Theory A different way of looking at the economic replacement decision...Summary Three different economic models have been reviewed in this section. The output of each is distinct. One seeks to minimize costs, one seeks to
Modelling Bathymetric Control of Near Coastal Wave Climate: Report 3
1992-02-01
complexity would occur if we were to make the full set of restrictions appropriate to the parabolic approximation of the KP equation ( Kadomtsev ... Kadomtsev , B.B. and Petviashvili , V.I., 1970, "On the stability of solitary waves in weakly dispersing media", Soy. Phys. Dokl., 15, 539-541. 24 Kirby...bar theory. Theory for Small Amplitude Bars The theory which provides the framework for analysis here is given by an extended mild-slope equation
The current status of REH theory. [Random Evolutionary Hits in biological molecular evolution
NASA Technical Reports Server (NTRS)
Holmquist, R.; Jukes, T. H.
1981-01-01
A response is made to the evaluation of Fitch (1980) of REH (random evolutionary hits) theory for the evolutionary divergence of proteins and nucleic acids. Correct calculations for the beta hemoglobin mRNAs of the human, mouse and rabbit in the absence and presence of selective constraints are summarized, and it is shown that the alternative evolutionary analysis of Fitch underestimates the total fixed mutations. It is further shown that the model used by Fitch to test for the completeness of the count of total base substitutions is in fact a variant of REH theory. Considerations of the variance inherent in evolutionary estimations are also presented which show the REH model to produce no more variance than other evolutionary models. In the reply, it is argued that, despite the objections raised, REH theory applied to proteins gives inaccurate estimates of total gene substitutions. It is further contended that REH theory developed for nucleic sequences suffers from problems relating to the frequency of nucleotide substitutions, the identity of the codons accepting silent and amino acid-changing substitutions, and estimate uncertainties.
Application of grey system theory in telecare.
Huang, Jui-Chen
2011-05-01
As a superiority to conventional statistical models, grey models require only a limited amount of data to estimate the behaviour of unknown systems. Grey system theory can be used in the effective factor assessment, and used in large samples where data are not available or uncertain whether the data was representative. Therefore, the purpose of this study was to adopt grey system theory to discuss older adult users' opinions on the telecare and its effect on their quality of life. This study surveyed the older adult users of Taiwan as subjects. User perception of the telecare services was collected via face-to-face interview. The grey system theory was used to examine the model. The results showed that the overall living quality has the greatest effect on the perceived effects of the telecare on their quality of life, followed by the acquisition of information, accessibility of medical care services, and safety. This finding may serve as a reference to future studies and it also shows that the grey system theory is a feasible analysis method. Copyright © 2011 Elsevier Ltd. All rights reserved.
Vectorlike particles, Z‧ and Yukawa unification in F-theory inspired E6
NASA Astrophysics Data System (ADS)
Karozas, Athanasios; Leontaris, George K.; Shafi, Qaisar
2018-03-01
We explore the low energy implications of an F-theory inspired E6 model whose breaking yields, in addition to the MSSM gauge symmetry, a Z‧ gauge boson associated with a U (1) symmetry broken at the TeV scale. The zero mode spectrum of the effective low energy theory is derived from the decomposition of the 27 and 27 ‾ representations of E6 and we parametrise their multiplicities in terms of a minimum number of flux parameters. We perform a two-loop renormalisation group analysis of the gauge and Yukawa couplings of the effective theory model and estimate lower bounds on the new vectorlike particles predicted in the model. We compute the third generation Yukawa couplings in an F-theory context assuming an E8 point of enhancement and express our results in terms of the local flux densities associated with the gauge symmetry breaking. We find that their values are compatible with the ones computed by the renormalisation group equations, and we identify points in the parameter space of the flux densities where the t - b - τ Yukawa couplings unify.
NASA Astrophysics Data System (ADS)
Zhu, Yichao; Wei, Yihai; Guo, Xu
2017-12-01
In the present paper, the well-established Gurtin-Murdoch theory of surface elasticity (Gurtin and Murdoch, 1975, 1978) is revisited from an orbital-free density functional theory (OFDFT) perspective by taking the boundary layer into consideration. Our analysis indicates that firstly, the quantities introduced in the Gurtin-Murdoch theory of surface elasticity can all find their explicit expressions in the derived OFDFT-based theoretical model. Secondly, the derived expression for surface energy density captures a competition between the surface normal derivatives of the electron density and the electrostatic potential, which well rationalises the onset of signed elastic constants that are observed both experimentally and computationally. Thirdly, the established model naturally yields an inversely linear relationship between the materials surface stiffness and its size, which conforms to relevant findings in literature. Since the proposed OFDFT-based model is established under arbitrarily imposed boundary condition of electron density, electrostatic potential and external load, it also has the potential of being used to investigate the electro-mechanical behaviour of nanoscale materials manifesting surface effect.
Does theory influence the effectiveness of health behavior interventions? Meta-analysis.
Prestwich, Andrew; Sniehotta, Falko F; Whittington, Craig; Dombrowski, Stephan U; Rogers, Lizzie; Michie, Susan
2014-05-01
To systematically investigate the extent and type of theory use in physical activity and dietary interventions, as well as associations between extent and type of theory use with intervention effectiveness. An in-depth analysis of studies included in two systematic reviews of physical activity and healthy eating interventions (k = 190). Extent and type of theory use was assessed using the Theory Coding Scheme (TCS) and intervention effectiveness was calculated using Hedges's g. Metaregressions assessed the relationships between these measures. Fifty-six percent of interventions reported a theory base. Of these, 90% did not report links between all of their behavior change techniques (BCTs) with specific theoretical constructs and 91% did not report links between all the specified constructs with BCTs. The associations between a composite score or specific items on the TCS and intervention effectiveness were inconsistent. Interventions based on Social Cognitive Theory or the Transtheoretical Model were similarly effective and no more effective than interventions not reporting a theory base. The coding of theory in these studies suggested that theory was not often used extensively in the development of interventions. Moreover, the relationships between type of theory used and the extent of theory use with effectiveness were generally weak. The findings suggest that attempts to apply the two theories commonly used in this review more extensively are unlikely to increase intervention effectiveness. PsycINFO Database Record (c) 2014 APA, all rights reserved.
MAC/GMC 4.0 User's Manual: Example Problem Manual. Volume 3
NASA Technical Reports Server (NTRS)
Bednarcyk, Brett A.; Arnold, Steven M.
2002-01-01
This document is the third volume in the three volume set of User's Manuals for the Micromechanics Analysis Code with Generalized Method of Cells Version 4.0 (MAC/GMC 4.0). Volume 1 is the Theory Manual, Volume 2 is the Keywords Manual, and this document is the Example Problems Manual. MAC/GMC 4.0 is a composite material and laminate analysis software program developed at the NASA Glenn Research Center. It is based on the generalized method of cells (GMC) micromechanics theory, which provides access to the local stress and strain fields in the composite material. This access grants GMC the ability to accommodate arbitrary local models for inelastic material behavior and various types of damage and failure analysis. MAC/GMC 4.0 has been built around GMC to provide the theory with a user-friendly framework, along with a library of local inelastic, damage, and failure models. Further, application of simulated thermo-mechanical loading, generation of output results, and selection of architectures to represent the composite material, have been automated in MAC/GMC 4.0. Finally, classical lamination theory has been implemented within MAC/GMC 4.0 wherein GMC is used to model the composite material response of each ply. Consequently, the full range of GMC composite material capabilities is available for analysis of arbitrary laminate configurations as well. This volume provides in-depth descriptions of 43 example problems, which were specially designed to highlight many of the most important capabilities of the code. The actual input files associated with each example problem are distributed with the MAC/GMC 4.0 software; thus providing the user with a convenient starting point for their own specialized problems of interest.
Performance and state-space analyses of systems using Petri nets
NASA Technical Reports Server (NTRS)
Watson, James Francis, III
1992-01-01
The goal of any modeling methodology is to develop a mathematical description of a system that is accurate in its representation and also permits analysis of structural and/or performance properties. Inherently, trade-offs exist between the level detail in the model and the ease with which analysis can be performed. Petri nets (PN's), a highly graphical modeling methodology for Discrete Event Dynamic Systems, permit representation of shared resources, finite capacities, conflict, synchronization, concurrency, and timing between state changes. By restricting the state transition time delays to the family of exponential density functions, Markov chain analysis of performance problems is possible. One major drawback of PN's is the tendency for the state-space to grow rapidly (exponential complexity) compared to increases in the PN constructs. It is the state space, or the Markov chain obtained from it, that is needed in the solution of many problems. The theory of state-space size estimation for PN's is introduced. The problem of state-space size estimation is defined, its complexities are examined, and estimation algorithms are developed. Both top-down and bottom-up approaches are pursued, and the advantages and disadvantages of each are described. Additionally, the author's research in non-exponential transition modeling for PN's is discussed. An algorithm for approximating non-exponential transitions is developed. Since only basic PN constructs are used in the approximation, theory already developed for PN's remains applicable. Comparison to results from entropy theory show the transition performance is close to the theoretic optimum. Inclusion of non-exponential transition approximations improves performance results at the expense of increased state-space size. The state-space size estimation theory provides insight and algorithms for evaluating this trade-off.
Viscosity models for pure hydrocarbons at extreme conditions: A review and comparative study
Baled, Hseen O.; Gamwo, Isaac K.; Enick, Robert M.; ...
2018-01-12
Here, viscosity is a critical fundamental property required in many applications in the chemical and oil industries. In this review the performance of seven select viscosity models, representative of various predictive and correlative approaches, is discussed and evaluated by comparison to experimental data of 52 pure hydrocarbons including straight-chain alkanes, branched alkanes, cycloalkanes, and aromatics. This analysis considers viscosity data to extremely high-temperature, high-pressure conditions up to 573 K and 300 MPa. Unsatisfactory results are found, particularly at high pressures, with the Chung-Ajlan-Lee-Starling, Pedersen-Fredenslund, and Lohrenz-Bray-Clark models commonly used for oil reservoir simulation. If sufficient experimental viscosity data are readilymore » available to determine model-specific parameters, the free volume theory and the expanded fluid theory models provide generally comparable results that are superior to those obtained with the friction theory, particularly at pressures higher than 100 MPa. Otherwise, the entropy scaling method by Lötgering-Lin and Gross is recommended as the best predictive model.« less
Viscosity models for pure hydrocarbons at extreme conditions: A review and comparative study
DOE Office of Scientific and Technical Information (OSTI.GOV)
Baled, Hseen O.; Gamwo, Isaac K.; Enick, Robert M.
Here, viscosity is a critical fundamental property required in many applications in the chemical and oil industries. In this review the performance of seven select viscosity models, representative of various predictive and correlative approaches, is discussed and evaluated by comparison to experimental data of 52 pure hydrocarbons including straight-chain alkanes, branched alkanes, cycloalkanes, and aromatics. This analysis considers viscosity data to extremely high-temperature, high-pressure conditions up to 573 K and 300 MPa. Unsatisfactory results are found, particularly at high pressures, with the Chung-Ajlan-Lee-Starling, Pedersen-Fredenslund, and Lohrenz-Bray-Clark models commonly used for oil reservoir simulation. If sufficient experimental viscosity data are readilymore » available to determine model-specific parameters, the free volume theory and the expanded fluid theory models provide generally comparable results that are superior to those obtained with the friction theory, particularly at pressures higher than 100 MPa. Otherwise, the entropy scaling method by Lötgering-Lin and Gross is recommended as the best predictive model.« less
Windt, Jennifer M; Noreika, Valdas
2011-12-01
In this paper, we address the different ways in which dream research can contribute to interdisciplinary consciousness research. As a second global state of consciousness aside from wakefulness, dreaming is an important contrast condition for theories of waking consciousness. However, programmatic suggestions for integrating dreaming into broader theories of consciousness, for instance by regarding dreams as a model system of standard or pathological wake states, have not yielded straightforward results. We review existing proposals for using dreaming as a model system, taking into account concerns about the concept of modeling and the adequacy and practical feasibility of dreaming as a model system. We conclude that existing modeling approaches are premature and rely on controversial background assumptions. Instead, we suggest that contrastive analysis of dreaming and wakefulness presents a more promising strategy for integrating dreaming into a broader research context and solving many of the problems involved in the modeling approach. Copyright © 2010 Elsevier Inc. All rights reserved.
Quick, Brian L; Anker, Ashley E; Feeley, Thomas Hugh; Morgan, Susan E
2016-01-01
An inconsistency in the research indicates positive attitudes toward organ donation do not map reliably onto organ donor registrations. Various models have sought to explain this inconsistency and the current analysis formally compared three models: the Bystander Intervention Model (BIM), the Organ Donor Model (ODM), and Vested Interest Theory (VIT). Mature (N = 688) adults between the ages of 50 to 64 years completed surveys related to organ donation. Results revealed that VIT accounted for the most variance in organ donation registrations followed by the BIM and ODM. The discussion emphasizes the importance of employing theories to explain a phenomenon as well as the practical implications of the findings.
NASA Technical Reports Server (NTRS)
Diamante, J. M.; Englar, T. S., Jr.; Jazwinski, A. H.
1977-01-01
Estimation theory, which originated in guidance and control research, is applied to the analysis of air quality measurements and atmospheric dispersion models to provide reliable area-wide air quality estimates. A method for low dimensional modeling (in terms of the estimation state vector) of the instantaneous and time-average pollutant distributions is discussed. In particular, the fluctuating plume model of Gifford (1959) is extended to provide an expression for the instantaneous concentration due to an elevated point source. Individual models are also developed for all parameters in the instantaneous and the time-average plume equations, including the stochastic properties of the instantaneous fluctuating plume.
Wu, Jun; Li, Chengbing; Huo, Yueying
2014-01-01
Safety of dangerous goods transport is directly related to the operation safety of dangerous goods transport enterprise. Aiming at the problem of the high accident rate and large harm in dangerous goods logistics transportation, this paper took the group decision making problem based on integration and coordination thought into a multiagent multiobjective group decision making problem; a secondary decision model was established and applied to the safety assessment of dangerous goods transport enterprise. First of all, we used dynamic multivalue background and entropy theory building the first level multiobjective decision model. Secondly, experts were to empower according to the principle of clustering analysis, and combining with the relative entropy theory to establish a secondary rally optimization model based on relative entropy in group decision making, and discuss the solution of the model. Then, after investigation and analysis, we establish the dangerous goods transport enterprise safety evaluation index system. Finally, case analysis to five dangerous goods transport enterprises in the Inner Mongolia Autonomous Region validates the feasibility and effectiveness of this model for dangerous goods transport enterprise recognition, which provides vital decision making basis for recognizing the dangerous goods transport enterprises. PMID:25477954
Wu, Jun; Li, Chengbing; Huo, Yueying
2014-01-01
Safety of dangerous goods transport is directly related to the operation safety of dangerous goods transport enterprise. Aiming at the problem of the high accident rate and large harm in dangerous goods logistics transportation, this paper took the group decision making problem based on integration and coordination thought into a multiagent multiobjective group decision making problem; a secondary decision model was established and applied to the safety assessment of dangerous goods transport enterprise. First of all, we used dynamic multivalue background and entropy theory building the first level multiobjective decision model. Secondly, experts were to empower according to the principle of clustering analysis, and combining with the relative entropy theory to establish a secondary rally optimization model based on relative entropy in group decision making, and discuss the solution of the model. Then, after investigation and analysis, we establish the dangerous goods transport enterprise safety evaluation index system. Finally, case analysis to five dangerous goods transport enterprises in the Inner Mongolia Autonomous Region validates the feasibility and effectiveness of this model for dangerous goods transport enterprise recognition, which provides vital decision making basis for recognizing the dangerous goods transport enterprises.
A Model of Small Group Facilitator Competencies
ERIC Educational Resources Information Center
Kolb, Judith A.; Jin, Sungmi; Song, Ji Hoon
2008-01-01
This study used small group theory, quantitative and qualitative data collected from experienced practicing facilitators at three points of time, and a building block process of collection, analysis, further collection, and consolidation to develop a model of small group facilitator competencies. The proposed model has five components:…
Markov Random Fields, Stochastic Quantization and Image Analysis
1990-01-01
Markov random fields based on the lattice Z2 have been extensively used in image analysis in a Bayesian framework as a-priori models for the...of Image Analysis can be given some fundamental justification then there is a remarkable connection between Probabilistic Image Analysis , Statistical Mechanics and Lattice-based Euclidean Quantum Field Theory.
Techniques for forced response involving discrete nonlinearities. I - Theory. II - Applications
NASA Astrophysics Data System (ADS)
Avitabile, Peter; Callahan, John O.
Several new techniques developed for the forced response analysis of systems containing discrete nonlinear connection elements are presented and compared to the traditional methods. In particular, the techniques examined are the Equivalent Reduced Model Technique (ERMT), Modal Modification Response Technique (MMRT), and Component Element Method (CEM). The general theory of the techniques is presented, and applications are discussed with particular reference to the beam nonlinear system model using ERMT, MMRT, and CEM; frame nonlinear response using the three techniques; and comparison of the results obtained by using the ERMT, MMRT, and CEM models.
Theory and Circuit Model for Lossy Coaxial Transmission Line
DOE Office of Scientific and Technical Information (OSTI.GOV)
Genoni, T. C.; Anderson, C. N.; Clark, R. E.
2017-04-01
The theory of signal propagation in lossy coaxial transmission lines is revisited and new approximate analytic formulas for the line impedance and attenuation are derived. The accuracy of these formulas from DC to 100 GHz is demonstrated by comparison to numerical solutions of the exact field equations. Based on this analysis, a new circuit model is described which accurately reproduces the line response over the entire frequency range. Circuit model calculations are in excellent agreement with the numerical and analytic results, and with finite-difference-time-domain simulations which resolve the skindepths of the conducting walls.
NASA Astrophysics Data System (ADS)
Mariño, Marcos
2015-09-01
Preface; Part I. Instantons: 1. Instantons in quantum mechanics; 2. Unstable vacua in quantum field theory; 3. Large order behavior and Borel summability; 4. Non-perturbative aspects of Yang-Mills theories; 5. Instantons and fermions; Part II. Large N: 6. Sigma models at large N; 7. The 1=N expansion in QCD; 8. Matrix models and matrix quantum mechanics at large N; 9. Large N QCD in two dimensions; 10. Instantons at large N; Appendix A. Harmonic analysis on S3; Appendix B. Heat kernel and zeta functions; Appendix C. Effective action for large N sigma models; References; Author index; Subject index.
Shedge, Sapana V; Zhou, Xiuwen; Wesolowski, Tomasz A
2014-09-01
Recent application of the Frozen-Density Embedding Theory based continuum model of the solvent, which is used for calculating solvatochromic shifts in the UV/Vis range, are reviewed. In this model, the solvent is represented as a non-uniform continuum taking into account both the statistical nature of the solvent and specific solute-solvent interactions. It offers, therefore, a computationally attractive alternative to methods in which the solvent is described at atomistic level. The evaluation of the solvatochromic shift involves only two calculations of excitation energy instead of at least hundreds needed to account for inhomogeneous broadening. The present review provides a detailed graphical analysis of the key quantities of this model: the average charge density of the solvent (<ρB>) and the corresponding Frozen-Density Embedding Theory derived embedding potential for coumarin 153.
Dolman, M; Chase, J
1996-08-01
A small-scale study was undertaken to test the relative predictive power of the Health Belief Model and Subjective Expected Utility Theory for the uptake of a behaviour (pelvic floor exercises) to reduce post-partum urinary incontinence in primigravida females. A structured questionnaire was used to gather data relevant to both models from a sample antenatal and postnatal primigravida women. Questions examined the perceived probability of becoming incontinent, the perceived (dis)utility of incontinence, the perceived probability of pelvic floor exercises preventing future urinary incontinence, the costs and benefits of performing pelvic floor exercises and sources of information and knowledge about incontinence. Multiple regression analysis focused on whether or not respondents intended to perform pelvic floor exercises and the factors influencing their decisions. Aggregated data were analysed to compare the Health Belief Model and Subjective Expected Utility Theory directly.
A Design Quality Learning Unit in Relational Data Modeling Based on Thriving Systems Properties
ERIC Educational Resources Information Center
Waguespack, Leslie J.
2013-01-01
This paper presents a learning unit that addresses quality design in relational data models. The focus on modeling allows the learning to span analysis, design, and implementation enriching pedagogy across the systems development life cycle. Thriving Systems Theory presents fifteen choice properties that convey design quality in models integrating…
NASA Astrophysics Data System (ADS)
Cheng, Fen; Hu, Wanxin
2017-05-01
Based on analysis of the impact of the experience of parking policy at home and abroad, design the impact analysis process of parking strategy. First, using group decision theory to create a parking strategy index system and calculate its weight. Index system includes government, parking operators and travelers. Then, use a multi-level extension theory to analyze the CBD parking strategy. Assess the parking strategy by calculating the correlation of each indicator. Finally, assess the strategy of parking charges through a case. Provide a scientific and reasonable basis for assessing parking strategy. The results showed that the model can effectively analyze multi-target, multi-property parking policy evaluation.
Toward a Qualitative Analysis of Standardized Tests Using an Information Processing Model.
ERIC Educational Resources Information Center
Armour-Thomas, Eleanor
The use of standardized tests and test data to detect and address differences in cognitive styles is advocated here. To this end, the paper describes the componential theory of intelligence addressed by Sternberg et. al. This theory defines the components of intelligence by function and level of generality, including: (1) metacomponents: higher…
ERIC Educational Resources Information Center
Cooper, Richard P.
2007-01-01
It has been suggested that the enterprise of developing mechanistic theories of the human cognitive architecture is flawed because the theories produced are not directly falsifiable. Newell attempted to sidestep this criticism by arguing for a Lakatosian model of scientific progress in which cognitive architectures should be understood as theories…
ERIC Educational Resources Information Center
MacMillan, Peter D.
2000-01-01
Compared classical test theory (CTT), generalizability theory (GT), and multifaceted Rasch model (MFRM) approaches to detecting and correcting for rater variability using responses of 4,930 high school students graded by 3 raters on 9 scales. The MFRM approach identified far more raters as different than did the CTT analysis. GT and Rasch…
ERIC Educational Resources Information Center
Huang, Jie-Tsuen; Hsieh, Hui-Hsien
2011-01-01
The purpose of this study was to investigate the contributions of socioeconomic status (SES) in predicting social cognitive career theory (SCCT) factors. Data were collected from 738 college students in Taiwan. The results of the partial least squares (PLS) analyses indicated that SES significantly predicted career decision self-efficacy (CDSE);…
Deary, Lauri; Roche, Joan; Plotkin, Karen; Zahourek, Rothlyn
2011-01-01
Hatha yoga increases self-awareness and well-being. Intentionality is creating motivation and then action. This qualitative study explored intentionality during hatha yoga sessions using narrative analysis. The results supported and expanded Zahourek's theory of intentionality, the matrix of healing, and provide new insights into intentionality in healing.
ERIC Educational Resources Information Center
Westling Allodi, Mara
2007-01-01
The Goals, Attitudes and Values in School (GAVIS) questionnaire was developed on the basis of theoretical frameworks concerning learning environments, universal human values and studies of students' experience of learning environments. The theory hypothesises that learning environments can be described and structured in a circumplex model using…
Using the PLUM procedure of SPSS to fit unequal variance and generalized signal detection models.
DeCarlo, Lawrence T
2003-02-01
The recent addition of aprocedure in SPSS for the analysis of ordinal regression models offers a simple means for researchers to fit the unequal variance normal signal detection model and other extended signal detection models. The present article shows how to implement the analysis and how to interpret the SPSS output. Examples of fitting the unequal variance normal model and other generalized signal detection models are given. The approach offers a convenient means for applying signal detection theory to a variety of research.
NASA Astrophysics Data System (ADS)
Boukarm, Riadh; Houam, Abdelkader; Fredj, Mohammed; Boucif, Rima
2017-12-01
The aim of our work is to check the stability during excavation tunnel work in the rock mass of Kherrata, connecting the cities of Bejaia to Setif. The characterization methods through the Q system (method of Barton), RMR (Bieniawski classification) allowed us to conclude that the quality of rock mass is average in limestone, and poor in fractured limestone. Then modelling of excavation phase using the theory of blocks method (Software UNWEDGE) with the parameters from the recommendations of classification allowed us to check stability and to finally conclude that the use of geomechanical classification and the theory of blocks can be considered reliable in preliminary design.
Farno, E; Coventry, K; Slatter, P; Eshtiaghi, N
2018-06-15
Sludge pumps in wastewater treatment plants are often oversized due to uncertainty in calculation of pressure drop. This issue costs millions of dollars for industry to purchase and operate the oversized pumps. Besides costs, higher electricity consumption is associated with extra CO 2 emission which creates huge environmental impacts. Calculation of pressure drop via current pipe flow theory requires model estimation of flow curve data which depends on regression analysis and also varies with natural variation of rheological data. This study investigates impact of variation of rheological data and regression analysis on variation of pressure drop calculated via current pipe flow theories. Results compare the variation of calculated pressure drop between different models and regression methods and suggest on the suitability of each method. Copyright © 2018 Elsevier Ltd. All rights reserved.
Advanced statistical energy analysis
NASA Astrophysics Data System (ADS)
Heron, K. H.
1994-09-01
A high-frequency theory (advanced statistical energy analysis (ASEA)) is developed which takes account of the mechanism of tunnelling and uses a ray theory approach to track the power flowing around a plate or a beam network and then uses statistical energy analysis (SEA) to take care of any residual power. ASEA divides the energy of each sub-system into energy that is freely available for transfer to other sub-systems and energy that is fixed within the sub-systems that are physically separate and can be interpreted as a series of mathematical models, the first of which is identical to standard SEA and subsequent higher order models are convergent on an accurate prediction. Using a structural assembly of six rods as an example, ASEA is shown to converge onto the exact results while SEA is shown to overpredict by up to 60 dB.
Doshmangir, P; Jahangiry, L; Farhangi, M A; Doshmangir, L; Faraji, L
2018-02-01
The prevalence of type 2 diabetes is rising rapidly around the world. A number of systematic reviews have provided evidence for the effectiveness of lifestyle interventions on diabetic patients. The effectiveness of theory- and model-based education-lifestyle interventions for diabetic patients are unclear. The systematic review and meta-analysis aimed to evaluate and quantify the impact of theory-based lifestyle interventions on type 2 diabetes. A literature search of authentic electronic resources including PubMed, Scopus, and Cochrane collaboration was performed to identify published papers between January 2002 and July 2016. The PICOs (participants, intervention, comparison, and outcomes) elements were used for the selection of studies to meet the inclusion and exclusion criteria. Mean differences and standard deviations of hemoglobin A1c (HbA1c [mmol/mol]) level in baseline and follow-up measures of studies in intervention and control groups were considered for data synthesis. A random-effects model was used for estimating pooled effect sizes. To investigate the source of heterogeneity, predefined subgroup analyses were performed using trial duration, baseline HbA1c (mmol/mol) level, and the age of participants. Meta-regression was performed to examine the contribution of trial duration, baseline HbA1c (mmol/mol) level, the age of participants, and mean differences of HbA1c (mmol/mol) level. The significant level was considered P < 0.05. Eighteen studies with 2384 participants met the inclusion criteria. The pooled main outcomes by random-effects model showed significant improvements in HbA1c (mmol/mol) -5.35% (95% confidence interval = -6.3, -4.40; P < 0.001) with the evidence of heterogeneity across studies. The findings of this meta-analysis suggest that theory- and model-based lifestyle interventions have positive effects on HbA1c (mmol/mol) indices in patients with type 2 diabetes. Health education theories have been applied as a useful tool for lifestyle change among people with type 2 diabetes. Copyright © 2017 The Royal Society for Public Health. Published by Elsevier Ltd. All rights reserved.
Anchor Selection Strategies for DIF Analysis: Review, Assessment, and New Approaches
ERIC Educational Resources Information Center
Kopf, Julia; Zeileis, Achim; Strobl, Carolin
2015-01-01
Differential item functioning (DIF) indicates the violation of the invariance assumption, for instance, in models based on item response theory (IRT). For item-wise DIF analysis using IRT, a common metric for the item parameters of the groups that are to be compared (e.g., for the reference and the focal group) is necessary. In the Rasch model,…
Latent Model Analysis of Substance Use and HIV Risk Behaviors among High-Risk Minority Adults
ERIC Educational Resources Information Center
Wang, Min Qi; Matthew, Resa F.; Chiu, Yu-Wen; Yan, Fang; Bellamy, Nikki D.
2007-01-01
Objectives: This study evaluated substance use and HIV risk profile using a latent model analysis based on ecological theory, inclusive of a risk and protective factor framework, in sexually active minority adults (N=1,056) who participated in a federally funded substance abuse and HIV prevention health initiative from 2002 to 2006. Methods: Data…
ERIC Educational Resources Information Center
Goltz, Sonia M.
2013-01-01
In the present analysis the author utilizes the groups as patches model (Goltz, 2009, 2010) to extend fairness heuristic theory (Lind, 2001) in which the concept of fairness is thought to be a heuristic that allows individuals to match responses to consequences they receive from groups. In this model, individuals who are reviewing possible groups…
A demographic-economic explanation of political stability: Mauritius as a microcosm.
Lempert, D
1987-06-01
"This paper examines current models of economic and political development--social modernization theory, political and economic characteristics of stable regimes, and cross country analysis of political stability--and tests them on the Indian Ocean Island of Mauritius. The analysis continues with a causal explanation for political stability in Mauritius' recent history, derived from an examination of economic policies and demographic patterns. Political change in Mauritius over the past sixty years seems to be explained best by a model for political stability which integrates specific economic and demographic factors. The model, applicable to development in other third world nations, revises Malthus' conclusion that population and economic conditions move in an oscillatory relationship and replaces it with a more comprehensive theory, suggesting that political stability is a function of both economic development and a repeating cyclical relationship between economics and population." excerpt
Electromagnetic game modeling through Tensor Analysis of Networks and Game Theory
NASA Astrophysics Data System (ADS)
Maurice, Olivier; Reineix, Alain; Lalléchère, Sébastien
2014-10-01
A complex system involves events coming from natural behaviors. Whatever is the complicated face of machines, they are still far from the complexity of natural systems. Currently, economy is one of the rare science trying to find out some ways to model human behavior. These attempts involve game theory and psychology. Our purpose is to develop a formalism able to take in charge both game and hardware modeling. We first present the Tensorial Analysis of Networks, used for the material part of the system. Then, we detail the mathematical objects defined in order to describe the evolution of the system and its gaming side. To illustrate the discussion we consider the case of a drone whose electronic can be disturbed by a radar field, but this drone must fly as near as possible close to this radar.
Hamiltonian Effective Field Theory Study of the N^{*}(1535) Resonance in Lattice QCD.
Liu, Zhan-Wei; Kamleh, Waseem; Leinweber, Derek B; Stokes, Finn M; Thomas, Anthony W; Wu, Jia-Jun
2016-02-26
Drawing on experimental data for baryon resonances, Hamiltonian effective field theory (HEFT) is used to predict the positions of the finite-volume energy levels to be observed in lattice QCD simulations of the lowest-lying J^{P}=1/2^{-} nucleon excitation. In the initial analysis, the phenomenological parameters of the Hamiltonian model are constrained by experiment and the finite-volume eigenstate energies are a prediction of the model. The agreement between HEFT predictions and lattice QCD results obtained on volumes with spatial lengths of 2 and 3 fm is excellent. These lattice results also admit a more conventional analysis where the low-energy coefficients are constrained by lattice QCD results, enabling a determination of resonance properties from lattice QCD itself. Finally, the role and importance of various components of the Hamiltonian model are examined.
Time Factor in the Theory of Anthropogenic Risk Prediction in Complex Dynamic Systems
NASA Astrophysics Data System (ADS)
Ostreikovsky, V. A.; Shevchenko, Ye N.; Yurkov, N. K.; Kochegarov, I. I.; Grishko, A. K.
2018-01-01
The article overviews the anthropogenic risk models that take into consideration the development of different factors in time that influence the complex system. Three classes of mathematical models have been analyzed for the use in assessing the anthropogenic risk of complex dynamic systems. These models take into consideration time factor in determining the prospect of safety change of critical systems. The originality of the study is in the analysis of five time postulates in the theory of anthropogenic risk and the safety of highly important objects. It has to be stressed that the given postulates are still rarely used in practical assessment of equipment service life of critically important systems. That is why, the results of study presented in the article can be used in safety engineering and analysis of critically important complex technical systems.
Theoretical analysis of impact in composite plates
NASA Technical Reports Server (NTRS)
Moon, F. C.
1973-01-01
The calculated stresses and displacements induced anisotropic plates by short duration impact forces are presented. The theoretical model attempts to model the response of fiber composite turbine fan blades to impact by foreign objects such as stones and hailstones. In this model the determination of the impact force uses the Hertz impact theory. The plate response treats the laminated blade as an equivalent anisotropic material using a form of Mindlin's theory for crystal plates. The analysis makes use of a computational tool called the fast Fourier transform. Results are presented in the form of stress contour plots in the plane of the plate for various times after impact. Examination of the maximum stresses due to impact versus ply layup angle reveals that the + or - 15 deg layup angle gives lower flexural stresses than 0 deg, + or - 30 deg and + or - 45 deg. cases.
QTest: Quantitative Testing of Theories of Binary Choice
Regenwetter, Michel; Davis-Stober, Clintin P.; Lim, Shiau Hong; Guo, Ying; Popova, Anna; Zwilling, Chris; Cha, Yun-Shil; Messner, William
2014-01-01
The goal of this paper is to make modeling and quantitative testing accessible to behavioral decision researchers interested in substantive questions. We provide a novel, rigorous, yet very general, quantitative diagnostic framework for testing theories of binary choice. This permits the nontechnical scholar to proceed far beyond traditionally rather superficial methods of analysis, and it permits the quantitatively savvy scholar to triage theoretical proposals before investing effort into complex and specialized quantitative analyses. Our theoretical framework links static algebraic decision theory with observed variability in behavioral binary choice data. The paper is supplemented with a custom-designed public-domain statistical analysis package, the QTest software. We illustrate our approach with a quantitative analysis using published laboratory data, including tests of novel versions of “Random Cumulative Prospect Theory.” A major asset of the approach is the potential to distinguish decision makers who have a fixed preference and commit errors in observed choices from decision makers who waver in their preferences. PMID:24999495
Four photon parametric amplification. [in unbiased Josephson junction
NASA Technical Reports Server (NTRS)
Parrish, P. T.; Feldman, M. J.; Ohta, H.; Chiao, R. Y.
1974-01-01
An analysis is presented describing four-photon parametric amplification in an unbiased Josephson junction. Central to the theory is the model of the Josephson effect as a nonlinear inductance. Linear, small signal analysis is applied to the two-fluid model of the Josephson junction. The gain, gain-bandwidth product, high frequency limit, and effective noise temperature are calculated for a cavity reflection amplifier. The analysis is extended to multiple (series-connected) junctions and subharmonic pumping.
A Study on Urban Road Traffic Safety Based on Matter Element Analysis
Hu, Qizhou; Zhou, Zhuping; Sun, Xu
2014-01-01
This paper examines a new evaluation of urban road traffic safety based on a matter element analysis, avoiding the difficulties found in other traffic safety evaluations. The issue of urban road traffic safety has been investigated through the matter element analysis theory. The chief aim of the present work is to investigate the features of urban road traffic safety. Emphasis was placed on the construction of a criterion function by which traffic safety achieved a hierarchical system of objectives to be evaluated. The matter element analysis theory was used to create the comprehensive appraisal model of urban road traffic safety. The technique was used to employ a newly developed and versatile matter element analysis algorithm. The matter element matrix solves the uncertainty and incompatibility of the evaluated factors used to assess urban road traffic safety. The application results showed the superiority of the evaluation model and a didactic example was included to illustrate the computational procedure. PMID:25587267
Equilibria of perceptrons for simple contingency problems.
Dawson, Michael R W; Dupuis, Brian
2012-08-01
The contingency between cues and outcomes is fundamentally important to theories of causal reasoning and to theories of associative learning. Researchers have computed the equilibria of Rescorla-Wagner models for a variety of contingency problems, and have used these equilibria to identify situations in which the Rescorla-Wagner model is consistent, or inconsistent, with normative models of contingency. Mathematical analyses that directly compare artificial neural networks to contingency theory have not been performed, because of the assumed equivalence between the Rescorla-Wagner learning rule and the delta rule training of artificial neural networks. However, recent results indicate that this equivalence is not as straightforward as typically assumed, suggesting a strong need for mathematical accounts of how networks deal with contingency problems. One such analysis is presented here, where it is proven that the structure of the equilibrium for a simple network trained on a basic contingency problem is quite different from the structure of the equilibrium for a Rescorla-Wagner model faced with the same problem. However, these structural differences lead to functionally equivalent behavior. The implications of this result for the relationships between associative learning, contingency theory, and connectionism are discussed.
The theory of planned behaviour: reactions and reflections.
Ajzen, Icek
2011-09-01
The seven articles in this issue, and the accompanying meta-analysis in Health Psychology Review [McEachan, R.R.C., Conner, M., Taylor, N., & Lawton, R.J. (2011). Prospective prediction of health-related behaviors with the theory of planned behavior: A meta-analysis. Health Psychology Review, 5, 97-144], illustrate the wide application of the theory of planned behaviour [Ajzen, I. (1991). The theory of planned behavior. Organizational Behavior and Human Decision Processes, 50, 179-211] in the health domain. In this editorial, Ajzen reflects on some of the issues raised by the different authors. Among the topics addressed are the nature of intentions and the limits of predictive validity; rationality, affect and emotions; past behaviour and habit; the prototype/willingness model; and the role of such background factors as the big five personality traits and social comparison tendency.
Hsu, Sze-Bi; Yang, Ya-Tang
2016-04-01
We present the theory of a microfluidic bioreactor with a two-compartment growth chamber and periodic serial dilution. In the model, coexisting planktonic and biofilm populations exchange by adsorption and detachment. The criteria for coexistence and global extinction are determined by stability analysis of the global extinction state. Stability analysis yields the operating diagram in terms of the dilution and removal ratios, constrained by the plumbing action of the bioreactor. The special case of equal uptake function and logistic growth is analytically solved and explicit growth curves are plotted. The presented theory is applicable to generic microfluidic bioreactors with discrete growth chambers and periodic dilution at discrete time points. Therefore, the theory is expected to assist the design of microfluidic devices for investigating microbial competition and microbial biofilm growth under serial dilution conditions.
Analysis of general power counting rules in effective field theory
Gavela, Belen; Jenkins, Elizabeth E.; Manohar, Aneesh V.; ...
2016-09-02
We derive the general counting rules for a quantum effective field theory (EFT) in d dimensions. The rules are valid for strongly and weakly coupled theories, and they predict that all kinetic energy terms are canonically normalized. They determine the energy dependence of scattering cross sections in the range of validity of the EFT expansion. We show that the size of the cross sections is controlled by the Λ power counting of EFT, not by chiral counting, even for chiral perturbation theory (χPT). The relation between Λ and f is generalized to d dimensions. We show that the naive dimensionalmore » analysis 4π counting is related to ℏ counting. The EFT counting rules are applied to χPT, low-energy weak interactions, Standard Model EFT and the non-trivial case of Higgs EFT.« less
Samuel, Douglas B.; Widiger, Thomas A.
2008-01-01
Theory and research have suggested that the personality disorders contained within the American Psychiatric Association's Diagnostic and Statistical Manual of Mental Disorders (DSM-IV-TR) can be understood as maladaptive variants of the personality traits included within the five-factor model (FFM). The current meta-analysis of FFM personality disorder research both replicated and extended the 2004 work of Saulsman and Page (The five-factor model and personality disorder empirical literature: A meta-analytic review. Clinical Psychology Review, 23, 1055-1085) through a facet-level analysis that provides a more specific and nuanced description of each DSM-IV-TR personality disorder. The empirical FFM profiles generated for each personality disorder were generally congruent at the facet level with hypothesized FFM translations of the DSM-IV-TR personality disorders. However, notable exceptions to the hypotheses did occur and even some findings that were consistent with FFM theory could be said to be instrument specific. PMID:18708274
Efficient Computation of Info-Gap Robustness for Finite Element Models
DOE Office of Scientific and Technical Information (OSTI.GOV)
Stull, Christopher J.; Hemez, Francois M.; Williams, Brian J.
2012-07-05
A recent research effort at LANL proposed info-gap decision theory as a framework by which to measure the predictive maturity of numerical models. Info-gap theory explores the trade-offs between accuracy, that is, the extent to which predictions reproduce the physical measurements, and robustness, that is, the extent to which predictions are insensitive to modeling assumptions. Both accuracy and robustness are necessary to demonstrate predictive maturity. However, conducting an info-gap analysis can present a formidable challenge, from the standpoint of the required computational resources. This is because a robustness function requires the resolution of multiple optimization problems. This report offers anmore » alternative, adjoint methodology to assess the info-gap robustness of Ax = b-like numerical models solved for a solution x. Two situations that can arise in structural analysis and design are briefly described and contextualized within the info-gap decision theory framework. The treatments of the info-gap problems, using the adjoint methodology are outlined in detail, and the latter problem is solved for four separate finite element models. As compared to statistical sampling, the proposed methodology offers highly accurate approximations of info-gap robustness functions for the finite element models considered in the report, at a small fraction of the computational cost. It is noted that this report considers only linear systems; a natural follow-on study would extend the methodologies described herein to include nonlinear systems.« less
NASA Technical Reports Server (NTRS)
Schmid, R. M.
1973-01-01
The vestibulo-ocular system is examined from the standpoint of system theory. The evolution of a mathematical model of the vestibulo-ocular system in an attempt to match more and more experimental data is followed step by step. The final model explains many characteristics of the eye movement in vestibularly induced nystagmus. The analysis of the dynamic behavior of the model at the different stages of its development is illustrated in time domain, mainly in a qualitative way.
Implicit theories of a desire for fame.
Maltby, John; Day, Liz; Giles, David; Gillett, Raphael; Quick, Marianne; Langcaster-James, Honey; Linley, P Alex
2008-05-01
The aim of the present studies was to generate implicit theories of a desire for fame among the general population. In Study 1, we were able to develop a nine-factor analytic model of conceptions of the desire to be famous that initially comprised nine separate factors; ambition, meaning derived through comparison with others, psychologically vulnerable, attention seeking, conceitedness, social access, altruistic, positive affect, and glamour. Analysis that sought to examine replicability among these factors suggested that three factors (altruistic, positive affect, and glamour) neither display factor congruence nor display adequate internal reliability. A second study examined the validity of these factors in predicting profiles of individuals who may desire fame. The findings from this study suggested that two of the nine factors (positive affect and altruism) could not be considered strong factors within the model. Overall, the findings suggest that implicit theories of a desire for fame comprise six factors. The discussion focuses on how an implicit model of a desire for fame might progress into formal theories of a desire for fame.
Induced Power of the Helicopter Rotor
NASA Technical Reports Server (NTRS)
Ormiston, Robert A.
2004-01-01
A simplified rotor model was used to explore fundamental behavior of lifting rotor induced power at moderate and high advance ratios. Several rotor inflow theories, including dynamic inflow theory and prescribed-wake vortex theory, together with idealized notional airfoil stall models were employed. A number of unusual results were encountered at high advance ratios including trim control reversal and multiple trim solutions. Significant increases in rotor induced power (torque) above the ideal minimum were observed for moderately high advance ratio. Very high induced power was observed near and above unity advance ratio. The results were sensitive to the stall characteristics of the airfoil models used. An equivalent wing analysis was developed to determine induced power from Prandtl lifting line theory and help interpret the rotor induced power behavior in terms of the spanwise airload distribution. The equivalent wing approach was successful in capturing the principal variations of induced power for different configurations and operating conditions. The effects blade root cutout were found to have a significant effect on rotor trim and induced power at high advance ratios.
Konik, R. M.; Palmai, T.; Takacs, G.; ...
2015-08-24
We study the SU(2) k Wess-Zumino-Novikov-Witten (WZNW) theory perturbed by the trace of the primary field in the adjoint representation, a theory governing the low-energy behaviour of a class of strongly correlated electronic systems. While the model is non-integrable, its dynamics can be investigated using the numerical technique of the truncated conformal spectrum approach combined with numerical and analytical renormalization groups (TCSA+RG). The numerical results so obtained provide support for a semiclassical analysis valid at k » 1. Namely, we find that the low energy behavior is sensitive to the sign of the coupling constant, λ. Moreover for λ >more » 0 this behavior depends on whether k is even or odd. With k even, we find definitive evidence that the model at low energies is equivalent to the massive O(3) sigma model. For k odd, the numerical evidence is more equivocal, but we find indications that the low energy effective theory is critical.« less
Analysis of an unswept propfan blade with a semiempirical dynamic stall model
NASA Technical Reports Server (NTRS)
Reddy, T. S. R.; Kaza, K. R. V.
1989-01-01
The time history response of a propfan wind tunnel model with dynamic stall is studied analytically. The response obtained from the analysis is compared with available experimental data. The governing equations of motion are formulated in terms of blade normal modes which are calculated using the COSMIC-NASTRAN computer code. The response analysis considered the blade plunging and pitching motions. The lift, drag and moment coefficients for angles of attack below the static stall angle are obtained from a quasi-steady theory. For angles above static stall angles, a semiempirical dynamic stall model based on a correction to angle of attack is used to obtain lift, drag and moment coefficients. Using these coefficients, the aerodynamic forces are calculated at a selected number of strips, and integrated to obtain the total generalized forces. The combined momentum-blade element theory is used to calculate the induced velocity. The semiempirical stall model predicted a limit cycle oscillation near the setting angle at which large vibratory stresses were observed in an experiment. The predicted mode and frequency of oscillation also agreed with those measured in the experiment near the setting angle.
NASA Astrophysics Data System (ADS)
Chen, Y. J.; Scarpa, F.; Farrow, I. R.; Liu, Y. J.; Leng, J. S.
2013-04-01
This paper describes the manufacturing, characterization and parametric modeling of a novel fiber-reinforced composite flexible skin with in-plane negative Poisson’s ratio (auxetic) behavior. The elastic mechanical performance of the auxetic skin is evaluated using a three-dimensional analytical model based on the classical laminate theory (CLT) and Sun’s thick laminate theory. Good agreement is observed between in-plane Poisson’s ratios and Young’s moduli of the composite skin obtained by the theoretical model and the experimental results. A parametric analysis carried out with the validated model shows that significant changes in the in-plane negative Poisson’s ratio can be achieved through different combinations of matrix and fiber materials and stacking sequences. It is also possible to identify fiber-reinforced composite skin configurations with the same in-plane auxeticity but different orthotropic stiffness performance, or the same orthotropic stiffness performance but different in-plane auxeticity. The analysis presented in this work provides useful guidelines to develop and manufacture flexible skins with negative Poisson’s ratio for applications focused on morphing aircraft wing designs.
Spectral stability of unitary network models
NASA Astrophysics Data System (ADS)
Asch, Joachim; Bourget, Olivier; Joye, Alain
2015-08-01
We review various unitary network models used in quantum computing, spectral analysis or condensed matter physics and establish relationships between them. We show that symmetric one-dimensional quantum walks are universal, as are CMV matrices. We prove spectral stability and propagation properties for general asymptotically uniform models by means of unitary Mourre theory.
An Aggregate IRT Procedure for Exploratory Factor Analysis
ERIC Educational Resources Information Center
Camilli, Gregory; Fox, Jean-Paul
2015-01-01
An aggregation strategy is proposed to potentially address practical limitation related to computing resources for two-level multidimensional item response theory (MIRT) models with large data sets. The aggregate model is derived by integration of the normal ogive model, and an adaptation of the stochastic approximation expectation maximization…
Averaging Models: Parameters Estimation with the R-Average Procedure
ERIC Educational Resources Information Center
Vidotto, G.; Massidda, D.; Noventa, S.
2010-01-01
The Functional Measurement approach, proposed within the theoretical framework of Information Integration Theory (Anderson, 1981, 1982), can be a useful multi-attribute analysis tool. Compared to the majority of statistical models, the averaging model can account for interaction effects without adding complexity. The R-Average method (Vidotto &…
Integrative Models in Environmental Planning and Policy Making.
ERIC Educational Resources Information Center
Kyler, David Clinton
1984-01-01
Discusses conceptual models of thought that have recently emerged to confront the conventional approaches to analysis and solution to complex environmental problems. In addition to a critical attack on the tradition of specialization and reductionism, several models are summarized that originated from ecology, cybernetics, and system theory. (BC)
Inverse Modelling Problems in Linear Algebra Undergraduate Courses
ERIC Educational Resources Information Center
Martinez-Luaces, Victor E.
2013-01-01
This paper will offer an analysis from a theoretical point of view of mathematical modelling, applications and inverse problems of both causation and specification types. Inverse modelling problems give the opportunity to establish connections between theory and practice and to show this fact, a simple linear algebra example in two different…
[Development of a program theory as a basis for the evaluation of a dementia special care unit].
Adlbrecht, Laura; Bartholomeyczik, Sabine; Mayer, Hanna
2018-06-01
Background: An existing dementia special care unit should be evaluated. In order to build a sound foundation of the evaluation a deep theoretical understanding of the implemented intervention is needed, which has not been explicated yet. One possibility to achieve this is the development of a program theory. Aim: The aim is to present a method to develop a program theory for the existing living and care concept of the dementia special care unit, which is used in a larger project to evaluate the concept theory-drivenly. Method: The evaluation is embedded in the framework of van Belle et al. (2010) and an action model and a change model (Chen, 2015) is created. For the specification of the change model the contribution analysis (Mayne, 2011) is applied. Data were collected in workshops with the developers and the nurses of the dementia special care unit and a literature research concerning interventions and outcomes was carried out. The results were synthesized in a consens workshop. Results: The action model describes the interventions of the dementia special care unit, the implementers, the organization and the context. The change model compromises the mechanisms through which interventions achieve outcomes. Conclusions: The results of the program theory can be employed to choose data collection methods and instruments for the evaluation. On the basis of the results of the evaluation the program theory can be refined and adapted.
Stenner, A Jackson; Fisher, William P; Stone, Mark H; Burdick, Donald S
2013-01-01
Rasch's unidimensional models for measurement show how to connect object measures (e.g., reader abilities), measurement mechanisms (e.g., machine-generated cloze reading items), and observational outcomes (e.g., counts correct on reading instruments). Substantive theory shows what interventions or manipulations to the measurement mechanism can be traded off against a change to the object measure to hold the observed outcome constant. A Rasch model integrated with a substantive theory dictates the form and substance of permissible interventions. Rasch analysis, absent construct theory and an associated specification equation, is a black box in which understanding may be more illusory than not. Finally, the quantitative hypothesis can be tested by comparing theory-based trade-off relations with observed trade-off relations. Only quantitative variables (as measured) support such trade-offs. Note that to test the quantitative hypothesis requires more than manipulation of the algebraic equivalencies in the Rasch model or descriptively fitting data to the model. A causal Rasch model involves experimental intervention/manipulation on either reader ability or text complexity or a conjoint intervention on both simultaneously to yield a successful prediction of the resultant observed outcome (count correct). We conjecture that when this type of manipulation is introduced for individual reader text encounters and model predictions are consistent with observations, the quantitative hypothesis is sustained.
Stenner, A. Jackson; Fisher, William P.; Stone, Mark H.; Burdick, Donald S.
2013-01-01
Rasch's unidimensional models for measurement show how to connect object measures (e.g., reader abilities), measurement mechanisms (e.g., machine-generated cloze reading items), and observational outcomes (e.g., counts correct on reading instruments). Substantive theory shows what interventions or manipulations to the measurement mechanism can be traded off against a change to the object measure to hold the observed outcome constant. A Rasch model integrated with a substantive theory dictates the form and substance of permissible interventions. Rasch analysis, absent construct theory and an associated specification equation, is a black box in which understanding may be more illusory than not. Finally, the quantitative hypothesis can be tested by comparing theory-based trade-off relations with observed trade-off relations. Only quantitative variables (as measured) support such trade-offs. Note that to test the quantitative hypothesis requires more than manipulation of the algebraic equivalencies in the Rasch model or descriptively fitting data to the model. A causal Rasch model involves experimental intervention/manipulation on either reader ability or text complexity or a conjoint intervention on both simultaneously to yield a successful prediction of the resultant observed outcome (count correct). We conjecture that when this type of manipulation is introduced for individual reader text encounters and model predictions are consistent with observations, the quantitative hypothesis is sustained. PMID:23986726
Kuhn's Ontological Relativism.
ERIC Educational Resources Information Center
Sankey, Howard
2000-01-01
Discusses Kuhn's model of scientific theory change. Documents Kuhn's move away from conceptual relativism and rational relativism. Provides an analysis of his present ontological form of relativism. (CCM)
A note on the theory of fast money flow dynamics
NASA Astrophysics Data System (ADS)
Sokolov, A.; Kieu, T.; Melatos, A.
2010-08-01
The gauge theory of arbitrage was introduced by Ilinski in [K. Ilinski, preprint arXiv:hep-th/9710148 (1997)] and applied to fast money flows in [A. Ilinskaia, K. Ilinski, preprint arXiv:cond-mat/9902044 (1999); K. Ilinski, Physics of finance: gauge modelling in non-equilibrium pricing (Wiley, 2001)]. The theory of fast money flow dynamics attempts to model the evolution of currency exchange rates and stock prices on short, e.g. intra-day, time scales. It has been used to explain some of the heuristic trading rules, known as technical analysis, that are used by professional traders in the equity and foreign exchange markets. A critique of some of the underlying assumptions of the gauge theory of arbitrage was presented by Sornette in [D. Sornette, Int. J. Mod. Phys. C 9, 505 (1998)]. In this paper, we present a critique of the theory of fast money flow dynamics, which was not examined by Sornette. We demonstrate that the choice of the input parameters used in [K. Ilinski, Physics of finance: gauge modelling in non-equilibrium pricing (Wiley, 2001)] results in sinusoidal oscillations of the exchange rate, in conflict with the results presented in [K. Ilinski, Physics of finance: gauge modelling in non-equilibrium pricing (Wiley, 2001)]. We also find that the dynamics predicted by the theory are generally unstable in most realistic situations, with the exchange rate tending to zero or infinity exponentially.
Testing adaptive toolbox models: a Bayesian hierarchical approach.
Scheibehenne, Benjamin; Rieskamp, Jörg; Wagenmakers, Eric-Jan
2013-01-01
Many theories of human cognition postulate that people are equipped with a repertoire of strategies to solve the tasks they face. This theoretical framework of a cognitive toolbox provides a plausible account of intra- and interindividual differences in human behavior. Unfortunately, it is often unclear how to rigorously test the toolbox framework. How can a toolbox model be quantitatively specified? How can the number of toolbox strategies be limited to prevent uncontrolled strategy sprawl? How can a toolbox model be formally tested against alternative theories? The authors show how these challenges can be met by using Bayesian inference techniques. By means of parameter recovery simulations and the analysis of empirical data across a variety of domains (i.e., judgment and decision making, children's cognitive development, function learning, and perceptual categorization), the authors illustrate how Bayesian inference techniques allow toolbox models to be quantitatively specified, strategy sprawl to be contained, and toolbox models to be rigorously tested against competing theories. The authors demonstrate that their approach applies at the individual level but can also be generalized to the group level with hierarchical Bayesian procedures. The suggested Bayesian inference techniques represent a theoretical and methodological advancement for toolbox theories of cognition and behavior.
NASA Astrophysics Data System (ADS)
Lotfy, Kh.
2017-07-01
The dual-phase-lag (DPL) model with two different time translations and Lord-Shulman (LS) theory with one relaxation time are applied to study the effect of hydrostatic initial stress on medium under the influence of two temperature parameter(a new model will be introduced using two temperature theory) and photothermal theory. We solved the thermal loading at the free surface in the semi-infinite semiconducting medium-coupled plasma waves with the effect of mechanical force during a photothermal process. The exact expressions of the considered variables are obtained using normal mode analysis also the two temperature coefficient ratios were obtained analytically. Numerical results for the field quantities are given in the physical domain and illustrated graphically under the effects of several parameters. Comparisons are made between the results of the two different models with and without two temperature parameter, and for two different values of the hydrostatic initial stress. A comparison is carried out between the considered variables as calculated from the generalized thermoelasticity based on the DPL model and the LS theory in the absence and presence of the thermoelastic and thermoelectric coupling parameters.
Kiviniemi, Marc T.; Bennett, Alyssa; Zaiter, Marie; Marshall, James R.
2010-01-01
Compliance with colorectal cancer screening recommendations requires considerable conscious effort on the part of the individual patient, making an individual's decisions about engagement in screening an important contributor to compliance or noncompliance. The objective of this paper was to examine the effectiveness of individual-level behavior theories and their associated constructs in accounting for engagement in colorectal cancer screening behavior. We reviewed the literature examining constructs from formal models of individual-level health behavior as factors associated with compliance with screening for colorectal cancer. All published studies examining one or more constructs from the health belief model, theory of planned behavior, transtheoretical model, or social cognitive theory and their relation to screening behavior or behavioral intentions were included in the analysis. By and large, results of studies supported the theory-based predictions for the influence of constructs on cancer screening behavior. However, the evidence base for many of these relations, especially for models other than the health belief model, is quite limited. Suggestions are made for future research on individual-level determinants of colorectal cancer screening. PMID:21954045
CAUSAL INFERENCE WITH A GRAPHICAL HIERARCHY OF INTERVENTIONS
Shpitser, Ilya; Tchetgen, Eric Tchetgen
2017-01-01
Identifying causal parameters from observational data is fraught with subtleties due to the issues of selection bias and confounding. In addition, more complex questions of interest, such as effects of treatment on the treated and mediated effects may not always be identified even in data where treatment assignment is known and under investigator control, or may be identified under one causal model but not another. Increasingly complex effects of interest, coupled with a diversity of causal models in use resulted in a fragmented view of identification. This fragmentation makes it unnecessarily difficult to determine if a given parameter is identified (and in what model), and what assumptions must hold for this to be the case. This, in turn, complicates the development of estimation theory and sensitivity analysis procedures. In this paper, we give a unifying view of a large class of causal effects of interest, including novel effects not previously considered, in terms of a hierarchy of interventions, and show that identification theory for this large class reduces to an identification theory of random variables under interventions from this hierarchy. Moreover, we show that one type of intervention in the hierarchy is naturally associated with queries identified under the Finest Fully Randomized Causally Interpretable Structure Tree Graph (FFRCISTG) model of Robins (via the extended g-formula), and another is naturally associated with queries identified under the Non-Parametric Structural Equation Model with Independent Errors (NPSEM-IE) of Pearl, via a more general functional we call the edge g-formula. Our results motivate the study of estimation theory for the edge g-formula, since we show it arises both in mediation analysis, and in settings where treatment assignment has unobserved causes, such as models associated with Pearl’s front-door criterion. PMID:28919652
Calder, Samuel Christian; Davidson, Graham R; Ho, Robert
2011-06-01
There has been limited research to date into methods for increasing people's intentions to use omega-3 polyunsaturated fatty acids (n-3 PUFA), which have been linked with decreased risk of suffering from numerous major diseases. The present study employed a cross-sectional design with 380 university students, employees, and visitors to investigate the efficacy of the protection motivation (PM) theory and the ordered protection motivation (OPM) theory, to predict behavioral intention to consume omega-3 rich foods and dietary supplements. Analysis of model fit indicated that both the PM model and the OPM model adequately represented the structural relationships between the cognitive variables and intention to consume n-3 PUFA. Further evaluation of relative fit of the two competing models suggested that the PM model might provide a better representation of decision-making following evaluation of the health threat of n-3 PUFA deficiency. Path analysis indicated that the component of coping appraisal was significantly associated with the behavioral intention to consume n-3 PUFA. Threat appraisal was found to be significantly associated with behavioral intention to consume n-3 PUFA only for the OPM model. Overall, the findings contribute to a better understanding of the roles that cognitive appraisal processes play in young and healthy individuals' protective health decision-making regarding consumption of n-3 PUFA. Implications of the findings and recommendations, which include (a) encouraging the consumption of n-3 PUFA as an effective barrier against the incidence of disease, and (b) effective health messaging that focuses on beliefs about the effectiveness of n-3 PUFA in reducing health risks, are discussed.
NASA Astrophysics Data System (ADS)
Pham, Tien Hung; Rühaak, Wolfram; Sass, Ingo
2017-04-01
Extensive groundwater extraction leads to a drawdown of the ground water table. Consequently, soil effective stress increases and can cause land subsidence. Analysis of land subsidence generally requires a numerical model based on poroelasticity theory, which was first proposed by Biot (1941). In the review of regional land subsidence accompanying groundwater extraction, Galloway and Burbey (2011) stated that more research and application is needed in coupling of stress-dependent land subsidence process. In geotechnical field, the constant rate of strain tests (CRS) was first introduced in 1969 (Smith and Wahls 1969) and was standardized in 1982 through the designation D4186-82 by American Society for Testing and Materials. From the reading values of CRS tests, the stress-dependent parameters of poroelasticity model can be calculated. So far, there is no research to link poroelasticity theory with CRS tests in modelling land subsidence due to groundwater extraction. One dimensional CRS tests using conventional compression cell and three dimension CRS tests using Rowe cell were performed. The tests were also modelled by using finite element method with mixed elements. Back analysis technique is used to find the suitable values of hydraulic conductivity and bulk modulus that depend on the stress or void ratio. Finally, the obtained results are used in land subsidence models. Biot, M. A. (1941). "General theory of three-dimensional consolidation." Journal of applied physics 12(2): 155-164. Galloway, D. L. and T. J. Burbey (2011). "Review: Regional land subsidence accompanying groundwater extraction." Hydrogeology Journal 19(8): 1459-1486. Smith, R. E. and H. E. Wahls (1969). "Consolidation under constant rates of strain." Journal of Soil Mechanics & Foundations Div.
CAUSAL INFERENCE WITH A GRAPHICAL HIERARCHY OF INTERVENTIONS.
Shpitser, Ilya; Tchetgen, Eric Tchetgen
2016-12-01
Identifying causal parameters from observational data is fraught with subtleties due to the issues of selection bias and confounding. In addition, more complex questions of interest, such as effects of treatment on the treated and mediated effects may not always be identified even in data where treatment assignment is known and under investigator control, or may be identified under one causal model but not another. Increasingly complex effects of interest, coupled with a diversity of causal models in use resulted in a fragmented view of identification. This fragmentation makes it unnecessarily difficult to determine if a given parameter is identified (and in what model), and what assumptions must hold for this to be the case. This, in turn, complicates the development of estimation theory and sensitivity analysis procedures. In this paper, we give a unifying view of a large class of causal effects of interest, including novel effects not previously considered, in terms of a hierarchy of interventions, and show that identification theory for this large class reduces to an identification theory of random variables under interventions from this hierarchy. Moreover, we show that one type of intervention in the hierarchy is naturally associated with queries identified under the Finest Fully Randomized Causally Interpretable Structure Tree Graph (FFRCISTG) model of Robins (via the extended g-formula), and another is naturally associated with queries identified under the Non-Parametric Structural Equation Model with Independent Errors (NPSEM-IE) of Pearl, via a more general functional we call the edge g-formula. Our results motivate the study of estimation theory for the edge g-formula, since we show it arises both in mediation analysis, and in settings where treatment assignment has unobserved causes, such as models associated with Pearl's front-door criterion.
Multi-agent fare optimization model of two modes problem and its analysis based on edge of chaos
NASA Astrophysics Data System (ADS)
Li, Xue-yan; Li, Xue-mei; Li, Xue-wei; Qiu, He-ting
2017-03-01
This paper proposes a new framework of fare optimization & game model for studying the competition between two travel modes (high speed railway and civil aviation) in which passengers' group behavior is taken into consideration. The small-world network is introduced to construct the multi-agent model of passengers' travel mode choice. The cumulative prospect theory is adopted to depict passengers' bounded rationality, the heterogeneity of passengers' reference point is depicted using the idea of group emotion computing. The conceptions of "Langton parameter" and "evolution entropy" in the theory of "edge of chaos" are introduced to create passengers' "decision coefficient" and "evolution entropy of travel mode choice" which are used to quantify passengers' group behavior. The numerical simulation and the analysis of passengers' behavior show that (1) the new model inherits the features of traditional model well and the idea of self-organizing traffic flow evolution fully embodies passengers' bounded rationality, (2) compared with the traditional model (logit model), when passengers are in the "edge of chaos" state, the total profit of the transportation system is higher.
Information theory applications for biological sequence analysis.
Vinga, Susana
2014-05-01
Information theory (IT) addresses the analysis of communication systems and has been widely applied in molecular biology. In particular, alignment-free sequence analysis and comparison greatly benefited from concepts derived from IT, such as entropy and mutual information. This review covers several aspects of IT applications, ranging from genome global analysis and comparison, including block-entropy estimation and resolution-free metrics based on iterative maps, to local analysis, comprising the classification of motifs, prediction of transcription factor binding sites and sequence characterization based on linguistic complexity and entropic profiles. IT has also been applied to high-level correlations that combine DNA, RNA or protein features with sequence-independent properties, such as gene mapping and phenotype analysis, and has also provided models based on communication systems theory to describe information transmission channels at the cell level and also during evolutionary processes. While not exhaustive, this review attempts to categorize existing methods and to indicate their relation with broader transversal topics such as genomic signatures, data compression and complexity, time series analysis and phylogenetic classification, providing a resource for future developments in this promising area.
SurfKin: an ab initio kinetic code for modeling surface reactions.
Le, Thong Nguyen-Minh; Liu, Bin; Huynh, Lam K
2014-10-05
In this article, we describe a C/C++ program called SurfKin (Surface Kinetics) to construct microkinetic mechanisms for modeling gas-surface reactions. Thermodynamic properties of reaction species are estimated based on density functional theory calculations and statistical mechanics. Rate constants for elementary steps (including adsorption, desorption, and chemical reactions on surfaces) are calculated using the classical collision theory and transition state theory. Methane decomposition and water-gas shift reaction on Ni(111) surface were chosen as test cases to validate the code implementations. The good agreement with literature data suggests this is a powerful tool to facilitate the analysis of complex reactions on surfaces, and thus it helps to effectively construct detailed microkinetic mechanisms for such surface reactions. SurfKin also opens a possibility for designing nanoscale model catalysts. Copyright © 2014 Wiley Periodicals, Inc.
NASA Technical Reports Server (NTRS)
Noll, Thomas E.
1990-01-01
The paper describes recent accomplishments and current research projects along four main thrusts in aeroservoelasticity at NASA Langley. One activity focuses on enhancing the modeling and analysis procedures to accurately predict aeroservoelastic interactions. Improvements to the minimum-state method of approximating unsteady aerodynamics are shown to provide precise low-order models for design and simulation tasks. Recent extensions in aerodynamic correction-factor methodology are also described. With respect to analysis procedures, the paper reviews novel enhancements to matched filter theory and random process theory for predicting the critical gust profile and the associated time-correlated gust loads for structural design considerations. Two research projects leading towards improved design capability are also summarized: (1) an integrated structure/control design capability and (2) procedures for obtaining low-order robust digital control laws for aeroelastic applications.
Digital morphogenesis via Schelling segregation
NASA Astrophysics Data System (ADS)
Barmpalias, George; Elwes, Richard; Lewis-Pye, Andrew
2018-04-01
Schelling’s model of segregation looks to explain the way in which particles or agents of two types may come to arrange themselves spatially into configurations consisting of large homogeneous clusters, i.e. connected regions consisting of only one type. As one of the earliest agent based models studied by economists and perhaps the most famous model of self-organising behaviour, it also has direct links to areas at the interface between computer science and statistical mechanics, such as the Ising model and the study of contagion and cascading phenomena in networks. While the model has been extensively studied it has largely resisted rigorous analysis, prior results from the literature generally pertaining to variants of the model which are tweaked so as to be amenable to standard techniques from statistical mechanics or stochastic evolutionary game theory. In Brandt et al (2012 Proc. 44th Annual ACM Symp. on Theory of Computing) provided the first rigorous analysis of the unperturbed model, for a specific set of input parameters. Here we provide a rigorous analysis of the model’s behaviour much more generally and establish some surprising forms of threshold behaviour, notably the existence of situations where an increased level of intolerance for neighbouring agents of opposite type leads almost certainly to decreased segregation.
Correcting for population structure and kinship using the linear mixed model: theory and extensions.
Hoffman, Gabriel E
2013-01-01
Population structure and kinship are widespread confounding factors in genome-wide association studies (GWAS). It has been standard practice to include principal components of the genotypes in a regression model in order to account for population structure. More recently, the linear mixed model (LMM) has emerged as a powerful method for simultaneously accounting for population structure and kinship. The statistical theory underlying the differences in empirical performance between modeling principal components as fixed versus random effects has not been thoroughly examined. We undertake an analysis to formalize the relationship between these widely used methods and elucidate the statistical properties of each. Moreover, we introduce a new statistic, effective degrees of freedom, that serves as a metric of model complexity and a novel low rank linear mixed model (LRLMM) to learn the dimensionality of the correction for population structure and kinship, and we assess its performance through simulations. A comparison of the results of LRLMM and a standard LMM analysis applied to GWAS data from the Multi-Ethnic Study of Atherosclerosis (MESA) illustrates how our theoretical results translate into empirical properties of the mixed model. Finally, the analysis demonstrates the ability of the LRLMM to substantially boost the strength of an association for HDL cholesterol in Europeans.
NASA Astrophysics Data System (ADS)
Sun, Di-Hua; Zhang, Geng; Zhao, Min; Cheng, Sen-Lin; Cao, Jian-Dong
2018-03-01
Recently, the influence of driver's individual behaviors on traffic stability is research hotspot with the fasting developing transportation cyber-physical systems. In this paper, a new traffic lattice hydrodynamic model is proposed with consideration of driver's feedforward anticipation optimal flux difference. The neutral stability condition of the new model is obtained through linear stability analysis theory. The results show that the stable region will be enlarged on the phase diagram when the feedforward anticipation optimal flux difference effect is taken into account. In order to depict traffic jamming transition properties theoretically, the mKdV equation near the critical point is derived via nonlinear reductive perturbation method. The propagation behavior of traffic density waves can be described by the kink-antikink solution of the mKdV equation. Numerical simulations are conducted to verify the analytical results and all the results confirms that traffic stability can be enhanced significantly by considering the feedforward anticipation optimal flux difference in traffic lattice hydrodynamic theory.
NASA Astrophysics Data System (ADS)
Kayumov, R. A.; Muhamedova, I. Z.; Tazyukov, B. F.; Shakirzjanov, F. R.
2018-03-01
In this paper, based on the analysis of some experimental data, a study and selection of hereditary models of deformation of reinforced polymeric composite materials, such as organic plastic, carbon plastic and a matrix of film-fabric composite, was pursued. On the basis of an analysis of a series of experiments it has been established that organo-plastic samples behave like viscoelastic bodies. It is shown that for sufficiently large load levels, the behavior of the material in question should be described by the relations of the nonlinear theory of heredity. An attempt to describe the process of deformation by means of linear relations of the theory of heredity leads to large discrepancies between the experimental and calculated deformation values. The use of the theory of accumulation of micro-damages leads to much better description of the experimental results. With the help of the hierarchical approach, a good approximation of the experimental values was successful only in the first three sections of loading.
Liguori, Gabriel R; Jeronimus, Bertus F; de Aquinas Liguori, Tácia T; Moreira, Luiz Felipe P; Harmsen, Martin C
2017-12-01
Animal experimentation requires a solid and rational moral foundation. Objective and emphatic decision-making and protocol evaluation by researchers and ethics committees remain a difficult and sensitive matter. This article presents three perspectives that facilitate a consideration of the minimally acceptable standard for animal experiments, in particular, in tissue engineering (TE) and regenerative medicine. First, we review the boundaries provided by law and public opinion in America and Europe. Second, we review contemporary moral theory to introduce the Neo-Rawlsian contractarian theory to objectively evaluate the ethics of animal experiments. Third, we introduce the importance of available reduction, replacement, and refinement strategies, which should be accounted for in moral decision-making and protocol evaluation of animal experiments. The three perspectives are integrated into an algorithmic and graphic harm-benefit analysis tool based on the most relevant aspects of animal models in TE. We conclude with a consideration of future avenues to improve animal experiments.
Using chemical organization theory for model checking
Kaleta, Christoph; Richter, Stephan; Dittrich, Peter
2009-01-01
Motivation: The increasing number and complexity of biomodels makes automatic procedures for checking the models' properties and quality necessary. Approaches like elementary mode analysis, flux balance analysis, deficiency analysis and chemical organization theory (OT) require only the stoichiometric structure of the reaction network for derivation of valuable information. In formalisms like Systems Biology Markup Language (SBML), however, information about the stoichiometric coefficients required for an analysis of chemical organizations can be hidden in kinetic laws. Results: First, we introduce an algorithm that uncovers stoichiometric information that might be hidden in the kinetic laws of a reaction network. This allows us to apply OT to SBML models using modifiers. Second, using the new algorithm, we performed a large-scale analysis of the 185 models contained in the manually curated BioModels Database. We found that for 41 models (22%) the set of organizations changes when modifiers are considered correctly. We discuss one of these models in detail (BIOMD149, a combined model of the ERK- and Wnt-signaling pathways), whose set of organizations drastically changes when modifiers are considered. Third, we found inconsistencies in 5 models (3%) and identified their characteristics. Compared with flux-based methods, OT is able to identify those species and reactions more accurately [in 26 cases (14%)] that can be present in a long-term simulation of the model. We conclude that our approach is a valuable tool that helps to improve the consistency of biomodels and their repositories. Availability: All data and a JAVA applet to check SBML-models is available from http://www.minet.uni-jena.de/csb/prj/ot/tools Contact: dittrich@minet.uni-jena.de Supplementary information: Supplementary data are available at Bioinformatics online. PMID:19468053
NASA Astrophysics Data System (ADS)
Sǎraru, Silviu-Constantin
Topological field theories originate in the papers of Schwarz and Witten. Initially, Schwarz shown that one of the topological invariants, namely the Ray-Singer torsion, can be represented as the partition function of a certain quantum field theory. Subsequently, Witten constructed a framework for understanding Morse theory in terms of supersymmetric quantum mechanics. These two constructions represent the prototypes of all topological field theories. The model used by Witten has been applied to classical index theorems and, moreover, suggested some generalizations that led to new mathematical results on holomorphic Morse inequalities. Starting with these results, further developments in the domain of topological field theories have been achieved. The Becchi-Rouet-Stora-Tyutin (BRST) symmetry allowed for a new definition of topological ...eld theories as theories whose BRST-invariant Hamiltonian is also BRST-exact. An important class of topological theories of Schwarz type is the class of BF models. This type of models describes three-dimensional quantum gravity and is useful at the study of four-dimensional quantum gravity in Ashtekar-Rovelli-Smolin formulation. Two-dimensional BF models are correlated to Poisson sigma models from various two-dimensional gravities. The analysis of Poisson sigma models, including their relationship to two-dimensional gravity and the study of classical solutions, has been intensively studied in the literature. In this thesis we approach the problem of construction of some classes of interacting BF models in the context of the BRST formalism. In view of this, we use the method of the deformation of the BRST charge and BRST-invariant Hamiltonian. Both methods rely on specific techniques of local BRST cohomology. The main hypotheses in which we construct the above mentioned interactions are: space-time locality, Poincare invariance, smoothness of deformations in the coupling constant and the preservation of the number of derivatives on each field. The first two hypotheses implies that the resulting interacting theory must be local in space-time and Poincare invariant. The smoothness of deformations means that the deformed objects that contribute to the construction of interactions must be smooth in the coupling constant and reduce to the objects corresponding to the free theory in the zero limit of the coupling constant. The preservation of the number of derivatives on each field imp! lies two aspects that must be simultaneously fulfilled: (i) the differential order of each free field equation must coincide with that of the corresponding interacting field equation; (ii) the maximum number of space-time derivatives from the interacting vertices cannot exceed the maximum number of derivatives from the free Lagrangian. The main results obtained can be synthesized into: obtaining self-interactions for certain classes of BF models; generation of couplings between some classes of BF theories and matter theories; construction of interactions between a class of BF models and a system of massless vector fields.
Jahn, Beate; Theurl, Engelbert; Siebert, Uwe; Pfeiffer, Karl-Peter
2010-01-01
In most decision-analytic models in health care, it is assumed that there is treatment without delay and availability of all required resources. Therefore, waiting times caused by limited resources and their impact on treatment effects and costs often remain unconsidered. Queuing theory enables mathematical analysis and the derivation of several performance measures of queuing systems. Nevertheless, an analytical approach with closed formulas is not always possible. Therefore, simulation techniques are used to evaluate systems that include queuing or waiting, for example, discrete event simulation. To include queuing in decision-analytic models requires a basic knowledge of queuing theory and of the underlying interrelationships. This tutorial introduces queuing theory. Analysts and decision-makers get an understanding of queue characteristics, modeling features, and its strength. Conceptual issues are covered, but the emphasis is on practical issues like modeling the arrival of patients. The treatment of coronary artery disease with percutaneous coronary intervention including stent placement serves as an illustrative queuing example. Discrete event simulation is applied to explicitly model resource capacities, to incorporate waiting lines and queues in the decision-analytic modeling example.
NASA Astrophysics Data System (ADS)
Shen, L.; Mickley, L. J.; Gilleland, E.
2016-04-01
We develop a statistical model using extreme value theory to estimate the 2000-2050 changes in ozone episodes across the United States. We model the relationships between daily maximum temperature (Tmax) and maximum daily 8 h average (MDA8) ozone in May-September over 2003-2012 using a Point Process (PP) model. At ~20% of the sites, a marked decrease in the ozone-temperature slope occurs at high temperatures, defined as ozone suppression. The PP model sometimes fails to capture ozone-Tmax relationships, so we refit the ozone-Tmax slope using logistic regression and a generalized Pareto distribution model. We then apply the resulting hybrid-extreme value theory model to projections of Tmax from an ensemble of downscaled climate models. Assuming constant anthropogenic emissions at the present level, we find an average increase of 2.3 d a-1 in ozone episodes (>75 ppbv) across the United States by the 2050s, with a change of +3-9 d a-1 at many sites.
School Management and Moral Literacy: A Conceptual Analysis of the Model
ERIC Educational Resources Information Center
Sagnak, Mesut
2012-01-01
The aim of this study is to analyze the moral literacy model developed by Tuana; discuss the superiorities and limitations, and constitute the theoretical conditions of a new model by utilizing previous researches and theories asserted about this subject. The model has stated that moral literacy is composed of three main components as ethics…
The Communication Model and the Nature of Change in Terms of Deforestation in China since 1949
ERIC Educational Resources Information Center
Tian, Dexin; Chao, Chin-Chung
2010-01-01
This article explores the communication model and nature of change in terms of deforestation in China since 1949. Through Lasswell's communication model and the theory of change and via historical analysis and extended literature review, we have discovered: First, Mao's government adopted an effective one-way top-down communication model with…
A Note on Item-Restscore Association in Rasch Models
ERIC Educational Resources Information Center
Kreiner, Svend
2011-01-01
To rule out the need for a two-parameter item response theory (IRT) model during item analysis by Rasch models, it is important to check the Rasch model's assumption that all items have the same item discrimination. Biserial and polyserial correlation coefficients measuring the association between items and restscores are often used in an informal…
Normal Theory Two-Stage ML Estimator When Data Are Missing at the Item Level
ERIC Educational Resources Information Center
Savalei, Victoria; Rhemtulla, Mijke
2017-01-01
In many modeling contexts, the variables in the model are linear composites of the raw items measured for each participant; for instance, regression and path analysis models rely on scale scores, and structural equation models often use parcels as indicators of latent constructs. Currently, no analytic estimation method exists to appropriately…
ERIC Educational Resources Information Center
de Barbaro, Kaya; Chiba, Andrea; Deak, Gedeon O.
2011-01-01
A current theory of attention posits that several micro-indices of attentional vigilance are dependent on activation of the locus coeruleus, a brainstem nucleus that regulates cortical norepinephrine activity (Aston-Jones et al., 1999). This theory may account for many findings in the infant literature, while highlighting important new areas for…
ERIC Educational Resources Information Center
Klinger, Don A.; Rogers, W. Todd
2003-01-01
The estimation accuracy of procedures based on classical test score theory and item response theory (generalized partial credit model) were compared for examinations consisting of multiple-choice and extended-response items. Analysis of British Columbia Scholarship Examination results found an error rate of about 10 percent for both methods, with…
Computer modeling of electromagnetic problems using the geometrical theory of diffraction
NASA Technical Reports Server (NTRS)
Burnside, W. D.
1976-01-01
Some applications of the geometrical theory of diffraction (GTD), a high frequency ray optical solution to electromagnetic problems, are presented. GTD extends geometric optics, which does not take into account the diffractions occurring at edges, vertices, and various other discontinuities. Diffraction solutions, analysis of basic structures, construction of more complex structures, and coupling using GTD are discussed.
ERIC Educational Resources Information Center
Wagner, William J.
The application of a linear learning model, which combines learning theory with a structural analysis of the exercises given to students, to an elementary mathematics curriculum is examined. Elementary arithmetic items taken by about 100 second-grade students on 26 weekly tests form the data base. Weekly predictions of group performance on…
ERIC Educational Resources Information Center
Brackenbury, Tim; Zickar, Michael J.; Munson, Benjamin; Storkel, Holly L.
2017-01-01
Purpose: Item response theory (IRT) is a psychometric approach to measurement that uses latent trait abilities (e.g., speech sound production skills) to model performance on individual items that vary by difficulty and discrimination. An IRT analysis was applied to preschoolers' productions of the words on the Goldman-Fristoe Test of…
A statistical mechanics approach to autopoietic immune networks
NASA Astrophysics Data System (ADS)
Barra, Adriano; Agliari, Elena
2010-07-01
In this work we aim to bridge theoretical immunology and disordered statistical mechanics. We introduce a model for the behavior of B-cells which naturally merges the clonal selection theory and the autopoietic network theory as a whole. From the analysis of its features we recover several basic phenomena such as low-dose tolerance, dynamical memory of antigens and self/non-self discrimination.
ERIC Educational Resources Information Center
Pongsophon, Pongprapan; Herman, Benjamin C.
2017-01-01
Given the abundance of literature describing the strong relationship between inquiry-based teaching and student achievement, more should be known about the factors impacting science teachers' classroom inquiry implementation. This study utilises the theory of planned behaviour to propose and validate a causal model of inquiry-based teaching…
Jordan, Pascal; Shedden-Mora, Meike C; Löwe, Bernd
2017-01-01
The Generalized Anxiety Disorder scale (GAD-7) is one of the most frequently used diagnostic self-report scales for screening, diagnosis and severity assessment of anxiety disorder. Its psychometric properties from the view of the Item Response Theory paradigm have rarely been investigated. We aimed to close this gap by analyzing the GAD-7 within a large sample of primary care patients with respect to its psychometric properties and its implications for scoring using Item Response Theory. Robust, nonparametric statistics were used to check unidimensionality of the GAD-7. A graded response model was fitted using a Bayesian approach. The model fit was evaluated using posterior predictive p-values, item information functions were derived and optimal predictions of anxiety were calculated. The sample included N = 3404 primary care patients (60% female; mean age, 52,2; standard deviation 19.2) The analysis indicated no deviations of the GAD-7 scale from unidimensionality and a decent fit of a graded response model. The commonly suggested ultra-brief measure consisting of the first two items, the GAD-2, was supported by item information analysis. The first four items discriminated better than the last three items with respect to latent anxiety. The information provided by the first four items should be weighted more heavily. Moreover, estimates corresponding to low to moderate levels of anxiety show greater variability. The psychometric validity of the GAD-2 was supported by our analysis.
Shedden-Mora, Meike C.; Löwe, Bernd
2017-01-01
Objective The Generalized Anxiety Disorder scale (GAD-7) is one of the most frequently used diagnostic self-report scales for screening, diagnosis and severity assessment of anxiety disorder. Its psychometric properties from the view of the Item Response Theory paradigm have rarely been investigated. We aimed to close this gap by analyzing the GAD-7 within a large sample of primary care patients with respect to its psychometric properties and its implications for scoring using Item Response Theory. Methods Robust, nonparametric statistics were used to check unidimensionality of the GAD-7. A graded response model was fitted using a Bayesian approach. The model fit was evaluated using posterior predictive p-values, item information functions were derived and optimal predictions of anxiety were calculated. Results The sample included N = 3404 primary care patients (60% female; mean age, 52,2; standard deviation 19.2) The analysis indicated no deviations of the GAD-7 scale from unidimensionality and a decent fit of a graded response model. The commonly suggested ultra-brief measure consisting of the first two items, the GAD-2, was supported by item information analysis. The first four items discriminated better than the last three items with respect to latent anxiety. Conclusion The information provided by the first four items should be weighted more heavily. Moreover, estimates corresponding to low to moderate levels of anxiety show greater variability. The psychometric validity of the GAD-2 was supported by our analysis. PMID:28771530
Rasmussen's legacy: A paradigm change in engineering for safety.
Leveson, Nancy G
2017-03-01
This paper describes three applications of Rasmussen's idea to systems engineering practice. The first is the application of the abstraction hierarchy to engineering specifications, particularly requirements specification. The second is the use of Rasmussen's ideas in safety modeling and analysis to create a new, more powerful type of accident causation model that extends traditional models to better handle human-operated, software-intensive, sociotechnical systems. Because this new model has a formal, mathematical foundation built on systems theory (as was Rasmussen's original model), new modeling and analysis tools become possible. The third application is to engineering hazard analysis. Engineers have traditionally either omitted human from consideration in system hazard analysis or have treated them rather superficially, for example, that they behave randomly. Applying Rasmussen's model of human error to a powerful new hazard analysis technique allows human behavior to be included in engineering hazard analysis. Copyright © 2016 Elsevier Ltd. All rights reserved.
Applying a contemporary grounded theory methodology.
Licqurish, Sharon; Seibold, Carmel
2011-01-01
The aim of this paper is to discuss the application of a contemporary grounded theory methodology to a research project exploring the experiences of students studying for a degree in midwifery. Grounded theory is a qualitative research approach developed by Glaser and Strauss in the 1950s but the methodology for this study was modelled on Clarke's (2005) approach and was underpinned by a symbolic interactionist theoretical perspective, post-structuralist theories of Michel Foucault and a constructionist epistemology. The study participants were 19 midwifery students completing their final placement. Data were collected through individual in-depth interviews and participant observation, and analysed using the grounded theory analysis techniques of coding, constant comparative analysis and theoretical sampling, as well as situational maps. The analysis focused on social action and interaction and the operation of power in the students' environment. The social process in which the students were involved, as well as the actors and discourses that affected the students' competency development, were highlighted. The methodology allowed a thorough exploration of the students' experiences of achieving competency. However, some difficulties were encountered. One of the major issues related to the understanding and application of complex sociological theories that challenged positivist notions of truth and power. Furthermore, the mapping processes were complex. Despite these minor challenges, the authors recommend applying this methodology to other similar research projects.
Dynamic fuzzy hierarchy analysis for evaluation of professionalization degree
NASA Astrophysics Data System (ADS)
Jin, Lin; Min, Luo; Ma, Jingxi
2016-06-01
This paper presents the model of dynamic fuzzy hierarchy analysis for evaluation of professionalization degree, as a combination of the dynamic fuzzy theory and the AHP, which can show the changes and trends of the value of each index of professionalization.
Attribution Theory and Crisis Intervention Therapy.
ERIC Educational Resources Information Center
Skilbeck, William M.
It was proposed that existing therapeutic procedures may influence attributions about emotional states. Therefore an attributional analysis of crisis intervention, a model of community-based, short-term consultation, was presented. This analysis suggested that crisis intervention provides attributionally-relevant information about both the source…
A stochastic dynamic model for human error analysis in nuclear power plants
NASA Astrophysics Data System (ADS)
Delgado-Loperena, Dharma
Nuclear disasters like Three Mile Island and Chernobyl indicate that human performance is a critical safety issue, sending a clear message about the need to include environmental press and competence aspects in research. This investigation was undertaken to serve as a roadmap for studying human behavior through the formulation of a general solution equation. The theoretical model integrates models from two heretofore-disassociated disciplines (behavior specialists and technical specialists), that historically have independently studied the nature of error and human behavior; including concepts derived from fractal and chaos theory; and suggests re-evaluation of base theory regarding human error. The results of this research were based on comprehensive analysis of patterns of error, with the omnipresent underlying structure of chaotic systems. The study of patterns lead to a dynamic formulation, serving for any other formula used to study human error consequences. The search for literature regarding error yielded insight for the need to include concepts rooted in chaos theory and strange attractors---heretofore unconsidered by mainstream researchers who investigated human error in nuclear power plants or those who employed the ecological model in their work. The study of patterns obtained from the rupture of a steam generator tube (SGTR) event simulation, provided a direct application to aspects of control room operations in nuclear power plant operations. In doing so, the conceptual foundation based in the understanding of the patterns of human error analysis can be gleaned, resulting in reduced and prevent undesirable events.
Theory of the decision/problem state
NASA Technical Reports Server (NTRS)
Dieterly, D. L.
1980-01-01
A theory of the decision-problem state was introduced and elaborated. Starting with the basic model of a decision-problem condition, an attempt was made to explain how a major decision-problem may consist of subsets of decision-problem conditions composing different condition sequences. In addition, the basic classical decision-tree model was modified to allow for the introduction of a series of characteristics that may be encountered in an analysis of a decision-problem state. The resulting hierarchical model reflects the unique attributes of the decision-problem state. The basic model of a decision-problem condition was used as a base to evolve a more complex model that is more representative of the decision-problem state and may be used to initiate research on decision-problem states.
Comparing theories of reference-dependent choice.
Bhatia, Sudeep
2017-09-01
Preferences are influenced by the presence or absence of salient choice options, known as reference points. This behavioral tendency is traditionally attributed to the loss aversion and diminishing sensitivity assumptions of prospect theory. In contrast, some psychological research suggests that reference dependence is caused by attentional biases that increase the subjective weighting of the reference point's primary attributes. Although both theories are able to successfully account for behavioral findings involving reference dependence, this article shows that these theories make diverging choice predictions when available options are inferior to the reference point. It presents the results of 2 studies that use settings with inferior choice options to compare these 2 theories. The analysis involves quantitative fits to participant-level choice data, and the results indicate that most participants are better described by models with attentional bias than they are by models with loss aversion and diminishing sensitivity. These differences appear to be caused by violations of loss aversion and diminishing sensitivity in losses. (PsycINFO Database Record (c) 2017 APA, all rights reserved).
On the Connection between Kinetic Monte Carlo and the Burton-Cabrera-Frank Theory
NASA Astrophysics Data System (ADS)
Patrone, Paul; Margetis, Dionisios; Einstein, T. L.
2013-03-01
In the many years since it was first proposed, the Burton- Cabrera-Frank (BCF) model of step-flow has been experimentally established as one of the cornerstones of surface physics. However, many questions remain regarding the underlying physical processes and theoretical assumptions that give rise to the BCF theory. In this work, we formally derive the BCF theory from an atomistic, kinetic Monte Carlo model of the surface in 1 +1 dimensions with one step. Our analysis (i) shows how the BCF theory describes a surface with a low density of adsorbed atoms, and (ii) establishes a set of near-equilibrium conditions ensuring that the theory remains valid for all times. Support for PP was provided by the NIST-ARRA Fellowship Award No. 70NANB10H026 through UMD. Support for TLE and PP was also provided by the CMTC at UMD, with ancillary support from the UMD MRSEC. Support for DM was provided by NSF DMS0847587 at UMD.
Knot invariants and M-theory: Proofs and derivations
NASA Astrophysics Data System (ADS)
Errasti Díez, Verónica
2018-01-01
We construct two distinct yet related M-theory models that provide suitable frameworks for the study of knot invariants. We then focus on the four-dimensional gauge theory that follows from appropriately compactifying one of these M-theory models. We show that this theory has indeed all required properties to host knots. Our analysis provides a unifying picture of the various recent works that attempt an understanding of knot invariants using techniques of four-dimensional physics. This is a companion paper to K. Dasgupta, V. Errasti Díez, P. Ramadevi, and R. Tatar, Phys. Rev. D 95, 026010 (2017), 10.1103/PhysRevD.95.026010, covering all but Sec. III C. It presents a detailed mathematical derivation of the main results there, as well as additional material. Among the new insights, those related to supersymmetry and the topological twist are highlighted. This paper offers an alternative, complementary formulation of the contents in the first paper, but is self-contained and can be read independently.
Francis, Jill J; Stockton, Charlotte; Eccles, Martin P; Johnston, Marie; Cuthbertson, Brian H; Grimshaw, Jeremy M; Hyde, Chris; Tinmouth, Alan; Stanworth, Simon J
2009-11-01
Many theories of behaviour are potentially relevant to predictive and intervention studies but most studies investigate a narrow range of theories. Michie et al. (2005) agreed 12 'theoretical domains' from 33 theories that explain behaviour change. They developed a 'Theoretical Domains Interview' (TDI) for identifying relevant domains for specific clinical behaviours, but the framework has not been used for selecting theories for predictive studies. It was used here to investigate clinicians' transfusion behaviour in intensive care units (ICU). Evidence suggests that red blood cells transfusion could be reduced for some patients without reducing quality of care. (1) To identify the domains relevant to transfusion practice in ICUs and neonatal intensive care units (NICUs), using the TDI. (2) To use the identified domains to select appropriate theories for a study predicting transfusion behaviour. An adapted TDI about managing a patient with borderline haemoglobin by watching and waiting instead of transfusing red blood cells was used to conduct semi-structured, one-to-one interviews with 18 intensive care consultants and neonatologists across the UK. Relevant theoretical domains were: knowledge, beliefs about capabilities, beliefs about consequences, social influences, behavioural regulation. Further analysis at the construct level resulted in selection of seven theoretical approaches relevant to this context: Knowledge-Attitude-Behaviour Model, Theory of Planned Behaviour, Social Cognitive Theory, Operant Learning Theory, Control Theory, Normative Model of Work Team Effectiveness and Action Planning Approaches. This study illustrated, the use of the TDI to identify relevant domains in a complex area of inpatient care. This approach is potentially valuable for selecting theories relevant to predictive studies and resulted in greater breadth of potential explanations than would be achieved if a single theoretical model had been adopted.
On selecting evidence to test hypotheses: A theory of selection tasks.
Ragni, Marco; Kola, Ilir; Johnson-Laird, Philip N
2018-05-21
How individuals choose evidence to test hypotheses is a long-standing puzzle. According to an algorithmic theory that we present, it is based on dual processes: individuals' intuitions depending on mental models of the hypothesis yield selections of evidence matching instances of the hypothesis, but their deliberations yield selections of potential counterexamples to the hypothesis. The results of 228 experiments using Wason's selection task corroborated the theory's predictions. Participants made dependent choices of items of evidence: the selections in 99 experiments were significantly more redundant (using Shannon's measure) than those of 10,000 simulations of each experiment based on independent selections. Participants tended to select evidence corresponding to instances of hypotheses, or to its counterexamples, or to both. Given certain contents, instructions, or framings of the task, they were more likely to select potential counterexamples to the hypothesis. When participants received feedback about their selections in the "repeated" selection task, they switched from selections of instances of the hypothesis to selection of potential counterexamples. These results eliminated most of the 15 alternative theories of selecting evidence. In a meta-analysis, the model theory yielded a better fit of the results of 228 experiments than the one remaining theory based on reasoning rather than meaning. We discuss the implications of the model theory for hypothesis testing and for a well-known paradox of confirmation. (PsycINFO Database Record (c) 2018 APA, all rights reserved).
A theory of drug tolerance and dependence I: a conceptual analysis.
Peper, Abraham
2004-08-21
A mathematical model of drug tolerance and its underlying theory is presented. The model extends a first approach, published previously. The model is essentially more complex than the generally used model of homeostasis, which is demonstrated to fail in describing tolerance development to repeated drug administrations. The model assumes the development of tolerance to a repeatedly administered drug to be the result of a regulated adaptive process. The oral detection and analysis of exogenous substances is proposed to be the primary stimulus for the mechanism of drug tolerance. Anticipation and environmental cues are in the model considered secondary stimuli, becoming primary only in dependence and addiction or when the drug administration bypasses the natural-oral-route, as is the case when drugs are administered intravenously. The model considers adaptation to the effect of a drug and adaptation to the interval between drug taking autonomous tolerance processes. Simulations with the mathematical model demonstrate the model's behavior to be consistent with important characteristics of the development of tolerance to repeatedly administered drugs: the gradual decrease in drug effect when tolerance develops, the high sensitivity to small changes in drug dose, the rebound phenomenon and the large reactions following withdrawal in dependence. The mathematical model verifies the proposed theory and provides a basis for the implementation of mathematical models of specific physiological processes. In addition, it establishes a relation between the drug dose at any moment, and the resulting drug effect and relates the magnitude of the reactions following withdrawal to the rate of tolerance and other parameters involved in the tolerance process. The present paper analyses the concept behind the model. The next paper discusses the mathematical model.
Amide I vibrational circular dichroism of dipeptide: Conformation dependence and fragment analysis
NASA Astrophysics Data System (ADS)
Choi, Jun-Ho; Cho, Minhaeng
2004-03-01
The amide I vibrational circular dichroic response of alanine dipeptide analog (ADA) was theoretically investigated and the density functional theory calculation and fragment analysis results are presented. A variety of vibrational spectroscopic properties, local and normal mode frequencies, coupling constant, dipole, and rotational strengths, are calculated by varying two dihedral angles determining the three-dimensional ADA conformation. Considering two monopeptide fragments separately, we show that the amide I vibrational circular dichroism of the ADA can be quantitatively predicted. For several representative conformations of the model ADA, vibrational circular dichroism spectra are calculated by using both the density functional theory calculation and fragment analysis methods.
Hao, Yong; Sun, Xu-Dong; Yang, Qiang
2012-12-01
Variables selection strategy combined with local linear embedding (LLE) was introduced for the analysis of complex samples by using near infrared spectroscopy (NIRS). Three methods include Monte Carlo uninformation variable elimination (MCUVE), successive projections algorithm (SPA) and MCUVE connected with SPA were used for eliminating redundancy spectral variables. Partial least squares regression (PLSR) and LLE-PLSR were used for modeling complex samples. The results shown that MCUVE can both extract effective informative variables and improve the precision of models. Compared with PLSR models, LLE-PLSR models can achieve more accurate analysis results. MCUVE combined with LLE-PLSR is an effective modeling method for NIRS quantitative analysis.
The ABC (in any D) of logarithmic CFT
NASA Astrophysics Data System (ADS)
Hogervorst, Matthijs; Paulos, Miguel; Vichi, Alessandro
2017-10-01
Logarithmic conformal field theories have a vast range of applications, from critical percolation to systems with quenched disorder. In this paper we thoroughly examine the structure of these theories based on their symmetry properties. Our analysis is model-independent and holds for any spacetime dimension. Our results include a determination of the general form of correlation functions and conformal block decompositions, clearing the path for future bootstrap applications. Several examples are discussed in detail, including logarithmic generalized free fields, holographic models, self-avoiding random walks and critical percolation.
NASA Astrophysics Data System (ADS)
Zhang, Yufeng; Long, Man; Luo, Sida; Bao, Yu; Shen, Hanxia
2015-12-01
Transit route choice model is the key technology of public transit systems planning and management. Traditional route choice models are mostly based on expected utility theory which has an evident shortcoming that it cannot accurately portray travelers' subjective route choice behavior for their risk preferences are not taken into consideration. Cumulative prospect theory (CPT), a brand new theory, can be used to describe travelers' decision-making process under the condition of uncertainty of transit supply and risk preferences of multi-type travelers. The method to calibrate the reference point, a key parameter to CPT-based transit route choice model, determines the precision of the model to a great extent. In this paper, a new method is put forward to obtain the value of reference point which combines theoretical calculation and field investigation results. Comparing the proposed method with traditional method, it shows that the new method can promote the quality of CPT-based model by improving the accuracy in simulating travelers' route choice behaviors based on transit trip investigation from Nanjing City, China. The proposed method is of great significance to logical transit planning and management, and to some extent makes up the defect that obtaining the reference point is solely based on qualitative analysis.
Shen, Minxue; Cui, Yuanwu; Hu, Ming; Xu, Linyong
2017-01-13
The study aimed to validate a scale to assess the severity of "Yin deficiency, intestine heat" pattern of functional constipation based on the modern test theory. Pooled longitudinal data of 237 patients with "Yin deficiency, intestine heat" pattern of constipation from a prospective cohort study were used to validate the scale. Exploratory factor analysis was used to examine the common factors of items. A multidimensional item response model was used to assess the scale with the presence of multidimensionality. The Cronbach's alpha ranged from 0.79 to 0.89, and the split-half reliability ranged from 0.67 to 0.79 at different measurements. Exploratory factor analysis identified two common factors, and all items had cross factor loadings. Bidimensional model had better goodness of fit than the unidimensional model. Multidimensional item response model showed that the all items had moderate to high discrimination parameters. Parameters indicated that the first latent trait signified intestine heat, while the second trait characterized Yin deficiency. Information function showed that items demonstrated highest discrimination power among patients with moderate to high level of disease severity. Multidimensional item response theory provides a useful and rational approach in validating scales for assessing the severity of patterns in traditional Chinese medicine.
Separate-channel analysis of two-channel microarrays: recovering inter-spot information.
Smyth, Gordon K; Altman, Naomi S
2013-05-26
Two-channel (or two-color) microarrays are cost-effective platforms for comparative analysis of gene expression. They are traditionally analysed in terms of the log-ratios (M-values) of the two channel intensities at each spot, but this analysis does not use all the information available in the separate channel observations. Mixed models have been proposed to analyse intensities from the two channels as separate observations, but such models can be complex to use and the gain in efficiency over the log-ratio analysis is difficult to quantify. Mixed models yield test statistics for the null distributions can be specified only approximately, and some approaches do not borrow strength between genes. This article reformulates the mixed model to clarify the relationship with the traditional log-ratio analysis, to facilitate information borrowing between genes, and to obtain an exact distributional theory for the resulting test statistics. The mixed model is transformed to operate on the M-values and A-values (average log-expression for each spot) instead of on the log-expression values. The log-ratio analysis is shown to ignore information contained in the A-values. The relative efficiency of the log-ratio analysis is shown to depend on the size of the intraspot correlation. A new separate channel analysis method is proposed that assumes a constant intra-spot correlation coefficient across all genes. This approach permits the mixed model to be transformed into an ordinary linear model, allowing the data analysis to use a well-understood empirical Bayes analysis pipeline for linear modeling of microarray data. This yields statistically powerful test statistics that have an exact distributional theory. The log-ratio, mixed model and common correlation methods are compared using three case studies. The results show that separate channel analyses that borrow strength between genes are more powerful than log-ratio analyses. The common correlation analysis is the most powerful of all. The common correlation method proposed in this article for separate-channel analysis of two-channel microarray data is no more difficult to apply in practice than the traditional log-ratio analysis. It provides an intuitive and powerful means to conduct analyses and make comparisons that might otherwise not be possible.
The black hole at the Galactic Center: Observations and models
NASA Astrophysics Data System (ADS)
Zakharov, Alexander F.
One of the most interesting astronomical objects is the Galactic Center. It is a subject of intensive astronomical observations in different spectral bands in recent years. We concentrate our discussion on a theoretical analysis of observational data of bright stars in the IR-band obtained with large telescopes. We also discuss the importance of VLBI observations of bright structures which could characterize the shadow at the Galactic Center. If we adopt general relativity (GR), there are a number of theoretical models for the Galactic Center, such as a cluster of neutron stars, boson stars, neutrino balls, etc. Some of these models were rejected or the range of their parameters is significantly constrained with consequent observations and theoretical analysis. In recent years, a number of alternative theories of gravity have been proposed because there are dark matter (DM) and dark energy (DE) problems. An alternative theory of gravity may be considered as one possible solution for such problems. Some of these theories have black hole solutions, while other theories have no such solutions. There are attempts to describe the Galactic Center with alternative theories of gravity and in this case one can constrain parameters of such theories with observational data for the Galactic Center. In particular, theories of massive gravity are intensively developing and theorists have overcome pathologies presented in the initial versions of these theories. In theories of massive gravity, a graviton is massive in contrast with GR where a graviton is massless. Now these theories are considered as an alternative to GR. For example, the LIGO-Virgo collaboration obtained the graviton mass constraint of about 1.2 × 10‑22 eV in their first publication about the discovery of the first gravitational wave detection event that resulted of the merger of two massive black holes. Surprisingly, one could obtain a consistent and comparable constraint of graviton mass at a level around mg < 2.9 × 10‑21eV from the analysis of observational data on the trajectory of the star S2 near the Galactic Center. Therefore, observations of bright stars with existing and forthcoming telescopes such as the European extremely large telescope (E-ELT) and the thirty meter telescope (TMT) are extremely useful for investigating the structure of the Galactic Center in the framework of GR, but these observations also give a tool to confirm, rule out or constrain alternative theories of gravity. As we noted earlier, VLBI observations with current and forthcoming global networks (like the Event Horizon Telescope) are used to check the hypothesis about the presence of a supermassive black hole at the Galactic Center.
Saloranta, Tuomo M; Andersen, Tom; Naes, Kristoffer
2006-01-01
Rate constant bioaccumulation models are applied to simulate the flow of polychlorinated dibenzo-p-dioxins and dibenzofurans (PCDD/Fs) in the coastal marine food web of Frierfjorden, a contaminated fjord in southern Norway. We apply two different ways to parameterize the rate constants in the model, global sensitivity analysis of the models using Extended Fourier Amplitude Sensitivity Test (Extended FAST) method, as well as results from general linear system theory, in order to obtain a more thorough insight to the system's behavior and to the flow pathways of the PCDD/Fs. We calibrate our models against observed body concentrations of PCDD/Fs in the food web of Frierfjorden. Differences between the predictions from the two models (using the same forcing and parameter values) are of the same magnitude as their individual deviations from observations, and the models can be said to perform about equally well in our case. Sensitivity analysis indicates that the success or failure of the models in predicting the PCDD/F concentrations in the food web organisms highly depends on the adequate estimation of the truly dissolved concentrations in water and sediment pore water. We discuss the pros and cons of such models in understanding and estimating the present and future concentrations and bioaccumulation of persistent organic pollutants in aquatic food webs.
Dual Processes in Decision Making and Developmental Neuroscience: A Fuzzy-Trace Model.
Reyna, Valerie F; Brainerd, Charles J
2011-09-01
From Piaget to the present, traditional and dual-process theories have predicted improvement in reasoning from childhood to adulthood, and improvement has been observed. However, developmental reversals-that reasoning biases emerge with development -have also been observed in a growing list of paradigms. We explain how fuzzy-trace theory predicts both improvement and developmental reversals in reasoning and decision making. Drawing on research on logical and quantitative reasoning, as well as on risky decision making in the laboratory and in life, we illustrate how the same small set of theoretical principles apply to typical neurodevelopment, encompassing childhood, adolescence, and adulthood, and to neurological conditions such as autism and Alzheimer's disease. For example, framing effects-that risk preferences shift when the same decisions are phrases in terms of gains versus losses-emerge in early adolescence as gist-based intuition develops. In autistic individuals, who rely less on gist-based intuition and more on verbatim-based analysis, framing biases are attenuated (i.e., they outperform typically developing control subjects). In adults, simple manipulations based on fuzzy-trace theory can make framing effects appear and disappear depending on whether gist-based intuition or verbatim-based analysis is induced. These theoretical principles are summarized and integrated in a new mathematical model that specifies how dual modes of reasoning combine to produce predictable variability in performance. In particular, we show how the most popular and extensively studied model of decision making-prospect theory-can be derived from fuzzy-trace theory by combining analytical (verbatim-based) and intuitive (gist-based) processes.
SIMP model at NNLO in chiral perturbation theory
NASA Astrophysics Data System (ADS)
Hansen, Martin; Langæble, Kasper; Sannino, Francesco
2015-10-01
We investigate the phenomenological viability of a recently proposed class of composite dark matter models where the relic density is determined by 3 →2 number-changing processes in the dark sector. Here the pions of the strongly interacting field theory constitute the dark matter particles. By performing a consistent next-to-leading- and next-to-next-to-leading-order chiral perturbative investigation we demonstrate that the leading-order analysis cannot be used to draw conclusions about the viability of the model. We further show that higher-order corrections substantially increase the tension with phenomenological constraints challenging the viability of the simplest realization of the strongly interacting massive particle paradigm.
Psychophysical Models for Signal Detection with Time Varying Uncertainty. Ph.D. Thesis
NASA Technical Reports Server (NTRS)
Gai, E.
1975-01-01
Psychophysical models for the behavior of the human operator in detection tasks which include change in detectability, correlation between observations and deferred decisions are developed. Classical Signal Detection Theory (SDT) is discussed and its emphasis on the sensory processes is contrasted to decision strategies. The analysis of decision strategies utilizes detection tasks with time varying signal strength. The classical theory is modified to include such tasks and several optimal decision strategies are explored. Two methods of classifying strategies are suggested. The first method is similar to the analysis of ROC curves, while the second is based on the relation between the criterion level (CL) and the detectability. Experiments to verify the analysis of tasks with changes of signal strength are designed. The results show that subjects are aware of changes in detectability and tend to use strategies that involve changes in the CL's.
Buatois, Simon; Retout, Sylvie; Frey, Nicolas; Ueckert, Sebastian
2017-10-01
This manuscript aims to precisely describe the natural disease progression of Parkinson's disease (PD) patients and evaluate approaches to increase the drug effect detection power. An item response theory (IRT) longitudinal model was built to describe the natural disease progression of 423 de novo PD patients followed during 48 months while taking into account the heterogeneous nature of the MDS-UPDRS. Clinical trial simulations were then used to compare drug effect detection power from IRT and sum of item scores based analysis under different analysis endpoints and drug effects. The IRT longitudinal model accurately describes the evolution of patients with and without PD medications while estimating different progression rates for the subscales. When comparing analysis methods, the IRT-based one consistently provided the highest power. IRT is a powerful tool which enables to capture the heterogeneous nature of the MDS-UPDRS.
An Analysis of Measured Pressure Signatures From Two Theory-Validation Low-Boom Models
NASA Technical Reports Server (NTRS)
Mack, Robert J.
2003-01-01
Two wing/fuselage/nacelle/fin concepts were designed to check the validity and the applicability of sonic-boom minimization theory, sonic-boom analysis methods, and low-boom design methodology in use at the end of the 1980is. Models of these concepts were built, and the pressure signatures they generated were measured in the wind-tunnel. The results of these measurements lead to three conclusions: (1) the existing methods could adequately predict sonic-boom characteristics of wing/fuselage/fin(s) configurations if the equivalent area distributions of each component were smooth and continuous; (2) these methods needed revision so the engine-nacelle volume and the nacelle-wing interference lift disturbances could be accurately predicted; and (3) current nacelle-configuration integration methods had to be updated. With these changes in place, the existing sonic-boom analysis and minimization methods could be effectively applied to supersonic-cruise concepts for acceptable/tolerable sonic-boom overpressures during cruise.
Probability theory versus simulation of petroleum potential in play analysis
Crovelli, R.A.
1987-01-01
An analytic probabilistic methodology for resource appraisal of undiscovered oil and gas resources in play analysis is presented. This play-analysis methodology is a geostochastic system for petroleum resource appraisal in explored as well as frontier areas. An objective was to replace an existing Monte Carlo simulation method in order to increase the efficiency of the appraisal process. Underlying the two methods is a single geologic model which considers both the uncertainty of the presence of the assessed hydrocarbon and its amount if present. The results of the model are resource estimates of crude oil, nonassociated gas, dissolved gas, and gas for a geologic play in terms of probability distributions. The analytic method is based upon conditional probability theory and a closed form solution of all means and standard deviations, along with the probabilities of occurrence. ?? 1987 J.C. Baltzer A.G., Scientific Publishing Company.
Scale-up of ecological experiments: Density variation in the mobile bivalve Macomona liliana
Schneider, Davod C.; Walters, R.; Thrush, S.; Dayton, P.
1997-01-01
At present the problem of scaling up from controlled experiments (necessarily at a small spatial scale) to questions of regional or global importance is perhaps the most pressing issue in ecology. Most of the proposed techniques recommend iterative cycling between theory and experiment. We present a graphical technique that facilitates this cycling by allowing the scope of experiments, surveys, and natural history observations to be compared to the scope of models and theory. We apply the scope analysis to the problem of understanding the population dynamics of a bivalve exposed to environmental stress at the scale of a harbour. Previous lab and field experiments were found not to be 1:1 scale models of harbour-wide processes. Scope analysis allowed small scale experiments to be linked to larger scale surveys and to a spatially explicit model of population dynamics.
Meshless analysis of shear deformable shells: the linear model
NASA Astrophysics Data System (ADS)
Costa, Jorge C.; Tiago, Carlos M.; Pimenta, Paulo M.
2013-10-01
This work develops a kinematically linear shell model departing from a consistent nonlinear theory. The initial geometry is mapped from a flat reference configuration by a stress-free finite deformation, after which, the actual shell motion takes place. The model maintains the features of a complete stress-resultant theory with Reissner-Mindlin kinematics based on an inextensible director. A hybrid displacement variational formulation is presented, where the domain displacements and kinematic boundary reactions are independently approximated. The resort to a flat reference configuration allows the discretization using 2-D Multiple Fixed Least-Squares (MFLS) on the domain. The consistent definition of stress resultants and consequent plane stress assumption led to a neat formulation for the analysis of shells. The consistent linear approximation, combined with MFLS, made possible efficient computations with a desired continuity degree, leading to smooth results for the displacement, strain and stress fields, as shown by several numerical examples.
NASA Astrophysics Data System (ADS)
Kruk, D.; Earle, K. A.; Mielczarek, A.; Kubica, A.; Milewska, A.; Moscicki, J.
2011-12-01
A general theory of lineshapes in nuclear quadrupole resonance (NQR), based on the stochastic Liouville equation, is presented. The description is valid for arbitrary motional conditions (particularly beyond the valid range of perturbation approaches) and interaction strengths. It can be applied to the computation of NQR spectra for any spin quantum number and for any applied magnetic field. The treatment presented here is an adaptation of the "Swedish slow motion theory," [T. Nilsson and J. Kowalewski, J. Magn. Reson. 146, 345 (2000), 10.1006/jmre.2000.2125] originally formulated for paramagnetic systems, to NQR spectral analysis. The description is formulated for simple (Brownian) diffusion, free diffusion, and jump diffusion models. The two latter models account for molecular cooperativity effects in dense systems (such as liquids of high viscosity or molecular glasses). The sensitivity of NQR slow motion spectra to the mechanism of the motional processes modulating the nuclear quadrupole interaction is discussed.
Analysis of the Efficacy of an Intervention to Improve Parent-Adolescent Problem Solving
Semeniuk, Yulia Yuriyivna; Brown, Roger L.; Riesch, Susan K.
2016-01-01
We conducted a two-group longitudinal partially nested randomized controlled trial to examine whether young adolescent youth-parent dyads participating in Mission Possible: Parents and Kids Who Listen, in contrast to a comparison group, would demonstrate improved problem solving skill. The intervention is based on the Circumplex Model and Social Problem Solving Theory. The Circumplex Model posits that families who are balanced, that is characterized by high cohesion and flexibility and open communication, function best. Social Problem Solving Theory informs the process and skills of problem solving. The Conditional Latent Growth Modeling analysis revealed no statistically significant differences in problem solving among the final sample of 127 dyads in the intervention and comparison groups. Analyses of effect sizes indicated large magnitude group effects for selected scales for youth and dyads portraying a potential for efficacy and identifying for whom the intervention may be efficacious if study limitations and lessons learned were addressed. PMID:26936844
Complexity analysis and mathematical tools towards the modelling of living systems.
Bellomo, N; Bianca, C; Delitala, M
2009-09-01
This paper is a review and critical analysis of the mathematical kinetic theory of active particles applied to the modelling of large living systems made up of interacting entities. The first part of the paper is focused on a general presentation of the mathematical tools of the kinetic theory of active particles. The second part provides a review of a variety of mathematical models in life sciences, namely complex social systems, opinion formation, evolution of epidemics with virus mutations, and vehicular traffic, crowds and swarms. All the applications are technically related to the mathematical structures reviewed in the first part of the paper. The overall contents are based on the concept that living systems, unlike the inert matter, have the ability to develop behaviour geared towards their survival, or simply to improve the quality of their life. In some cases, the behaviour evolves in time and generates destructive and/or proliferative events.
Stochastic modeling of mode interactions via linear parabolized stability equations
NASA Astrophysics Data System (ADS)
Ran, Wei; Zare, Armin; Hack, M. J. Philipp; Jovanovic, Mihailo
2017-11-01
Low-complexity approximations of the Navier-Stokes equations have been widely used in the analysis of wall-bounded shear flows. In particular, the parabolized stability equations (PSE) and Floquet theory have been employed to capture the evolution of primary and secondary instabilities in spatially-evolving flows. We augment linear PSE with Floquet analysis to formally treat modal interactions and the evolution of secondary instabilities in the transitional boundary layer via a linear progression. To this end, we leverage Floquet theory by incorporating the primary instability into the base flow and accounting for different harmonics in the flow state. A stochastic forcing is introduced into the resulting linear dynamics to model the effect of nonlinear interactions on the evolution of modes. We examine the H-type transition scenario to demonstrate how our approach can be used to model nonlinear effects and capture the growth of the fundamental and subharmonic modes observed in direct numerical simulations and experiments.
USDA-ARS?s Scientific Manuscript database
The objective of this work is to develop a new thermodynamic mathematical model for evaluating the effect of temperature on the rate of microbial growth. The new mathematical model is derived by combining the Arrhenius equation and the Eyring-Polanyi transition theory. The new model, suitable for ...
Health belief model and reasoned action theory in predicting water saving behaviors in yazd, iran.
Morowatisharifabad, Mohammad Ali; Momayyezi, Mahdieh; Ghaneian, Mohammad Taghi
2012-01-01
People's behaviors and intentions about healthy behaviors depend on their beliefs, values, and knowledge about the issue. Various models of health education are used in deter¬mining predictors of different healthy behaviors but their efficacy in cultural behaviors, such as water saving behaviors, are not studied. The study was conducted to explain water saving beha¬viors in Yazd, Iran on the basis of Health Belief Model and Reasoned Action Theory. The cross-sectional study used random cluster sampling to recruit 200 heads of households to collect the data. The survey questionnaire was tested for its content validity and reliability. Analysis of data included descriptive statistics, simple correlation, hierarchical multiple regression. Simple correlations between water saving behaviors and Reasoned Action Theory and Health Belief Model constructs were statistically significant. Health Belief Model and Reasoned Action Theory constructs explained 20.80% and 8.40% of the variances in water saving beha-viors, respectively. Perceived barriers were the strongest Predictor. Additionally, there was a sta¬tistically positive correlation between water saving behaviors and intention. In designing interventions aimed at water waste prevention, barriers of water saving behaviors should be addressed first, followed by people's attitude towards water saving. Health Belief Model constructs, with the exception of perceived severity and benefits, is more powerful than is Reasoned Action Theory in predicting water saving behavior and may be used as a framework for educational interventions aimed at improving water saving behaviors.
Health Belief Model and Reasoned Action Theory in Predicting Water Saving Behaviors in Yazd, Iran
Morowatisharifabad, Mohammad Ali; Momayyezi, Mahdieh; Ghaneian, Mohammad Taghi
2012-01-01
Background: People's behaviors and intentions about healthy behaviors depend on their beliefs, values, and knowledge about the issue. Various models of health education are used in deter¬mining predictors of different healthy behaviors but their efficacy in cultural behaviors, such as water saving behaviors, are not studied. The study was conducted to explain water saving beha¬viors in Yazd, Iran on the basis of Health Belief Model and Reasoned Action Theory. Methods: The cross-sectional study used random cluster sampling to recruit 200 heads of households to collect the data. The survey questionnaire was tested for its content validity and reliability. Analysis of data included descriptive statistics, simple correlation, hierarchical multiple regression. Results: Simple correlations between water saving behaviors and Reasoned Action Theory and Health Belief Model constructs were statistically significant. Health Belief Model and Reasoned Action Theory constructs explained 20.80% and 8.40% of the variances in water saving beha-viors, respectively. Perceived barriers were the strongest Predictor. Additionally, there was a sta¬tistically positive correlation between water saving behaviors and intention. Conclusion: In designing interventions aimed at water waste prevention, barriers of water saving behaviors should be addressed first, followed by people's attitude towards water saving. Health Belief Model constructs, with the exception of perceived severity and benefits, is more powerful than is Reasoned Action Theory in predicting water saving behavior and may be used as a framework for educational interventions aimed at improving water saving behaviors. PMID:24688927
Alien calculus and a Schwinger-Dyson equation: two-point function with a nonperturbative mass scale
NASA Astrophysics Data System (ADS)
Bellon, Marc P.; Clavier, Pierre J.
2018-02-01
Starting from the Schwinger-Dyson equation and the renormalization group equation for the massless Wess-Zumino model, we compute the dominant nonperturbative contributions to the anomalous dimension of the theory, which are related by alien calculus to singularities of the Borel transform on integer points. The sum of these dominant contributions has an analytic expression. When applied to the two-point function, this analysis gives a tame evolution in the deep euclidean domain at this approximation level, making doubtful the arguments on the triviality of the quantum field theory with positive β -function. On the other side, we have a singularity of the propagator for timelike momenta of the order of the renormalization group invariant scale of the theory, which has a nonperturbative relationship with the renormalization point of the theory. All these results do not seem to have an interpretation in terms of semiclassical analysis of a Feynman path integral.
NASA Astrophysics Data System (ADS)
Tian, F.; Lu, Y.
2017-12-01
Based on socioeconomic and hydrological data in three arid inland basins and error analysis, the dynamics of human water consumption (HWC) are analyzed to be asymmetric, i.e., HWC increase rapidly in wet periods while maintain or decrease slightly in dry periods. Besides the qualitative analysis that in wet periods great water availability inspires HWC to grow fast but the now expanded economy is managed to sustain by over-exploitation in dry periods, two quantitative models are established and tested, based on expected utility theory (EUT) and prospect theory (PT) respectively. EUT states that humans make decisions based on the total expected utility, namely the sum of utility function multiplied by probability of each result, while PT states that the utility function is defined over gains and losses separately, and probability should be replaced by probability weighting function.
Vectorlike fermions and Higgs effective field theory revisited
Chen, Chien-Yi; Dawson, S.; Furlan, Elisabetta
2017-07-10
Heavy vectorlike quarks (VLQs) appear in many models of beyond the Standard Model physics. Direct experimental searches require these new quarks to be heavy, ≳ 800 – 1000 GeV . Here, we perform a global fit of the parameters of simple VLQ models in minimal representations of S U ( 2 ) L to precision data and Higgs rates. One interesting connection between anomalous Z bmore » $$\\bar{b}$$ interactions and Higgs physics in VLQ models is discussed. Finally, we present our analysis in an effective field theory (EFT) framework and show that the parameters of VLQ models are already highly constrained. Exact and approximate analytical formulas for the S and T parameters in the VLQ models we consider are available in the Supplemental Material as Mathematica files.« less
On various refined theories in the bending analysis of angle-ply laminates
NASA Astrophysics Data System (ADS)
Savithri, S.; Varadan, T. K.
1992-05-01
The accuracies of six shear-deformation theories are compared by analyzing the bending of angle-ply laminates and studying the results in the light of exact solutions. The shear-deformation theories used are those by: Ren (1986), Savithri and Varadan (1990), Bhaskar and Varadan (1991), Murakami (1986), and Pandya and Kant (1988), and combinations of these. The analytical methods are similar in that the number of unknown variables in the displacement field is independent of the number of layers in the laminate. The model by Ren is based on a parabolic distribution of transverse shear stresses in each laminate layer. This model is shown to give good predictions of deflections and stresses in two-layer antisymmetric and three-layer symmetric angle-ply laminates.
Computation of magnetic suspension of maglev systems using dynamic circuit theory
NASA Technical Reports Server (NTRS)
He, J. L.; Rote, D. M.; Coffey, H. T.
1992-01-01
Dynamic circuit theory is applied to several magnetic suspensions associated with maglev systems. These suspension systems are the loop-shaped coil guideway, the figure-eight-shaped null-flux coil guideway, and the continuous sheet guideway. Mathematical models, which can be used for the development of computer codes, are provided for each of these suspension systems. The differences and similarities of the models in using dynamic circuit theory are discussed in the paper. The paper emphasizes the transient and dynamic analysis and computer simulation of maglev systems. In general, the method discussed here can be applied to many electrodynamic suspension system design concepts. It is also suited for the computation of the performance of maglev propulsion systems. Numerical examples are presented in the paper.
Uncertainty quantification and propagation in nuclear density functional theory
Schunck, N.; McDonnell, J. D.; Higdon, D.; ...
2015-12-23
Nuclear density functional theory (DFT) is one of the main theoretical tools used to study the properties of heavy and superheavy elements, or to describe the structure of nuclei far from stability. While on-going eff orts seek to better root nuclear DFT in the theory of nuclear forces, energy functionals remain semi-phenomenological constructions that depend on a set of parameters adjusted to experimental data in fi nite nuclei. In this study, we review recent eff orts to quantify the related uncertainties, and propagate them to model predictions. In particular, we cover the topics of parameter estimation for inverse problems, statisticalmore » analysis of model uncertainties and Bayesian inference methods. Illustrative examples are taken from the literature.« less
NASA Astrophysics Data System (ADS)
Nguyen, Trung N.; Siegmund, Thomas; Tomar, Vikas; Kruzic, Jamie J.
2017-12-01
Size effects occur in non-uniform plastically deformed metals confined in a volume on the scale of micrometer or sub-micrometer. Such problems have been well studied using strain gradient rate-independent plasticity theories. Yet, plasticity theories describing the time-dependent behavior of metals in the presence of size effects are presently limited, and there is no consensus about how the size effects vary with strain rates or whether there is an interaction between them. This paper introduces a constitutive model which enables the analysis of complex load scenarios, including loading rate sensitivity, creep, relaxation and interactions thereof under the consideration of plastic strain gradient effects. A strain gradient viscoplasticity constitutive model based on the Kocks-Mecking theory of dislocation evolution, namely the strain gradient Kocks-Mecking (SG-KM) model, is established and allows one to capture both rate and size effects, and their interaction. A formulation of the model in the finite element analysis framework is derived. Numerical examples are presented. In a special virtual creep test with the presence of plastic strain gradients, creep rates are found to diminish with the specimen size, and are also found to depend on the loading rate in an initial ramp loading step. Stress relaxation in a solid medium containing cylindrical microvoids is predicted to increase with decreasing void radius and strain rate in a prior ramp loading step.
McDonnell, J D; Schunck, N; Higdon, D; Sarich, J; Wild, S M; Nazarewicz, W
2015-03-27
Statistical tools of uncertainty quantification can be used to assess the information content of measured observables with respect to present-day theoretical models, to estimate model errors and thereby improve predictive capability, to extrapolate beyond the regions reached by experiment, and to provide meaningful input to applications and planned measurements. To showcase new opportunities offered by such tools, we make a rigorous analysis of theoretical statistical uncertainties in nuclear density functional theory using Bayesian inference methods. By considering the recent mass measurements from the Canadian Penning Trap at Argonne National Laboratory, we demonstrate how the Bayesian analysis and a direct least-squares optimization, combined with high-performance computing, can be used to assess the information content of the new data with respect to a model based on the Skyrme energy density functional approach. Employing the posterior probability distribution computed with a Gaussian process emulator, we apply the Bayesian framework to propagate theoretical statistical uncertainties in predictions of nuclear masses, two-neutron dripline, and fission barriers. Overall, we find that the new mass measurements do not impose a constraint that is strong enough to lead to significant changes in the model parameters. The example discussed in this study sets the stage for quantifying and maximizing the impact of new measurements with respect to current modeling and guiding future experimental efforts, thus enhancing the experiment-theory cycle in the scientific method.
NASA Technical Reports Server (NTRS)
Yamauchi, G.; Johnson, W.
1984-01-01
A computationally efficient body analysis designed to couple with a comprehensive helicopter analysis is developed in order to calculate the body-induced aerodynamic effects on rotor performance and loads. A modified slender body theory is used as the body model. With the objective of demonstrating the accuracy, efficiency, and application of the method, the analysis at this stage is restricted to axisymmetric bodies at zero angle of attack. By comparing with results from an exact analysis for simple body shapes, it is found that the modified slender body theory provides an accurate potential flow solution for moderately thick bodies, with only a 10%-20% increase in computational effort over that of an isolated rotor analysis. The computational ease of this method provides a means for routine assessment of body-induced effects on a rotor. Results are given for several configurations that typify those being used in the Ames 40- by 80-Foot Wind Tunnel and in the rotor-body aerodynamic interference tests being conducted at Ames. A rotor-hybrid airship configuration is also analyzed.
Practical research on the teaching of Optical Design
NASA Astrophysics Data System (ADS)
Fan, Changjiang; Ren, Zhijun; Ying, Chaofu; Peng, Baojin
2017-08-01
Optical design, together with applied optics, forms a complete system from basic theory to application theory, and it plays a very important role in professional education. In order to improve senior undergraduates' understanding of optical design, this course is divided into three parts: theoretical knowledge, software design and product processing. Through learning theoretical knowledge, students can master the aberration theory and the design principles of typical optical system. By using ZEMAX(an imaging design software), TRACEPRO(a lighting optical design software), SOLIDWORKS or PROE( mechanical design software), student can establish a complete model of optical system. Student can use carving machine located in lab or cooperative units to process the model. Through the above three parts, student can learn necessary practical knowledge and get improved in their learning and analysis abilities, thus they can also get enough practice to prompt their creative abilities, then they could gradually change from scientific theory learners to an Optics Engineers.
Ortega, Johis; Huang, Shi; Prado, Guillermo
2012-01-03
HIV/AIDS is listed as one of the top 10 reasons for the death of Hispanics between the ages of 15 and 54 in the United States. This cross sectional, descriptive secondary study proposed that using both the systemic (ecodevelopmental) and the individually focused (theory of reasoned action) theories together would lead to an increased understanding of the risk and protective factors that influence HIV risk behaviors in this population. The sample consisted of 493 Hispanic adolescent 7th and 8th graders and their immigrant parents living in Miami, Florida. Structural Equation Modeling (SEM) was used for the data analysis. Family functioning emerged as the heart of the model, embedded within a web of direct and mediated relationships. The data support the idea that family can play a central role in the prevention of Hispanic adolescents' risk behaviors.
Ortega, Johis; Huang, Shi; Prado, Guillermo
2012-01-01
HIV/AIDS is listed as one of the top 10 reasons for the death of Hispanics between the ages of 15 and 54 in the United States. This cross sectional, descriptive secondary study proposed that using both the systemic (ecodevelopmental) and the individually focused (theory of reasoned action) theories together would lead to an increased understanding of the risk and protective factors that influence HIV risk behaviors in this population. The sample consisted of 493 Hispanic adolescent 7th and 8th graders and their immigrant parents living in Miami, Florida. Structural Equation Modeling (SEM) was used for the data analysis. Family functioning emerged as the heart of the model, embedded within a web of direct and mediated relationships. The data support the idea that family can play a central role in the prevention of Hispanic adolescents’ risk behaviors. PMID:23152718
Pelaccia, Thierry; Tardif, Jacques; Triby, Emmanuel; Charlin, Bernard
2011-03-14
Clinical reasoning plays a major role in the ability of doctors to make diagnoses and decisions. It is considered as the physician's most critical competence, and has been widely studied by physicians, educationalists, psychologists and sociologists. Since the 1970s, many theories about clinical reasoning in medicine have been put forward. This paper aims at exploring a comprehensive approach: the "dual-process theory", a model developed by cognitive psychologists over the last few years. After 40 years of sometimes contradictory studies on clinical reasoning, the dual-process theory gives us many answers on how doctors think while making diagnoses and decisions. It highlights the importance of physicians' intuition and the high level of interaction between analytical and non-analytical processes. However, it has not received much attention in the medical education literature. The implications of dual-process models of reasoning in terms of medical education will be discussed.
Investigating synoptic-scale monsoonal disturbances in an idealized moist model
NASA Astrophysics Data System (ADS)
Clark, S.; Ming, Y.
2017-12-01
Recent studies have highlighted the potential utility of a theory for a "moisture-dynamical" instability in explaining the time and spatial scales of intra-seasonal variability associated with the Indian summer monsoon. These studies suggest that a localized region in the subtropics with mean low-level westerly winds and mean temperature increasing poleward will allow the formation of westward propagating precipitation anomalies associated with moist Rossby-like waves. Here we test this theory in an idealized moist model with realistic radiative transfer by inducing a local poleward-increasing temperature gradient by placing a continent with simplified hydrology in the subtropics. We experiment with different treatments of land-surface hydrology, ranging from the extreme (treating land as having the same heat capacity as the slab ocean used in the model, and turning off evaporation completely over land) to the more realistic (bucket hydrology, with a decreased heat capacity over land), and different continental shapes, ranging from a zonally-symmetric continent, to Earth-like continental geometry. Precipitation rates produced by the simulations are analyzed using space-time spectral analysis, and connected to variability in the winds through regression analysis. The observed behavior is discussed with respect to predictions from the theory.