Topics in Computational Learning Theory and Graph Algorithms.
ERIC Educational Resources Information Center
Board, Raymond Acton
This thesis addresses problems from two areas of theoretical computer science. The first area is that of computational learning theory, which is the study of the phenomenon of concept learning using formal mathematical models. The goal of computational learning theory is to investigate learning in a rigorous manner through the use of techniques…
Non-Determinism: An Abstract Concept in Computer Science Studies
ERIC Educational Resources Information Center
Armoni, Michal; Gal-Ezer, Judith
2007-01-01
Non-determinism is one of the most important, yet abstract, recurring concepts of Computer Science. It plays an important role in Computer Science areas such as formal language theory, computability theory, distributed computing, and operating systems. We conducted a series of studies on the perception of non-determinism. In the current research,…
Computer Game Theories for Designing Motivating Educational Software: A Survey Study
ERIC Educational Resources Information Center
Ang, Chee Siang; Rao, G. S. V. Radha Krishna
2008-01-01
The purpose of this study is to evaluate computer game theories for educational software. We propose a framework for designing engaging educational games based on contemporary game studies which includes ludology and narratology. Ludology focuses on the study of computer games as play and game activities, while narratology revolves around the…
ERIC Educational Resources Information Center
Presno, Caroline
1997-01-01
Discusses computer instruction in light of Bruner's theory of three forms of representation (action, icons, and symbols). Examines how studies regarding Paivio's dual-coding theory and studies focusing on procedural knowledge support Bruner's theory. Provides specific examples for instruction in three categories: demonstrations, pictures and…
Computer-based teaching module design: principles derived from learning theories.
Lau, K H Vincent
2014-03-01
The computer-based teaching module (CBTM), which has recently gained prominence in medical education, is a teaching format in which a multimedia program serves as a single source for knowledge acquisition rather than playing an adjunctive role as it does in computer-assisted learning (CAL). Despite empirical validation in the past decade, there is limited research into the optimisation of CBTM design. This review aims to summarise research in classic and modern multimedia-specific learning theories applied to computer learning, and to collapse the findings into a set of design principles to guide the development of CBTMs. Scopus was searched for: (i) studies of classic cognitivism, constructivism and behaviourism theories (search terms: 'cognitive theory' OR 'constructivism theory' OR 'behaviourism theory' AND 'e-learning' OR 'web-based learning') and their sub-theories applied to computer learning, and (ii) recent studies of modern learning theories applied to computer learning (search terms: 'learning theory' AND 'e-learning' OR 'web-based learning') for articles published between 1990 and 2012. The first search identified 29 studies, dominated in topic by the cognitive load, elaboration and scaffolding theories. The second search identified 139 studies, with diverse topics in connectivism, discovery and technical scaffolding. Based on their relative representation in the literature, the applications of these theories were collapsed into a list of CBTM design principles. Ten principles were identified and categorised into three levels of design: the global level (managing objectives, framing, minimising technical load); the rhetoric level (optimising modality, making modality explicit, scaffolding, elaboration, spaced repeating), and the detail level (managing text, managing devices). This review examined the literature in the application of learning theories to CAL to develop a set of principles that guide CBTM design. Further research will enable educators to take advantage of this unique teaching format as it gains increasing importance in medical education. © 2014 John Wiley & Sons Ltd.
Do All Roads Lead to Rome? ("or" Reductions for Dummy Travelers)
ERIC Educational Resources Information Center
Kilpelainen, Pekka
2010-01-01
Reduction is a central ingredient of computational thinking, and an important tool in algorithm design, in computability theory, and in complexity theory. Reduction has been recognized to be a difficult topic for students to learn. Previous studies on teaching reduction have concentrated on its use in special courses on the theory of computing. As…
Modeling Human-Computer Decision Making with Covariance Structure Analysis.
ERIC Educational Resources Information Center
Coovert, Michael D.; And Others
Arguing that sufficient theory exists about the interplay between human information processing, computer systems, and the demands of various tasks to construct useful theories of human-computer interaction, this study presents a structural model of human-computer interaction and reports the results of various statistical analyses of this model.…
Reconstructing constructivism: causal models, Bayesian learning mechanisms, and the theory theory.
Gopnik, Alison; Wellman, Henry M
2012-11-01
We propose a new version of the "theory theory" grounded in the computational framework of probabilistic causal models and Bayesian learning. Probabilistic models allow a constructivist but rigorous and detailed approach to cognitive development. They also explain the learning of both more specific causal hypotheses and more abstract framework theories. We outline the new theoretical ideas, explain the computational framework in an intuitive and nontechnical way, and review an extensive but relatively recent body of empirical results that supports these ideas. These include new studies of the mechanisms of learning. Children infer causal structure from statistical information, through their own actions on the world and through observations of the actions of others. Studies demonstrate these learning mechanisms in children from 16 months to 4 years old and include research on causal statistical learning, informal experimentation through play, and imitation and informal pedagogy. They also include studies of the variability and progressive character of intuitive theory change, particularly theory of mind. These studies investigate both the physical and the psychological and social domains. We conclude with suggestions for further collaborative projects between developmental and computational cognitive scientists.
Women and Computer Based Technologies: A Feminist Perspective.
ERIC Educational Resources Information Center
Morritt, Hope
The use of computer based technologies by professional women in education is examined through a feminist standpoint theory in this paper. The theory is grounded in eight claims which form the basis of the conceptual framework for the study. The experiences of nine women participants with computer based technologies were categorized using three…
ERIC Educational Resources Information Center
Goldberg, Adele; Suppes, Patrick
An interactive computer-assisted system for teaching elementary logic is described, which was designed to handle formalizations of first-order theories suitable for presentation in a computer-assisted instruction environment. The system provides tools with which the user can develop and then study a nonlogical axiomatic theory along whatever lines…
NASA Astrophysics Data System (ADS)
Pietropolli Charmet, Andrea; Stoppa, Paolo; Tasinato, Nicola; Giorgianni, Santi
2017-05-01
This work presents a benchmark study on the calculation of the sextic centrifugal distortion constants employing cubic force fields computed by means of density functional theory (DFT). For a set of semi-rigid halogenated organic compounds several functionals (B2PLYP, B3LYP, B3PW91, M06, M06-2X, O3LYP, X3LYP, ωB97XD, CAM-B3LYP, LC-ωPBE, PBE0, B97-1 and B97-D) were used for computing the sextic centrifugal distortion constants. The effects related to the size of basis sets and the performances of hybrid approaches, where the harmonic data obtained at higher level of electronic correlation are coupled with cubic force constants yielded by DFT functionals, are presented and discussed. The predicted values were compared to both the available data published in the literature and those obtained by calculations carried out at increasing level of electronic correlation: Hartree-Fock Self Consistent Field (HF-SCF), second order Møller-Plesset perturbation theory (MP2), and coupled-cluster single and double (CCSD) level of theory. Different hybrid approaches, having the cubic force field computed at DFT level of theory coupled to harmonic data computed at increasing level of electronic correlation (up to CCSD level of theory augmented by a perturbational estimate of the effects of connected triple excitations, CCSD(T)) were considered. The obtained results demonstrate that they can represent reliable and computationally affordable methods to predict sextic centrifugal terms with an accuracy almost comparable to that yielded by the more expensive anharmonic force fields fully computed at MP2 and CCSD levels of theory. In view of their reduced computational cost, these hybrid approaches pave the route to the study of more complex systems.
ERIC Educational Resources Information Center
Khan, Misbah Mahmood; Reed, Jonathan
2011-01-01
Games Based Learning needs to be linked to good learning theory to become an important educational intervention. This study examines the effectiveness of a collection of computer games called Neurogames®. Neurogames are a group of computer games aimed at improving reading and basic maths and are designed using neuropsychological theory. The…
Reconstructing constructivism: Causal models, Bayesian learning mechanisms and the theory theory
Gopnik, Alison; Wellman, Henry M.
2012-01-01
We propose a new version of the “theory theory” grounded in the computational framework of probabilistic causal models and Bayesian learning. Probabilistic models allow a constructivist but rigorous and detailed approach to cognitive development. They also explain the learning of both more specific causal hypotheses and more abstract framework theories. We outline the new theoretical ideas, explain the computational framework in an intuitive and non-technical way, and review an extensive but relatively recent body of empirical results that supports these ideas. These include new studies of the mechanisms of learning. Children infer causal structure from statistical information, through their own actions on the world and through observations of the actions of others. Studies demonstrate these learning mechanisms in children from 16 months to 4 years old and include research on causal statistical learning, informal experimentation through play, and imitation and informal pedagogy. They also include studies of the variability and progressive character of intuitive theory change, particularly theory of mind. These studies investigate both the physical and psychological and social domains. We conclude with suggestions for further collaborative projects between developmental and computational cognitive scientists. PMID:22582739
Kiper, Pawel; Szczudlik, Andrzej; Venneri, Annalena; Stozek, Joanna; Luque-Moreno, Carlos; Opara, Jozef; Baba, Alfonc; Agostini, Michela; Turolla, Andrea
2016-10-15
Computational approaches for modelling the central nervous system (CNS) aim to develop theories on processes occurring in the brain that allow the transformation of all information needed for the execution of motor acts. Computational models have been proposed in several fields, to interpret not only the CNS functioning, but also its efferent behaviour. Computational model theories can provide insights into neuromuscular and brain function allowing us to reach a deeper understanding of neuroplasticity. Neuroplasticity is the process occurring in the CNS that is able to permanently change both structure and function due to interaction with the external environment. To understand such a complex process several paradigms related to motor learning and computational modeling have been put forward. These paradigms have been explained through several internal model concepts, and supported by neurophysiological and neuroimaging studies. Therefore, it has been possible to make theories about the basis of different learning paradigms according to known computational models. Here we review the computational models and motor learning paradigms used to describe the CNS and neuromuscular functions, as well as their role in the recovery process. These theories have the potential to provide a way to rigorously explain all the potential of CNS learning, providing a basis for future clinical studies. Copyright © 2016 Elsevier B.V. All rights reserved.
The computational neurobiology of learning and reward.
Daw, Nathaniel D; Doya, Kenji
2006-04-01
Following the suggestion that midbrain dopaminergic neurons encode a signal, known as a 'reward prediction error', used by artificial intelligence algorithms for learning to choose advantageous actions, the study of the neural substrates for reward-based learning has been strongly influenced by computational theories. In recent work, such theories have been increasingly integrated into experimental design and analysis. Such hybrid approaches have offered detailed new insights into the function of a number of brain areas, especially the cortex and basal ganglia. In part this is because these approaches enable the study of neural correlates of subjective factors (such as a participant's beliefs about the reward to be received for performing some action) that the computational theories purport to quantify.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Barbero, E.J.
1989-01-01
In this study, a computational model for accurate analysis of composite laminates and laminates with including delaminated interfaces is developed. An accurate prediction of stress distributions, including interlaminar stresses, is obtained by using the Generalized Laminate Plate Theory of Reddy in which layer-wise linear approximation of the displacements through the thickness is used. Analytical as well as finite-element solutions of the theory are developed for bending and vibrations of laminated composite plates for the linear theory. Geometrical nonlinearity, including buckling and postbuckling are included and used to perform stress analysis of laminated plates. A general two dimensional theory of laminatedmore » cylindrical shells is also developed in this study. Geometrical nonlinearity and transverse compressibility are included. Delaminations between layers of composite plates are modelled by jump discontinuity conditions at the interfaces. The theory includes multiple delaminations through the thickness. Geometric nonlinearity is included to capture layer buckling. The strain energy release rate distribution along the boundary of delaminations is computed by a novel algorithm. The computational models presented herein are accurate for global behavior and particularly appropriate for the study of local effects.« less
ERIC Educational Resources Information Center
Wu, Zhiwei
2018-01-01
Framed from positioning theory and dynamic systems theory, the paper reports on a naturalistic study involving four Chinese participants and their American peers in an intercultural asynchronous computer-mediated communication (ACMC) activity. Based on the moment-by-moment analysis and triangulation of forum posts, reflective essays, and…
ERIC Educational Resources Information Center
Jarosz, Gaja
2010-01-01
This study examines the interacting roles of implicational markedness and frequency from the joint perspectives of formal linguistic theory, phonological acquisition and computational modeling. The hypothesis that child grammars are rankings of universal constraints, as in Optimality Theory (Prince & Smolensky, 1993/2004), that learning involves a…
NASA Astrophysics Data System (ADS)
Kim, Joonho; Kim, Seok; Lee, Kimyeong; Park, Jaemo; Vafa, Cumrun
2017-09-01
We study a family of 2d N=(0, 4) gauge theories which describes at low energy the dynamics of E-strings, the M2-branes suspended between a pair of M5 and M9 branes. The gauge theory is engineered using a duality with type IIA theory, leading to the D2-branes suspended between an NS5-brane and 8 D8-branes on an O8-plane. We compute the elliptic genus of this family of theories, and find agreement with the known results for single and two E-strings. The partition function can in principle be computed for arbitrary number of E-strings, and we compute them explicitly for low numbers. We test our predictions against the partially known results from topological strings, as well as from the instanton calculus of 5d Sp(1) gauge theory. Given the relation to topological strings, our computation provides the all genus partition function of the refined topological strings on the canonical bundle over 1/2K3.
The Prediction of Item Parameters Based on Classical Test Theory and Latent Trait Theory
ERIC Educational Resources Information Center
Anil, Duygu
2008-01-01
In this study, the prediction power of the item characteristics based on the experts' predictions on conditions try-out practices cannot be applied was examined for item characteristics computed depending on classical test theory and two-parameters logistic model of latent trait theory. The study was carried out on 9914 randomly selected students…
Impact of Blended Learning Environments Based on Algo-Heuristic Theory on Some Variables
ERIC Educational Resources Information Center
Aygün, Mustafa; Korkmaz, Özgen
2012-01-01
In this study, the effects of Algo-Heuristic Theory based blended learning environments on students' computer skills in their preparation of presentations, levels of attitudes towards computers, and levels of motivation regarding the information technology course were investigated. The research sample was composed of 71 students. A semi-empirical…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sanfilippo, Antonio P.
2005-12-27
Graph theory is a branch of discrete combinatorial mathematics that studies the properties of graphs. The theory was pioneered by the Swiss mathematician Leonhard Euler in the 18th century, commenced its formal development during the second half of the 19th century, and has witnessed substantial growth during the last seventy years, with applications in areas as diverse as engineering, computer science, physics, sociology, chemistry and biology. Graph theory has also had a strong impact in computational linguistics by providing the foundations for the theory of features structures that has emerged as one of the most widely used frameworks for themore » representation of grammar formalisms.« less
Wallen, Erik S; Mulloy, Karen B
2006-10-01
Occupational diseases are a significant problem affecting public health. Safety training is an important method of preventing occupational illness. Training is increasingly being delivered by computer although theories of learning from computer-based multimedia have been tested almost entirely on college students. This study was designed to determine whether these theories might also be applied to safety training applications for working adults. Participants viewed either computer-based multimedia respirator use training with concurrent narration, narration prior to the animation, or unrelated safety training. Participants then took a five-item transfer test which measured their ability to use their knowledge in new and creative ways. Participants who viewed the computer-based multimedia trainings both did significantly better than the control group on the transfer test. The results of this pilot study suggest that design guidelines developed for younger learners may be effective for training workers in occupational safety and health although more investigation is needed.
Design of transonic airfoil sections using a similarity theory
NASA Technical Reports Server (NTRS)
Nixon, D.
1978-01-01
A study of the available methods for transonic airfoil and wing design indicates that the most powerful technique is the numerical optimization procedure. However, the computer time for this method is relatively large because of the amount of computation required in the searches during optimization. The optimization method requires that base and calibration solutions be computed to determine a minimum drag direction. The design space is then computationally searched in this direction; it is these searches that dominate the computation time. A recent similarity theory allows certain transonic flows to be calculated rapidly from the base and calibration solutions. In this paper the application of the similarity theory to design problems is examined with the object of at least partially eliminating the costly searches of the design optimization method. An example of an airfoil design is presented.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zaborszky, J.; Venkatasubramanian, V.
1995-10-01
Taxonomy Theory is the first precise comprehensive theory for large power system dynamics modeled in any detail. The motivation for this project is to show that it can be used, practically, for analyzing a disturbance that actually occurred on a large system, which affected a sizable portion of the Midwest with supercritical Hopf type oscillations. This event is well documented and studied. The report first summarizes Taxonomy Theory with an engineering flavor. Then various computational approaches are sighted and analyzed for desirability to use with Taxonomy Theory. Then working equations are developed for computing a segment of the feasibility boundarymore » that bounds the region of (operating) parameters throughout which the operating point can be moved without losing stability. Then experimental software incorporating large EPRI software packages PSAPAC is developed. After a summary of the events during the subject disturbance, numerous large scale computations, up to 7600 buses, are reported. These results are reduced into graphical and tabular forms, which then are analyzed and discussed. The report is divided into two volumes. This volume illustrates the use of the Taxonomy Theory for computing the feasibility boundary and presents evidence that the event indeed led to a Hopf type oscillation on the system. Furthermore it proves that the Feasibility Theory can indeed be used for practical computation work with very large systems. Volume 2, a separate volume, will show that the disturbance has led to a supercritical (that is stable oscillation) Hopf bifurcation.« less
A Pilot Study of the Naming Transaction Shell
1991-06-01
effective computer-based instructional design. AIDA will take established theories of knowledge, learning , and instruction and incorporate the theories...felt that anyone could learn to use the system both in design and delivery modes. Traditional course development (non- computer instruction) for the...students were studying and learning the material in the text. This often resulted in wasted effort in the simulator. By ensuring that the students knew the
Investigations in Computer-Aided Instruction and Computer-Aided Controls. Final Report.
ERIC Educational Resources Information Center
Rosenberg, R.C.; And Others
These research projects, designed to delve into certain relationships between humans and computers, are focused on computer-assisted instruction and on man-computer interaction. One study demonstrates that within the limits of formal engineering theory, a computer simulated laboratory (Dynamic Systems Laboratory) can be built in which freshmen…
Collider Physics Cosmic Frontier Cosmic Frontier Theory & Computing Detector R&D Electronic Design Theory Seminar Argonne >High Energy Physics Cosmic Frontier Theory & Computing Homepage General Cosmic Frontier Theory & Computing Group led the analysis to begin mapping dark matter. There have
Song, Lingchun; Han, Jaebeom; Lin, Yen-lin; Xie, Wangshen; Gao, Jiali
2009-10-29
The explicit polarization (X-Pol) method has been examined using ab initio molecular orbital theory and density functional theory. The X-Pol potential was designed to provide a novel theoretical framework for developing next-generation force fields for biomolecular simulations. Importantly, the X-Pol potential is a general method, which can be employed with any level of electronic structure theory. The present study illustrates the implementation of the X-Pol method using ab initio Hartree-Fock theory and hybrid density functional theory. The computational results are illustrated by considering a set of bimolecular complexes of small organic molecules and ions with water. The computed interaction energies and hydrogen bond geometries are in good accord with CCSD(T) calculations and B3LYP/aug-cc-pVDZ optimizations.
Bounds on the power of proofs and advice in general physical theories.
Lee, Ciarán M; Hoban, Matty J
2016-06-01
Quantum theory presents us with the tools for computational and communication advantages over classical theory. One approach to uncovering the source of these advantages is to determine how computation and communication power vary as quantum theory is replaced by other operationally defined theories from a broad framework of such theories. Such investigations may reveal some of the key physical features required for powerful computation and communication. In this paper, we investigate how simple physical principles bound the power of two different computational paradigms which combine computation and communication in a non-trivial fashion: computation with advice and interactive proof systems. We show that the existence of non-trivial dynamics in a theory implies a bound on the power of computation with advice. Moreover, we provide an explicit example of a theory with no non-trivial dynamics in which the power of computation with advice is unbounded. Finally, we show that the power of simple interactive proof systems in theories where local measurements suffice for tomography is non-trivially bounded. This result provides a proof that [Formula: see text] is contained in [Formula: see text], which does not make use of any uniquely quantum structure-such as the fact that observables correspond to self-adjoint operators-and thus may be of independent interest.
ERIC Educational Resources Information Center
Giannakos, Michail N.
2014-01-01
Computer Science (CS) courses comprise both Programming and Information and Communication Technology (ICT) issues; however these two areas have substantial differences, inter alia the attitudes and beliefs of the students regarding the intended learning content. In this research, factors from the Social Cognitive Theory and Unified Theory of…
ERIC Educational Resources Information Center
Buraphadeja, Vasa; Dawson, Kara
2008-01-01
This article reviews content analysis studies aimed to assess critical thinking in computer-mediated communication. It also discusses theories and content analysis models that encourage critical thinking skills in asynchronous learning environments and reviews theories and factors that may foster critical thinking skills and new knowledge…
ERIC Educational Resources Information Center
Fredrick, Rona M.
2007-01-01
An interpretive case study framed by the critical race theory (CRT) and African centered theory is used to examine the teaching practices of two transformative African American teachers, which transformed the thinking and lives of their students. The analysis has illustrated that the computer technology has helped teachers in engaging in…
ERIC Educational Resources Information Center
Ruddick, Kristie R.; Parrill, Abby L.; Petersen, Richard L.
2012-01-01
In this study, a computational molecular orbital theory experiment was implemented in a first-semester honors general chemistry course. Students used the GAMESS (General Atomic and Molecular Electronic Structure System) quantum mechanical software (as implemented in ChemBio3D) to optimize the geometry for various small molecules. Extended Huckel…
Dynamics of Numerics & Spurious Behaviors in CFD Computations. Revised
NASA Technical Reports Server (NTRS)
Yee, Helen C.; Sweby, Peter K.
1997-01-01
The global nonlinear behavior of finite discretizations for constant time steps and fixed or adaptive grid spacings is studied using tools from dynamical systems theory. Detailed analysis of commonly used temporal and spatial discretizations for simple model problems is presented. The role of dynamics in the understanding of long time behavior of numerical integration and the nonlinear stability, convergence, and reliability of using time-marching approaches for obtaining steady-state numerical solutions in computational fluid dynamics (CFD) is explored. The study is complemented with examples of spurious behavior observed in steady and unsteady CFD computations. The CFD examples were chosen to illustrate non-apparent spurious behavior that was difficult to detect without extensive grid and temporal refinement studies and some knowledge from dynamical systems theory. Studies revealed the various possible dangers of misinterpreting numerical simulation of realistic complex flows that are constrained by available computing power. In large scale computations where the physics of the problem under study is not well understood and numerical simulations are the only viable means of solution, extreme care must be taken in both computation and interpretation of the numerical data. The goal of this paper is to explore the important role that dynamical systems theory can play in the understanding of the global nonlinear behavior of numerical algorithms and to aid the identification of the sources of numerical uncertainties in CFD.
Construction of Rational Maps on the Projective Line with Given Dynamical Structure
2016-05-11
References 42 4 1. Introduction The is a paper in arithmetic dynamics, a relatively young field at the intersection of the older studies of number theory...computers became available. The exponentially increased computational power and access to larger data sets rocketed the field forward, allowing...theory and dy- 5 namical systems, have come together to create a new field : arithmetic dynamics. Relative to the study of mathematics as a whole
Too Good to be True? Ideomotor Theory from a Computational Perspective
Herbort, Oliver; Butz, Martin V.
2012-01-01
In recent years, Ideomotor Theory has regained widespread attention and sparked the development of a number of theories on goal-directed behavior and learning. However, there are two issues with previous studies’ use of Ideomotor Theory. Although Ideomotor Theory is seen as very general, it is often studied in settings that are considerably more simplistic than most natural situations. Moreover, Ideomotor Theory’s claim that effect anticipations directly trigger actions and that action-effect learning is based on the formation of direct action-effect associations is hard to address empirically. We address these points from a computational perspective. A simple computational model of Ideomotor Theory was tested in tasks with different degrees of complexity. The model evaluation showed that Ideomotor Theory is a computationally feasible approach for understanding efficient action-effect learning for goal-directed behavior if the following preconditions are met: (1) The range of potential actions and effects has to be restricted. (2) Effects have to follow actions within a short time window. (3) Actions have to be simple and may not require sequencing. The first two preconditions also limit human performance and thus support Ideomotor Theory. The last precondition can be circumvented by extending the model with more complex, indirect action generation processes. In conclusion, we suggest that Ideomotor Theory offers a comprehensive framework to understand action-effect learning. However, we also suggest that additional processes may mediate the conversion of effect anticipations into actions in many situations. PMID:23162524
The 6th International Conference on Computer Science and Computational Mathematics (ICCSCM 2017)
NASA Astrophysics Data System (ADS)
2017-09-01
The ICCSCM 2017 (The 6th International Conference on Computer Science and Computational Mathematics) has aimed to provide a platform to discuss computer science and mathematics related issues including Algebraic Geometry, Algebraic Topology, Approximation Theory, Calculus of Variations, Category Theory; Homological Algebra, Coding Theory, Combinatorics, Control Theory, Cryptology, Geometry, Difference and Functional Equations, Discrete Mathematics, Dynamical Systems and Ergodic Theory, Field Theory and Polynomials, Fluid Mechanics and Solid Mechanics, Fourier Analysis, Functional Analysis, Functions of a Complex Variable, Fuzzy Mathematics, Game Theory, General Algebraic Systems, Graph Theory, Group Theory and Generalizations, Image Processing, Signal Processing and Tomography, Information Fusion, Integral Equations, Lattices, Algebraic Structures, Linear and Multilinear Algebra; Matrix Theory, Mathematical Biology and Other Natural Sciences, Mathematical Economics and Financial Mathematics, Mathematical Physics, Measure Theory and Integration, Neutrosophic Mathematics, Number Theory, Numerical Analysis, Operations Research, Optimization, Operator Theory, Ordinary and Partial Differential Equations, Potential Theory, Real Functions, Rings and Algebras, Statistical Mechanics, Structure Of Matter, Topological Groups, Wavelets and Wavelet Transforms, 3G/4G Network Evolutions, Ad-Hoc, Mobile, Wireless Networks and Mobile Computing, Agent Computing & Multi-Agents Systems, All topics related Image/Signal Processing, Any topics related Computer Networks, Any topics related ISO SC-27 and SC- 17 standards, Any topics related PKI(Public Key Intrastructures), Artifial Intelligences(A.I.) & Pattern/Image Recognitions, Authentication/Authorization Issues, Biometric authentication and algorithms, CDMA/GSM Communication Protocols, Combinatorics, Graph Theory, and Analysis of Algorithms, Cryptography and Foundation of Computer Security, Data Base(D.B.) Management & Information Retrievals, Data Mining, Web Image Mining, & Applications, Defining Spectrum Rights and Open Spectrum Solutions, E-Comerce, Ubiquitous, RFID, Applications, Fingerprint/Hand/Biometrics Recognitions and Technologies, Foundations of High-performance Computing, IC-card Security, OTP, and Key Management Issues, IDS/Firewall, Anti-Spam mail, Anti-virus issues, Mobile Computing for E-Commerce, Network Security Applications, Neural Networks and Biomedical Simulations, Quality of Services and Communication Protocols, Quantum Computing, Coding, and Error Controls, Satellite and Optical Communication Systems, Theory of Parallel Processing and Distributed Computing, Virtual Visions, 3-D Object Retrievals, & Virtual Simulations, Wireless Access Security, etc. The success of ICCSCM 2017 is reflected in the received papers from authors around the world from several countries which allows a highly multinational and multicultural idea and experience exchange. The accepted papers of ICCSCM 2017 are published in this Book. Please check http://www.iccscm.com for further news. A conference such as ICCSCM 2017 can only become successful using a team effort, so herewith we want to thank the International Technical Committee and the Reviewers for their efforts in the review process as well as their valuable advices. We are thankful to all those who contributed to the success of ICCSCM 2017. The Secretary
ERIC Educational Resources Information Center
Ramsey, Gregory W.
2010-01-01
This dissertation proposes and tests a theory explaining how people make decisions to achieve a goal in a specific task environment. The theory is represented as a computational model and implemented as a computer program. The task studied was primary care physicians treating patients with type 2 diabetes. Some physicians succeed in achieving…
ERIC Educational Resources Information Center
Baumeister, Antonia E.; Engelmann, Tanja; Hesse, Friedrich W.
2017-01-01
This experimental study extends conflict elaboration theory (1) by revealing social influence dynamics for a knowledge-rich computer-supported socio-cognitive conflict task not investigated in the context of this theory before and (2) by showing the impact of individual differences in social comparison orientation. Students in two conditions…
The fast algorithm of spark in compressive sensing
NASA Astrophysics Data System (ADS)
Xie, Meihua; Yan, Fengxia
2017-01-01
Compressed Sensing (CS) is an advanced theory on signal sampling and reconstruction. In CS theory, the reconstruction condition of signal is an important theory problem, and spark is a good index to study this problem. But the computation of spark is NP hard. In this paper, we study the problem of computing spark. For some special matrixes, for example, the Gaussian random matrix and 0-1 random matrix, we obtain some conclusions. Furthermore, for Gaussian random matrix with fewer rows than columns, we prove that its spark equals to the number of its rows plus one with probability 1. For general matrix, two methods are given to compute its spark. One is the method of directly searching and the other is the method of dual-tree searching. By simulating 24 Gaussian random matrixes and 18 0-1 random matrixes, we tested the computation time of these two methods. Numerical results showed that the dual-tree searching method had higher efficiency than directly searching, especially for those matrixes which has as much as rows and columns.
Tebb, Kathleen P; Erenrich, Rebecca K; Jasik, Carolyn Bradner; Berna, Mark S; Lester, James C; Ozer, Elizabeth M
2016-06-17
Alcohol use and binge drinking among adolescents and young adults remain frequent causes of preventable injuries, disease, and death, and there has been growing attention to computer-based modes of intervention delivery to prevent/reduce alcohol use. Research suggests that health interventions grounded in established theory are more effective than those with no theoretical basis. The goal of this study was to conduct a literature review of computer-based interventions (CBIs) designed to address alcohol use among adolescents and young adults (aged 12-21 years) and examine the extent to which CBIs use theories of behavior change in their development and evaluations. This study also provides an update on extant CBIs addressing alcohol use among youth and their effectiveness. Between November and December of 2014, a literature review of CBIs aimed at preventing or reducing alcohol in PsychINFO, PubMed, and Google Scholar was conducted. The use of theory in each CBI was examined using a modified version of the classification system developed by Painter et al. (Ann Behav Med 35:358-362, 2008). The search yielded 600 unique articles, 500 were excluded because they did not meet the inclusion criteria. The 100 remaining articles were retained for analyses. Many articles were written about a single intervention; thus, the search revealed a total of 42 unique CBIs. In examining the use of theory, 22 CBIs (52 %) explicitly named one or more theoretical frameworks. Primary theories mentioned were social cognitive theory, transtheoretical model, theory of planned behavior and reasoned action, and health belief model. Less than half (48 %), did not use theory, but mentioned either use of a theoretical construct (such as self-efficacy) or an intervention technique (e.g., manipulating social norms). Only a few articles provided detailed information about how the theory was applied to the CBI; the vast majority included little to no information. Given the importance of theory in guiding interventions, greater emphasis on the selection and application of theory is needed. The classification system used in this review offers a guiding framework for reporting how theory based principles can be applied to computer based interventions.
Cao, Siqin; Sheong, Fu Kit; Huang, Xuhui
2015-08-07
Reference interaction site model (RISM) has recently become a popular approach in the study of thermodynamical and structural properties of the solvent around macromolecules. On the other hand, it was widely suggested that there exists water density depletion around large hydrophobic solutes (>1 nm), and this may pose a great challenge to the RISM theory. In this paper, we develop a new analytical theory, the Reference Interaction Site Model with Hydrophobicity induced density Inhomogeneity (RISM-HI), to compute solvent radial distribution function (RDF) around large hydrophobic solute in water as well as its mixture with other polyatomic organic solvents. To achieve this, we have explicitly considered the density inhomogeneity at the solute-solvent interface using the framework of the Yvon-Born-Green hierarchy, and the RISM theory is used to obtain the solute-solvent pair correlation. In order to efficiently solve the relevant equations while maintaining reasonable accuracy, we have also developed a new closure called the D2 closure. With this new theory, the solvent RDFs around a large hydrophobic particle in water and different water-acetonitrile mixtures could be computed, which agree well with the results of the molecular dynamics simulations. Furthermore, we show that our RISM-HI theory can also efficiently compute the solvation free energy of solute with a wide range of hydrophobicity in various water-acetonitrile solvent mixtures with a reasonable accuracy. We anticipate that our theory could be widely applied to compute the thermodynamic and structural properties for the solvation of hydrophobic solute.
NASA Astrophysics Data System (ADS)
Gunceler, Deniz
Solvents are of great importance in many technological applications, but are difficult to study using standard, off-the-shelf ab initio electronic structure methods. This is because a single configuration of molecular positions in the solvent (a "snapshot" of the fluid) is not necessarily representative of the thermodynamic average. To obtain any thermodynamic averages (e.g. free energies), the phase space of the solvent must be sampled, typically using molecular dynamics. This greatly increases the computational cost involved in studying solvated systems. Joint density-functional theory has made its mark by being a computationally efficient yet rigorous theory by which to study solvation. It replaces the need for thermodynamic sampling with an effective continuum description of the solvent environment that is in-principle exact, computationally efficient and intuitive (easier to interpret). It has been very successful in aqueous systems, with potential applications in (among others) energy materials discovery, catalysis and surface science. In this dissertation, we develop accurate and fast joint density functional theories for complex, non-aqueous solvent enviroments, including organic solvents and room temperature ionic liquids, as well as new methods for calculating electron excitation spectra in such systems. These theories are then applied to a range of physical problems, from dendrite formation in lithium-metal batteries to the optical spectra of solvated ions.
What Communication Theories Can Teach the Designer of Computer-Based Training.
ERIC Educational Resources Information Center
Larsen, Ronald E.
1985-01-01
Reviews characteristics of computer-based training (CBT) that make application of communication theories appropriate and presents principles from communication theory (e.g., general systems theory, symbolic interactionism, rule theories, and interpersonal communication theories) to illustrate how CBT developers can profitably apply them to…
Control Theory and Statistical Generalizations.
ERIC Educational Resources Information Center
Powers, William T.
1990-01-01
Contrasts modeling methods in control theory to the methods of statistical generalizations in empirical studies of human or animal behavior. Presents a computer simulation that predicts behavior based on variables (effort and rewards) determined by the invariable (desired reward). Argues that control theory methods better reflect relationships to…
ERIC Educational Resources Information Center
Rias, Riaza Mohd; Zaman, Halimah Badioze
2011-01-01
Higher learning based instruction may be primarily concerned in most cases with the content of their academic lessons, and not very much with their instructional delivery. However, the effective application of learning theories and technology in higher education has an impact on student performance. With the rapid progress in the computer and…
Intentions of hospital nurses to work with computers: based on the theory of planned behavior.
Shoham, Snunith; Gonen, Ayala
2008-01-01
The purpose of this study was to determine registered nurses' attitudes related to intent to use computers in the hospital setting as a predictor of their future behavior. The study was further aimed at identifying the relationship between these attitudes and selected sociological, professional, and personal factors and to describe a research model integrating these various factors. The study was based on the theory of planned behavior. A random sample of 411 registered nurses was selected from a single large medical center in Israel. The study tool was a Likert-style questionnaire. Nine different indices were used: (1) behavioral intention toward computer use; (2) general attitudes toward computer use; (3) nursing attitudes toward computer use; (4) threat involved in computer use; (5) challenge involved in computer use; (6) organizational climate; (7) departmental climate; (8) attraction to technological innovations/innovativeness; (9) self-efficacy, ability to control behavior. Strong significant positive correlations were found between the nurses' attitudes (general attitudes and nursing attitudes), self-efficacy, innovativeness, and intentions to use computers. Higher correlations were found between departmental climate and attitudes than between organizational climate and attitudes. The threat and challenge that are involved in computer use were shown as important mediating variables to the understanding of the process of predicting attitudes and intentions toward using computers.
Ziegler, Sigurd; Pedersen, Mads L; Mowinckel, Athanasia M; Biele, Guido
2016-12-01
Attention deficit hyperactivity disorder (ADHD) is characterized by altered decision-making (DM) and reinforcement learning (RL), for which competing theories propose alternative explanations. Computational modelling contributes to understanding DM and RL by integrating behavioural and neurobiological findings, and could elucidate pathogenic mechanisms behind ADHD. This review of neurobiological theories of ADHD describes predictions for the effect of ADHD on DM and RL as described by the drift-diffusion model of DM (DDM) and a basic RL model. Empirical studies employing these models are also reviewed. While theories often agree on how ADHD should be reflected in model parameters, each theory implies a unique combination of predictions. Empirical studies agree with the theories' assumptions of a lowered DDM drift rate in ADHD, while findings are less conclusive for boundary separation. The few studies employing RL models support a lower choice sensitivity in ADHD, but not an altered learning rate. The discussion outlines research areas for further theoretical refinement in the ADHD field. Copyright © 2016 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Pietropolli Charmet, Andrea; Cornaton, Yann
2018-05-01
This work presents an investigation of the theoretical predictions yielded by anharmonic force fields having the cubic and quartic force constants are computed analytically by means of density functional theory (DFT) using the recursive scheme developed by M. Ringholm et al. (J. Comput. Chem. 35 (2014) 622). Different functionals (namely B3LYP, PBE, PBE0 and PW86x) and basis sets were used for calculating the anharmonic vibrational spectra of two halomethanes. The benchmark analysis carried out demonstrates the reliability and overall good performances offered by hybrid approaches, where the harmonic data obtained at the coupled cluster with single and double excitations level of theory augmented by a perturbational estimate of the effects of connected triple excitations, CCSD(T), are combined with the fully analytic higher order force constants yielded by DFT functionals. These methods lead to reliable and computationally affordable calculations of anharmonic vibrational spectra with an accuracy comparable to that yielded by hybrid force fields having the anharmonic force fields computed at second order Møller-Plesset perturbation theory (MP2) level of theory using numerical differentiation but without the corresponding potential issues related to computational costs and numerical errors.
ERIC Educational Resources Information Center
Reed, Cajah S.
2012-01-01
This study sought to find evidence for a beneficial learning theory to teach computer software programs. Additionally, software was analyzed for each learning theory's applicability to resolve whether certain software requires a specific method of education. The results are meant to give educators more effective teaching tools, so students…
Representational geometry: integrating cognition, computation, and the brain
Kriegeskorte, Nikolaus; Kievit, Rogier A.
2013-01-01
The cognitive concept of representation plays a key role in theories of brain information processing. However, linking neuronal activity to representational content and cognitive theory remains challenging. Recent studies have characterized the representational geometry of neural population codes by means of representational distance matrices, enabling researchers to compare representations across stages of processing and to test cognitive and computational theories. Representational geometry provides a useful intermediate level of description, capturing both the information represented in a neuronal population code and the format in which it is represented. We review recent insights gained with this approach in perception, memory, cognition, and action. Analyses of representational geometry can compare representations between models and the brain, and promise to explain brain computation as transformation of representational similarity structure. PMID:23876494
Logic as Marr's Computational Level: Four Case Studies.
Baggio, Giosuè; van Lambalgen, Michiel; Hagoort, Peter
2015-04-01
We sketch four applications of Marr's levels-of-analysis methodology to the relations between logic and experimental data in the cognitive neuroscience of language and reasoning. The first part of the paper illustrates the explanatory power of computational level theories based on logic. We show that a Bayesian treatment of the suppression task in reasoning with conditionals is ruled out by EEG data, supporting instead an analysis based on defeasible logic. Further, we describe how results from an EEG study on temporal prepositions can be reanalyzed using formal semantics, addressing a potential confound. The second part of the article demonstrates the predictive power of logical theories drawing on EEG data on processing progressive constructions and on behavioral data on conditional reasoning in people with autism. Logical theories can constrain processing hypotheses all the way down to neurophysiology, and conversely neuroscience data can guide the selection of alternative computational level models of cognition. Copyright © 2014 Cognitive Science Society, Inc.
Implicit Theories of Creativity in Computer Science in the United States and China
ERIC Educational Resources Information Center
Tang, Chaoying; Baer, John; Kaufman, James C.
2015-01-01
To study implicit concepts of creativity in computer science in the United States and mainland China, we first asked 308 Chinese computer scientists for adjectives that would describe a creative computer scientist. Computer scientists and non-computer scientists from China (N = 1069) and the United States (N = 971) then rated how well those…
NASA Astrophysics Data System (ADS)
Cook, Perry R.
This chapter covers algorithms, technologies, computer languages, and systems for computer music. Computer music involves the application of computers and other digital/electronic technologies to music composition, performance, theory, history, and the study of perception. The field combines digital signal processing, computational algorithms, computer languages, hardware and software systems, acoustics, psychoacoustics (low-level perception of sounds from the raw acoustic signal), and music cognition (higher-level perception of musical style, form, emotion, etc.).
A Quantitative Exploration of Preservice Teachers' Intent to Use Computer-based Technology
ERIC Educational Resources Information Center
Kim, Kioh; Jain, Sachin; Westhoff, Guy; Rezabek, Landra
2008-01-01
Based on Bandura's (1977) social learning theory, the purpose of this study is to identify the relationship of preservice teachers' perceptions of faculty modeling of computer-based technology and preservice teachers' intent of using computer-based technology in educational settings. There were 92 participants in this study; they were enrolled in…
Computation in generalised probabilisitic theories
NASA Astrophysics Data System (ADS)
Lee, Ciarán M.; Barrett, Jonathan
2015-08-01
From the general difficulty of simulating quantum systems using classical systems, and in particular the existence of an efficient quantum algorithm for factoring, it is likely that quantum computation is intrinsically more powerful than classical computation. At present, the best upper bound known for the power of quantum computation is that {{BQP}}\\subseteq {{AWPP}}, where {{AWPP}} is a classical complexity class (known to be included in {{PP}}, hence {{PSPACE}}). This work investigates limits on computational power that are imposed by simple physical, or information theoretic, principles. To this end, we define a circuit-based model of computation in a class of operationally-defined theories more general than quantum theory, and ask: what is the minimal set of physical assumptions under which the above inclusions still hold? We show that given only an assumption of tomographic locality (roughly, that multipartite states and transformations can be characterized by local measurements), efficient computations are contained in {{AWPP}}. This inclusion still holds even without assuming a basic notion of causality (where the notion is, roughly, that probabilities for outcomes cannot depend on future measurement choices). Following Aaronson, we extend the computational model by allowing post-selection on measurement outcomes. Aaronson showed that the corresponding quantum complexity class, {{PostBQP}}, is equal to {{PP}}. Given only the assumption of tomographic locality, the inclusion in {{PP}} still holds for post-selected computation in general theories. Hence in a world with post-selection, quantum theory is optimal for computation in the space of all operational theories. We then consider whether one can obtain relativized complexity results for general theories. It is not obvious how to define a sensible notion of a computational oracle in the general framework that reduces to the standard notion in the quantum case. Nevertheless, it is possible to define computation relative to a ‘classical oracle’. Then, we show there exists a classical oracle relative to which efficient computation in any theory satisfying the causality assumption does not include {{NP}}.
Hamiltonian lattice field theory: Computer calculations using variational methods
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zako, Robert L.
1991-12-03
I develop a variational method for systematic numerical computation of physical quantities -- bound state energies and scattering amplitudes -- in quantum field theory. An infinite-volume, continuum theory is approximated by a theory on a finite spatial lattice, which is amenable to numerical computation. I present an algorithm for computing approximate energy eigenvalues and eigenstates in the lattice theory and for bounding the resulting errors. I also show how to select basis states and choose variational parameters in order to minimize errors. The algorithm is based on the Rayleigh-Ritz principle and Kato`s generalizations of Temple`s formula. The algorithm could bemore » adapted to systems such as atoms and molecules. I show how to compute Green`s functions from energy eigenvalues and eigenstates in the lattice theory, and relate these to physical (renormalized) coupling constants, bound state energies and Green`s functions. Thus one can compute approximate physical quantities in a lattice theory that approximates a quantum field theory with specified physical coupling constants. I discuss the errors in both approximations. In principle, the errors can be made arbitrarily small by increasing the size of the lattice, decreasing the lattice spacing and computing sufficiently long. Unfortunately, I do not understand the infinite-volume and continuum limits well enough to quantify errors due to the lattice approximation. Thus the method is currently incomplete. I apply the method to real scalar field theories using a Fock basis of free particle states. All needed quantities can be calculated efficiently with this basis. The generalization to more complicated theories is straightforward. I describe a computer implementation of the method and present numerical results for simple quantum mechanical systems.« less
Theory of electron-phonon-dislon interacting system—toward a quantized theory of dislocations
NASA Astrophysics Data System (ADS)
Li, Mingda; Tsurimaki, Yoichiro; Meng, Qingping; Andrejevic, Nina; Zhu, Yimei; Mahan, Gerald D.; Chen, Gang
2018-02-01
We provide a comprehensive theoretical framework to study how crystal dislocations influence the functional properties of materials, based on the idea of a quantized dislocation, namely a ‘dislon’. In contrast to previous work on dislons which focused on exotic phenomenology, here we focus on their theoretical structure and computational power. We first provide a pedagogical introduction that explains the necessity and benefits of taking the dislon approach and why the dislon Hamiltonian takes its current form. Then, we study the electron-dislocation and phonon-dislocation scattering problems using the dislon formalism. Both the effective electron and phonon theories are derived, from which the role of dislocations on electronic and phononic transport properties is computed. Compared with traditional dislocation scattering studies, which are intrinsically single-particle, low-order perturbation and classical quenched defect in nature, the dislon theory not only allows easy incorporation of quantum many-body effects such as electron correlation, electron-phonon interaction, and higher-order scattering events, but also allows proper consideration of the dislocation’s long-range strain field and dynamic aspects on equal footing for arbitrary types of straight-line dislocations. This means that instead of developing individual models for specific dislocation scattering problems, the dislon theory allows for the calculation of electronic structure and electrical transport, thermal transport, optical and superconducting properties, etc, under one unified theory. Furthermore, the dislon theory has another advantage over empirical models in that it requires no fitting parameters. The dislon theory could serve as a major computational tool to understand the role of dislocations on multiple materials’ functional properties at an unprecedented level of clarity, and may have wide applications in dislocated energy materials.
Theory of electron–phonon–dislon interacting system—toward a quantized theory of dislocations
Li, Mingda; Tsurimaki, Yoichiro; Meng, Qingping; ...
2018-02-05
In this paper, we provide a comprehensive theoretical framework to study how crystal dislocations influence the functional properties of materials, based on the idea of a quantized dislocation, namely a 'dislon'. In contrast to previous work on dislons which focused on exotic phenomenology, here we focus on their theoretical structure and computational power. We first provide a pedagogical introduction that explains the necessity and benefits of taking the dislon approach and why the dislon Hamiltonian takes its current form. Then, we study the electron–dislocation and phonon–dislocation scattering problems using the dislon formalism. Both the effective electron and phonon theories aremore » derived, from which the role of dislocations on electronic and phononic transport properties is computed. Compared with traditional dislocation scattering studies, which are intrinsically single-particle, low-order perturbation and classical quenched defect in nature, the dislon theory not only allows easy incorporation of quantum many-body effects such as electron correlation, electron–phonon interaction, and higher-order scattering events, but also allows proper consideration of the dislocation's long-range strain field and dynamic aspects on equal footing for arbitrary types of straight-line dislocations. This means that instead of developing individual models for specific dislocation scattering problems, the dislon theory allows for the calculation of electronic structure and electrical transport, thermal transport, optical and superconducting properties, etc, under one unified theory. Furthermore, the dislon theory has another advantage over empirical models in that it requires no fitting parameters. The dislon theory could serve as a major computational tool to understand the role of dislocations on multiple materials' functional properties at an unprecedented level of clarity, and may have wide applications in dislocated energy materials.« less
Theory of electron–phonon–dislon interacting system—toward a quantized theory of dislocations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Li, Mingda; Tsurimaki, Yoichiro; Meng, Qingping
In this paper, we provide a comprehensive theoretical framework to study how crystal dislocations influence the functional properties of materials, based on the idea of a quantized dislocation, namely a 'dislon'. In contrast to previous work on dislons which focused on exotic phenomenology, here we focus on their theoretical structure and computational power. We first provide a pedagogical introduction that explains the necessity and benefits of taking the dislon approach and why the dislon Hamiltonian takes its current form. Then, we study the electron–dislocation and phonon–dislocation scattering problems using the dislon formalism. Both the effective electron and phonon theories aremore » derived, from which the role of dislocations on electronic and phononic transport properties is computed. Compared with traditional dislocation scattering studies, which are intrinsically single-particle, low-order perturbation and classical quenched defect in nature, the dislon theory not only allows easy incorporation of quantum many-body effects such as electron correlation, electron–phonon interaction, and higher-order scattering events, but also allows proper consideration of the dislocation's long-range strain field and dynamic aspects on equal footing for arbitrary types of straight-line dislocations. This means that instead of developing individual models for specific dislocation scattering problems, the dislon theory allows for the calculation of electronic structure and electrical transport, thermal transport, optical and superconducting properties, etc, under one unified theory. Furthermore, the dislon theory has another advantage over empirical models in that it requires no fitting parameters. The dislon theory could serve as a major computational tool to understand the role of dislocations on multiple materials' functional properties at an unprecedented level of clarity, and may have wide applications in dislocated energy materials.« less
Causal hydrodynamics of gauge theory plasmas from AdS/CFT duality
DOE Office of Scientific and Technical Information (OSTI.GOV)
Natsuume, Makoto; Okamura, Takashi; Department of Physics, Kwansei Gakuin University, Sanda, Hyogo, 669-1337
2008-03-15
We study causal hydrodynamics (Israel-Stewart theory) of gauge theory plasmas from the AdS/CFT duality. Causal hydrodynamics requires new transport coefficients (relaxation times) and we compute them for a number of supersymmetric gauge theories including the N=4 super Yang-Mills theory. However, the relaxation times obtained from the 'shear mode' do not agree with the ones from the 'sound mode', which implies that the Israel-Stewart theory is not a sufficient framework to describe the gauge theory plasmas.
Goli, Mohammad; Shahbazian, Shant
2018-06-20
Recently we have proposed an effective Hartree-Fock (EHF) theory for the electrons of the muonic molecules that is formally equivalent to the HF theory within the context of the nuclear-electronic orbital theory [Phys. Chem. Chem. Phys., 2018, 20, 4466]. In the present report we extend the muon-specific effective electronic structure theory beyond the EHF level by introducing the effective second order Møller-Plesset perturbation theory (EMP2) and the effective coupled-cluster theory at single and double excitation levels (ECCSD) as well as an improved version including perturbative triple excitations (ECCSD(T)). These theories incorporate electron-electron correlation into the effective paradigm and through their computational implementation, a diverse set of small muonic species is considered as a benchmark at these post-EHF levels. A comparative computational study on this set demonstrates that the muonic bond length is in general non-negligibly longer than corresponding hydrogenic analogs. Next, the developed post-EHF theories are applied for the muoniated N-heterocyclic carbene/silylene/germylene and the muoniated triazolium cation revealing the relative stability of the sticking sites of the muon in each species. The computational results, in line with previously reported experimental data demonstrate that the muon generally prefers to attach to the divalent atom with carbeneic nature. A detailed comparison of these muonic adducts with the corresponding hydrogenic adducts reveals subtle differences that have already been overlooked.
DOE Office of Scientific and Technical Information (OSTI.GOV)
D'Ariano, Giacomo Mauro
2010-05-04
I will argue that the proposal of establishing operational foundations of Quantum Theory should have top-priority, and that the Lucien Hardy's program on Quantum Gravity should be paralleled by an analogous program on Quantum Field Theory (QFT), which needs to be reformulated, notwithstanding its experimental success. In this paper, after reviewing recently suggested operational 'principles of the quantumness', I address the problem on whether Quantum Theory and Special Relativity are unrelated theories, or instead, if the one implies the other. I show how Special Relativity can be indeed derived from causality of Quantum Theory, within the computational paradigm 'the universemore » is a huge quantum computer', reformulating QFT as a Quantum-Computational Field Theory (QCFT). In QCFT Special Relativity emerges from the fabric of the computational network, which also naturally embeds gauge invariance. In this scheme even the quantization rule and the Planck constant can in principle be derived as emergent from the underlying causal tapestry of space-time. In this way Quantum Theory remains the only theory operating the huge computer of the universe.Is the computational paradigm only a speculative tautology (theory as simulation of reality), or does it have a scientific value? The answer will come from Occam's razor, depending on the mathematical simplicity of QCFT. Here I will just start scratching the surface of QCFT, analyzing simple field theories, including Dirac's. The number of problems and unmotivated recipes that plague QFT strongly motivates us to undertake the QCFT project, since QCFT makes all such problems manifest, and forces a re-foundation of QFT.« less
ERIC Educational Resources Information Center
Campbell, Robert D.
2017-01-01
This paper presents an example of an approach to teaching financial theory at the college and post-graduate levels that I call "teaching backwards". In the more traditional approach, instructors begin by explaining financial theory, then proceed to give examples of the way this theory can be applied to a business problem, structuring…
ERIC Educational Resources Information Center
Liu, Xiufeng
2006-01-01
Based on current theories of chemistry learning, this study intends to test a hypothesis that computer modeling enhanced hands-on chemistry laboratories are more effective than hands-on laboratories or computer modeling laboratories alone in facilitating high school students' understanding of chemistry concepts. Thirty-three high school chemistry…
ERIC Educational Resources Information Center
Arnold, Nike
2007-01-01
Many studies (e.g., [Beauvois, M.H., 1998. "E-talk: Computer-assisted classroom discussion--attitudes and motivation." In: Swaffar, J., Romano, S., Markley, P., Arens, K. (Eds.), "Language learning online: Theory and practice in the ESL and L2 computer classroom." Labyrinth Publications, Austin, TX, pp. 99-120; Bump, J., 1990. "Radical changes in…
Representational geometry: integrating cognition, computation, and the brain.
Kriegeskorte, Nikolaus; Kievit, Rogier A
2013-08-01
The cognitive concept of representation plays a key role in theories of brain information processing. However, linking neuronal activity to representational content and cognitive theory remains challenging. Recent studies have characterized the representational geometry of neural population codes by means of representational distance matrices, enabling researchers to compare representations across stages of processing and to test cognitive and computational theories. Representational geometry provides a useful intermediate level of description, capturing both the information represented in a neuronal population code and the format in which it is represented. We review recent insights gained with this approach in perception, memory, cognition, and action. Analyses of representational geometry can compare representations between models and the brain, and promise to explain brain computation as transformation of representational similarity structure. Copyright © 2013 Elsevier Ltd. All rights reserved.
1998-08-07
cognitive flexibility theory and generative learning theory which focus primarily on the individual student’s cognitive development , collaborative... develop "Handling Transfusion Hazards," a computer program based upon cognitive flexibility theory principles. The Program: Handling Transfusion Hazards...computer program was developed according to cognitive flexibility theory principles. A generative version was then developed by embedding
Evaluation of the chondral modeling theory using fe-simulation and numeric shape optimization
Plochocki, Jeffrey H; Ward, Carol V; Smith, Douglas E
2009-01-01
The chondral modeling theory proposes that hydrostatic pressure within articular cartilage regulates joint size, shape, and congruence through regional variations in rates of tissue proliferation.The purpose of this study is to develop a computational model using a nonlinear two-dimensional finite element analysis in conjunction with numeric shape optimization to evaluate the chondral modeling theory. The model employed in this analysis is generated from an MR image of the medial portion of the tibiofemoral joint in a subadult male. Stress-regulated morphological changes are simulated until skeletal maturity and evaluated against the chondral modeling theory. The computed results are found to support the chondral modeling theory. The shape-optimized model exhibits increased joint congruence, broader stress distributions in articular cartilage, and a relative decrease in joint diameter. The results for the computational model correspond well with experimental data and provide valuable insights into the mechanical determinants of joint growth. The model also provides a crucial first step toward developing a comprehensive model that can be employed to test the influence of mechanical variables on joint conformation. PMID:19438771
Interferometric Computation Beyond Quantum Theory
NASA Astrophysics Data System (ADS)
Garner, Andrew J. P.
2018-03-01
There are quantum solutions for computational problems that make use of interference at some stage in the algorithm. These stages can be mapped into the physical setting of a single particle travelling through a many-armed interferometer. There has been recent foundational interest in theories beyond quantum theory. Here, we present a generalized formulation of computation in the context of a many-armed interferometer, and explore how theories can differ from quantum theory and still perform distributed calculations in this set-up. We shall see that quaternionic quantum theory proves a suitable candidate, whereas box-world does not. We also find that a classical hidden variable model first presented by Spekkens (Phys Rev A 75(3): 32100, 2007) can also be used for this type of computation due to the epistemic restriction placed on the hidden variable.
On the use and computation of the Jordan canonical form in system theory
NASA Technical Reports Server (NTRS)
Sridhar, B.; Jordan, D.
1974-01-01
This paper investigates various aspects of the application of the Jordan canonical form of a matrix in system theory and develops a computational approach to determining the Jordan form for a given matrix. Applications include pole placement, controllability and observability studies, serving as an intermediate step in yielding other canonical forms, and theorem proving. The computational method developed in this paper is both simple and efficient. The method is based on the definition of a generalized eigenvector and a natural extension of Gauss elimination techniques. Examples are included for demonstration purposes.
Item Response Theory: A Basic Concept
ERIC Educational Resources Information Center
Mahmud, Jumailiyah
2017-01-01
With the development in computing technology, item response theory (IRT) develops rapidly, and has become a user friendly application in psychometrics world. Limitation in classical theory is one aspect that encourages the use of IRT. In this study, the basic concept of IRT will be discussed. In addition, it will briefly review the ability…
Relating Theory and Practice in Laboratory Work: A Variation Theoretical Study
ERIC Educational Resources Information Center
Eckerdal, Anna
2015-01-01
Computer programming education has practice-oriented as well as theory-oriented learning goals. Here, lab work plays an important role in students' learning. It is however widely reported that many students face great difficulties in learning theory as well as practice. This paper investigates the important but problematic relation between the…
A Re-Examination of Information Seeking Behaviour in the Context of Activity Theory
ERIC Educational Resources Information Center
Wilson, T. D.
2006-01-01
Introduction: Activity theory, developed in the USSR as a Marxist alternative to Western psychology, has been applied widely in educational studies and increasingly in human-computer interaction research. Argument: The key elements of activity theory, Motivation, Goal, Activity, Tools, Object, Outcome, Rules, Community and Division of labour are…
Chunks in expert memory: evidence for the magical number four ... or is it two?
Gobet, Fernand; Clarkson, Gary
2004-11-01
This study aims to test the divergent predictions of the chunking theory (Chase & Simon, 1973) and template theory (Gobet & Simon, 1996a, 2000) with respect to the number of chunks held in visual short-term memory and the size of chunks used by experts. We presented game and random chessboards in both a copy and a recall task. In a within-subject design, the stimuli were displayed using two presentation media: (a) physical board and pieces, as in Chase and Simon's (1973) study; and (b) a computer display, as in Gobet and Simon's (1998) study. Results show that, in most cases, no more than three chunks were replaced in the recall task, as predicted by template theory. In addition, with game positions in the computer condition, chess Masters replaced very large chunks (up to 15 pieces), again in line with template theory. Overall, the results suggest that the original chunking theory overestimated short-term memory capacity and underestimated the size of chunks used, in particular with Masters. They also suggest that Cowan's (2001) proposal that STM holds four chunks may be an overestimate.
ERIC Educational Resources Information Center
Navarro, Aaron B.
1981-01-01
Presents a program in Level II BASIC for a TRS-80 computer that simulates a Turing machine and discusses the nature of the device. The program is run interactively and is designed to be used as an educational tool by computer science or mathematics students studying computational or automata theory. (MP)
Why Can't a Computer Be More Like a Brain?
ERIC Educational Resources Information Center
Lerner, Eric J.
1984-01-01
Engineers seeking to develop intelligent computers have looked to studies of the human brain in hope of imitating its processes. A theory (known as cooperative action) that the brain processes information with electromagnetic waves may inspire engineers to develop entirely new types of computers. (JN)
Realizing the Promise of Visualization in the Theory of Computing
ERIC Educational Resources Information Center
Cogliati, Joshua J.; Goosey, Frances W.; Grinder, Michael T.; Pascoe, Bradley A.; Ross, Rockford J.; Williams, Cheston J.
2005-01-01
Progress on a hypertextbook on the theory of computing is presented. The hypertextbook is a novel teaching and learning resource built around web technologies that incorporates text, sound, pictures, illustrations, slide shows, video clips, and--most importantly--active learning models of the key concepts of the theory of computing into an…
Measuring uncertainty by extracting fuzzy rules using rough sets
NASA Technical Reports Server (NTRS)
Worm, Jeffrey A.
1991-01-01
Despite the advancements in the computer industry in the past 30 years, there is still one major deficiency. Computers are not designed to handle terms where uncertainty is present. To deal with uncertainty, techniques other than classical logic must be developed. The methods are examined of statistical analysis, the Dempster-Shafer theory, rough set theory, and fuzzy set theory to solve this problem. The fundamentals of these theories are combined to possibly provide the optimal solution. By incorporating principles from these theories, a decision making process may be simulated by extracting two sets of fuzzy rules: certain rules and possible rules. From these rules a corresponding measure of how much these rules is believed is constructed. From this, the idea of how much a fuzzy diagnosis is definable in terms of a set of fuzzy attributes is studied.
ERIC Educational Resources Information Center
Cetin, Ibrahim
2015-01-01
The purpose of this study is to explore students' understanding of loops and nested loops concepts. Sixty-three mechanical engineering students attending an introductory programming course participated in the study. APOS (Action, Process, Object, Schema) is a constructivist theory developed originally for mathematics education. This study is the…
On computational Gestalt detection thresholds.
Grompone von Gioi, Rafael; Jakubowicz, Jérémie
2009-01-01
The aim of this paper is to show some recent developments of computational Gestalt theory, as pioneered by Desolneux, Moisan and Morel. The new results allow to predict much more accurately the detection thresholds. This step is unavoidable if one wants to analyze visual detection thresholds in the light of computational Gestalt theory. The paper first recalls the main elements of computational Gestalt theory. It points out a precision issue in this theory, essentially due to the use of discrete probability distributions. It then proposes to overcome this issue by using continuous probability distributions and illustrates it on the meaningful alignment detector of Desolneux et al.
Blazevski, Daniel; Franklin, Jennifer
2012-12-01
Scattering theory is a convenient way to describe systems that are subject to time-dependent perturbations which are localized in time. Using scattering theory, one can compute time-dependent invariant objects for the perturbed system knowing the invariant objects of the unperturbed system. In this paper, we use scattering theory to give numerical computations of invariant manifolds appearing in laser-driven reactions. In this setting, invariant manifolds separate regions of phase space that lead to different outcomes of the reaction and can be used to compute reaction rates.
Developmental Stages in School Computer Use: Neither Marx Nor Piaget.
ERIC Educational Resources Information Center
Lengel, James G.
Karl Marx's theory of stages can be applied to computer use in the schools. The first stage, the P Stage, comprises the entry of the computer into the school. Computer use at this stage is personal and tends to center around one personality. Social studies teachers are seldom among this select few. The second stage of computer use, the D Stage, is…
ERIC Educational Resources Information Center
Wyche, Susan Porter
2010-01-01
This research focuses on the development and study of Information and Communication Technology (ICT) that support religious practices and the use of standpoint theory in ICT evaluation studies. Three phases makeup this work: formative studies to understand how megachurches, their members and leaders use ICT in ways tied to their Protestant…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Johnson, J. D.; Oberkampf, William Louis; Helton, Jon Craig
2006-10-01
Evidence theory provides an alternative to probability theory for the representation of epistemic uncertainty in model predictions that derives from epistemic uncertainty in model inputs, where the descriptor epistemic is used to indicate uncertainty that derives from a lack of knowledge with respect to the appropriate values to use for various inputs to the model. The potential benefit, and hence appeal, of evidence theory is that it allows a less restrictive specification of uncertainty than is possible within the axiomatic structure on which probability theory is based. Unfortunately, the propagation of an evidence theory representation for uncertainty through a modelmore » is more computationally demanding than the propagation of a probabilistic representation for uncertainty, with this difficulty constituting a serious obstacle to the use of evidence theory in the representation of uncertainty in predictions obtained from computationally intensive models. This presentation describes and illustrates a sampling-based computational strategy for the representation of epistemic uncertainty in model predictions with evidence theory. Preliminary trials indicate that the presented strategy can be used to propagate uncertainty representations based on evidence theory in analysis situations where naive sampling-based (i.e., unsophisticated Monte Carlo) procedures are impracticable due to computational cost.« less
Recent developments in rotary-wing aerodynamic theory
NASA Technical Reports Server (NTRS)
Johnson, W.
1986-01-01
Current progress in the computational analysis of rotary-wing flowfields is surveyed, and some typical results are presented in graphs. Topics examined include potential theory, rotating coordinate systems, lifting-surface theory (moving singularity, fixed wing, and rotary wing), panel methods (surface singularity representations, integral equations, and compressible flows), transonic theory (the small-disturbance equation), wake analysis (hovering rotor-wake models and transonic blade-vortex interaction), limitations on computational aerodynamics, and viscous-flow methods (dynamic-stall theories and lifting-line theory). It is suggested that the present algorithms and advanced computers make it possible to begin working toward the ultimate goal of turbulent Navier-Stokes calculations for an entire rotorcraft.
Introduction to the theory of machines and languages
DOE Office of Scientific and Technical Information (OSTI.GOV)
Weidhaas, P. P.
1976-04-01
This text is intended to be an elementary ''guided tour'' through some basic concepts of modern computer science. Various models of computing machines and formal languages are studied in detail. Discussions center around questions such as, ''What is the scope of problems that can or cannot be solved by computers.''
NASA Technical Reports Server (NTRS)
Finley, Gail T.
1988-01-01
This report covers the study of the relational database implementation in the NASCAD computer program system. The existing system is used primarily for computer aided design. Attention is also directed to a hidden-surface algorithm for final drawing output.
NASA Astrophysics Data System (ADS)
Saheb, Vahid; Maleki, Samira
2018-03-01
The hydrogen abstraction reactions from CH3Cl2F (R-141b) and CH3CClF2 (R-142b) by OH radicals are studied theoretically by semi-classical transition state theory. The stationary points for the reactions are located by using KMLYP density functional method along with 6-311++G(2 d,2 p) basis set and MP2 method along with 6-311+G( d, p) basis set. Single-point energy calculations are performed by the CBS-Q and G4 combination methods on the geometries optimized at the KMLYP/6-311++G(2 d,2 p) level of theory. Vibrational anharmonicity coefficients, x ij , which are needed for semi-classical transition state theory calculations, are computed at the KMLYP/6-311++G(2 d,2 p) and MP2/6-311+G( d, p) levels of theory. The computed barrier heights are slightly sensitive to the quantum-chemical method. Thermal rate coefficients are computed over the temperature range from 200 to 2000 K and they are shown to be in accordance with available experimental data. On the basis of the computed rate coefficients, the tropospheric lifetime of the CH3CCl2F and CH3CClF2 are estimated to be about 6.5 and 12.0 years, respectively.
Hollow cathodes as electron emitting plasma contactors Theory and computer modeling
NASA Technical Reports Server (NTRS)
Davis, V. A.; Katz, I.; Mandell, M. J.; Parks, D. E.
1987-01-01
Several researchers have suggested using hollow cathodes as plasma contactors for electrodynamic tethers, particularly to prevent the Shuttle Orbiter from charging to large negative potentials. Previous studies have shown that fluid models with anomalous scattering can describe the electron transport in hollow cathode generated plasmas. An improved theory of the hollow cathode plasmas is developed and computational results using the theory are compared with laboratory experiments. Numerical predictions for a hollow cathode plasma source of the type considered for use on the Shuttle are presented, as are three-dimensional NASCAP/LEO calculations of the emitted ion trajectories and the resulting potentials in the vicinity of the Orbiter. The computer calculations show that the hollow cathode plasma source makes vastly superior contact with the ionospheric plasma compared with either an electron gun or passive ion collection by the Orbiter.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cao, Siqin; Department of Chemistry, Hong Kong University of Science and Technology, Clear Water Bay, Kowloon; Sheong, Fu Kit
Reference interaction site model (RISM) has recently become a popular approach in the study of thermodynamical and structural properties of the solvent around macromolecules. On the other hand, it was widely suggested that there exists water density depletion around large hydrophobic solutes (>1 nm), and this may pose a great challenge to the RISM theory. In this paper, we develop a new analytical theory, the Reference Interaction Site Model with Hydrophobicity induced density Inhomogeneity (RISM-HI), to compute solvent radial distribution function (RDF) around large hydrophobic solute in water as well as its mixture with other polyatomic organic solvents. To achievemore » this, we have explicitly considered the density inhomogeneity at the solute-solvent interface using the framework of the Yvon-Born-Green hierarchy, and the RISM theory is used to obtain the solute-solvent pair correlation. In order to efficiently solve the relevant equations while maintaining reasonable accuracy, we have also developed a new closure called the D2 closure. With this new theory, the solvent RDFs around a large hydrophobic particle in water and different water-acetonitrile mixtures could be computed, which agree well with the results of the molecular dynamics simulations. Furthermore, we show that our RISM-HI theory can also efficiently compute the solvation free energy of solute with a wide range of hydrophobicity in various water-acetonitrile solvent mixtures with a reasonable accuracy. We anticipate that our theory could be widely applied to compute the thermodynamic and structural properties for the solvation of hydrophobic solute.« less
Hall, Nathan C.; Goetz, Thomas; Chiarella, Andrew; Rahimi, Sonia
2018-01-01
As technology becomes increasingly integrated with education, research on the relationships between students’ computing-related emotions and motivation following technological difficulties is critical to improving learning experiences. Following from Weiner’s (2010) attribution theory of achievement motivation, the present research examined relationships between causal attributions and emotions concerning academic computing difficulties in two studies. Study samples consisted of North American university students enrolled in both traditional and online universities (total N = 559) who responded to either hypothetical scenarios or experimental manipulations involving technological challenges experienced in academic settings. Findings from Study 1 showed stable and external attributions to be emotionally maladaptive (more helplessness, boredom, guilt), particularly in response to unexpected computing problems. Additionally, Study 2 found stable attributions for unexpected problems to predict more anxiety for traditional students, with both external and personally controllable attributions for minor problems proving emotionally beneficial for students in online degree programs (more hope, less anxiety). Overall, hypothesized negative effects of stable attributions were observed across both studies, with mixed results for personally controllable attributions and unanticipated emotional benefits of external attributions for academic computing problems warranting further study. PMID:29529039
Maymon, Rebecca; Hall, Nathan C; Goetz, Thomas; Chiarella, Andrew; Rahimi, Sonia
2018-01-01
As technology becomes increasingly integrated with education, research on the relationships between students' computing-related emotions and motivation following technological difficulties is critical to improving learning experiences. Following from Weiner's (2010) attribution theory of achievement motivation, the present research examined relationships between causal attributions and emotions concerning academic computing difficulties in two studies. Study samples consisted of North American university students enrolled in both traditional and online universities (total N = 559) who responded to either hypothetical scenarios or experimental manipulations involving technological challenges experienced in academic settings. Findings from Study 1 showed stable and external attributions to be emotionally maladaptive (more helplessness, boredom, guilt), particularly in response to unexpected computing problems. Additionally, Study 2 found stable attributions for unexpected problems to predict more anxiety for traditional students, with both external and personally controllable attributions for minor problems proving emotionally beneficial for students in online degree programs (more hope, less anxiety). Overall, hypothesized negative effects of stable attributions were observed across both studies, with mixed results for personally controllable attributions and unanticipated emotional benefits of external attributions for academic computing problems warranting further study.
Optimizing Computer Assisted Instruction By Applying Principles of Learning Theory.
ERIC Educational Resources Information Center
Edwards, Thomas O.
The development of learning theory and its application to computer-assisted instruction (CAI) are described. Among the early theoretical constructs thought to be important are E. L. Thorndike's concept of learning connectisms, Neal Miller's theory of motivation, and B. F. Skinner's theory of operant conditioning. Early devices incorporating those…
2003-03-01
sociocultural theory of learning was pioneered by Lev Vygotsky in the early twentieth century Soviet Union. Although his works were not published...Overview ....................................................................................................................... 14 Learning Theories ...and Teaching Strategies .................................................................. 14 Learning Theories and CBT
Thermodynamic and transport properties of nitrogen fluid: Molecular theory and computer simulations
NASA Astrophysics Data System (ADS)
Eskandari Nasrabad, A.; Laghaei, R.
2018-04-01
Computer simulations and various theories are applied to compute the thermodynamic and transport properties of nitrogen fluid. To model the nitrogen interaction, an existing potential in the literature is modified to obtain a close agreement between the simulation results and experimental data for the orthobaric densities. We use the Generic van der Waals theory to calculate the mean free volume and apply the results within the modified Cohen-Turnbull relation to obtain the self-diffusion coefficient. Compared to experimental data, excellent results are obtained via computer simulations for the orthobaric densities, the vapor pressure, the equation of state, and the shear viscosity. We analyze the results of the theory and computer simulations for the various thermophysical properties.
A computer architecture for intelligent machines
NASA Technical Reports Server (NTRS)
Lefebvre, D. R.; Saridis, G. N.
1992-01-01
The theory of intelligent machines proposes a hierarchical organization for the functions of an autonomous robot based on the principle of increasing precision with decreasing intelligence. An analytic formulation of this theory using information-theoretic measures of uncertainty for each level of the intelligent machine has been developed. The authors present a computer architecture that implements the lower two levels of the intelligent machine. The architecture supports an event-driven programming paradigm that is independent of the underlying computer architecture and operating system. Execution-level controllers for motion and vision systems are briefly addressed, as well as the Petri net transducer software used to implement coordination-level functions. A case study illustrates how this computer architecture integrates real-time and higher-level control of manipulator and vision systems.
A Flow Theory Perspective on Learner Motivation and Behavior in Distance Education
ERIC Educational Resources Information Center
Liao, Li-Fen
2006-01-01
Motivating learners to continue to study and enjoy learning is one of the critical factors in distance education. Flow theory is a useful framework for studying the individual experience of learning through using computers. In this study, I examine students' emotional and cognitive responses to distance learning systems by constructing two models…
Design and evaluation of a computer game to promote a healthy diet for young adults.
Peng, Wei
2009-03-01
This article reports the development and evaluation of a computer game (RightWay Café) as a special medium to promote a healthy diet for young adults. Structural features of computer games, such as interactive tailoring, role playing, the element of fun, and narrative, were operationalized in the RightWay Café game to afford behavior rehearsal in a safe and entertaining way. Theories such as the health belief model, social cognitive theory, and theory of reasoned action guided the content design of the game to influence mediators of behavior change, including self-efficacy, perceived benefits, perceived barriers, and behavior change intention. A randomized controlled evaluation study with pretest, posttest, and follow-up design demonstrated that this game was effective in teaching nutrition and weight management knowledge and increasing people's self-efficacy and perceived benefits of healthy eating, as well as their intention to be on a healthy diet. Limited long-term effects were also found: participants in the game-playing group had greater self-efficacy than participants in the control group after 1 month. This study validates the computer game-based approach to health promotion for young adults. Limitations and implications are also discussed.
ERIC Educational Resources Information Center
Venkatesh, Vijay P.
2013-01-01
The current computing landscape owes its roots to the birth of hardware and software technologies from the 1940s and 1950s. Since then, the advent of mainframes, miniaturized computing, and internetworking has given rise to the now prevalent cloud computing era. In the past few months just after 2010, cloud computing adoption has picked up pace…
NASA Astrophysics Data System (ADS)
Chen, Su Shing; Caulfield, H. John
1994-03-01
Adaptive Computing, vs. Classical Computing, is emerging to be a field which is the culmination during the last 40 and more years of various scientific and technological areas, including cybernetics, neural networks, pattern recognition networks, learning machines, selfreproducing automata, genetic algorithms, fuzzy logics, probabilistic logics, chaos, electronics, optics, and quantum devices. This volume of "Critical Reviews on Adaptive Computing: Mathematics, Electronics, and Optics" is intended as a synergistic approach to this emerging field. There are many researchers in these areas working on important results. However, we have not seen a general effort to summarize and synthesize these results in theory as well as implementation. In order to reach a higher level of synergism, we propose Adaptive Computing as the field which comprises of the above mentioned computational paradigms and various realizations. The field should include both the Theory (or Mathematics) and the Implementation. Our emphasis is on the interplay of Theory and Implementation. The interplay, an adaptive process itself, of Theory and Implementation is the only "holistic" way to advance our understanding and realization of brain-like computation. We feel that a theory without implementation has the tendency to become unrealistic and "out-of-touch" with reality, while an implementation without theory runs the risk to be superficial and obsolete.
Holographic P -wave superconductors in 1 +1 dimensions
NASA Astrophysics Data System (ADS)
Alkac, Gokhan; Chakrabortty, Shankhadeep; Chaturvedi, Pankaj
2017-10-01
We study (1 +1 )-dimensional P -wave holographic superconductors described by three- dimensional Einstein-Maxwell gravity coupled to a massive complex vector field in the context of AdS3/CFT2 correspondence. In the probe limit, where the backreaction of matter fields is neglected, we show that there is a formation of a vector hair around the black hole below a certain critical temperature. In the dual strongly coupled (1 +1 )-dimensional boundary theory, this holographically corresponds to the formation of a charged vector condensate which breaks spontaneously both the U (1 ) and S O (1 ,1 ) symmetries. We numerically compute both the free energy and the ac conductivity for the superconducting phase of the boundary field theory. Our numerical computations clearly establish that the superconducting phase of the boundary theory is favorable to the normal phase, and the presence of a magnetic moment term in the dual bulk theory effects the conductivity in the boundary field theory.
ERIC Educational Resources Information Center
Mavrou, Katerina; Lewis, Ann; Douglas, Graeme
2010-01-01
This paper discusses the results of a study of the role of the computer in scaffolding pupils' interaction and its effects on the disabled (D) pupils' participation and inclusion in the context of socio-cultural theories and the ideals of inclusive education. The study investigated the interactions of pairs of D and non-disabled (ND) pupils…
Higher-Order Adaptive Finite-Element Methods for Kohn-Sham Density Functional Theory
2012-07-03
systems studied, we observe diminishing returns in computational savings beyond the sixth-order for accuracies commensurate with chemi- cal accuracy...calculations. Further, we demonstrate the capability of the proposed approach to compute the electronic structure of materials systems contain- ing a...benchmark systems studied, we observe diminishing returns in computational savings beyond the sixth-order for accuracies commensurate with chemical accuracy
Dudding, Travis; Houk, Kendall N
2004-04-20
The catalytic asymmetric thiazolium- and triazolium-catalyzed benzoin condensations of aldehydes and ketones were studied with computational methods. Transition-state geometries were optimized by using Morokuma's IMOMO [integrated MO (molecular orbital) + MO method] variation of ONIOM (n-layered integrated molecular orbital method) with a combination of B3LYP/6-31G(d) and AM1 levels of theory, and final transition-state energies were computed with single-point B3LYP/6-31G(d) calculations. Correlations between experiment and theory were found, and the origins of stereoselection were identified. Thiazolium catalysts were predicted to be less selective then triazolium catalysts, a trend also found experimentally.
ERIC Educational Resources Information Center
Rambe, Patient
2012-01-01
Studies that employed activity theory as a theoretical lens for exploring computer-mediated interaction have not adopted social media as their object of study. However, social media provides lecturers with personalised learning environments for diagnostic and prognostic assessments of student mastery of content and deep learning. The integration…
Aspects of perturbation theory in quantum mechanics: The BenderWuMATHEMATICA® package
NASA Astrophysics Data System (ADS)
Sulejmanpasic, Tin; Ünsal, Mithat
2018-07-01
We discuss a general setup which allows the study of the perturbation theory of an arbitrary, locally harmonic 1D quantum mechanical potential as well as its multi-variable (many-body) generalization. The latter may form a prototype for regularized quantum field theory. We first generalize the method of Bender-Wu,and derive exact recursion relations which allow the determination of the perturbative wave-function and energy corrections to an arbitrary order, at least in principle. For 1D systems, we implement these equations in an easy to use MATHEMATICA® package we call BenderWu. Our package enables quick home-computer computation of high orders of perturbation theory (about 100 orders in 10-30 s, and 250 orders in 1-2 h) and enables practical study of a large class of problems in Quantum Mechanics. We have two hopes concerning the BenderWu package. One is that due to resurgence, large amount of non-perturbative information, such as non-perturbative energies and wave-functions (e.g. WKB wave functions), can in principle be extracted from the perturbative data. We also hope that the package may be used as a teaching tool, providing an effective bridge between perturbation theory and non-perturbative physics in textbooks. Finally, we show that for the multi-variable case, the recursion relation acquires a geometric character, and has a structure which allows parallelization to computer clusters.
The Dimensionality and Correlates of Flow in Human-Computer Interactions.
ERIC Educational Resources Information Center
Webster, Jane; And Others
1993-01-01
Defines playfulness in human-computer interactions in terms of flow theory and explores the dimensionality of the flow concept. Two studies are reported that investigated the factor structure and correlates of flow in human-computer interactions: one examined MBA students using Lotus 1-2-3 spreadsheet software, and one examined employees using…
Computer Aided Instruction: A Study of Student Evaluations and Academic Performance
ERIC Educational Resources Information Center
Collins, David; Deck, Alan; McCrickard, Myra
2008-01-01
Computer aided instruction (CAI) encompasses a broad range of computer technologies that supplement the classroom learning environment and can dramatically increase a student's access to information. Criticism of CAI generally focuses on two issues: it lacks an adequate foundation in educational theory and the software is difficult to implement…
Optimal and Nonoptimal Computer-Based Test Designs for Making Pass-Fail Decisions
ERIC Educational Resources Information Center
Hambleton, Ronald K.; Xing, Dehui
2006-01-01
Now that many credentialing exams are being routinely administered by computer, new computer-based test designs, along with item response theory models, are being aggressively researched to identify specific designs that can increase the decision consistency and accuracy of pass-fail decisions. The purpose of this study was to investigate the…
NASA Technical Reports Server (NTRS)
Chu, Y. Y.
1978-01-01
A unified formulation of computer-aided, multi-task, decision making is presented. Strategy for the allocation of decision making responsibility between human and computer is developed. The plans of a flight management systems are studied. A model based on the queueing theory was implemented.
Engaging or Distracting: Children's Tablet Computer Use in Education
ERIC Educational Resources Information Center
McEwen, Rhonda N.; Dubé, Adam K.
2015-01-01
Communications studies and psychology offer analytical and methodological tools that when combined have the potential to bring novel perspectives on human interaction with technologies. In this study of children using simple and complex mathematics applications on tablet computers, cognitive load theory is used to answer the question: how…
Dualities and Topological Field Theories from Twisted Geometries
NASA Astrophysics Data System (ADS)
Markov, Ruza
I will present three studies of string theory on twisted geometries. In the first calculation included in this dissertation we use gauge/gravity duality to study the Coulomb branch of an unusual type of nonlocal field theory, called Puff Field Theory. On the gravity side, this theory is given in terms of D3-branes in type IIB string theory with a geometric twist. While the field theory description, available in the IR limit, is a deformation of Yang-Mills gauge theory by an order seven operator which we here compute. In the rest of this dissertation we explore N = 4 super Yang-Mills (SYM) theory compactied on a circle with S-duality and R-symmetry twists that preserve N = 6 supersymmetry in 2 + 1D. It was shown that abelian theory on a flat manifold gives Chern-Simons theory in the low-energy limit and here we are interested in the non-abelian counterpart. To that end, we introduce external static supersymmetric quark and anti-quark sources into the theory and calculate the Witten Index of the resulting Hilbert space of ground states on a two-torus. Using these results we compute the action of simple Wilson loops on the Hilbert space of ground states without sources. In some cases we find disagreement between our results for the Wilson loop eigenvalues and previous conjectures about a connection with Chern-Simons theory. The last result discussed in this dissertation demonstrates a connection between gravitational Chern-Simons theory and N = 4 four-dimensional SYM theory compactified on a circle twisted by S-duality where the remaining three-manifold is not flat starting with the explicit geometric realization of S-duality in terms of (2, 0) theory.
On Evaluating Human Problem Solving of Computationally Hard Problems
ERIC Educational Resources Information Center
Carruthers, Sarah; Stege, Ulrike
2013-01-01
This article is concerned with how computer science, and more exactly computational complexity theory, can inform cognitive science. In particular, we suggest factors to be taken into account when investigating how people deal with computational hardness. This discussion will address the two upper levels of Marr's Level Theory: the computational…
Unsteady Thick Airfoil Aerodynamics: Experiments, Computation, and Theory
NASA Technical Reports Server (NTRS)
Strangfeld, C.; Rumsey, C. L.; Mueller-Vahl, H.; Greenblatt, D.; Nayeri, C. N.; Paschereit, C. O.
2015-01-01
An experimental, computational and theoretical investigation was carried out to study the aerodynamic loads acting on a relatively thick NACA 0018 airfoil when subjected to pitching and surging, individually and synchronously. Both pre-stall and post-stall angles of attack were considered. Experiments were carried out in a dedicated unsteady wind tunnel, with large surge amplitudes, and airfoil loads were estimated by means of unsteady surface mounted pressure measurements. Theoretical predictions were based on Theodorsen's and Isaacs' results as well as on the relatively recent generalizations of van der Wall. Both two- and three-dimensional computations were performed on structured grids employing unsteady Reynolds-averaged Navier-Stokes (URANS). For pure surging at pre-stall angles of attack, the correspondence between experiments and theory was satisfactory; this served as a validation of Isaacs theory. Discrepancies were traced to dynamic trailing-edge separation, even at low angles of attack. Excellent correspondence was found between experiments and theory for airfoil pitching as well as combined pitching and surging; the latter appears to be the first clear validation of van der Wall's theoretical results. Although qualitatively similar to experiment at low angles of attack, two-dimensional URANS computations yielded notable errors in the unsteady load effects of pitching, surging and their synchronous combination. The main reason is believed to be that the URANS equations do not resolve wake vorticity (explicitly modeled in the theory) or the resulting rolled-up un- steady flow structures because high values of eddy viscosity tend to \\smear" the wake. At post-stall angles, three-dimensional computations illustrated the importance of modeling the tunnel side walls.
Perceptions of teaching and learning automata theory in a college-level computer science course
NASA Astrophysics Data System (ADS)
Weidmann, Phoebe Kay
This dissertation identifies and describes student and instructor perceptions that contribute to effective teaching and learning of Automata Theory in a competitive college-level Computer Science program. Effective teaching is the ability to create an appropriate learning environment in order to provide effective learning. We define effective learning as the ability of a student to meet instructor set learning objectives, demonstrating this by passing the course, while reporting a good learning experience. We conducted our investigation through a detailed qualitative case study of two sections (118 students) of Automata Theory (CS 341) at The University of Texas at Austin taught by Dr. Lily Quilt. Because Automata Theory has a fixed curriculum in the sense that many curricula and textbooks agree on what Automata Theory contains, differences being depth and amount of material to cover in a single course, a case study would allow for generalizable findings. Automata Theory is especially problematic in a Computer Science curriculum since students are not experienced in abstract thinking before taking this course, fail to understand the relevance of the theory, and prefer classes with more concrete activities such as programming. This creates a special challenge for any instructor of Automata Theory as motivation becomes critical for student learning. Through the use of student surveys, instructor interviews, classroom observation, material and course grade analysis we sought to understand what students perceived, what instructors expected of students, and how those perceptions played out in the classroom in terms of structure and instruction. Our goal was to create suggestions that would lead to a better designed course and thus a higher student success rate in Automata Theory. We created a unique theoretical basis, pedagogical positivism, on which to study college-level courses. Pedagogical positivism states that through examining instructor and student perceptions of teaching and learning, improvements to a course are possible. These improvements can eventually develop a "best practice" instructional environment. This view is not possible under a strictly constructivist learning theory as there is no way to teach a group of individuals in a "best" way. Using this theoretical basis, we examined the gathered data from CS 341. (Abstract shortened by UMI.)
Computational Complexity and Human Decision-Making.
Bossaerts, Peter; Murawski, Carsten
2017-12-01
The rationality principle postulates that decision-makers always choose the best action available to them. It underlies most modern theories of decision-making. The principle does not take into account the difficulty of finding the best option. Here, we propose that computational complexity theory (CCT) provides a framework for defining and quantifying the difficulty of decisions. We review evidence showing that human decision-making is affected by computational complexity. Building on this evidence, we argue that most models of decision-making, and metacognition, are intractable from a computational perspective. To be plausible, future theories of decision-making will need to take into account both the resources required for implementing the computations implied by the theory, and the resource constraints imposed on the decision-maker by biology. Copyright © 2017 Elsevier Ltd. All rights reserved.
Multiple exciton generation in chiral carbon nanotubes: Density functional theory based computation
NASA Astrophysics Data System (ADS)
Kryjevski, Andrei; Mihaylov, Deyan; Kilina, Svetlana; Kilin, Dmitri
2017-10-01
We use a Boltzmann transport equation (BE) to study time evolution of a photo-excited state in a nanoparticle including phonon-mediated exciton relaxation and the multiple exciton generation (MEG) processes, such as exciton-to-biexciton multiplication and biexciton-to-exciton recombination. BE collision integrals are computed using Kadanoff-Baym-Keldysh many-body perturbation theory based on density functional theory simulations, including exciton effects. We compute internal quantum efficiency (QE), which is the number of excitons generated from an absorbed photon in the course of the relaxation. We apply this approach to chiral single-wall carbon nanotubes (SWCNTs), such as (6,2) and (6,5). We predict efficient MEG in the (6,2) and (6,5) SWCNTs within the solar spectrum range starting at the 2Eg energy threshold and with QE reaching ˜1.6 at about 3Eg, where Eg is the electronic gap.
Multiple exciton generation in chiral carbon nanotubes: Density functional theory based computation.
Kryjevski, Andrei; Mihaylov, Deyan; Kilina, Svetlana; Kilin, Dmitri
2017-10-21
We use a Boltzmann transport equation (BE) to study time evolution of a photo-excited state in a nanoparticle including phonon-mediated exciton relaxation and the multiple exciton generation (MEG) processes, such as exciton-to-biexciton multiplication and biexciton-to-exciton recombination. BE collision integrals are computed using Kadanoff-Baym-Keldysh many-body perturbation theory based on density functional theory simulations, including exciton effects. We compute internal quantum efficiency (QE), which is the number of excitons generated from an absorbed photon in the course of the relaxation. We apply this approach to chiral single-wall carbon nanotubes (SWCNTs), such as (6,2) and (6,5). We predict efficient MEG in the (6,2) and (6,5) SWCNTs within the solar spectrum range starting at the 2E g energy threshold and with QE reaching ∼1.6 at about 3E g , where E g is the electronic gap.
Effects of shock on hypersonic boundary layer stability
NASA Astrophysics Data System (ADS)
Pinna, F.; Rambaud, P.
2013-06-01
The design of hypersonic vehicles requires the estimate of the laminar to turbulent transition location for an accurate sizing of the thermal protection system. Linear stability theory is a fast scientific way to study the problem. Recent improvements in computational capabilities allow computing the flow around a full vehicle instead of using only simplified boundary layer equations. In this paper, the effect of the shock is studied on a mean flow provided by steady Computational Fluid Dynamics (CFD) computations and simplified boundary layer calculations.
The entropic boundary law in BF theory
NASA Astrophysics Data System (ADS)
Livine, Etera R.; Terno, Daniel R.
2009-01-01
We compute the entropy of a closed bounded region of space for pure 3d Riemannian gravity formulated as a topological BF theory for the gauge group SU(2) and show its holographic behavior. More precisely, we consider a fixed graph embedded in space and study the flat connection spin network state without and with particle-like topological defects. We regularize and compute exactly the entanglement for a bipartite splitting of the graph and show it scales at leading order with the number of vertices on the boundary (or equivalently with the number of loops crossing the boundary). More generally these results apply to BF theory with any compact gauge group in any space-time dimension.
NASA Astrophysics Data System (ADS)
Palmer, T. N.
2012-12-01
This essay discusses a proposal that draws together the three great revolutionary theories of 20th Century physics: quantum theory, relativity theory and chaos theory. Motivated by the Bohmian notion of implicate order, and what in chaos theory would be described as a strange attractor, the proposal attributes special ontological significance to certain non-computable, dynamically invariant state-space geometries for the universe as a whole. Studying the phenomenon of quantum interference, it is proposed to understand quantum wave-particle duality, and indeed classical electromagnetism, in terms of particles in space time and waves on this state space geometry. Studying the EPR experiment, the acausal constraints that this invariant geometry provides on spatially distant degrees of freedom, provides a way for the underlying dynamics to be consistent with the Bell theorem, yet be relativistically covariant ("nonlocality without nonlocality"). It is suggested that the physical basis for such non-computable geometries lies in properties of gravity with the information irreversibility implied by black hole no-hair theorems being crucial. In conclusion it is proposed that quantum theory may be emergent from an extended theory of gravity which is geometric not only in space time, but also in state space. Such a notion would undermine most current attempts to "quantise gravity".
The Schwarzian theory — origins
NASA Astrophysics Data System (ADS)
Mertens, Thomas G.
2018-05-01
In this paper we further study the 1d Schwarzian theory, the universal low-energy limit of Sachdev-Ye-Kitaev models, using the link with 2d Liouville theory. We provide a path-integral derivation of the structural link between both theories, and study the relation between 3d gravity, 2d Jackiw-Teitelboim gravity, 2d Liouville and the 1d Schwarzian. We then generalize the Schwarzian double-scaling limit to rational models, relevant for SYK-type models with internal symmetries. We identify the holographic gauge theory as a 2d BF theory and compute correlators of the holographically dual 1d particle-on-a-group action, decomposing these into diagrammatic building blocks, in a manner very similar to the Schwarzian theory.
T-Duality for Orientifolds and Twisted KR-Theory
NASA Astrophysics Data System (ADS)
Doran, Charles; Méndez-Diez, Stefan; Rosenberg, Jonathan
2014-08-01
D-brane charges in orientifold string theories are classified by the KR-theory of Atiyah. However, this is assuming that all O-planes have the same sign. When there are O-planes of different signs, physics demands a "KR-theory with a sign choice" which up until now has not been studied by mathematicians (with the unique exception of Moutuou, who did not have a specific application in mind). We give a definition of this theory and compute it for orientifold theories compactified on S 1 and T 2. We also explain how and why additional "twisting" is implemented. We show that our results satisfy all possible T-duality relationships for orientifold string theories on elliptic curves, which will be studied further in subsequent work.
NASA Astrophysics Data System (ADS)
Binti Shamsuddin, Norsila
Technology advancement and development in a higher learning institution is a chance for students to be motivated to learn in depth in the information technology areas. Students should take hold of the opportunity to blend their skills towards these technologies as preparation for them when graduating. The curriculum itself can rise up the students' interest and persuade them to be directly involved in the evolvement of the technology. The aim of this study is to see how deep is the students' involvement as well as their acceptance towards the adoption of the technology used in Computer Graphics and Image Processing subjects. The study will be towards the Bachelor students in Faculty of Industrial Information Technology (FIIT), Universiti Industri Selangor (UNISEL); Bac. In Multimedia Industry, BSc. Computer Science and BSc. Computer Science (Software Engineering). This study utilizes the new Unified Theory of Acceptance and Use of Technology (UTAUT) to further validate the model and enhance our understanding of the adoption of Computer Graphics and Image Processing Technologies. Four (4) out of eight (8) independent factors in UTAUT will be studied towards the dependent factor.
Density-Functional Theory Study of Materials and Their Properties at Non-Zero Temperature
NASA Astrophysics Data System (ADS)
Antolin, Nikolas
Density functional theory (DFT) has proven useful in providing energetic and structural data to inform higher levels of simulation as well as populate materials databases. However, DFT does not intrinsically include temperature effects that are critical to determining materials behavior in real-world applications. By considering the magnitude of critical energy differences in a system to be studied, one may select the appropriate level of additional theory with which to supplement DFT to obtain meaningful results with respect to temperature-induced behavior. This thesis details studies on three materials systems, representing three distinct levels of additional theory used in the study of thermally-induced behavior. After introducing the concepts involved in extracting thermal data from atomistics and density functional theory in chapters 1 and 2, chapter 3 details studies on a Ni-base superalloy system and its behavior in creep testing at high temperature due to planar defects. Chapters 4 and 5 detail work on thermal stabilization of BCC phases which are unstable without temperature effects and the progress in calculating the thermodynamic stability of vacancies in these and other BCC systems. Chapter 6 describes a study of thermal effects coupling to magnetism in indium antimonide (InSb), which are the result of previously unobserved coupling between phonons and magnetic field in a diamagnetic material. All three of the systems studied exhibit materials properties which are strongly temperature-dependent, but the level of theory necessary to study them varies from simple ground state calculations to consideration of the effects of single vibrational modes within the material. Since many of the approaches used and introduced here are computationally intensive and push the limits of publicly available computational resources, this thesis puts additional focus on optimizing code execution and choosing an appropriate level of theory to probe a given material system. An inappropriate level of theory can either be computationally wasteful (or unfeasible) or yield meaningless results; it is only by the inclusion of appropriate thermal effects, determined by system to be considered, that valid results can be obtained. Though much progress has been made in generalizing the approaches described in this thesis, further research will be necessary if we hope to fulfill the lofty goal of a universally applicable method of extracting thermal data from first principles in a way that guarantees valid and useful results.
Robust flow stability: Theory, computations and experiments in near wall turbulence
NASA Astrophysics Data System (ADS)
Bobba, Kumar Manoj
Helmholtz established the field of hydrodynamic stability with his pioneering work in 1868. From then on, hydrodynamic stability became an important tool in understanding various fundamental fluid flow phenomena in engineering (mechanical, aeronautics, chemical, materials, civil, etc.) and science (astrophysics, geophysics, biophysics, etc.), and turbulence in particular. However, there are many discrepancies between classical hydrodynamic stability theory and experiments. In this thesis, the limitations of traditional hydrodynamic stability theory are shown and a framework for robust flow stability theory is formulated. A host of new techniques like gramians, singular values, operator norms, etc. are introduced to understand the role of various kinds of uncertainty. An interesting feature of this framework is the close interplay between theory and computations. It is shown that a subset of Navier-Stokes equations are globally, non-nonlinearly stable for all Reynolds number. Yet, invoking this new theory, it is shown that these equations produce structures (vortices and streaks) as seen in the experiments. The experiments are done in zero pressure gradient transiting boundary layer on a flat plate in free surface tunnel. Digital particle image velocimetry, and MEMS based laser Doppler velocimeter and shear stress sensors have been used to make quantitative measurements of the flow. Various theoretical and computational predictions are in excellent agreement with the experimental data. A closely related topic of modeling, simulation and complexity reduction of large mechanics problems with multiple spatial and temporal scales is also studied. A nice method that rigorously quantifies the important scales and automatically gives models of the problem to various levels of accuracy is introduced. Computations done using spectral methods are presented.
A computational model of self-efficacy's various effects on performance: Moving the debate forward.
Vancouver, Jeffrey B; Purl, Justin D
2017-04-01
Self-efficacy, which is one's belief in one's capacity, has been found to both positively and negatively influence effort and performance. The reasons for these different effects have been a major topic of debate among social-cognitive and perceptual control theorists. In particular, the findings of various self-efficacy effects has been motivated by a perceptual control theory view of self-regulation that social-cognitive theorists' question. To provide more clarity to the theoretical arguments, a computational model of the multiple processes presumed to create the positive, negative, and null effects for self-efficacy is presented. Building on an existing computational model of goal choice that produces a positive effect for self-efficacy, the current article adds a symbolic processing structure used during goal striving that explains the negative self-efficacy effect observed in recent studies. Moreover, the multiple processes, operating together, allow the model to recreate the various effects found in a published study of feedback ambiguity's moderating role on the self-efficacy to performance relationship (Schmidt & DeShon, 2010). Discussion focuses on the implications of the model for the self-efficacy debate, alternative computational models, the overlap between control theory and social-cognitive theory explanations, the value of using computational models for resolving theoretical disputes, and future research and directions the model inspires. (PsycINFO Database Record (c) 2017 APA, all rights reserved).
Simulation of Nonlinear Instabilities in an Attachment-Line Boundary Layer
NASA Technical Reports Server (NTRS)
Joslin, Ronald D.
1996-01-01
The linear and the nonlinear stability of disturbances that propagate along the attachment line of a three-dimensional boundary layer is considered. The spatially evolving disturbances in the boundary layer are computed by direct numerical simulation (DNS) of the unsteady, incompressible Navier-Stokes equations. Disturbances are introduced either by forcing at the in ow or by applying suction and blowing at the wall. Quasi-parallel linear stability theory and a nonparallel theory yield notably different stability characteristics for disturbances near the critical Reynolds number; the DNS results con rm the latter theory. Previously, a weakly nonlinear theory and computations revealed a high wave-number region of subcritical disturbance growth. More recent computations have failed to achieve this subcritical growth. The present computational results indicate the presence of subcritically growing disturbances; the results support the weakly nonlinear theory. Furthermore, an explanation is provided for the previous theoretical and computational discrepancy. In addition, the present results demonstrate that steady suction can be used to stabilize disturbances that otherwise grow subcritically along the attachment line.
Geometry of the perceptual space
NASA Astrophysics Data System (ADS)
Assadi, Amir H.; Palmer, Stephen; Eghbalnia, Hamid; Carew, John
1999-09-01
The concept of space and geometry varies across the subjects. Following Poincare, we consider the construction of the perceptual space as a continuum equipped with a notion of magnitude. The study of the relationships of objects in the perceptual space gives rise to what we may call perceptual geometry. Computational modeling of objects and investigation of their deeper perceptual geometrical properties (beyond qualitative arguments) require a mathematical representation of the perceptual space. Within the realm of such a mathematical/computational representation, visual perception can be studied as in the well-understood logic-based geometry. This, however, does not mean that one could reduce all problems of visual perception to their geometric counterparts. Rather, visual perception as reported by a human observer, has a subjective factor that could be analytically quantified only through statistical reasoning and in the course of repetitive experiments. Thus, the desire to experimentally verify the statements in perceptual geometry leads to an additional probabilistic structure imposed on the perceptual space, whose amplitudes are measured through intervention by human observers. We propose a model for the perceptual space and the case of perception of textured surfaces as a starting point for object recognition. To rigorously present these ideas and propose computational simulations for testing the theory, we present the model of the perceptual geometry of surfaces through an amplification of theory of Riemannian foliation in differential topology, augmented by statistical learning theory. When we refer to the perceptual geometry of a human observer, the theory takes into account the Bayesian formulation of the prior state of the knowledge of the observer and Hebbian learning. We use a Parallel Distributed Connectionist paradigm for computational modeling and experimental verification of our theory.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dang, Liem X.; Vo, Quynh N.; Nilsson, Mikael
We report one of the first simulations using a classical rate theory approach to predict the mechanism of the exchange process between water and aqueous uranyl ions. Using our water and ion-water polarizable force fields and molecular dynamics techniques, we computed the potentials of mean force for the uranyl ion-water pair as the function of pressures at ambient temperature. Subsequently, these simulated potentials of mean force were used to calculate rate constants using the transition rate theory; the time dependent transmission coefficients were also examined using the reactive flux method and Grote-Hynes treatments of the dynamic response of the solvent.more » The computed activation volumes using transition rate theory and the corrected rate constants are positive, thus the mechanism of this particular water-exchange is a dissociative process. We discuss our rate theory results and compare them with previously studies in which non-polarizable force fields were used. This work was supported by the US Department of Energy, Office of Science, Office of Basic Energy Sciences, Division of Chemical Sciences, Geosciences, and Biosciences. The calculations were carried out using computer resources provided by the Office of Basic Energy Sciences.« less
Shegog, Ross; Bartholomew, L Kay; Gold, Robert S; Pierrel, Elaine; Parcel, Guy S; Sockrider, Marianna M; Czyzewski, Danita I; Fernandez, Maria E; Berlin, Nina J; Abramson, Stuart
2006-01-01
Translating behavioral theories, models, and strategies to guide the development and structure of computer-based health applications is well recognized, although a continued challenge for program developers. A stepped approach to translate behavioral theory in the design of simulations to teach chronic disease management to children is described. This includes the translation steps to: 1) define target behaviors and their determinants, 2) identify theoretical methods to optimize behavioral change, and 3) choose educational strategies to effectively apply these methods and combine these into a cohesive computer-based simulation for health education. Asthma is used to exemplify a chronic health management problem and a computer-based asthma management simulation (Watch, Discover, Think and Act) that has been evaluated and shown to effect asthma self-management in children is used to exemplify the application of theory to practice. Impact and outcome evaluation studies have indicated the effectiveness of these steps in providing increased rigor and accountability, suggesting their utility for educators and developers seeking to apply simulations to enhance self-management behaviors in patients.
Learner-Environment Fit: University Students in a Computer Room.
ERIC Educational Resources Information Center
Yeaman, Andrew R. J.
The purpose of this study was to apply the theory of person-environment fit in assessing student well-being in a university computer room. Subjects were 12 students enrolled in a computer literacy course. Their learning behavior and well-being were evaluated on the basis of three symptoms of video display terminal stress usually found in the…
ERIC Educational Resources Information Center
Velez-Rubio, Miguel
2013-01-01
Teaching computer programming to freshmen students in Computer Sciences and other Information Technology areas has been identified as a complex activity. Different approaches have been studied looking for the best one that could help to improve this teaching process. A proposed approach was implemented which is based in the language immersion…
ERIC Educational Resources Information Center
Heift, Trude; Schulze, Mathias
2012-01-01
This book provides the first comprehensive overview of theoretical issues, historical developments and current trends in ICALL (Intelligent Computer-Assisted Language Learning). It assumes a basic familiarity with Second Language Acquisition (SLA) theory and teaching, CALL and linguistics. It is of interest to upper undergraduate and/or graduate…
Large-N kinetic theory for highly occupied systems
NASA Astrophysics Data System (ADS)
Walz, R.; Boguslavski, K.; Berges, J.
2018-06-01
We consider an effective kinetic description for quantum many-body systems, which is not based on a weak-coupling or diluteness expansion. Instead, it employs an expansion in the number of field components N of the underlying scalar quantum field theory. Extending previous studies, we demonstrate that the large-N kinetic theory at next-to-leading order is able to describe important aspects of highly occupied systems, which are beyond standard perturbative kinetic approaches. We analyze the underlying quasiparticle dynamics by computing the effective scattering matrix elements analytically and solve numerically the large-N kinetic equation for a highly occupied system far from equilibrium. This allows us to compute the universal scaling form of the distribution function at an infrared nonthermal fixed point within a kinetic description, and we compare to existing lattice field theory simulation results.
Supersonic Leading Edge Receptivity
NASA Technical Reports Server (NTRS)
Maslov, Anatoly A.
1998-01-01
This paper describes experimental studies of leading edge boundary layer receptivity for imposed stream disturbances. Studies were conducted in the supersonic T-325 facility at ITAM and include data for both sharp and blunt leading edges. The data are in agreement with existing theory and should provide guidance for the development of more complete theories and numerical computations of this phenomena.
Accelerating MP2C dispersion corrections for dimers and molecular crystals
NASA Astrophysics Data System (ADS)
Huang, Yuanhang; Shao, Yihan; Beran, Gregory J. O.
2013-06-01
The MP2C dispersion correction of Pitonak and Hesselmann [J. Chem. Theory Comput. 6, 168 (2010)], 10.1021/ct9005882 substantially improves the performance of second-order Møller-Plesset perturbation theory for non-covalent interactions, albeit with non-trivial computational cost. Here, the MP2C correction is computed in a monomer-centered basis instead of a dimer-centered one. When applied to a single dimer MP2 calculation, this change accelerates the MP2C dispersion correction several-fold while introducing only trivial new errors. More significantly, in the context of fragment-based molecular crystal studies, combination of the new monomer basis algorithm and the periodic symmetry of the crystal reduces the cost of computing the dispersion correction by two orders of magnitude. This speed-up reduces the MP2C dispersion correction calculation from a significant computational expense to a negligible one in crystals like aspirin or oxalyl dihydrazide, without compromising accuracy.
Framing: Supporting Change for a System as an External Activity
1998-03-01
result of interacting with members of other social worlds. 20 DSTO-RR-0127 2.3.6 Activity Theory Activity Theory (Leontev 1978; Nardi 1996; Vygotsky ...demonstrates how computer systems can aid people in organisations conceiving situations that change an organisation’s behaviour. A theory of framing is...demonstrate how the theory of framing can be used to aid people framing situations that change an organisation’s behaviour. Two case studies are used to
Design guidelines for the use of audio cues in computer interfaces
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sumikawa, D.A.; Blattner, M.M.; Joy, K.I.
1985-07-01
A logical next step in the evolution of the computer-user interface is the incorporation of sound thereby using our senses of ''hearing'' in our communication with the computer. This allows our visual and auditory capacities to work in unison leading to a more effective and efficient interpretation of information received from the computer than by sight alone. In this paper we examine earcons, which are audio cues, used in the computer-user interface to provide information and feedback to the user about computer entities (these include messages and functions, as well as states and labels). The material in this paper ismore » part of a larger study that recommends guidelines for the design and use of audio cues in the computer-user interface. The complete work examines the disciplines of music, psychology, communication theory, advertising, and psychoacoustics to discover how sound is utilized and analyzed in those areas. The resulting information is organized according to the theory of semiotics, the theory of signs, into the syntax, semantics, and pragmatics of communication by sound. Here we present design guidelines for the syntax of earcons. Earcons are constructed from motives, short sequences of notes with a specific rhythm and pitch, embellished by timbre, dynamics, and register. Compound earcons and family earcons are introduced. These are related motives that serve to identify a family of related cues. Examples of earcons are given.« less
Disciplines, models, and computers: the path to computational quantum chemistry.
Lenhard, Johannes
2014-12-01
Many disciplines and scientific fields have undergone a computational turn in the past several decades. This paper analyzes this sort of turn by investigating the case of computational quantum chemistry. The main claim is that the transformation from quantum to computational quantum chemistry involved changes in three dimensions. First, on the side of instrumentation, small computers and a networked infrastructure took over the lead from centralized mainframe architecture. Second, a new conception of computational modeling became feasible and assumed a crucial role. And third, the field of computa- tional quantum chemistry became organized in a market-like fashion and this market is much bigger than the number of quantum theory experts. These claims will be substantiated by an investigation of the so-called density functional theory (DFT), the arguably pivotal theory in the turn to computational quantum chemistry around 1990.
NASA Technical Reports Server (NTRS)
Tessler, Alexander; Gherlone, Marco; Versino, Daniele; DiSciuva, Marco
2012-01-01
This paper reviews the theoretical foundation and computational mechanics aspects of the recently developed shear-deformation theory, called the Refined Zigzag Theory (RZT). The theory is based on a multi-scale formalism in which an equivalent single-layer plate theory is refined with a robust set of zigzag local layer displacements that are free of the usual deficiencies found in common plate theories with zigzag kinematics. In the RZT, first-order shear-deformation plate theory is used as the equivalent single-layer plate theory, which represents the overall response characteristics. Local piecewise-linear zigzag displacements are used to provide corrections to these overall response characteristics that are associated with the plate heterogeneity and the relative stiffnesses of the layers. The theory does not rely on shear correction factors and is equally accurate for homogeneous, laminated composite, and sandwich beams and plates. Regardless of the number of material layers, the theory maintains only seven kinematic unknowns that describe the membrane, bending, and transverse shear plate-deformation modes. Derived from the virtual work principle, RZT is well-suited for developing computationally efficient, C(sup 0)-continuous finite elements; formulations of several RZT-based elements are highlighted. The theory and its finite element approximations thus provide a unified and reliable computational platform for the analysis and design of high-performance load-bearing aerospace structures.
NASA Technical Reports Server (NTRS)
Tessler, Alexander; Gherlone, Marco; Versino, Daniele; Di Sciuva, Marco
2012-01-01
This paper reviews the theoretical foundation and computational mechanics aspects of the recently developed shear-deformation theory, called the Refined Zigzag Theory (RZT). The theory is based on a multi-scale formalism in which an equivalent single-layer plate theory is refined with a robust set of zigzag local layer displacements that are free of the usual deficiencies found in common plate theories with zigzag kinematics. In the RZT, first-order shear-deformation plate theory is used as the equivalent single-layer plate theory, which represents the overall response characteristics. Local piecewise-linear zigzag displacements are used to provide corrections to these overall response characteristics that are associated with the plate heterogeneity and the relative stiffnesses of the layers. The theory does not rely on shear correction factors and is equally accurate for homogeneous, laminated composite, and sandwich beams and plates. Regardless of the number of material layers, the theory maintains only seven kinematic unknowns that describe the membrane, bending, and transverse shear plate-deformation modes. Derived from the virtual work principle, RZT is well-suited for developing computationally efficient, C0-continuous finite elements; formulations of several RZT-based elements are highlighted. The theory and its finite elements provide a unified and reliable computational platform for the analysis and design of high-performance load-bearing aerospace structures.
Evaluation of a Computer-Tailored Osteoporosis Prevention Intervention in Young Women
ERIC Educational Resources Information Center
Lein, Donald H., Jr.; Clark, Diane; Turner, Lori W.; Kohler, Connie L.; Snyder, Scott; Morgan, Sarah L.; Schoenberger, Yu-Mei M.
2014-01-01
Purpose: The purpose of this study was to evaluate the effectiveness of a theory-based computer-tailored osteoporosis prevention program on calcium and vitamin D intake and osteoporosis health beliefs in young women. Additionally, this study tested whether adding bone density testing to the intervention improved the outcomes. Methods: One hundred…
Learning Vocabulary in a Foreign Language: A Computer Software Based Model Attempt
ERIC Educational Resources Information Center
Yelbay Yilmaz, Yasemin
2015-01-01
This study aimed at devising a vocabulary learning software that would help learners learn and retain vocabulary items effectively. Foundation linguistics and learning theories have been adapted to the foreign language vocabulary learning context using a computer software named Parole that was designed exclusively for this study. Experimental…
NASA Astrophysics Data System (ADS)
Stockert, Sven; Wehr, Matthias; Lohmar, Johannes; Abel, Dirk; Hirt, Gerhard
2017-10-01
In the electrical and medical industries the trend towards further miniaturization of devices is accompanied by the demand for smaller manufacturing tolerances. Such industries use a plentitude of small and narrow cold rolled metal strips with high thickness accuracy. Conventional rolling mills can hardly achieve further improvement of these tolerances. However, a model-based controller in combination with an additional piezoelectric actuator for high dynamic roll adjustment is expected to enable the production of the required metal strips with a thickness tolerance of +/-1 µm. The model-based controller has to be based on a rolling theory which can describe the rolling process very accurately. Additionally, the required computing time has to be low in order to predict the rolling process in real-time. In this work, four rolling theories from literature with different levels of complexity are tested for their suitability for the predictive controller. Rolling theories of von Kármán, Siebel, Bland & Ford and Alexander are implemented in Matlab and afterwards transferred to the real-time computer used for the controller. The prediction accuracy of these theories is validated using rolling trials with different thickness reduction and a comparison to the calculated results. Furthermore, the required computing time on the real-time computer is measured. Adequate results according the prediction accuracy can be achieved with the rolling theories developed by Bland & Ford and Alexander. A comparison of the computing time of those two theories reveals that Alexander's theory exceeds the sample rate of 1 kHz of the real-time computer.
The AdS/CFT Correspondence: Classical, Quantum, and Thermodynamical Aspects
NASA Astrophysics Data System (ADS)
Young, Donovan
2007-06-01
Certain aspects of the AdS/CFT correspondence are studied in detail. We investigate the one-loop mass shift to certain two-impurity string states in light-cone string field theory on a plane wave background. We find that there exist logarithmic divergences in the sums over intermediate mode numbers which cancel between the cubic Hamiltonian and quartic "contact term". We argue that generically, every order in intermediate state impurities contributes to the mass shift at leading perturbative order. The same mass shift is also computed using an improved 3-string vertex proposed by Dobashi and Yoneya. The result is found to agree with gauge theory at leading order and is close but not quite in agreement at subleading order. We extend the analysis to include discrete light-cone quantization, considering states with up to three units of p+. We study the (apparently) first-order phase transition in the weakly coupled plane-wave matrix model at finite temperature. We analyze the effect of interactions by computing the relevant parts of the effective potential for the Polyakov loop operator to three loop order. We show that the phase transition is indeed of first order. We also compute the 2-loop correction to the Hagedorn temperature. Finally, correlation functions of 1/4 BPS Wilson loops with the infinite family of 1/2 BPS chiral primary operators are computed in N=4 super Yang-Mills theory by summing planar ladder diagrams. The correlation functions are also computed in the strong-coupling limit using string theory; the result is found to agree with the extrapolation of the planar ladders. The result is related to similar correlators of 1/2 BPS loops by a simple re-scaling of the coupling constant, discovered by Drukker for the case of the 1/4 BPS loop VEV.
Computational Study of the Structure of a Sepiolite/Thioindigo Mayan Pigment
Alvarado, Manuel; Chianelli, Russell C.; Arrowood, Roy M.
2012-01-01
The interaction of thioindigo and the phyllosilicate clay sepiolite is investigated using density functional theory (DFT) and molecular orbital theory (MO). The best fit to experimental UV/Vis spectra occurs when a single thioindigo molecule attaches via Van der Waals forces to a tetrahedrally coordinated Al3+ cation with an additional nearby tetrahedrally coordinated Al3+ also present. The thioindigo molecule distorts from its planar structure, a behavior consistent with a color change. Due to the weak interaction between thioindigo and sepiolite we conclude that the thioindigo molecule must be trapped in a channel, an observation consistent with previous experimental studies. Future computational studies will look at the interaction of indigo with sepiolite. PMID:23193386
Tetraquark resonances computed with static lattice QCD potentials and scattering theory
NASA Astrophysics Data System (ADS)
Bicudo, Pedro; Cardoso, Marco; Peters, Antje; Pflaumer, Martin; Wagner, Marc
2018-03-01
We study tetraquark resonances with lattice QCD potentials computed for two static quarks and two dynamical quarks, the Born-Oppenheimer approximation and the emergent wave method of scattering theory. As a proof of concept we focus on systems with isospin I = 0, but consider different relative angular momenta l of the heavy b quarks. We compute the phase shifts and search for S and T matrix poles in the second Riemann sheet. We predict a new tetraquark resonance for l = 1, decaying into two B mesons, with quantum numbers I(JP) = 0(1-), mass m = 10576-4+4 MeV and decay width Γ = 112-103+90 MeV.
Dudding, Travis; Houk, Kendall N.
2004-01-01
The catalytic asymmetric thiazolium- and triazolium-catalyzed benzoin condensations of aldehydes and ketones were studied with computational methods. Transition-state geometries were optimized by using Morokuma's IMOMO [integrated MO (molecular orbital) + MO method] variation of ONIOM (n-layered integrated molecular orbital method) with a combination of B3LYP/6–31G(d) and AM1 levels of theory, and final transition-state energies were computed with single-point B3LYP/6–31G(d) calculations. Correlations between experiment and theory were found, and the origins of stereoselection were identified. Thiazolium catalysts were predicted to be less selective then triazolium catalysts, a trend also found experimentally. PMID:15079058
Weck, Philippe F; Kim, Eunja
2014-12-07
The structure of dehydrated schoepite, α-UO2(OH)2, was investigated using computational approaches that go beyond standard density functional theory and include van der Waals dispersion corrections (DFT-D). Thermal properties of α-UO2(OH)2, were also obtained from phonon frequencies calculated with density functional perturbation theory (DFPT) including van der Waals dispersion corrections. While the isobaric heat capacity computed from first-principles reproduces available calorimetric data to within 5% up to 500 K, some entropy estimates based on calorimetric measurements for UO3·0.85H2O were found to overestimate by up to 23% the values computed in this study.
Vazart, Fanny; Calderini, Danilo; Puzzarini, Cristina; Skouteris, Dimitrios
2017-01-01
We propose an integrated computational strategy aimed at providing reliable thermochemical and kinetic information on the formation processes of astrochemical complex organic molecules. The approach involves state-of-the-art quantum-mechanical computations, second-order vibrational perturbation theory, and kinetic models based on capture and transition state theory together with the master equation approach. Notably, tunneling, quantum reflection, and leading anharmonic contributions are accounted for in our model. Formamide has been selected as a case study in view of its interest as a precursor in the abiotic amino acid synthesis. After validation of the level of theory chosen for describing the potential energy surface, we have investigated several pathways of the OH+CH2NH and NH2+HCHO reaction channels. Our results indicate that both reaction channels are essentially barrier-less (in the sense that all relevant transition states lie below or only marginally above the reactants) and can, therefore, occur under the low temperature conditions of interstellar objects provided that tunneling is taken into the proper account. PMID:27689448
Electric Circuit Theory--Computer Illustrated Text.
ERIC Educational Resources Information Center
Riches, Brian
1990-01-01
Discusses the use of a computer-illustrated text (CIT) with integrated software to teach electric circuit theory to college students. Examples of software use are given, including simple animation, graphical displays, and problem-solving programs. Issues affecting electric circuit theory instruction are also addressed, including mathematical…
ERIC Educational Resources Information Center
Lorton, Paul, Jr.
EXPER-SIM (Experiment Simulation) is an instructional approach (with supporting computer programs) which allows an instructor to build a theory based model of how data would occur if an experiment were actually conducted in a world where the theory held true. The LESS version of EXPER-SIM was adapted to run on the Hewlett-Packard 2000E timesharing…
Midwives and the Computerization of Perinatal Data Entry: The Theory of Beneficial Engagement.
Craswell, Alison; Moxham, Lorna; Broadbent, Marc
2016-10-01
Theory building in nursing and midwifery both to explain and inform practice is important to advance these professions via provision of a theoretical foundation. This research explored the process of perinatal data entry undertaken by midwives to explore the impact of the movement from paper to computer collection of data. Use of grounded theory methodology enabled theory building, leading to a theoretical understanding of the phenomenon and development of the Theory of Beneficial Engagement grounded in the data. Methods involved in-depth semistructured interviews with 15 users of perinatal data systems. Participants were recruited from 12 different healthcare locations and were utilizing three different electronic systems for data entry. The research question that guided the study focused on examining the influences of using the computer for perinatal data entry. Findings indicated that qualities particular to some midwives denoted engagement with perinatal data entry, suggesting a strong desire to enter complete, timely, and accurate data. The Theory of Beneficial Engagement provides a model of user engagement with systems for perinatal data entry consistent with other theories of engagement. The theory developed describes this phenomenon in a simple, elegant manner that can be applied to other areas where mandatory data entry is undertaken.
Wilson loops in supersymmetric gauge theories
NASA Astrophysics Data System (ADS)
Pestun, Vasily
This thesis is devoted to several exact computations in four-dimensional supersymmetric gauge field theories. In the first part of the thesis we prove conjecture due to Erickson-Semenoff-Zarembo and Drukker-Gross which relates supersymmetric circular Wilson loop operators in the N = 4 supersymmetric Yang-Mills theory with a Gaussian matrix model. We also compute the partition function and give a new matrix model formula for the expectation value of a supersymmetric circular Wilson loop operator for the pure N = 2 and the N* = 2 supersymmetric Yang-Mills theory on a four-sphere. Circular supersymmetric Wilson loops in four-dimensional N = 2 superconformal gauge theory are treated similarly. In the second part we consider supersymmetric Wilson loops of arbitrary shape restricted to a two-dimensional sphere in the four-dimensional N = 4 supersymmetric Yang-Mills theory. We show that expectation value for these Wilson loops can be exactly computed using a two-dimensional theory closely related to the topological two-dimensional Higgs-Yang-Mills theory, or two-dimensional Yang-Mills theory for the complexified gauge group.
Vibrations of cantilevered circular cylindrical shells Shallow versus deep shell theory
NASA Technical Reports Server (NTRS)
Lee, J. K.; Leissa, A. W.; Wang, A. J.
1983-01-01
Free vibrations of cantilevered circular cylindrical shells having rectangular planforms are studied in this paper by means of the Ritz method. The deep shell theory of Novozhilov and Goldenveizer is used and compared with the usual shallow shell theory for a wide range of shell parameters. A thorough convergence study is presented along with comparisons to previously published finite element solutions and experimental results. Accurately computed frequency parameters and mode shapes for various shell configurations are presented. The present paper appears to be the first comprehensive study presenting rigorous comparisons between the two shell theories in dealing with free vibrations of cantilevered cylindrical shells.
Strong dynamics and lattice gauge theory
NASA Astrophysics Data System (ADS)
Schaich, David
In this dissertation I use lattice gauge theory to study models of electroweak symmetry breaking that involve new strong dynamics. Electroweak symmetry breaking (EWSB) is the process by which elementary particles acquire mass. First proposed in the 1960s, this process has been clearly established by experiments, and can now be considered a law of nature. However, the physics underlying EWSB is still unknown, and understanding it remains a central challenge in particle physics today. A natural possibility is that EWSB is driven by the dynamics of some new, strongly-interacting force. Strong interactions invalidate the standard analytical approach of perturbation theory, making these models difficult to study. Lattice gauge theory is the premier method for obtaining quantitatively-reliable, nonperturbative predictions from strongly-interacting theories. In this approach, we replace spacetime by a regular, finite grid of discrete sites connected by links. The fields and interactions described by the theory are likewise discretized, and defined on the lattice so that we recover the original theory in continuous spacetime on an infinitely large lattice with sites infinitesimally close together. The finite number of degrees of freedom in the discretized system lets us simulate the lattice theory using high-performance computing. Lattice gauge theory has long been applied to quantum chromodynamics, the theory of strong nuclear interactions. Using lattice gauge theory to study dynamical EWSB, as I do in this dissertation, is a new and exciting application of these methods. Of particular interest is non-perturbative lattice calculation of the electroweak S parameter. Experimentally S ≈ -0.15(10), which tightly constrains dynamical EWSB. On the lattice, I extract S from the momentum-dependence of vector and axial-vector current correlators. I created and applied computer programs to calculate these correlators and analyze them to determine S. I also calculated the masses and other properties of the new particles predicted by these theories. I find S ≳ 0.1 in the specific theories I study. Although this result still disagrees with experiment, it is much closer to the experimental value than is the conventional wisdom S ≳ 0.3. These results encourage further lattice studies to search for experimentally viable strongly-interacting theories of EWSB.
The New Physical Optics Notebook: Tutorials in Fourier Optics.
ERIC Educational Resources Information Center
Reynolds, George O.; And Others
This is a textbook of Fourier optics for the classroom or self-study. Major topics included in the 38 chapters are: Huygens' principle and Fourier transforms; image formation; optical coherence theory; coherent imaging; image analysis; coherent noise; interferometry; holography; communication theory techniques; analog optical computing; phase…
THE CURRENT STATUS OF RESEARCH AND THEORY IN HUMAN PROBLEM SOLVING.
ERIC Educational Resources Information Center
DAVIS, GARY A.
PROBLEM-SOLVING THEORIES IN THREE AREAS - TRADITIONAL (STIMULUS-RESPONSE) LEARNING, COGNITIVE-GESTALT APPROACHES, AND COMPUTER AND MATHEMATICAL MODELS - WERE SUMMARIZED. RECENT EMPIRICAL STUDIES (1960-65) ON PROBLEM SOLVING WERE CATEGORIZED ACCORDING TO TYPE OF BEHAVIOR ELICITED BY PARTICULAR PROBLEM-SOLVING TASKS. ANAGRAM,…
ERIC Educational Resources Information Center
Singh, Gurmukh
2012-01-01
The present article is primarily targeted for the advanced college/university undergraduate students of chemistry/physics education, computational physics/chemistry, and computer science. The most recent software system such as MS Visual Studio .NET version 2010 is employed to perform computer simulations for modeling Bohr's quantum theory of…
The Mechanics of Embodiment: A Dialog on Embodiment and Computational Modeling
Pezzulo, Giovanni; Barsalou, Lawrence W.; Cangelosi, Angelo; Fischer, Martin H.; McRae, Ken; Spivey, Michael J.
2011-01-01
Embodied theories are increasingly challenging traditional views of cognition by arguing that conceptual representations that constitute our knowledge are grounded in sensory and motor experiences, and processed at this sensorimotor level, rather than being represented and processed abstractly in an amodal conceptual system. Given the established empirical foundation, and the relatively underspecified theories to date, many researchers are extremely interested in embodied cognition but are clamoring for more mechanistic implementations. What is needed at this stage is a push toward explicit computational models that implement sensorimotor grounding as intrinsic to cognitive processes. In this article, six authors from varying backgrounds and approaches address issues concerning the construction of embodied computational models, and illustrate what they view as the critical current and next steps toward mechanistic theories of embodiment. The first part has the form of a dialog between two fictional characters: Ernest, the “experimenter,” and Mary, the “computational modeler.” The dialog consists of an interactive sequence of questions, requests for clarification, challenges, and (tentative) answers, and touches the most important aspects of grounded theories that should inform computational modeling and, conversely, the impact that computational modeling could have on embodied theories. The second part of the article discusses the most important open challenges for embodied computational modeling. PMID:21713184
Intersecting surface defects and instanton partition functions
NASA Astrophysics Data System (ADS)
Pan, Yiwen; Peelaers, Wolfger
2017-07-01
We analyze intersecting surface defects inserted in interacting four-dimensional N=2 supersymmetric quantum field theories. We employ the realization of a class of such systems as the infrared fixed points of renormalization group flows from larger theories, triggered by perturbed Seiberg-Witten monopole-like configurations, to compute their partition functions. These results are cast into the form of a partition function of 4d/2d/0d coupled systems. Our computations provide concrete expressions for the instanton partition function in the presence of intersecting defects and we study the corresponding ADHM model.
Kinematics of a vertical axis wind turbine with a variable pitch angle
NASA Astrophysics Data System (ADS)
Jakubowski, Mateusz; Starosta, Roman; Fritzkowski, Pawel
2018-01-01
A computational model for the kinematics of a vertical axis wind turbine (VAWT) is presented. A H-type rotor turbine with a controlled pitch angle is considered. The aim of this solution is to improve the VAWT productivity. The discussed method is related to a narrow computational branch based on the Blade Element Momentum theory (BEM theory). The paper can be regarded as a theoretical basis and an introduction to further studies with the application of BEM. The obtained torque values show the main advantage of using the variable pitch angle.
Laminar fMRI and computational theories of brain function.
Stephan, K E; Petzschner, F H; Kasper, L; Bayer, J; Wellstein, K V; Stefanics, G; Pruessmann, K P; Heinzle, J
2017-11-02
Recently developed methods for functional MRI at the resolution of cortical layers (laminar fMRI) offer a novel window into neurophysiological mechanisms of cortical activity. Beyond physiology, laminar fMRI also offers an unprecedented opportunity to test influential theories of brain function. Specifically, hierarchical Bayesian theories of brain function, such as predictive coding, assign specific computational roles to different cortical layers. Combined with computational models, laminar fMRI offers a unique opportunity to test these proposals noninvasively in humans. This review provides a brief overview of predictive coding and related hierarchical Bayesian theories, summarises their predictions with regard to layered cortical computations, examines how these predictions could be tested by laminar fMRI, and considers methodological challenges. We conclude by discussing the potential of laminar fMRI for clinically useful computational assays of layer-specific information processing. Copyright © 2017 Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Wang, Yaqun
2017-03-01
The authors are to be congratulated for a thought-provoking article [1], which reviews the epigenetic game theory (epiGame) that utilizes differential equations to study the epigenetic control of embryo development. It is a novel application of evolutionary game theory and provides biology researchers with useful methodologies to address scientific questions related to biological coordination of competition and cooperation.
Quantum market games: implementing tactics via measurements
NASA Astrophysics Data System (ADS)
Pakula, I.; Piotrowski, E. W.; Sladkowski, J.
2006-02-01
A major development in applying quantum mechanical formalism to various fields has been made during the last few years. Quantum counterparts of Game Theory, Economy, as well as diverse approaches to Quantum Information Theory have been found and currently are being explored. Using connections between Quantum Game Theory and Quantum Computations, an application of the universality of a measurement based computation in Quantum Market Theory is presented.
NASA Technical Reports Server (NTRS)
Parzen, Benjamin
1992-01-01
The theory of oscillator analysis in the immittance domain should be read in conjunction with the additional theory presented here. The combined theory enables the computer simulation of the steady state oscillator. The simulation makes the calculation of the oscillator total steady state performance practical, including noise at all oscillator locations. Some specific precision oscillators are analyzed.
Implementing Computer Technologies: Teachers' Perceptions and Practices
ERIC Educational Resources Information Center
Wozney, Lori; Venkatesh, Vivek; Abrami, Philip
2006-01-01
This study investigates personal and setting characteristics, teacher attitudes, and current computer technology practices among 764 elementary and secondary teachers from both private and public school sectors in Quebec. Using expectancy-value theory, the Technology Implementation Questionnaire (TIQ) was developed; it consists of 33 belief items…
Human Resource Management, Computers, and Organization Theory.
ERIC Educational Resources Information Center
Garson, G. David
In an attempt to provide a framework for research and theory building in public management information systems (PMIS), state officials responsible for computing in personnel operations were surveyed. The data were applied to hypotheses arising from a recent model by Bozeman and Bretschneider, attempting to relate organization theory to management…
Computational-Model-Based Analysis of Context Effects on Harmonic Expectancy.
Morimoto, Satoshi; Remijn, Gerard B; Nakajima, Yoshitaka
2016-01-01
Expectancy for an upcoming musical chord, harmonic expectancy, is supposedly based on automatic activation of tonal knowledge. Since previous studies implicitly relied on interpretations based on Western music theory, the underlying computational processes involved in harmonic expectancy and how it relates to tonality need further clarification. In particular, short chord sequences which cannot lead to unique keys are difficult to interpret in music theory. In this study, we examined effects of preceding chords on harmonic expectancy from a computational perspective, using stochastic modeling. We conducted a behavioral experiment, in which participants listened to short chord sequences and evaluated the subjective relatedness of the last chord to the preceding ones. Based on these judgments, we built stochastic models of the computational process underlying harmonic expectancy. Following this, we compared the explanatory power of the models. Our results imply that, even when listening to short chord sequences, internally constructed and updated tonal assumptions determine the expectancy of the upcoming chord.
Computational-Model-Based Analysis of Context Effects on Harmonic Expectancy
Morimoto, Satoshi; Remijn, Gerard B.; Nakajima, Yoshitaka
2016-01-01
Expectancy for an upcoming musical chord, harmonic expectancy, is supposedly based on automatic activation of tonal knowledge. Since previous studies implicitly relied on interpretations based on Western music theory, the underlying computational processes involved in harmonic expectancy and how it relates to tonality need further clarification. In particular, short chord sequences which cannot lead to unique keys are difficult to interpret in music theory. In this study, we examined effects of preceding chords on harmonic expectancy from a computational perspective, using stochastic modeling. We conducted a behavioral experiment, in which participants listened to short chord sequences and evaluated the subjective relatedness of the last chord to the preceding ones. Based on these judgments, we built stochastic models of the computational process underlying harmonic expectancy. Following this, we compared the explanatory power of the models. Our results imply that, even when listening to short chord sequences, internally constructed and updated tonal assumptions determine the expectancy of the upcoming chord. PMID:27003807
Implementation of an Improved Adaptive Testing Theory
ERIC Educational Resources Information Center
Al-A'ali, Mansoor
2007-01-01
Computer adaptive testing is the study of scoring tests and questions based on assumptions concerning the mathematical relationship between examinees' ability and the examinees' responses. Adaptive student tests, which are based on item response theory (IRT), have many advantages over conventional tests. We use the least square method, a…
Teacher-Education Students' Views about Knowledge Building Theory and Practice
ERIC Educational Resources Information Center
Hong, Huang-Yao; Chen, Fei-Ching; Chai, Ching Sing; Chan, Wen-Ching
2011-01-01
This study investigated the effects of engaging students to collectively learn and work with knowledge in a computer-supported collaborative learning environment called Knowledge Forum on their views about knowledge building theory and practice. Participants were 24 teacher-education students who took a required course titled "Integrating Theory…
Semantics vs. World Knowledge in Prefrontal Cortex
ERIC Educational Resources Information Center
Pylkkanen, Liina; Oliveri, Bridget; Smart, Andrew J.
2009-01-01
Humans have knowledge about the properties of their native language at various levels of representation; sound, structure, and meaning computation constitute the core components of any linguistic theory. Although the brain sciences have engaged with representational theories of sound and syntactic structure, the study of the neural bases of…
ERIC Educational Resources Information Center
Blau, Ina; Benolol, Nurit
2016-01-01
Creative computing is one of the rapidly growing educational trends around the world. Previous studies have shown that creative computing can empower disadvantaged children and youth. At-risk youth tend to hold a negative view of self and perceive their abilities as inferior compared to "normative" pupils. The Implicit Theories of…
ERIC Educational Resources Information Center
Kordaki, Maria
2010-01-01
This paper presents both the design and the pilot formative evaluation study of a computer-based problem-solving environment (named LECGO: Learning Environment for programming using C using Geometrical Objects) for the learning of computer programming using C by beginners. In its design, constructivist and social learning theories were taken into…
ERIC Educational Resources Information Center
Nikolaidou, Georgia N.
2012-01-01
This exploratory work describes and analyses the collaborative interactions that emerge during computer-based music composition in the primary school. The study draws on socio-cultural theories of learning, originated within Vygotsky's theoretical context, and proposes a new model, namely Computer-mediated Praxis and Logos under Synergy (ComPLuS).…
NASA Astrophysics Data System (ADS)
Nazarov, Anton
2012-11-01
In this paper we present Affine.m-a program for computations in representation theory of finite-dimensional and affine Lie algebras and describe implemented algorithms. The algorithms are based on the properties of weights and Weyl symmetry. Computation of weight multiplicities in irreducible and Verma modules, branching of representations and tensor product decomposition are the most important problems for us. These problems have numerous applications in physics and we provide some examples of these applications. The program is implemented in the popular computer algebra system Mathematica and works with finite-dimensional and affine Lie algebras. Catalogue identifier: AENA_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AENB_v1_0.html Program obtainable from: CPC Program Library, Queen’s University, Belfast, UK Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 24 844 No. of bytes in distributed program, including test data, etc.: 1 045 908 Distribution format: tar.gz Programming language: Mathematica. Computer: i386-i686, x86_64. Operating system: Linux, Windows, Mac OS, Solaris. RAM: 5-500 Mb Classification: 4.2, 5. Nature of problem: Representation theory of finite-dimensional Lie algebras has many applications in different branches of physics, including elementary particle physics, molecular physics, nuclear physics. Representations of affine Lie algebras appear in string theories and two-dimensional conformal field theory used for the description of critical phenomena in two-dimensional systems. Also Lie symmetries play a major role in a study of quantum integrable systems. Solution method: We work with weights and roots of finite-dimensional and affine Lie algebras and use Weyl symmetry extensively. Central problems which are the computations of weight multiplicities, branching and fusion coefficients are solved using one general recurrent algorithm based on generalization of Weyl character formula. We also offer alternative implementation based on the Freudenthal multiplicity formula which can be faster in some cases. Restrictions: Computational complexity grows fast with the rank of an algebra, so computations for algebras of ranks greater than 8 are not practical. Unusual features: We offer the possibility of using a traditional mathematical notation for the objects in representation theory of Lie algebras in computations if Affine.m is used in the Mathematica notebook interface. Running time: From seconds to days depending on the rank of the algebra and the complexity of the representation.
Vertex operator algebras of Argyres-Douglas theories from M5-branes
NASA Astrophysics Data System (ADS)
Song, Jaewon; Xie, Dan; Yan, Wenbin
2017-12-01
We study aspects of the vertex operator algebra (VOA) corresponding to Argyres-Douglas (AD) theories engineered using the 6d N=(2, 0) theory of type J on a punctured sphere. We denote the AD theories as ( J b [ k], Y), where J b [ k] and Y represent an irregular and a regular singularity respectively. We restrict to the `minimal' case where J b [ k] has no associated mass parameters, and the theory does not admit any exactly marginal deformations. The VOA corresponding to the AD theory is conjectured to be the W-algebra W^{k_{2d}}(J, Y ) , where {k}_{2d}=-h+b/b+k with h being the dual Coxeter number of J. We verify this conjecture by showing that the Schur index of the AD theory is identical to the vacuum character of the corresponding VOA, and the Hall-Littlewood index computes the Hilbert series of the Higgs branch. We also find that the Schur and Hall-Littlewood index for the AD theory can be written in a simple closed form for b = h. We also test the conjecture that the associated variety of such VOA is identical to the Higgs branch. The M5-brane construction of these theories and the corresponding TQFT structure of the index play a crucial role in our computations.
ERIC Educational Resources Information Center
Reutzel, D. Ray; Mohr, Kathleen A. J.
2014-01-01
In this response to "Measuring Students' Writing Ability on a Computer Analytic Developmental Scale: An Exploratory Validity Study," the authors agree that assessments should seek parsimony in both theory and application wherever possible. Doing so allows maximal dissemination and implementation while minimizing costs. The Writing…
How robotics programs influence young women's career choices : a grounded theory model
NASA Astrophysics Data System (ADS)
Craig, Cecilia Dosh-Bluhm
The fields of engineering, computer science, and physics have a paucity of women despite decades of intervention by universities and organizations. Women's graduation rates in these fields continue to stagnate, posing a critical problem for society. This qualitative grounded theory (GT) study sought to understand how robotics programs influenced young women's career decisions and the program's effect on engineering, physics, and computer science career interests. To test this, a study was mounted to explore how the FIRST (For Inspiration and Recognition of Science and Technology) Robotics Competition (FRC) program influenced young women's college major and career choices. Career theories suggested that experiential programs coupled with supportive relationships strongly influence career decisions, especially for science, technology, engineering, and mathematics careers. The study explored how and when young women made career decisions and how the experiential program and! its mentors and role models influenced career choice. Online focus groups and interviews (online and face-to-face) with 10 female FRC alumnae and GT processes (inductive analysis, open coding, categorizations using mind maps and content clouds) were used to generate a general systems theory style model of the career decision process for these young women. The study identified gender stereotypes and other career obstacles for women. The study's conclusions include recommendations to foster connections to real-world challenges, to develop training programs for mentors, and to nurture social cohesion, a mostly untapped area. Implementing these recommendations could help grow a critical mass of women in engineering, physics, and computer science careers, a social change worth pursuing.
2D problems of surface growth theory with applications to additive manufacturing
NASA Astrophysics Data System (ADS)
Manzhirov, A. V.; Mikhin, M. N.
2018-04-01
We study 2D problems of surface growth theory of deformable solids and their applications to the analysis of the stress-strain state of AM fabricated products and structures. Statements of the problems are given, and a solution method based on the approaches of the theory of functions of a complex variable is suggested. Computations are carried out for model problems. Qualitative and quantitative results are discussed.
NASA Astrophysics Data System (ADS)
Cao, Siqin; Zhu, Lizhe; Huang, Xuhui
2018-04-01
The 3D reference interaction site model (3DRISM) is a powerful tool to study the thermodynamic and structural properties of liquids. However, for hydrophobic solutes, the inhomogeneity of the solvent density around them poses a great challenge to the 3DRISM theory. To address this issue, we have previously introduced the hydrophobic-induced density inhomogeneity theory (HI) for purely hydrophobic solutes. To further consider the complex hydrophobic solutes containing partial charges, here we propose the D2MSA closure to incorporate the short-range and long-range interactions with the D2 closure and the mean spherical approximation, respectively. We demonstrate that our new theory can compute the solvent distributions around real hydrophobic solutes in water and complex organic solvents that agree well with the explicit solvent molecular dynamics simulations.
Theoretical models for Computing VLF wave amplitude and phase and their applications
NASA Astrophysics Data System (ADS)
Pal, Sujay; Chakrabarti, S. K.
2010-10-01
We present a review of the present theoretical models for computing the amplitude and phase of the VLF signal at any given point on earth. We present the basics of the wave hop theory and the Mode theory. We compute the signal amplitudes as a function of distance from a transmitter using both the theories and compare them. We also repeat a similar exercise for the diurnal signal. We note that the signal variation by wave hop theory gives more detailed information in the day time. As an example of using LWPC code, we compute the variation of the effective height h' and steepness β parameters for a solar flare and obtain the time dependence of the electron number density along both VTX-Kolkata and NWC-Kolkata propagation paths.
Theory-Guided Technology in Computer Science.
ERIC Educational Resources Information Center
Ben-Ari, Mordechai
2001-01-01
Examines the history of major achievements in computer science as portrayed by winners of the prestigious Turing award and identifies a possibly unique activity called Theory-Guided Technology (TGT). Researchers develop TGT by using theoretical results to create practical technology. Discusses reasons why TGT is practical in computer science and…
Implications of Whole-Brained Theories of Learning and Thinking for Computer-Based Instruction.
ERIC Educational Resources Information Center
Torrance, E. Paul
1981-01-01
Discusses the implications of theories of hemispheric dominance for computer-assisted instruction, highlights some of the computer's instructional uses, lists specialized functions of the cerebral hemispheres, and lists recommended solutions to CBI program problems which were submitted by gifted children. Thirty-five sources are listed. (FM)
Perturbation theory in light-cone quantization
DOE Office of Scientific and Technical Information (OSTI.GOV)
Langnau, A.
1992-01-01
A thorough investigation of light-cone properties which are characteristic for higher dimensions is very important. The easiest way of addressing these issues is by analyzing the perturbative structure of light-cone field theories first. Perturbative studies cannot be substituted for an analysis of problems related to a nonperturbative approach. However, in order to lay down groundwork for upcoming nonperturbative studies, it is indispensable to validate the renormalization methods at the perturbative level, i.e., to gain control over the perturbative treatment first. A clear understanding of divergences in perturbation theory, as well as their numerical treatment, is a necessary first step towardsmore » formulating such a program. The first objective of this dissertation is to clarify this issue, at least in second and fourth-order in perturbation theory. The work in this dissertation can provide guidance for the choice of counterterms in Discrete Light-Cone Quantization or the Tamm-Dancoff approach. A second objective of this work is the study of light-cone perturbation theory as a competitive tool for conducting perturbative Feynman diagram calculations. Feynman perturbation theory has become the most practical tool for computing cross sections in high energy physics and other physical properties of field theory. Although this standard covariant method has been applied to a great range of problems, computations beyond one-loop corrections are very difficult. Because of the algebraic complexity of the Feynman calculations in higher-order perturbation theory, it is desirable to automatize Feynman diagram calculations so that algebraic manipulation programs can carry out almost the entire calculation. This thesis presents a step in this direction. The technique we are elaborating on here is known as light-cone perturbation theory.« less
DOT National Transportation Integrated Search
2001-02-01
A new version of the CRCP computer program, CRCP-9, has been developed in this study. The numerical model of the CRC pavements was developed using finite element theories, the crack spacing prediction model was developed using the Monte Carlo method,...
NASA Astrophysics Data System (ADS)
Zhao, Yinjian
2017-09-01
Aiming at a high simulation accuracy, a Particle-Particle (PP) Coulombic molecular dynamics model is implemented to study the electron-ion temperature relaxation. In this model, the Coulomb's law is directly applied in a bounded system with two cutoffs at both short and long length scales. By increasing the range between the two cutoffs, it is found that the relaxation rate deviates from the BPS theory and approaches the LS theory and the GMS theory. Also, the effective minimum and maximum impact parameters (bmin* and bmax*) are obtained. For the simulated plasma condition, bmin* is about 6.352 times smaller than the Landau length (bC), and bmax* is about 2 times larger than the Debye length (λD), where bC and λD are used in the LS theory. Surprisingly, the effective relaxation time obtained from the PP model is very close to the LS theory and the GMS theory, even though the effective Coulomb logarithm is two times greater than the one used in the LS theory. Besides, this work shows that the PP model (commonly known as computationally expensive) is becoming practicable via GPU parallel computing techniques.
Gambini, R; Pullin, J
2000-12-18
We consider general relativity with a cosmological constant as a perturbative expansion around a completely solvable diffeomorphism invariant field theory. This theory is the lambda --> infinity limit of general relativity. This allows an explicit perturbative computational setup in which the quantum states of the theory and the classical observables can be explicitly computed. An unexpected relationship arises at a quantum level between the discrete spectrum of the volume operator and the allowed values of the cosmological constant.
Group Theory, Computational Thinking, and Young Mathematicians
ERIC Educational Resources Information Center
Gadanidis, George; Clements, Erin; Yiu, Chris
2018-01-01
In this article, we investigate the artistic puzzle of designing mathematics experiences (MEs) to engage young children with ideas of group theory, using a combination of hands-on and computational thinking (CT) tools. We elaborate on: (1) group theory and why we chose it as a context for young mathematicians' experiences with symmetry and…
Toward a Script Theory of Guidance in Computer-Supported Collaborative Learning
ERIC Educational Resources Information Center
Fischer, Frank; Kollar, Ingo; Stegmann, Karsten; Wecker, Christof
2013-01-01
This article presents an outline of a script theory of guidance for computer-supported collaborative learning (CSCL). With its 4 types of components of internal and external scripts (play, scene, role, and scriptlet) and 7 principles, this theory addresses the question of how CSCL practices are shaped by dynamically reconfigured internal…
ERIC Educational Resources Information Center
Schwonke, Rolf
2015-01-01
Instructional design theories such as the "cognitive load theory" (CLT) or the "cognitive theory of multimedia learning" (CTML) explain learning difficulties in (computer-based) learning usually as a result of design deficiencies that hinder effective schema construction. However, learners often struggle even in well-designed…
ERIC Educational Resources Information Center
van den Bogaart, Antoine C. M.; Bilderbeek, Richel J. C.; Schaap, Harmen; Hummel, Hans G. K.; Kirschner, Paul A.
2016-01-01
This article introduces a dedicated, computer-supported method to construct and formatively assess open, annotated concept maps of Personal Professional Theories (PPTs). These theories are internalised, personal bodies of formal and practical knowledge, values, norms and convictions that professionals use as a reference to interpret and acquire…
Determination of partial molar volumes from free energy perturbation theory.
Vilseck, Jonah Z; Tirado-Rives, Julian; Jorgensen, William L
2015-04-07
Partial molar volume is an important thermodynamic property that gives insights into molecular size and intermolecular interactions in solution. Theoretical frameworks for determining the partial molar volume (V°) of a solvated molecule generally apply Scaled Particle Theory or Kirkwood-Buff theory. With the current abilities to perform long molecular dynamics and Monte Carlo simulations, more direct methods are gaining popularity, such as computing V° directly as the difference in computed volume from two simulations, one with a solute present and another without. Thermodynamically, V° can also be determined as the pressure derivative of the free energy of solvation in the limit of infinite dilution. Both approaches are considered herein with the use of free energy perturbation (FEP) calculations to compute the necessary free energies of solvation at elevated pressures. Absolute and relative partial molar volumes are computed for benzene and benzene derivatives using the OPLS-AA force field. The mean unsigned error for all molecules is 2.8 cm(3) mol(-1). The present methodology should find use in many contexts such as the development and testing of force fields for use in computer simulations of organic and biomolecular systems, as a complement to related experimental studies, and to develop a deeper understanding of solute-solvent interactions.
Wing Shape Sensing from Measured Strain
NASA Technical Reports Server (NTRS)
Pak, Chan-Gi
2015-01-01
A new two-step theory is investigated for predicting the deflection and slope of an entire structure using strain measurements at discrete locations. In the first step, a measured strain is fitted using a piecewise least-squares curve fitting method together with the cubic spline technique. These fitted strains are integrated twice to obtain deflection data along the fibers. In the second step, computed deflection along the fibers are combined with a finite element model of the structure in order to interpolate and extrapolate the deflection and slope of the entire structure through the use of the System Equivalent Reduction and Expansion Process. The theory is first validated on a computational model, a cantilevered rectangular plate wing. The theory is then applied to test data from a cantilevered swept-plate wing model. Computed results are compared with finite element results, results using another strain-based method, and photogrammetry data. For the computational model under an aeroelastic load, maximum deflection errors in the fore and aft, lateral, and vertical directions are -3.2 percent, 0.28 percent, and 0.09 percent, respectively; and maximum slope errors in roll and pitch directions are 0.28 percent and -3.2 percent, respectively. For the experimental model, deflection results at the tip are shown to be accurate to within 3.8 percent of the photogrammetry data and are accurate to within 2.2 percent in most cases. In general, excellent matching between target and computed values are accomplished in this study. Future refinement of this theory will allow it to monitor the deflection and health of an entire aircraft in real time, allowing for aerodynamic load computation, active flexible motion control, and active induced drag reduction..
Wing Shape Sensing from Measured Strain
NASA Technical Reports Server (NTRS)
Pak, Chan-gi
2015-01-01
A new two-step theory is investigated for predicting the deflection and slope of an entire structure using strain measurements at discrete locations. In the first step, a measured strain is fitted using a piecewise least-squares curve fitting method together with the cubic spline technique. These fitted strains are integrated twice to obtain deflection data along the fibers. In the second step, computed deflection along the fibers are combined with a finite element model of the structure in order to interpolate and extrapolate the deflection and slope of the entire structure through the use of the System Equivalent Reduction and Expansion Process. The theory is first validated on a computational model, a cantilevered rectangular plate wing. The theory is then applied to test data from a cantilevered swept-plate wing model. Computed results are compared with finite element results, results using another strainbased method, and photogrammetry data. For the computational model under an aeroelastic load, maximum deflection errors in the fore and aft, lateral, and vertical directions are -3.2%, 0.28%, and 0.09%, respectively; and maximum slope errors in roll and pitch directions are 0.28% and -3.2%, respectively. For the experimental model, deflection results at the tip are shown to be accurate to within 3.8% of the photogrammetry data and are accurate to within 2.2% in most cases. In general, excellent matching between target and computed values are accomplished in this study. Future refinement of this theory will allow it to monitor the deflection and health of an entire aircraft in real time, allowing for aerodynamic load computation, active flexible motion control, and active induced drag reduction.
Beam and Plasma Physics Research
1990-06-01
La di~raDy in high power microwave computations and thi-ory and high energy plasma computations and theory. The HPM computations concentrated on...2.1 REPORT INDEX 7 2.2 TASK AREA 2: HIGH-POWER RF EMISSION AND CHARGED- PARTICLE BEAM PHYSICS COMPUTATION , MODELING AND THEORY 10 2.2.1 Subtask 02-01...Vulnerability of Space Assets 22 2.2.6 Subtask 02-06, Microwave Computer Program Enhancements 22 2.2.7 Subtask 02-07, High-Power Microwave Transvertron Design 23
Localization in abelian Chern-Simons theory
NASA Astrophysics Data System (ADS)
McLellan, B. D. K.
2013-02-01
Chern-Simons theory on a closed contact three-manifold is studied when the Lie group for gauge transformations is compact, connected, and abelian. The abelian Chern-Simons partition function is derived using the Faddeev-Popov gauge fixing method. The partition function is then formally computed using the technique of non-abelian localization. This study leads to a natural identification of the abelian Reidemeister-Ray-Singer torsion as a specific multiple of the natural unit symplectic volume form on the moduli space of flat abelian connections for the class of Sasakian three-manifolds. The torsion part of the abelian Chern-Simons partition function is computed explicitly in terms of Seifert data for a given Sasakian three-manifold.
Urbic, Tomaz
2016-01-01
In this paper we applied an analytical theory for the two dimensional dimerising fluid. We applied Wertheims thermodynamic perturbation theory (TPT) and integral equation theory (IET) for associative liquids to the dimerising model with arbitrary position of dimerising points from center of the particles. The theory was used to study thermodynamical and structural properties. To check the accuracy of the theories we compared theoretical results with corresponding results obtained by Monte Carlo computer simulations. The theories are accurate for the different positions of patches of the model at all values of the temperature and density studied. IET correctly predicts the pair correlation function of the model. Both TPT and IET are in good agreement with the Monte Carlo values of the energy, pressure, chemical potential, compressibility and ratios of free and bonded particles. PMID:28529396
Kleene Monads: Handling Iteration in a Framework of Generic Effects
NASA Astrophysics Data System (ADS)
Goncharov, Sergey; Schröder, Lutz; Mossakowski, Till
Monads are a well-established tool for modelling various computational effects. They form the semantic basis of Moggi’s computational metalanguage, the metalanguage of effects for short, which made its way into modern functional programming in the shape of Haskell’s do-notation. Standard computational idioms call for specific classes of monads that support additional control operations. Here, we introduce Kleene monads, which additionally feature nondeterministic choice and Kleene star, i.e. nondeterministic iteration, and we provide a metalanguage and a sound calculus for Kleene monads, the metalanguage of control and effects, which is the natural joint extension of Kleene algebra and the metalanguage of effects. This provides a framework for studying abstract program equality focussing on iteration and effects. These aspects are known to have decidable equational theories when studied in isolation. However, it is well known that decidability breaks easily; e.g. the Horn theory of continuous Kleene algebras fails to be recursively enumerable. Here, we prove several negative results for the metalanguage of control and effects; in particular, already the equational theory of the unrestricted metalanguage of control and effects over continuous Kleene monads fails to be recursively enumerable. We proceed to identify a fragment of this language which still contains both Kleene algebra and the metalanguage of effects and for which the natural axiomatisation is complete, and indeed the equational theory is decidable.
Challenges in Computational Social Modeling and Simulation for National Security Decision Making
2011-06-01
This study is grounded within a system-activity theory , a logico-philosophical model of interdisciplinary research [13, 14], the concepts of social...often a difficult challenge. Ironically, social science research methods , such as ethnography , may be tremendously helpful in designing these...social sciences. Moreover, CSS projects draw on knowledge and methods from other fields of study , including graph theory , information visualization
Number crunching vs. number theory: computers and FLT, from Kummer to SWAC (1850-1960), and beyond
NASA Astrophysics Data System (ADS)
Corry, Leo
2008-07-01
The article discusses the computational tools (both conceptual and material) used in various attempts to deal with individual cases of FLT [Fermat's Last Theorem], as well as the changing historical contexts in which these tools were developed and used, and affected research. It also explores the changing conceptions about the role of computations within the overall disciplinary picture of number theory, how they influenced research on the theorem, and the kinds of general insights thus achieved. After an overview of Kummer's contributions and its immediate influence, the author presents work that favored intensive computations of particular cases of FLT as a legitimate, fruitful, and worth-pursuing number-theoretical endeavor, and that were part of a coherent and active, but essentially low-profile tradition within nineteenth century number theory. This work was related to table making activity that was encouraged by institutions and individuals whose motivations came mainly from applied mathematics, astronomy, and engineering, and seldom from number theory proper. A main section of the article is devoted to the fruitful collaboration between Harry S. Vandiver and Emma and Dick Lehmer. The author shows how their early work led to the hesitant introduction of electronic computers for research related with FLT. Their joint work became a milestone for computer-assisted activity in number theory at large.
Practical and Theoretical Requirements for Controlling Rater Stringency in Peer Review.
ERIC Educational Resources Information Center
Cason, Gerald J.; Cason, Carolyn L.
This study describes a computer based, performance rating information processing system, performance rating theory, and programs for the application of the theory to obtain ratings free from the effects of reviewer stringency in reviewing abstracts of conference papers. Originally, the Performance Rating (PR) System was used to evaluate the…
How Robotics Programs Influence Young Women's Career Choices: A Grounded Theory Model
ERIC Educational Resources Information Center
Craig, Cecilia Dosh-Bluhm
2014-01-01
The fields of engineering, computer science, and physics have a paucity of women despite decades of intervention by universities and organizations. Women's graduation rates in these fields continue to stagnate, posing a critical problem for society. This qualitative grounded theory (GT) study sought to understand how robotics programs influenced…
Enhancing Student Explanations of Evolution: Comparing Elaborating and Competing Theory Prompts
ERIC Educational Resources Information Center
Donnelly, Dermot F.; Namdar, Bahadir; Vitale, Jonathan M.; Lai, Kevin; Linn, Marcia C.
2016-01-01
In this study, we explore how two different prompt types within an online computer-based inquiry learning environment enhance 392 7th grade students' explanations of evolution with three teachers. In the "elaborating" prompt condition, students are prompted to write explanations that support the accepted theory of evolution. In the…
The Role and Design of Screen Images in Software Documentation.
ERIC Educational Resources Information Center
van der Meij, Hans
2000-01-01
Discussion of learning a new computer software program focuses on how to support the joint handling of a manual, input devices, and screen display. Describes a study that examined three design styles for manuals that included screen images to reduce split-attention problems and discusses theory versus practice and cognitive load theory.…
Recursive renormalization group theory based subgrid modeling
NASA Technical Reports Server (NTRS)
Zhou, YE
1991-01-01
Advancing the knowledge and understanding of turbulence theory is addressed. Specific problems to be addressed will include studies of subgrid models to understand the effects of unresolved small scale dynamics on the large scale motion which, if successful, might substantially reduce the number of degrees of freedom that need to be computed in turbulence simulation.
Graviton propagator from background-independent quantum gravity.
Rovelli, Carlo
2006-10-13
We study the graviton propagator in Euclidean loop quantum gravity. We use spin foam, boundary-amplitude, and group-field-theory techniques. We compute a component of the propagator to first order, under some approximations, obtaining the correct large-distance behavior. This indicates a way for deriving conventional spacetime quantities from a background-independent theory.
Computational Methods for Inviscid and Viscous Two-and-Three-Dimensional Flow Fields.
1975-01-01
Difference Equations Over a Network, Watson Sei. Comput. Lab. Report, 19U9. 173- Isaacson, E. and Keller, H. B., Analaysis of Numerical Methods...element method has given a new impulse to the old mathematical theory of multivariate interpolation. We first study the one-dimensional case, which
Computer-Assisted Instruction to Avert Teen Pregnancy.
ERIC Educational Resources Information Center
Starn, Jane Ryburn; Paperny, David M.
Teenage pregnancy has become a major public health problem in the United States. A study was conducted to assess an intervention based upon computer-assisted instruction (CAI) to avert teenage pregnancy. Social learning and decision theory were applied to mediate the adolescent environment through CAI so that adolescent development would be…
Computational Chemistry Studies on the Carbene Hydroxymethylene
ERIC Educational Resources Information Center
Marzzacco, Charles J.; Baum, J. Clayton
2011-01-01
A density functional theory computational chemistry exercise on the structure and vibrational spectrum of the carbene hydroxymethylene is presented. The potential energy curve for the decomposition reaction of the carbene to formaldehyde and the geometry of the transition state are explored. The results are in good agreement with recent…
ERIC Educational Resources Information Center
Jackson, Karen Latrice Terrell
2014-01-01
Students' perceptions influence their expectations and values. According to Expectations and Values Theory of Achievement Motivation (EVT-AM), students' expectations and values impact their behaviors (Eccles & Wigfield, 2002). This study seeks to find students' perceptions of developmental mathematics in a mastery learning computer-based…
Patterns of Computer-Mediated Interaction in Small Writing Groups Using Wikis
ERIC Educational Resources Information Center
Li, Mimi; Zhu, Wei
2013-01-01
Informed by sociocultural theory and guided especially by "collective scaffolding", this study investigated the nature of computer-mediated interaction of three groups of English as a Foreign Language students when they performed collaborative writing tasks using wikis. Nine college students from a Chinese university participated in the…
Elementary Teachers' Simulation Adoption and Inquiry-Based Use Following Professional Development
ERIC Educational Resources Information Center
Gonczi, Amanda; Maeng, Jennifer; Bell, Randy
2017-01-01
The purpose of this study was to characterize and compare 64 elementary science teachers' computer simulation use prior to and following professional development (PD) aligned with Innovation Adoption Theory. The PD highlighted computer simulation affordances that elementary teachers might find particularly useful. Qualitative and quantitative…
Quantum Electrodynamics in d=3 from the ε Expansion.
Di Pietro, Lorenzo; Komargodski, Zohar; Shamir, Itamar; Stamou, Emmanuel
2016-04-01
We study quantum electrodynamics in d=3 coupled to N_{f} flavors of fermions. The theory flows to an IR fixed point for N_{f} larger than some critical number N_{f}^{c}. For N_{f}≤N_{f}^{c}, chiral-symmetry breaking is believed to take place. In analogy with the Wilson-Fisher description of the critical O(N) models in d=3, we make use of the existence of a fixed point in d=4-2ε to study the three-dimensional conformal theory. We compute, in perturbation theory, the IR dimensions of fermion bilinear and quadrilinear operators. For small N_{f}, a quadrilinear operator can become relevant in the IR and destabilize the fixed point. Therefore, the epsilon expansion can be used to estimate N_{f}^{c}. An interesting novelty compared to the O(N) models is that the theory in d=3 has an enhanced symmetry due to the structure of 3D spinors. We identify the operators in d=4-2ε that correspond to the additional conserved currents at d=3 and compute their infrared dimensions.
The cyclotron maser theory of AKR and Z-mode radiation. [Auroral Kilometric Radiation
NASA Technical Reports Server (NTRS)
Wu, C. S.
1985-01-01
The cyclotron maser mechanism which may be responsible for the generation of auroral kilometric radiation and Z-mode radiation is discussed. Emphasis is placed on the basic concepts of the cyclotron maser theory, particularly the relativistic effect of the cyclotron resonance condition. Recent development of the theory is reviewed. Finally, the results of a computer simulation study which helps to understand the nonlinear saturation of the maser instability are reported.
ERIC Educational Resources Information Center
Aldalalah, Osamah Ahmad; Fong, Soon Fook
2010-01-01
The purpose of this study was to investigate the effects of modality and redundancy principles on the attitude and learning of music theory among primary pupils of different aptitudes in Jordan. The lesson of music theory was developed in three different modes, audio and image (AI), text with image (TI) and audio with image and text (AIT). The…
ERIC Educational Resources Information Center
DeVillar, Robert A.; Faltis, Christian J.
This book offers an alternative conceptual framework for effectively incorporating computer use within the heterogeneous classroom. The framework integrates Vygotskian social-learning theory with Allport's contact theory and the principles of cooperative learning. In Part 1 an essential element is identified for each of these areas. These are, in…
A Review of Humor for Computer Games: Play, Laugh and More
ERIC Educational Resources Information Center
Dormann, Claire; Biddle, Robert
2009-01-01
Computer games are now becoming ways to communicate, teach, and influence attitudes and behavior. In this article, we address the role of humor in computer games, especially in support of serious purposes. We begin with a review of the main theories of humor, including superiority, incongruity, and relief. These theories and their…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kok Yan Chan, G.; Sclavounos, P. D.; Jonkman, J.
2015-04-02
A hydrodynamics computer module was developed for the evaluation of the linear and nonlinear loads on floating wind turbines using a new fluid-impulse formulation for coupling with the FAST program. The recently developed formulation allows the computation of linear and nonlinear loads on floating bodies in the time domain and avoids the computationally intensive evaluation of temporal and nonlinear free-surface problems and efficient methods are derived for its computation. The body instantaneous wetted surface is approximated by a panel mesh and the discretization of the free surface is circumvented by using the Green function. The evaluation of the nonlinear loadsmore » is based on explicit expressions derived by the fluid-impulse theory, which can be computed efficiently. Computations are presented of the linear and nonlinear loads on the MIT/NREL tension-leg platform. Comparisons were carried out with frequency-domain linear and second-order methods. Emphasis was placed on modeling accuracy of the magnitude of nonlinear low- and high-frequency wave loads in a sea state. Although fluid-impulse theory is applied to floating wind turbines in this paper, the theory is applicable to other offshore platforms as well.« less
Applications of Ergodic Theory to Coverage Analysis
NASA Technical Reports Server (NTRS)
Lo, Martin W.
2003-01-01
The study of differential equations, or dynamical systems in general, has two fundamentally different approaches. We are most familiar with the construction of solutions to differential equations. Another approach is to study the statistical behavior of the solutions. Ergodic Theory is one of the most developed methods to study the statistical behavior of the solutions of differential equations. In the theory of satellite orbits, the statistical behavior of the orbits is used to produce 'Coverage Analysis' or how often a spacecraft is in view of a site on the ground. In this paper, we consider the use of Ergodic Theory for Coverage Analysis. This allows us to greatly simplify the computation of quantities such as the total time for which a ground station can see a satellite without ever integrating the trajectory, see Lo 1,2. More over, for any quantity which is an integrable function of the ground track, its average may be computed similarly without the integration of the trajectory. For example, the data rate for a simple telecom system is a function of the distance between the satellite and the ground station. We show that such a function may be averaged using the Ergodic Theorem.
Entanglement negativity bounds for fermionic Gaussian states
NASA Astrophysics Data System (ADS)
Eisert, Jens; Eisler, Viktor; Zimborás, Zoltán
2018-04-01
The entanglement negativity is a versatile measure of entanglement that has numerous applications in quantum information and in condensed matter theory. It can not only efficiently be computed in the Hilbert space dimension, but for noninteracting bosonic systems, one can compute the negativity efficiently in the number of modes. However, such an efficient computation does not carry over to the fermionic realm, the ultimate reason for this being that the partial transpose of a fermionic Gaussian state is no longer Gaussian. To provide a remedy for this state of affairs, in this work, we introduce efficiently computable and rigorous upper and lower bounds to the negativity, making use of techniques of semidefinite programming, building upon the Lagrangian formulation of fermionic linear optics, and exploiting suitable products of Gaussian operators. We discuss examples in quantum many-body theory and hint at applications in the study of topological properties at finite temperature.
A computer architecture for intelligent machines
NASA Technical Reports Server (NTRS)
Lefebvre, D. R.; Saridis, G. N.
1991-01-01
The Theory of Intelligent Machines proposes a hierarchical organization for the functions of an autonomous robot based on the Principle of Increasing Precision With Decreasing Intelligence. An analytic formulation of this theory using information-theoretic measures of uncertainty for each level of the intelligent machine has been developed in recent years. A computer architecture that implements the lower two levels of the intelligent machine is presented. The architecture supports an event-driven programming paradigm that is independent of the underlying computer architecture and operating system. Details of Execution Level controllers for motion and vision systems are addressed, as well as the Petri net transducer software used to implement Coordination Level functions. Extensions to UNIX and VxWorks operating systems which enable the development of a heterogeneous, distributed application are described. A case study illustrates how this computer architecture integrates real-time and higher-level control of manipulator and vision systems.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pop, N., E-mail: nicolina.pop@upt.ro; Ilie, S.; Motapon, O.
2014-11-24
The present work is aimed at performing the computation of cross sections and Maxwell rate coefficients in the framework of the stepwise version of the Multichannel Quantum Defect Theory (MQDT). Cross sections and rate coefficients suitable for the modelling of the kinetics of HD{sup +} and BeH{sup +} in fusion plasmas and in the stellar atmospheres are presented and discussed. A very good agreement is found between our results for rotational transitions for HD{sup +} and other computations, as well as with experiment.
Intersecting surface defects and instanton partition functions
Pan, Yiwen; Peelaers, Wolfger
2017-07-14
We analyze intersecting surface defects inserted in interacting four-dimensional N = 2 supersymmetric quantum field theories. We employ the realization of a class of such systems as the infrared xed points of renormalization group flows from larger theories, triggered by perturbed Seiberg-Witten monopole-like con gurations, to compute their partition functions. These results are cast into the form of a partition function of 4d/2d/0d coupled systems. In conclusion, our computations provide concrete expressions for the instanton partition function in the presence of intersecting defects and we study the corresponding ADHM model.
Sound propagation through a variable area duct - Experiment and theory
NASA Technical Reports Server (NTRS)
Silcox, R. J.; Lester, H. C.
1981-01-01
A comparison of experiment and theory has been made for the propagation of sound through a variable area axisymmetric duct with zero mean flow. Measurement of the acoustic pressure field on both sides of the constricted test section was resolved on a modal basis for various spinning mode sources. Transmitted and reflected modal amplitudes and phase angles were compared with finite element computations. Good agreement between experiment and computation was obtained over a wide range of frequencies and modal transmission variations. The study suggests that modal transmission through a variable area duct is governed by the throat modal cut-off ratio.
Intersecting surface defects and instanton partition functions
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pan, Yiwen; Peelaers, Wolfger
We analyze intersecting surface defects inserted in interacting four-dimensional N = 2 supersymmetric quantum field theories. We employ the realization of a class of such systems as the infrared xed points of renormalization group flows from larger theories, triggered by perturbed Seiberg-Witten monopole-like con gurations, to compute their partition functions. These results are cast into the form of a partition function of 4d/2d/0d coupled systems. In conclusion, our computations provide concrete expressions for the instanton partition function in the presence of intersecting defects and we study the corresponding ADHM model.
Sandow, M J; Fisher, T J; Howard, C Q; Papas, S
2014-05-01
This study was part of a larger project to develop a (kinetic) theory of carpal motion based on computationally derived isometric constraints. Three-dimensional models were created from computed tomography scans of the wrists of ten normal subjects and carpal spatial relationships at physiological motion extremes were assessed. Specific points on the surface of the various carpal bones and the radius that remained isometric through range of movement were identified. Analysis of the isometric constraints and intercarpal motion suggests that the carpus functions as a stable central column (lunate-capitate-hamate-trapezoid-trapezium) with a supporting lateral column (scaphoid), which behaves as a 'two gear four bar linkage'. The triquetrum functions as an ulnar translation restraint, as well as controlling lunate flexion. The 'trapezoid'-shaped trapezoid places the trapezium anterior to the transverse plane of the radius and ulna, and thus rotates the principal axis of the central column to correspond to that used in the 'dart thrower's motion'. This study presents a forward kinematic analysis of the carpus that provides the basis for the development of a unifying kinetic theory of wrist motion based on isometric constraints and rules-based motion.
NASA Astrophysics Data System (ADS)
Gobithaasan, R. U.; Miura, Kenjiro T.; Hassan, Mohamad Nor
2014-07-01
Computer Aided Geometric Design (CAGD) which surpasses the underlying theories of Computer Aided Design (CAD) and Computer Graphics (CG) has been taught in a number of Malaysian universities under the umbrella of Mathematical Sciences' faculty/department. On the other hand, CAD/CG is taught either under the Engineering or Computer Science Faculty. Even though CAGD researchers/educators/students (denoted as contributors) have been enriching this field of study by means of article/journal publication, many fail to convert the idea into constructive innovation due to the gap that occurs between CAGD contributors and practitioners (engineers/product/designers/architects/artists). This paper addresses this issue by advocating a number of technologies that can be used to transform CAGD contributors into innovators where immediate impact in terms of practical application can be experienced by the CAD/CG practitioners. The underlying principle of solving this issue is twofold. First would be to expose the CAGD contributors on ways to turn mathematical ideas into plug-ins and second is to impart relevant CAGD theories to CAD/CG to practitioners. Both cases are discussed in detail and the final section shows examples to illustrate the importance of turning mathematical knowledge into innovations.
NASA Astrophysics Data System (ADS)
Assadi, Amir H.
2001-11-01
Perceptual geometry is an emerging field of interdisciplinary research whose objectives focus on study of geometry from the perspective of visual perception, and in turn, apply such geometric findings to the ecological study of vision. Perceptual geometry attempts to answer fundamental questions in perception of form and representation of space through synthesis of cognitive and biological theories of visual perception with geometric theories of the physical world. Perception of form and space are among fundamental problems in vision science. In recent cognitive and computational models of human perception, natural scenes are used systematically as preferred visual stimuli. Among key problems in perception of form and space, we have examined perception of geometry of natural surfaces and curves, e.g. as in the observer's environment. Besides a systematic mathematical foundation for a remarkably general framework, the advantages of the Gestalt theory of natural surfaces include a concrete computational approach to simulate or recreate images whose geometric invariants and quantities might be perceived and estimated by an observer. The latter is at the very foundation of understanding the nature of perception of space and form, and the (computer graphics) problem of rendering scenes to visually invoke virtual presence.
Linguistics, Computers, and the Language Teacher. A Communicative Approach.
ERIC Educational Resources Information Center
Underwood, John H.
This analysis of the state of the art of computer programs and programming for language teaching has two parts. In the first part, an overview of the theory and practice of language teaching, Noam Chomsky's view of language, and the implications and problems of generative theory are presented. The theory behind the input model of language…
The Handicap Principle for Trust in Computer Security, the Semantic Web and Social Networking
NASA Astrophysics Data System (ADS)
Ma, Zhanshan (Sam); Krings, Axel W.; Hung, Chih-Cheng
Communication is a fundamental function of life, and it exists in almost all living things: from single-cell bacteria to human beings. Communication, together with competition and cooperation,arethree fundamental processes in nature. Computer scientists are familiar with the study of competition or 'struggle for life' through Darwin's evolutionary theory, or even evolutionary computing. They may be equally familiar with the study of cooperation or altruism through the Prisoner's Dilemma (PD) game. However, they are likely to be less familiar with the theory of animal communication. The objective of this article is three-fold: (i) To suggest that the study of animal communication, especially the honesty (reliability) of animal communication, in which some significant advances in behavioral biology have been achieved in the last three decades, should be on the verge to spawn important cross-disciplinary research similar to that generated by the study of cooperation with the PD game. One of the far-reaching advances in the field is marked by the publication of "The Handicap Principle: a Missing Piece of Darwin's Puzzle" by Zahavi (1997). The 'Handicap' principle [34][35], which states that communication signals must be costly in some proper way to be reliable (honest), is best elucidated with evolutionary games, e.g., Sir Philip Sidney (SPS) game [23]. Accordingly, we suggest that the Handicap principle may serve as a fundamental paradigm for trust research in computer science. (ii) To suggest to computer scientists that their expertise in modeling computer networks may help behavioral biologists in their study of the reliability of animal communication networks. This is largely due to the historical reason that, until the last decade, animal communication was studied with the dyadic paradigm (sender-receiver) rather than with the network paradigm. (iii) To pose several open questions, the answers to which may bear some refreshing insights to trust research in computer science, especially secure and resilient computing, the semantic web, and social networking. One important thread unifying the three aspects is the evolutionary game theory modeling or its extensions with survival analysis and agreement algorithms [19][20], which offer powerful game models for describing time-, space-, and covariate-dependent frailty (uncertainty and vulnerability) and deception (honesty).
"MONSTROUS MOONSHINE" and Physics
NASA Astrophysics Data System (ADS)
Pushkin, A. V.
The report presents some results obtained by the author on the quantum gravitation theory. Algebraic structure of this theory proves to be related to the commutative nonassociative Griess algebra. The theory symmetry is the automorphism group of Griess algebra: "Monster" simple group. Knowledge of the theory symmetry allows to compute observed physical values in the `zero' approximation. The report presents such computed results for values {m_{p}}/{m_{c}} and α, for the latter the `zero' approximation accuracy, controlled by the theory, being one order of magnitude higher than the accuracy of modern measurements.
Nonlinear constitutive theory for turbine engine structural analysis
NASA Technical Reports Server (NTRS)
Thompson, R. L.
1982-01-01
A number of viscoplastic constitutive theories and a conventional constitutive theory are evaluated and compared in their ability to predict nonlinear stress-strain behavior in gas turbine engine components at elevated temperatures. Specific application of these theories is directed towards the structural analysis of combustor liners undergoing transient, cyclic, thermomechanical load histories. The combustor liner material considered in this study is Hastelloy X. The material constants for each of the theories (as a function of temperature) are obtained from existing, published experimental data. The viscoplastic theories and a conventional theory are incorporated into a general purpose, nonlinear, finite element computer program. Several numerical examples of combustor liner structural analysis using these theories are given to demonstrate their capabilities. Based on the numerical stress-strain results, the theories are evaluated and compared.
Results of Computing Amplitude and Phase of the VLF Wave Using Wave Hop Theory
NASA Astrophysics Data System (ADS)
Pal, Sujay; Basak, Tamal; Chakrabarti, Sandip K.
2011-07-01
We present the basics of the wave hop theory to compute the amplitude and phase of the VLF signals. We use the Indian Navy VTX transmitter at 18.2 kHz as an example of the source and compute the VLF propagation characteristics for several propagation paths using the wave-hop theory. We find the signal amplitudes as a function of distance from the transmitter using wave hop theory in different bearing angles and compare with the same obtained from the Long Wave Propagation Capability (LWPC) code which uses the mode theory. We repeat a similar exercise for the diurnal and seasonal behavior. We note that the signal variation by wave hop theory gives more detailed information in the day time. We further present the spatial variation of the signal amplitude over whole of India at a given time including the effect of sunrise and sunset terminator and also compare the same with that from the mode theory. We point out that the terminator effect is clearly understood in wave hop results than that from the mode theory.
Computer Series, 114: MO Theory Made Visible.
ERIC Educational Resources Information Center
Mealli, Carlo; Proserpio, Davide M.
1990-01-01
A collection of Molecular Orbital (MO) programs that have been integrated into routines and programs to illustrate MO theory are presented. Included are discussions of Computer Aided Composition of Atomic Orbitals (CACAO) and Walsh diagrams. (CW)
Qualitative and Quantitative Pedigree Analysis: Graph Theory, Computer Software, and Case Studies.
ERIC Educational Resources Information Center
Jungck, John R.; Soderberg, Patti
1995-01-01
Presents a series of elementary mathematical tools for re-representing pedigrees, pedigree generators, pedigree-driven database management systems, and case studies for exploring genetic relationships. (MKR)
2010-03-01
functionality and plausibility distinguishes this research from most research in computational linguistics and computational psycholinguistics . The... Psycholinguistic Theory There is extensive psycholinguistic evidence that human language processing is essentially incremental and interactive...challenges of psycholinguistic research is to explain how humans can process language effortlessly and accurately given the complexity and ambiguity that is
Teaching and Learning Methodologies Supported by ICT Applied in Computer Science
ERIC Educational Resources Information Center
Capacho, Jose
2016-01-01
The main objective of this paper is to show a set of new methodologies applied in the teaching of Computer Science using ICT. The methodologies are framed in the conceptual basis of the following sciences: Psychology, Education and Computer Science. The theoretical framework of the research is supported by Behavioral Theory, Gestalt Theory.…
Theory and Programs for Dynamic Modeling of Tree Rings from Climate
Paul C. van Deusen; Jennifer Koretz
1988-01-01
Computer programs written in GAUSS(TM) for IBM compatible personal computers are described that perform dynamic tree ring modeling with climate data; the underlying theory is also described. The programs and a separate users manual are available from the authors, although users must have the GAUSS software package on their personal computer. An example application of...
Computer vision in cell biology.
Danuser, Gaudenz
2011-11-23
Computer vision refers to the theory and implementation of artificial systems that extract information from images to understand their content. Although computers are widely used by cell biologists for visualization and measurement, interpretation of image content, i.e., the selection of events worth observing and the definition of what they mean in terms of cellular mechanisms, is mostly left to human intuition. This Essay attempts to outline roles computer vision may play and should play in image-based studies of cellular life. Copyright © 2011 Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Borah, Mukunda Madhab; Devi, Th. Gomti
2018-06-01
The vibrational spectral analysis of Serotonin and its dimer were carried out using the Fourier Transform Infrared (FTIR) and Raman techniques. The equilibrium geometrical parameters, harmonic vibrational wavenumbers, Frontier orbitals, Mulliken atomic charges, Natural Bond orbitals, first order hyperpolarizability and some optimized energy parameters were computed by density functional theory with 6-31G(d,p) basis set. The detailed analysis of the vibrational spectra have been carried out by computing Potential Energy Distribution (PED, %) with the help of Vibrational Energy Distribution Analysis (VEDA) program. The second order delocalization energies E(2) confirms the occurrence of intramolecular Charge Transfer (ICT) within the molecule. The computed wavenumbers of Serotonin monomer and dimer were found in good agreement with the experimental Raman and IR values.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Elnabawy, Ahmed O.; Rangarajan, Srinivas; Mavrikakis, Manos
Computational chemistry, especially density functional theory, has experienced a remarkable growth in terms of application over the last few decades. This is attributed to the improvements in theory and computing infrastructure that enable the analysis of systems of unprecedented size and detail at an affordable computational expense. In this perspective, we discuss recent progress and current challenges facing electronic structure theory in the context of heterogeneous catalysis. We specifically focus on the impact of computational chemistry in elucidating and designing catalytic systems in three topics of interest to Haldor Topsøe – ammonia, synthesis, hydrotreating, and NO x reduction. Furthermore, wemore » then discuss the common tools and concepts in computational catalysis that underline these topics and provide a perspective on the challenges and future directions of research in this area of catalysis research.« less
Elnabawy, Ahmed O.; Rangarajan, Srinivas; Mavrikakis, Manos
2015-06-05
Computational chemistry, especially density functional theory, has experienced a remarkable growth in terms of application over the last few decades. This is attributed to the improvements in theory and computing infrastructure that enable the analysis of systems of unprecedented size and detail at an affordable computational expense. In this perspective, we discuss recent progress and current challenges facing electronic structure theory in the context of heterogeneous catalysis. We specifically focus on the impact of computational chemistry in elucidating and designing catalytic systems in three topics of interest to Haldor Topsøe – ammonia, synthesis, hydrotreating, and NO x reduction. Furthermore, wemore » then discuss the common tools and concepts in computational catalysis that underline these topics and provide a perspective on the challenges and future directions of research in this area of catalysis research.« less
The application of the integral equation theory to study the hydrophobic interaction
Mohorič, Tomaž; Urbic, Tomaz; Hribar-Lee, Barbara
2014-01-01
The Wertheim's integral equation theory was tested against newly obtained Monte Carlo computer simulations to describe the potential of mean force between two hydrophobic particles. An excellent agreement was obtained between the theoretical and simulation results. Further, the Wertheim's integral equation theory with polymer Percus-Yevick closure qualitatively correctly (with respect to the experimental data) describes the solvation structure under conditions where the simulation results are difficult to obtain with good enough accuracy. PMID:24437891
Gamut relativity: a new computational approach to brightness and lightness perception.
Vladusich, Tony
2013-01-09
This article deconstructs the conventional theory that "brightness" and "lightness" constitute perceptual dimensions corresponding to the physical dimensions of luminance and reflectance, and builds in its place the theory that brightness and lightness correspond to computationally defined "modes," rather than dimensions, of perception. According to the theory, called gamut relativity, "blackness" and "whiteness" constitute the perceptual dimensions (forming a two-dimensional "blackness-whiteness" space) underlying achromatic color perception (black, white, and gray shades). These perceptual dimensions are postulated to be related to the neural activity levels in the ON and OFF channels of vision. The theory unifies and generalizes a number of extant concepts in the brightness and lightness literature, such as simultaneous contrast, anchoring, and scission, and quantitatively simulates several challenging perceptual phenomena, including the staircase Gelb effect and the effects of task instructions on achromatic color-matching behavior, all with a single free parameter. The theory also provides a new conception of achromatic color constancy in terms of the relative distances between points in blackness-whiteness space. The theory suggests a host of striking conclusions, the most important of which is that the perceptual dimensions of vision should be generically specified according to the computational properties of the brain, rather than in terms of "reified" physical dimensions. This new approach replaces the computational goal of estimating absolute physical quantities ("inverse optics") with the goal of computing object properties relatively.
20 CFR 901.11 - Enrollment procedures.
Code of Federal Regulations, 2011 CFR
2011-04-01
.... Examples include economics, computer programs, pension accounting, investment and finance, risk theory... Columbia responsible for the issuance of a license in the field of actuarial science, insurance, accounting... include economics, computer programming, pension accounting, investment and finance, risk theory...
Poeppel, David
2014-01-01
We outline what an integrated approach to language research that connects experimental, theoretical, and neurobiological domains of inquiry would look like, and ask to what extent unification is possible across domains. At the center of the program is the idea that computational/representational (CR) theories of language must be used to investigate its neurobiological (NB) foundations. We consider different ways in which CR and NB might be connected. These are (1) A Correlational way, in which NB computation is correlated with the CR theory; (2) An Integrated way, in which NB data provide crucial evidence for choosing among CR theories; and (3) an Explanatory way, in which properties of NB explain why a CR theory is the way it is. We examine various questions concerning the prospects for Explanatory connections in particular, including to what extent it makes sense to say that NB could be specialized for particular computations. PMID:25914888
Embick, David; Poeppel, David
2015-05-01
We outline what an integrated approach to language research that connects experimental, theoretical, and neurobiological domains of inquiry would look like, and ask to what extent unification is possible across domains. At the center of the program is the idea that computational/representational (CR) theories of language must be used to investigate its neurobiological (NB) foundations. We consider different ways in which CR and NB might be connected. These are (1) A Correlational way, in which NB computation is correlated with the CR theory; (2) An Integrated way, in which NB data provide crucial evidence for choosing among CR theories; and (3) an Explanatory way, in which properties of NB explain why a CR theory is the way it is. We examine various questions concerning the prospects for Explanatory connections in particular, including to what extent it makes sense to say that NB could be specialized for particular computations.
Importance of elastic finite-size effects: Neutral defects in ionic compounds
NASA Astrophysics Data System (ADS)
Burr, P. A.; Cooper, M. W. D.
2017-09-01
Small system sizes are a well-known source of error in density functional theory (DFT) calculations, yet computational constraints frequently dictate the use of small supercells, often as small as 96 atoms in oxides and compound semiconductors. In ionic compounds, electrostatic finite-size effects have been well characterized, but self-interaction of charge-neutral defects is often discounted or assumed to follow an asymptotic behavior and thus easily corrected with linear elastic theory. Here we show that elastic effects are also important in the description of defects in ionic compounds and can lead to qualitatively incorrect conclusions if inadequately small supercells are used; moreover, the spurious self-interaction does not follow the behavior predicted by linear elastic theory. Considering the exemplar cases of metal oxides with fluorite structure, we show that numerous previous studies, employing 96-atom supercells, misidentify the ground-state structure of (charge-neutral) Schottky defects. We show that the error is eliminated by employing larger cells (324, 768, and 1500 atoms), and careful analysis determines that elastic, not electrostatic, effects are responsible. The spurious self-interaction was also observed in nonoxide ionic compounds irrespective of the computational method used, thereby resolving long-standing discrepancies between DFT and force-field methods, previously attributed to the level of theory. The surprising magnitude of the elastic effects is a cautionary tale for defect calculations in ionic materials, particularly when employing computationally expensive methods (e.g., hybrid functionals) or when modeling large defect clusters. We propose two computationally practicable methods to test the magnitude of the elastic self-interaction in any ionic system. In commonly studied oxides, where electrostatic effects would be expected to be dominant, it is the elastic effects that dictate the need for larger supercells: greater than 96 atoms.
Orbital theory in terms of KS elements with luni-solar perturbations
NASA Astrophysics Data System (ADS)
Sellamuthu, Harishkumar; Sharma, Ram
2016-07-01
Precise orbit computation of Earth orbiting satellites is essential for efficient mission planning of planetary exploration, navigation and satellite geodesy. The third-body perturbations of the Sun and the Moon predominantly affect the satellite motion in the high altitude and elliptical orbits, where the effect of atmospheric drag is negligible. The physics of the luni-solar gravity effect on Earth satellites have been studied extensively over the years. The combined luni-solar gravitational attraction will induce a cumulative effect on the dynamics of satellite orbits, which mainly oscillates the perigee altitude. Though accurate orbital parameters are computed by numerical integration with respect to complex force models, analytical theories are highly valued for the manifold of solutions restricted to relatively simple force models. During close approach, the classical equations of motion in celestial mechanics are almost singular and they are unstable for long-term orbit propagation. A new singularity-free analytical theory in terms of KS (Kustaanheimo and Stiefel) regular elements with respect to luni-solar perturbation is developed. These equations are regular everywhere and eccentric anomaly is the independent variable. Plataforma Solar de Almería (PSA) algorithm and a Fourier series algorithm are used to compute the accurate positions of the Sun and the Moon, respectively. Numerical studies are carried out for wide range of initial parameters and the analytical solutions are found to be satisfactory when compared with numerically integrated values. The symmetrical nature of the equations allows only two of the nine equations to be solved for computing the state vectors and the time. Only a change in the initial conditions is required to solve the other equations. This theory will find multiple applications including on-board software packages and for mission analysis purposes.
Media Effects: Theory and Research.
Valkenburg, Patti M; Peter, Jochen; Walther, Joseph B
2016-01-01
This review analyzes trends and commonalities among prominent theories of media effects. On the basis of exemplary meta-analyses of media effects and bibliometric studies of well-cited theories, we identify and discuss five features of media effects theories as well as their empirical support. Each of these features specifies the conditions under which media may produce effects on certain types of individuals. Our review ends with a discussion of media effects in newer media environments. This includes theories of computer-mediated communication, the development of which appears to share a similar pattern of reformulation from unidirectional, receiver-oriented views, to theories that recognize the transactional nature of communication. We conclude by outlining challenges and promising avenues for future research.
On mini-superspace limit of boundary three-point function in Liouville field theory
NASA Astrophysics Data System (ADS)
Apresyan, Elena; Sarkissian, Gor
2017-12-01
We study the mini-superspace semiclassical limit of the boundary three-point function in the Liouville field theory. We compute also matrix elements for the Morse potential quantum mechanics. An exact agreement between the former and the latter is found. We show that both of them are given by the generalized hypergeometric functions.
A Comparison of Parallelism in Interface Designs for Computer-Based Learning Environments
ERIC Educational Resources Information Center
Min, Rik; Yu, Tao; Spenkelink, Gerd; Vos, Hans
2004-01-01
In this paper we discuss an experiment that was carried out with a prototype, designed in conformity with the concept of parallelism and the Parallel Instruction theory (the PI theory). We designed this prototype with five different interfaces, and ran an empirical study in which 18 participants completed an abstract task. The five basic designs…
ERIC Educational Resources Information Center
Burk, Robin K.
2010-01-01
Computational natural language understanding and generation have been a goal of artificial intelligence since McCarthy, Minsky, Rochester and Shannon first proposed to spend the summer of 1956 studying this and related problems. Although statistical approaches dominate current natural language applications, two current research trends bring…
Five-dimensional fermionic Chern-Simons theory
NASA Astrophysics Data System (ADS)
Bak, Dongsu; Gustavsson, Andreas
2018-02-01
We study 5d fermionic CS theory with a fermionic 2-form gauge potential. This theory can be obtained from 5d maximally supersymmetric YM theory by performing the maximal topological twist. We put the theory on a five-manifold and compute the partition function. We find that it is a topological quantity, which involves the Ray-Singer torsion of the five-manifold. For abelian gauge group we consider the uplift to the 6d theory and find a mismatch between the 5d partition function and the 6d index, due to the nontrivial dimensional reduction of a selfdual two-form gauge field on a circle. We also discuss an application of the 5d theory to generalized knots made of 2d sheets embedded in 5d.
Politeness Theory in Computer Mediated Communication: Face Threatening Acts in a "Faceless" Medium.
ERIC Educational Resources Information Center
Simmons, Thomas L.
A study of distinctive characteristics of the style in which people communicate in computer-mediated communication (CMC), focusing on use of politeness conventions, is reported. Aspects of the concept of "face" and politeness in social interaction are first reviewed, and threats to speaker's and hearer's face are outlined. The…
Collaborative Dialogue in Synchronous Computer-Mediated Communication and Face-to-Face Communication
ERIC Educational Resources Information Center
Zeng, Gang
2017-01-01
Previous research has documented that collaborative dialogue promotes L2 learning in both face-to-face (F2F) and synchronous computer-mediated communication (SCMC) modalities. However, relatively little research has explored modality effects on collaborative dialogue. Thus, motivated by sociocultual theory, this study examines how F2F compares…
ERIC Educational Resources Information Center
Wu, Min Lun
2018-01-01
This qualitative case study reports descriptive findings of digital game-based learning involving 15 Taiwanese middle school students' use of computational thinking skills elicited through programmed activities in a game design workshop. Situated learning theory is utilized as framework to evaluate novice game designers' individual advancement in…
Development of a Computer-Based Visualised Quantitative Learning System for Playing Violin Vibrato
ERIC Educational Resources Information Center
Ho, Tracy Kwei-Liang; Lin, Huann-shyang; Chen, Ching-Kong; Tsai, Jih-Long
2015-01-01
Traditional methods of teaching music are largely subjective, with the lack of objectivity being particularly challenging for violin students learning vibrato because of the existence of conflicting theories. By using a computer-based analysis method, this study found that maintaining temporal coincidence between the intensity peak and the target…
Information Prosthetics for the Handicapped. Artificial Intelligence Memo No. 496.
ERIC Educational Resources Information Center
Papert, Seymour A.; Weir, Sylvia
The proposal outlines a study to assess the role of computers in assessing and instructing students with severe cerebral palsy in spatial and communication skills. The computer's capacity to make learning interesting and challenging to the severely disabled student is noted, along with its use as a diagnostic tool. Implications for theories on…
Analysis of Computer Algebra System Tutorials Using Cognitive Load Theory
ERIC Educational Resources Information Center
May, Patricia
2004-01-01
Most research in the area of Computer Algebra Systems (CAS) has been designed to compare the effectiveness of instructional technology to traditional lecture-based formats. While results are promising, research also indicates evidence of the steep learning curve imposed by the technology. Yet no studies have been conducted to investigate this…
Application of Game Theory to Improve the Defense of the Smart Grid
2012-03-01
Computer Systems and Networks ...............................................22 2.4.2 Trust Models ...systems. In this environment, developers assumed deterministic communications mediums rather than the “best effort” models provided in most modern... models or computational models to validate the SPSs design. Finally, the study reveals concerns about the performance of load rejection schemes
Tying Theory To Practice: Cognitive Aspects of Computer Interaction in the Design Process.
ERIC Educational Resources Information Center
Mikovec, Amy E.; Dake, Dennis M.
The new medium of computer-aided design requires changes to the creative problem-solving methodologies typically employed in the development of new visual designs. Most theoretical models of creative problem-solving suggest a linear progression from preparation and incubation to some type of evaluative study of the "inspiration." These…
Development and Validation of a Computer Adaptive EFL Test
ERIC Educational Resources Information Center
He, Lianzhen; Min, Shangchao
2017-01-01
The first aim of this study was to develop a computer adaptive EFL test (CALT) that assesses test takers' listening and reading proficiency in English with dichotomous items and polytomous testlets. We reported in detail on the development of the CALT, including item banking, determination of suitable item response theory (IRT) models for item…
ERIC Educational Resources Information Center
Anderson, Bodi; Horn, Robert
2012-01-01
Computer literacy is increasingly important in higher education, and many educational technology experts propose a more prominent integration of technology into pedagogy. Empirical evidence is needed to support these theories. This study examined community college students planning to transfer to 4-year universities and estimated the relationship…
ERIC Educational Resources Information Center
Erdogan, Ahmet
2010-01-01
Based on Social Cognitive Carier Theory (SCCT) (Lent, Brown, & Hackett, 1994, 2002), this study tested the effects of mathematics teacher candidates' self-efficacy in, outcome expectations from, and interest in CAME on their intentions to integrate Computer-Assisted Mathematics Education (CAME). While mathematics teacher candidates' outcome…
NASA Technical Reports Server (NTRS)
Reuther, James; Alonso, Juan Jose; Rimlinger, Mark J.; Jameson, Antony
1996-01-01
This work describes the application of a control theory-based aerodynamic shape optimization method to the problem of supersonic aircraft design. The design process is greatly accelerated through the use of both control theory and a parallel implementation on distributed memory computers. Control theory is employed to derive the adjoint differential equations whose solution allows for the evaluation of design gradient information at a fraction of the computational cost required by previous design methods. The resulting problem is then implemented on parallel distributed memory architectures using a domain decomposition approach, an optimized communication schedule, and the MPI (Message Passing Interface) Standard for portability and efficiency. The final result achieves very rapid aerodynamic design based on higher order computational fluid dynamics methods (CFD). In our earlier studies, the serial implementation of this design method was shown to be effective for the optimization of airfoils, wings, wing-bodies, and complex aircraft configurations using both the potential equation and the Euler equations. In our most recent paper, the Euler method was extended to treat complete aircraft configurations via a new multiblock implementation. Furthermore, during the same conference, we also presented preliminary results demonstrating that this basic methodology could be ported to distributed memory parallel computing architectures. In this paper, our concern will be to demonstrate that the combined power of these new technologies can be used routinely in an industrial design environment by applying it to the case study of the design of typical supersonic transport configurations. A particular difficulty of this test case is posed by the propulsion/airframe integration.
Reis, H; Rasulev, B; Papadopoulos, M G; Leszczynski, J
2015-01-01
Fullerene and its derivatives are currently one of the most intensively investigated species in the area of nanomedicine and nanochemistry. Various unique properties of fullerenes are responsible for their wide range applications in industry, biology and medicine. A large pool of functionalized C60 and C70 fullerenes is investigated theoretically at different levels of quantum-mechanical theory. The semiempirial PM6 method, density functional theory with the B3LYP functional, and correlated ab initio MP2 method are employed to compute the optimized structures, and an array of properties for the considered species. In addition to the calculations for isolated molecules, the results of solution calculations are also reported at the DFT level, using the polarizable continuum model (PCM). Ionization potentials (IPs) and electron affinities (EAs) are computed by means of Koopmans' theorem as well as with the more accurate but computationally expensive ΔSCF method. Both procedures yield comparable values, while comparison of IPs and EAs computed with different quantum-mechanical methods shows surprisingly large differences. Harmonic vibrational frequencies are computed at the PM6 and B3LYP levels of theory and compared with each other. A possible application of the frequencies as 3D descriptors in the EVA (EigenVAlues) method is shown. All the computed data are made available, and may be used to replace experimental data in routine applications where large amounts of data are required, e.g. in structure-activity relationship studies of the toxicity of fullerene derivatives.
Nasrabad, Afshin Eskandari; Laghaei, Rozita; Eu, Byung Chan
2005-04-28
In previous work on the density fluctuation theory of transport coefficients of liquids, it was necessary to use empirical self-diffusion coefficients to calculate the transport coefficients (e.g., shear viscosity of carbon dioxide). In this work, the necessity of empirical input of the self-diffusion coefficients in the calculation of shear viscosity is removed, and the theory is thus made a self-contained molecular theory of transport coefficients of liquids, albeit it contains an empirical parameter in the subcritical regime. The required self-diffusion coefficients of liquid carbon dioxide are calculated by using the modified free volume theory for which the generic van der Waals equation of state and Monte Carlo simulations are combined to accurately compute the mean free volume by means of statistical mechanics. They have been computed as a function of density along four different isotherms and isobars. A Lennard-Jones site-site interaction potential was used to model the molecular carbon dioxide interaction. The density and temperature dependence of the theoretical self-diffusion coefficients are shown to be in excellent agreement with experimental data when the minimum critical free volume is identified with the molecular volume. The self-diffusion coefficients thus computed are then used to compute the density and temperature dependence of the shear viscosity of liquid carbon dioxide by employing the density fluctuation theory formula for shear viscosity as reported in an earlier paper (J. Chem. Phys. 2000, 112, 7118). The theoretical shear viscosity is shown to be robust and yields excellent density and temperature dependence for carbon dioxide. The pair correlation function appearing in the theory has been computed by Monte Carlo simulations.
Moncho, Salvador; Autschbach, Jochen
2010-01-12
A benchmark study for relativistic density functional calculations of NMR spin-spin coupling constants has been performed. The test set contained 47 complexes with heavy metal atoms (W, Pt, Hg, Tl, Pb) with a total of 88 coupling constants involving one or two heavy metal atoms. One-, two-, three-, and four-bond spin-spin couplings have been computed at different levels of theory (nonhybrid vs hybrid DFT, scalar vs two-component relativistic). The computational model was based on geometries fully optimized at the BP/TZP scalar relativistic zeroth-order regular approximation (ZORA) and the conductor-like screening model (COSMO) to include solvent effects. The NMR computations also employed the continuum solvent model. Computations in the gas phase were performed in order to assess the importance of the solvation model. The relative median deviations between various computational models and experiment were found to range between 13% and 21%, with the highest-level computational model (hybrid density functional computations including scalar plus spin-orbit relativistic effects, the COSMO solvent model, and a Gaussian finite-nucleus model) performing best.
A Cohomological Perspective on Algebraic Quantum Field Theory
NASA Astrophysics Data System (ADS)
Hawkins, Eli
2018-05-01
Algebraic quantum field theory is considered from the perspective of the Hochschild cohomology bicomplex. This is a framework for studying deformations and symmetries. Deformation is a possible approach to the fundamental challenge of constructing interacting QFT models. Symmetry is the primary tool for understanding the structure and properties of a QFT model. This perspective leads to a generalization of the algebraic quantum field theory framework, as well as a more general definition of symmetry. This means that some models may have symmetries that were not previously recognized or exploited. To first order, a deformation of a QFT model is described by a Hochschild cohomology class. A deformation could, for example, correspond to adding an interaction term to a Lagrangian. The cohomology class for such an interaction is computed here. However, the result is more general and does not require the undeformed model to be constructed from a Lagrangian. This computation leads to a more concrete version of the construction of perturbative algebraic quantum field theory.
Computational Nuclear Physics and Post Hartree-Fock Methods
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lietz, Justin; Sam, Novario; Hjorth-Jensen, M.
We present a computational approach to infinite nuclear matter employing Hartree-Fock theory, many-body perturbation theory and coupled cluster theory. These lectures are closely linked with those of chapters 9, 10 and 11 and serve as input for the correlation functions employed in Monte Carlo calculations in chapter 9, the in-medium similarity renormalization group theory of dense fermionic systems of chapter 10 and the Green's function approach in chapter 11. We provide extensive code examples and benchmark calculations, allowing thereby an eventual reader to start writing her/his own codes. We start with an object-oriented serial code and end with discussions onmore » strategies for porting the code to present and planned high-performance computing facilities.« less
Any Ontological Model of the Single Qubit Stabilizer Formalism must be Contextual
NASA Astrophysics Data System (ADS)
Lillystone, Piers; Wallman, Joel J.
Quantum computers allow us to easily solve some problems classical computers find hard. Non-classical improvements in computational power should be due to some non-classical property of quantum theory. Contextuality, a more general notion of non-locality, is a necessary, but not sufficient, resource for quantum speed-up. Proofs of contextuality can be constructed for the classically simulable stabilizer formalism. Previous proofs of stabilizer contextuality are known for 2 or more qubits, for example the Mermin-Peres magic square. In the work presented we extend these results and prove that any ontological model of the single qubit stabilizer theory must be contextual, as defined by R. Spekkens, and give a relation between our result and the Mermin-Peres square. By demonstrating that contextuality is present in the qubit stabilizer formalism we provide further insight into the contextuality present in quantum theory. Understanding the contextuality of classical sub-theories will allow us to better identify the physical properties of quantum theory required for computational speed up. This research was supported by CIFAR, the Government of Ontario, and the Government of Canada through NSERC and Industry Canada.
On the effective field theory of heterotic vacua.
McOrist, Jock
2018-01-01
The effective field theory of heterotic vacua that realise [Formula: see text] preserving [Formula: see text] supersymmetry is studied. The vacua in question admit large radius limits taking the form [Formula: see text], with [Formula: see text] a smooth threefold with vanishing first Chern class and a stable holomorphic gauge bundle [Formula: see text]. In a previous paper we calculated the kinetic terms for moduli, deducing the moduli metric and Kähler potential. In this paper, we compute the remaining couplings in the effective field theory, correct to first order in [Formula: see text]. In particular, we compute the contribution of the matter sector to the Kähler potential and derive the Yukawa couplings and other quadratic fermionic couplings. From this we write down a Kähler potential [Formula: see text] and superpotential [Formula: see text].
Computer calculation of Witten's 3-manifold invariant
NASA Astrophysics Data System (ADS)
Freed, Daniel S.; Gompf, Robert E.
1991-10-01
Witten's 2+1 dimensional Chern-Simons theory is exactly solvable. We compute the partition function, a topological invariant of 3-manifolds, on generalized Seifert spaces. Thus we test the path integral using the theory of 3-manifolds. In particular, we compare the exact solution with the asymptotic formula predicted by perturbation theory. We conclude that this path integral works as advertised and gives an effective topological invariant.
Bao, Junwei Lucas; Zhang, Xin
2016-01-01
Bond dissociation is a fundamental chemical reaction, and the first principles modeling of the kinetics of dissociation reactions with a monotonically increasing potential energy along the dissociation coordinate presents a challenge not only for modern electronic structure methods but also for kinetics theory. In this work, we use multifaceted variable-reaction-coordinate variational transition-state theory (VRC-VTST) to compute the high-pressure limit dissociation rate constant of tetrafluoroethylene (C2F4), in which the potential energies are computed by direct dynamics with the M08-HX exchange correlation functional. To treat the pressure dependence of the unimolecular rate constants, we use the recently developed system-specific quantum Rice–Ramsperger–Kassel theory. The calculations are carried out by direct dynamics using an exchange correlation functional validated against calculations that go beyond coupled-cluster theory with single, double, and triple excitations. Our computed dissociation rate constants agree well with the recent experimental measurements. PMID:27834727
Bao, Junwei Lucas; Zhang, Xin; Truhlar, Donald G
2016-11-29
Bond dissociation is a fundamental chemical reaction, and the first principles modeling of the kinetics of dissociation reactions with a monotonically increasing potential energy along the dissociation coordinate presents a challenge not only for modern electronic structure methods but also for kinetics theory. In this work, we use multifaceted variable-reaction-coordinate variational transition-state theory (VRC-VTST) to compute the high-pressure limit dissociation rate constant of tetrafluoroethylene (C 2 F 4 ), in which the potential energies are computed by direct dynamics with the M08-HX exchange correlation functional. To treat the pressure dependence of the unimolecular rate constants, we use the recently developed system-specific quantum Rice-Ramsperger-Kassel theory. The calculations are carried out by direct dynamics using an exchange correlation functional validated against calculations that go beyond coupled-cluster theory with single, double, and triple excitations. Our computed dissociation rate constants agree well with the recent experimental measurements.
Theory-of-mind development influences suggestibility and source monitoring.
Bright-Paul, Alexandra; Jarrold, Christopher; Wright, Daniel B
2008-07-01
According to the mental-state reasoning model of suggestibility, 2 components of theory of mind mediate reductions in suggestibility across the preschool years. The authors examined whether theory-of-mind performance may be legitimately separated into 2 components and explored the memory processes underlying the associations between theory of mind and suggestibility, independent of verbal ability. Children 3 to 6 years old completed 6 theory-of-mind tasks and a postevent misinformation procedure. Contrary to the model's prediction, a single latent theory-of-mind factor emerged, suggesting a single-component rather than a dual-component conceptualization of theory-of-mind performance. This factor provided statistical justification for computing a single composite theory-of-mind score. Improvements in theory of mind predicted reductions in suggestibility, independent of verbal ability (Study 1, n = 72). Furthermore, once attribution biases were controlled (Study 2, n = 45), there was also a positive relationship between theory of mind and source memory, but not recognition performance. The findings suggest a substantial, and possibly causal, association between theory-of-mind development and resistance to suggestion, driven specifically by improvements in source monitoring.
Examination and Implementation of a Proposal for a Ph.D. Program in Administrative Sciences
1992-03-01
Review of two proposals recently approved by the Academic Council (i.e., Computer Science and Mathematics Departments). C. SCOPE OF THE STUDY Since WWII...and through the computer age, the application of administrative science theory and methodologies from the behavioral sciences and quantitative...roles in the U.S. Navy and DoD, providing people who firmly understand the technical and organizational aspects of computer -based systems which support
Chiral phase transition from string theory.
Parnachev, Andrei; Sahakyan, David A
2006-09-15
The low energy dynamics of a certain D-brane configuration in string theory is described at weak t'Hooft coupling by a nonlocal version of the Nambu-Jona-Lasinio model. We study this system at finite temperature and strong t'Hooft coupling, using the string theory dual. We show that for sufficiently low temperatures chiral symmetry is broken, while for temperatures larger then the critical value, it gets restored. We compute the latent heat and observe that the phase transition is of the first order.
NASA Technical Reports Server (NTRS)
Isaacson, D.; Marchesin, D.; Paes-Leme, P. J.
1980-01-01
This paper is an expanded version of a talk given at the 1979 T.I.C.O.M. conference. It is a self-contained introduction, for applied mathematicians and numerical analysts, to quantum mechanics and quantum field theory. It also contains a brief description of the authors' numerical approach to the problems of quantum field theory, which may best be summarized by the question; Can we compute the eigenvalues and eigenfunctions of Schrodinger operators in infinitely many variables.
Modelling of Surfaces. Part 1: Monatomic Metallic Surfaces Using Equivalent Crystal Theory
NASA Technical Reports Server (NTRS)
Bozzolo, Guillermo; Ferrante, John; Rodriguez, Agustin M.
1994-01-01
We present a detailed description of equivalent crystal theory focusing on its application to the study of surface structure. While the emphasis is in the structure of the algorithm and its computational aspects, we also present a comprehensive discussion on the calculation of surface energies of metallic systems with equivalent crystal theory and other approaches. Our results are compared to experiment and other semiempirical as well as first-principles calculations for a variety of fcc and bcc metals.
Semiempirical methods for computing turbulent flows
NASA Technical Reports Server (NTRS)
Belov, I. A.; Ginzburg, I. P.
1986-01-01
Two semiempirical theories which provide a basis for determining the turbulent friction and heat exchange near a wall are presented: (1) the Prandtl-Karman theory, and (2) the theory utilizing an equation for the energy of turbulent pulsations. A comparison is made between exact numerical methods and approximate integral methods for computing the turbulent boundary layers in the presence of pressure, blowing, or suction gradients. Using the turbulent flow around a plate as an example, it is shown that, when computing turbulent flows with external turbulence, it is preferable to construct a turbulence model based on the equation for energy of turbulent pulsations.
Computational Psychiatry and the Challenge of Schizophrenia.
Krystal, John H; Murray, John D; Chekroud, Adam M; Corlett, Philip R; Yang, Genevieve; Wang, Xiao-Jing; Anticevic, Alan
2017-05-01
Schizophrenia research is plagued by enormous challenges in integrating and analyzing large datasets and difficulties developing formal theories related to the etiology, pathophysiology, and treatment of this disorder. Computational psychiatry provides a path to enhance analyses of these large and complex datasets and to promote the development and refinement of formal models for features of this disorder. This presentation introduces the reader to the notion of computational psychiatry and describes discovery-oriented and theory-driven applications to schizophrenia involving machine learning, reinforcement learning theory, and biophysically-informed neural circuit models. Published by Oxford University Press on behalf of the Maryland Psychiatric Research Center 2017.
Perspective: Ring-polymer instanton theory
NASA Astrophysics Data System (ADS)
Richardson, Jeremy O.
2018-05-01
Since the earliest explorations of quantum mechanics, it has been a topic of great interest that quantum tunneling allows particles to penetrate classically insurmountable barriers. Instanton theory provides a simple description of these processes in terms of dominant tunneling pathways. Using a ring-polymer discretization, an efficient computational method is obtained for applying this theory to compute reaction rates and tunneling splittings in molecular systems. Unlike other quantum-dynamics approaches, the method scales well with the number of degrees of freedom, and for many polyatomic systems, the method may provide the most accurate predictions which can be practically computed. Instanton theory thus has the capability to produce useful data for many fields of low-temperature chemistry including spectroscopy, atmospheric and astrochemistry, as well as surface science. There is however still room for improvement in the efficiency of the numerical algorithms, and new theories are under development for describing tunneling in nonadiabatic transitions.
NASA Technical Reports Server (NTRS)
Goldstein, M. L.
1976-01-01
The propagation of charged particles through interstellar and interplanetary space has often been described as a random process in which the particles are scattered by ambient electromagnetic turbulence. In general, this changes both the magnitude and direction of the particles' momentum. Some situations for which scattering in direction (pitch angle) is of primary interest were studied. A perturbed orbit, resonant scattering theory for pitch-angle diffusion in magnetostatic turbulence was slightly generalized and then utilized to compute the diffusion coefficient for spatial propagation parallel to the mean magnetic field, Kappa. All divergences inherent in the quasilinear formalism when the power spectrum of the fluctuation field falls off as K to the minus Q power (Q less than 2) were removed. Various methods of computing Kappa were compared and limits on the validity of the theory discussed. For Q less than 1 or 2, the various methods give roughly comparable values of Kappa, but use of perturbed orbits systematically results in a somewhat smaller Kappa than can be obtained from quasilinear theory.
Jasim Mohammed, M; Ibrahim, Rabha W; Ahmad, M Z
2017-03-01
In this paper, we consider a low initial population model. Our aim is to study the periodicity computation of this model by using neutral differential equations, which are recognized in various studies including biology. We generalize the neutral Rayleigh equation for the third-order by exploiting the model of fractional calculus, in particular the Riemann-Liouville differential operator. We establish the existence and uniqueness of a periodic computational outcome. The technique depends on the continuation theorem of the coincidence degree theory. Besides, an example is presented to demonstrate the finding.
Physics Computing '92: Proceedings of the 4th International Conference
NASA Astrophysics Data System (ADS)
de Groot, Robert A.; Nadrchal, Jaroslav
1993-04-01
The Table of Contents for the book is as follows: * Preface * INVITED PAPERS * Ab Initio Theoretical Approaches to the Structural, Electronic and Vibrational Properties of Small Clusters and Fullerenes: The State of the Art * Neural Multigrid Methods for Gauge Theories and Other Disordered Systems * Multicanonical Monte Carlo Simulations * On the Use of the Symbolic Language Maple in Physics and Chemistry: Several Examples * Nonequilibrium Phase Transitions in Catalysis and Population Models * Computer Algebra, Symmetry Analysis and Integrability of Nonlinear Evolution Equations * The Path-Integral Quantum Simulation of Hydrogen in Metals * Digital Optical Computing: A New Approach of Systolic Arrays Based on Coherence Modulation of Light and Integrated Optics Technology * Molecular Dynamics Simulations of Granular Materials * Numerical Implementation of a K.A.M. Algorithm * Quasi-Monte Carlo, Quasi-Random Numbers and Quasi-Error Estimates * What Can We Learn from QMC Simulations * Physics of Fluctuating Membranes * Plato, Apollonius, and Klein: Playing with Spheres * Steady States in Nonequilibrium Lattice Systems * CONVODE: A REDUCE Package for Differential Equations * Chaos in Coupled Rotators * Symplectic Numerical Methods for Hamiltonian Problems * Computer Simulations of Surfactant Self Assembly * High-dimensional and Very Large Cellular Automata for Immunological Shape Space * A Review of the Lattice Boltzmann Method * Electronic Structure of Solids in the Self-interaction Corrected Local-spin-density Approximation * Dedicated Computers for Lattice Gauge Theory Simulations * Physics Education: A Survey of Problems and Possible Solutions * Parallel Computing and Electronic-Structure Theory * High Precision Simulation Techniques for Lattice Field Theory * CONTRIBUTED PAPERS * Case Study of Microscale Hydrodynamics Using Molecular Dynamics and Lattice Gas Methods * Computer Modelling of the Structural and Electronic Properties of the Supported Metal Catalysis * Ordered Particle Simulations for Serial and MIMD Parallel Computers * "NOLP" -- Program Package for Laser Plasma Nonlinear Optics * Algorithms to Solve Nonlinear Least Square Problems * Distribution of Hydrogen Atoms in Pd-H Computed by Molecular Dynamics * A Ray Tracing of Optical System for Protein Crystallography Beamline at Storage Ring-SIBERIA-2 * Vibrational Properties of a Pseudobinary Linear Chain with Correlated Substitutional Disorder * Application of the Software Package Mathematica in Generalized Master Equation Method * Linelist: An Interactive Program for Analysing Beam-foil Spectra * GROMACS: A Parallel Computer for Molecular Dynamics Simulations * GROMACS Method of Virial Calculation Using a Single Sum * The Interactive Program for the Solution of the Laplace Equation with the Elimination of Singularities for Boundary Functions * Random-Number Generators: Testing Procedures and Comparison of RNG Algorithms * Micro-TOPIC: A Tokamak Plasma Impurities Code * Rotational Molecular Scattering Calculations * Orthonormal Polynomial Method for Calibrating of Cryogenic Temperature Sensors * Frame-based System Representing Basis of Physics * The Role of Massively Data-parallel Computers in Large Scale Molecular Dynamics Simulations * Short-range Molecular Dynamics on a Network of Processors and Workstations * An Algorithm for Higher-order Perturbation Theory in Radiative Transfer Computations * Hydrostochastics: The Master Equation Formulation of Fluid Dynamics * HPP Lattice Gas on Transputers and Networked Workstations * Study on the Hysteresis Cycle Simulation Using Modeling with Different Functions on Intervals * Refined Pruning Techniques for Feed-forward Neural Networks * Random Walk Simulation of the Motion of Transient Charges in Photoconductors * The Optical Hysteresis in Hydrogenated Amorphous Silicon * Diffusion Monte Carlo Analysis of Modern Interatomic Potentials for He * A Parallel Strategy for Molecular Dynamics Simulations of Polar Liquids on Transputer Arrays * Distribution of Ions Reflected on Rough Surfaces * The Study of Step Density Distribution During Molecular Beam Epitaxy Growth: Monte Carlo Computer Simulation * Towards a Formal Approach to the Construction of Large-scale Scientific Applications Software * Correlated Random Walk and Discrete Modelling of Propagation through Inhomogeneous Media * Teaching Plasma Physics Simulation * A Theoretical Determination of the Au-Ni Phase Diagram * Boson and Fermion Kinetics in One-dimensional Lattices * Computational Physics Course on the Technical University * Symbolic Computations in Simulation Code Development and Femtosecond-pulse Laser-plasma Interaction Studies * Computer Algebra and Integrated Computing Systems in Education of Physical Sciences * Coordinated System of Programs for Undergraduate Physics Instruction * Program Package MIRIAM and Atomic Physics of Extreme Systems * High Energy Physics Simulation on the T_Node * The Chapman-Kolmogorov Equation as Representation of Huygens' Principle and the Monolithic Self-consistent Numerical Modelling of Lasers * Authoring System for Simulation Developments * Molecular Dynamics Study of Ion Charge Effects in the Structure of Ionic Crystals * A Computational Physics Introductory Course * Computer Calculation of Substrate Temperature Field in MBE System * Multimagnetical Simulation of the Ising Model in Two and Three Dimensions * Failure of the CTRW Treatment of the Quasicoherent Excitation Transfer * Implementation of a Parallel Conjugate Gradient Method for Simulation of Elastic Light Scattering * Algorithms for Study of Thin Film Growth * Algorithms and Programs for Physics Teaching in Romanian Technical Universities * Multicanonical Simulation of 1st order Transitions: Interface Tension of the 2D 7-State Potts Model * Two Numerical Methods for the Calculation of Periodic Orbits in Hamiltonian Systems * Chaotic Behavior in a Probabilistic Cellular Automata? * Wave Optics Computing by a Networked-based Vector Wave Automaton * Tensor Manipulation Package in REDUCE * Propagation of Electromagnetic Pulses in Stratified Media * The Simple Molecular Dynamics Model for the Study of Thermalization of the Hot Nucleon Gas * Electron Spin Polarization in PdCo Alloys Calculated by KKR-CPA-LSD Method * Simulation Studies of Microscopic Droplet Spreading * A Vectorizable Algorithm for the Multicolor Successive Overrelaxation Method * Tetragonality of the CuAu I Lattice and Its Relation to Electronic Specific Heat and Spin Susceptibility * Computer Simulation of the Formation of Metallic Aggregates Produced by Chemical Reactions in Aqueous Solution * Scaling in Growth Models with Diffusion: A Monte Carlo Study * The Nucleus as the Mesoscopic System * Neural Network Computation as Dynamic System Simulation * First-principles Theory of Surface Segregation in Binary Alloys * Data Smooth Approximation Algorithm for Estimating the Temperature Dependence of the Ice Nucleation Rate * Genetic Algorithms in Optical Design * Application of 2D-FFT in the Study of Molecular Exchange Processes by NMR * Advanced Mobility Model for Electron Transport in P-Si Inversion Layers * Computer Simulation for Film Surfaces and its Fractal Dimension * Parallel Computation Techniques and the Structure of Catalyst Surfaces * Educational SW to Teach Digital Electronics and the Corresponding Text Book * Primitive Trinomials (Mod 2) Whose Degree is a Mersenne Exponent * Stochastic Modelisation and Parallel Computing * Remarks on the Hybrid Monte Carlo Algorithm for the ∫4 Model * An Experimental Computer Assisted Workbench for Physics Teaching * A Fully Implicit Code to Model Tokamak Plasma Edge Transport * EXPFIT: An Interactive Program for Automatic Beam-foil Decay Curve Analysis * Mapping Technique for Solving General, 1-D Hamiltonian Systems * Freeway Traffic, Cellular Automata, and Some (Self-Organizing) Criticality * Photonuclear Yield Analysis by Dynamic Programming * Incremental Representation of the Simply Connected Planar Curves * Self-convergence in Monte Carlo Methods * Adaptive Mesh Technique for Shock Wave Propagation * Simulation of Supersonic Coronal Streams and Their Interaction with the Solar Wind * The Nature of Chaos in Two Systems of Ordinary Nonlinear Differential Equations * Considerations of a Window-shopper * Interpretation of Data Obtained by RTP 4-Channel Pulsed Radar Reflectometer Using a Multi Layer Perceptron * Statistics of Lattice Bosons for Finite Systems * Fractal Based Image Compression with Affine Transformations * Algorithmic Studies on Simulation Codes for Heavy-ion Reactions * An Energy-Wise Computer Simulation of DNA-Ion-Water Interactions Explains the Abnormal Structure of Poly[d(A)]:Poly[d(T)] * Computer Simulation Study of Kosterlitz-Thouless-Like Transitions * Problem-oriented Software Package GUN-EBT for Computer Simulation of Beam Formation and Transport in Technological Electron-Optical Systems * Parallelization of a Boundary Value Solver and its Application in Nonlinear Dynamics * The Symbolic Classification of Real Four-dimensional Lie Algebras * Short, Singular Pulses Generation by a Dye Laser at Two Wavelengths Simultaneously * Quantum Monte Carlo Simulations of the Apex-Oxygen-Model * Approximation Procedures for the Axial Symmetric Static Einstein-Maxwell-Higgs Theory * Crystallization on a Sphere: Parallel Simulation on a Transputer Network * FAMULUS: A Software Product (also) for Physics Education * MathCAD vs. FAMULUS -- A Brief Comparison * First-principles Dynamics Used to Study Dissociative Chemisorption * A Computer Controlled System for Crystal Growth from Melt * A Time Resolved Spectroscopic Method for Short Pulsed Particle Emission * Green's Function Computation in Radiative Transfer Theory * Random Search Optimization Technique for One-criteria and Multi-criteria Problems * Hartley Transform Applications to Thermal Drift Elimination in Scanning Tunneling Microscopy * Algorithms of Measuring, Processing and Interpretation of Experimental Data Obtained with Scanning Tunneling Microscope * Time-dependent Atom-surface Interactions * Local and Global Minima on Molecular Potential Energy Surfaces: An Example of N3 Radical * Computation of Bifurcation Surfaces * Symbolic Computations in Quantum Mechanics: Energies in Next-to-solvable Systems * A Tool for RTP Reactor and Lamp Field Design * Modelling of Particle Spectra for the Analysis of Solid State Surface * List of Participants
Computation of magnetic suspension of maglev systems using dynamic circuit theory
NASA Technical Reports Server (NTRS)
He, J. L.; Rote, D. M.; Coffey, H. T.
1992-01-01
Dynamic circuit theory is applied to several magnetic suspensions associated with maglev systems. These suspension systems are the loop-shaped coil guideway, the figure-eight-shaped null-flux coil guideway, and the continuous sheet guideway. Mathematical models, which can be used for the development of computer codes, are provided for each of these suspension systems. The differences and similarities of the models in using dynamic circuit theory are discussed in the paper. The paper emphasizes the transient and dynamic analysis and computer simulation of maglev systems. In general, the method discussed here can be applied to many electrodynamic suspension system design concepts. It is also suited for the computation of the performance of maglev propulsion systems. Numerical examples are presented in the paper.
Quantum equivalence of f (R) gravity and scalar-tensor theories in the Jordan and Einstein frames
NASA Astrophysics Data System (ADS)
Ohta, Nobuyoshi
2018-03-01
The f(R) gravity and scalar-tensor theory are known to be equivalent at the classical level. We study if this equivalence is valid at the quantum level. There are two descriptions of the scalar-tensor theory in the Jordan and Einstein frames. It is shown that these three formulations of the theories give the same determinant or effective action on shell, and thus they are equivalent at the quantum one-loop level on shell in arbitrary dimensions. We also compute the one-loop divergence in f(R) gravity on an Einstein space.
Theory for the solvation of nonpolar solutes in water
NASA Astrophysics Data System (ADS)
Urbic, T.; Vlachy, V.; Kalyuzhnyi, Yu. V.; Dill, K. A.
2007-11-01
We recently developed an angle-dependent Wertheim integral equation theory (IET) of the Mercedes-Benz (MB) model of pure water [Silverstein et al., J. Am. Chem. Soc. 120, 3166 (1998)]. Our approach treats explicitly the coupled orientational constraints within water molecules. The analytical theory offers the advantage of being less computationally expensive than Monte Carlo simulations by two orders of magnitude. Here we apply the angle-dependent IET to studying the hydrophobic effect, the transfer of a nonpolar solute into MB water. We find that the theory reproduces the Monte Carlo results qualitatively for cold water and quantitatively for hot water.
Theory for the solvation of nonpolar solutes in water.
Urbic, T; Vlachy, V; Kalyuzhnyi, Yu V; Dill, K A
2007-11-07
We recently developed an angle-dependent Wertheim integral equation theory (IET) of the Mercedes-Benz (MB) model of pure water [Silverstein et al., J. Am. Chem. Soc. 120, 3166 (1998)]. Our approach treats explicitly the coupled orientational constraints within water molecules. The analytical theory offers the advantage of being less computationally expensive than Monte Carlo simulations by two orders of magnitude. Here we apply the angle-dependent IET to studying the hydrophobic effect, the transfer of a nonpolar solute into MB water. We find that the theory reproduces the Monte Carlo results qualitatively for cold water and quantitatively for hot water.
Fuzzy logic of Aristotelian forms
DOE Office of Scientific and Technical Information (OSTI.GOV)
Perlovsky, L.I.
1996-12-31
Model-based approaches to pattern recognition and machine vision have been proposed to overcome the exorbitant training requirements of earlier computational paradigms. However, uncertainties in data were found to lead to a combinatorial explosion of the computational complexity. This issue is related here to the roles of a priori knowledge vs. adaptive learning. What is the a-priori knowledge representation that supports learning? I introduce Modeling Field Theory (MFT), a model-based neural network whose adaptive learning is based on a priori models. These models combine deterministic, fuzzy, and statistical aspects to account for a priori knowledge, its fuzzy nature, and data uncertainties.more » In the process of learning, a priori fuzzy concepts converge to crisp or probabilistic concepts. The MFT is a convergent dynamical system of only linear computational complexity. Fuzzy logic turns out to be essential for reducing the combinatorial complexity to linear one. I will discuss the relationship of the new computational paradigm to two theories due to Aristotle: theory of Forms and logic. While theory of Forms argued that the mind cannot be based on ready-made a priori concepts, Aristotelian logic operated with just such concepts. I discuss an interpretation of MFT suggesting that its fuzzy logic, combining a-priority and adaptivity, implements Aristotelian theory of Forms (theory of mind). Thus, 2300 years after Aristotle, a logic is developed suitable for his theory of mind.« less
Correlation tracking study for meter-class solar telescope on space shuttle. [solar granulation
NASA Technical Reports Server (NTRS)
Smithson, R. C.; Tarbell, T. D.
1977-01-01
The theory and expected performance level of correlation trackers used to control the pointing of a solar telescope in space using white light granulation as a target were studied. Three specific trackers were modeled and their performance levels predicted for telescopes of various apertures. The performance of the computer model trackers on computer enhanced granulation photographs was evaluated. Parametric equations for predicting tracker performance are presented.
ERIC Educational Resources Information Center
Mermillod, Martial; Bonin, Patrick; Meot, Alain; Ferrand, Ludovic; Paindavoine, Michel
2012-01-01
According to the age-of-acquisition hypothesis, words acquired early in life are processed faster and more accurately than words acquired later. Connectionist models have begun to explore the influence of the age/order of acquisition of items (and also their frequency of encounter). This study attempts to reconcile two different methodological and…
Impact of Media Richness and Flow on E-Learning Technology Acceptance
ERIC Educational Resources Information Center
Liu, Su-Houn; Liao, Hsiu-Li; Pratt, Jean A.
2009-01-01
Advances in e-learning technologies parallels a general increase in sophistication by computer users. The use of just one theory or model, such as the technology acceptance model, is no longer sufficient to study the intended use of e-learning systems. Rather, a combination of theories must be integrated in order to fully capture the complexity of…
ERIC Educational Resources Information Center
Garg, Deepti; Garg, Ajay K.
2007-01-01
This study applied the Theory of Reasoned Action and the Technology Acceptance Model to measure outcomes of general education courses (GECs) under the University of Botswana Computer and Information Skills (CIS) program. An exploratory model was validated for responses from 298 students. The results suggest that resources currently committed to…
Error Monitoring in Speech Production: A Computational Test of the Perceptual Loop Theory.
ERIC Educational Resources Information Center
Hartsuiker, Robert J.; Kolk, Herman H. J.
2001-01-01
Tested whether an elaborated version of the perceptual loop theory (W. Levelt, 1983) and the main interruption rule was consistent with existing time course data (E. Blackmer and E. Mitton, 1991; C. Oomen and A. Postma, in press). The study suggests that including an inner loop through the speech comprehension system generates predictions that fit…
ERIC Educational Resources Information Center
Sorebo, Oystein; Haehre, Reidar
2012-01-01
The purpose of this study is to explain students' perceived relevance of playing an educational game as a means for development of discipline competence. Based on self-determination theory and the concept of personal interest, we propose that: Satisfying students' basic needs for competence, autonomy, and relatedness when playing educational games…
Using Activity Theory to Understand Intergenerational Play: The Case of Family Quest
ERIC Educational Resources Information Center
Siyahhan, Sinem; Barab, Sasha A.; Downton, Michael P.
2010-01-01
We implemented a five-week family program called "Family Quest" where parents and children ages 9 to 13 played Quest Atlantis, a multiuser 3D educational computer game, at a local after-school club for 90-minute sessions. We used activity theory as a conceptual and an analytical framework to study the nature of intergenerational play, the…
Influence matrix program for aerodynamic lifting surface theory. [in subsonic flows
NASA Technical Reports Server (NTRS)
Medan, R. T.; Ray, K. S.
1973-01-01
A users manual is described for a USA FORTRAN 4 computer program which computes an aerodynamic influence matrix and is one of several computer programs used to analyze lifting, thin wings in steady, subsonic flow according to a kernel function method lifting surface theory. The most significant features of the program are that it can treat unsymmetrical wings, control points can be placed on the leading and/or trailing edges, and a stable, efficient algorithm is used to compute the influence matrix.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Semrau, P.
The purpose of this study was to analyze selected cognitive theories in the areas of artificial intelligence (A.I.) and psychology to determine the role of emotions in the cognitive or intellectual processes. Understanding the relationship of emotions to processes of intelligence has implications for constructing theories of aesthetic response and A.I. systems in art. Psychological theories were examined that demonstrated the changing nature of the research in emotion related to cognition. The basic techniques in A.I. were reviewed and the A.I. research was analyzed to determine the process of cognition and the role of emotion. The A.I. research emphasized themore » digital, quantifiable character of the computer and associated cognitive models and programs. In conclusion, the cognitive-emotive research in psychology and the cognitive research in A.I. emphasized quantification methods over analog and qualitative characteristics required for a holistic explanation of cognition. Further A.I. research needs to examine the qualitative aspects of values, attitudes, and beliefs on influencing the creative thinking processes. Inclusion of research related to qualitative problem solving in art provides a more comprehensive base of study for examining the area of intelligence in computers.« less
Computational predictions of zinc oxide hollow structures
NASA Astrophysics Data System (ADS)
Tuoc, Vu Ngoc; Huan, Tran Doan; Thao, Nguyen Thi
2018-03-01
Nanoporous materials are emerging as potential candidates for a wide range of technological applications in environment, electronic, and optoelectronics, to name just a few. Within this active research area, experimental works are predominant while theoretical/computational prediction and study of these materials face some intrinsic challenges, one of them is how to predict porous structures. We propose a computationally and technically feasible approach for predicting zinc oxide structures with hollows at the nano scale. The designed zinc oxide hollow structures are studied with computations using the density functional tight binding and conventional density functional theory methods, revealing a variety of promising mechanical and electronic properties, which can potentially find future realistic applications.
NASA Technical Reports Server (NTRS)
Barger, R. L.
1980-01-01
A general procedure for computing the region of influence of a maneuvering vehicle is described. Basic differential geometric relations, including the use of a general trajectory parameter and the introduction of auxiliary variables in the envelope theory are presented. To illustrate the application of the method, the destruct region for a maneuvering fighter firing missiles is computed.
Towards a theory of automated elliptic mesh generation
NASA Technical Reports Server (NTRS)
Cordova, J. Q.
1992-01-01
The theory of elliptic mesh generation is reviewed and the fundamental problem of constructing computational space is discussed. It is argued that the construction of computational space is an NP-Complete problem and therefore requires a nonstandard approach for its solution. This leads to the development of graph-theoretic, combinatorial optimization and integer programming algorithms. Methods for the construction of two dimensional computational space are presented.
Effect of Computer-Aided Instruction on Attitude and Achievement of Fifth Grade Math Students
ERIC Educational Resources Information Center
Shoemaker, Traci L.
2013-01-01
The purpose of this quasi-experimental non-equivalent control group study was to test theories of constructivism and motivation, along with research-based teaching practices of differentiating instruction and instructing within a child's Zone of Proximal Development, in measuring the effect of computer-aided instruction on fifth grade students'…
ERIC Educational Resources Information Center
Kim, Jieun; Ryu, Hokyoung; Katuk, Norliza; Wang, Ruili; Choi, Gyunghyun
2014-01-01
The present study aims to show if a skill-challenge balancing (SCB) instruction strategy can assist learners to motivationally engage in computer-based learning. Csikszentmihalyi's flow theory (self-control, curiosity, focus of attention, and intrinsic interest) was applied to an account of the optimal learning experience in SCB-based learning…
Balancing Act: The Struggle between Orality and Linearity in Computer-Mediated Communication.
ERIC Educational Resources Information Center
Metz, J Michel
1996-01-01
Asks whether computer-mediated communication allows creation of isolated virtual communities for its interlocutors or brings closer the ideal of a global community. Examines the question from the standpoint of R. Cathcart and G. Gumpert's medium theory. Uses data derived from an ethnographic study to illustrate how a new approach is required. (PA)
Games as Artistic Medium: Interfacing Complexity Theory in Game-Based Art Pedagogy
ERIC Educational Resources Information Center
Patton, Ryan Matthew
2011-01-01
Having computer skills, let alone access to a personal computer, has become a necessary component of contemporary Western society and many parts of the world. Digital media literacy involves youth being able to view, participate in, and make creative works with technologies in personal and meaningful ways. Games, defined in this study as…
ERIC Educational Resources Information Center
Suppes, P.; And Others
From some simple and schematic assumptions about information processing, a stochastic differential equation is derived for the motion of a student through a computer-assisted elementary mathematics curriculum. The mathematics strands curriculum of the Institute for Mathematical Studies in the Social Sciences is used to test: (1) the theory and (2)…
The Impact of the Digital Divide on First-Year Community College Students
ERIC Educational Resources Information Center
Mansfield, Malinda
2017-01-01
Some students do not possess the learning management system (LMS) and basic computer skills needed for success in first-year experience (FYE) courses. The purpose of this quantitative study, based on the Integrative Learning Design Framework and theory of transactional distance, was to identify what basic computer skills and LMS skills are needed…
A Computational Experiment of the Endo versus Exo Preference in a Diels-Alder Reaction
ERIC Educational Resources Information Center
Rowley, Christopher N.; Woo, Tom K.
2009-01-01
We have developed and tested a computational laboratory that investigates an endo versus exo Diels-Alder cycloaddition. This laboratory employed density functional theory (DFT) calculations to study the cycloaddition of N-phenylmaleimide to furan. The endo and exo stereoisomers of the product were distinguished by building the two isomers in a…
Relativity in a Rock Field: A Study of Physics Learning with a Computer Game
ERIC Educational Resources Information Center
Carr, David; Bossomaier, Terry
2011-01-01
The "Theory of Special Relativity" is widely regarded as a difficult topic for learners in physics to grasp, as it reformulates fundamental conceptions of space, time and motion, and predominantly deals with situations outside of everyday experience. In this paper, we describe embedding the physics of relativity into a computer game, and…
ERIC Educational Resources Information Center
Nguyen, ThuyUyen H.; Charity, Ian; Robson, Andrew
2016-01-01
This study investigates students' perceptions of computer-based learning environments, their attitude towards business statistics, and their academic achievement in higher education. Guided by learning environments concepts and attitudinal theory, a theoretical model was proposed with two instruments, one for measuring the learning environment and…
The orbifolder: A tool to study the low-energy effective theory of heterotic orbifolds
NASA Astrophysics Data System (ADS)
Nilles, H. P.; Ramos-Sánchez, S.; Vaudrevange, P. K. S.; Wingerter, A.
2012-06-01
The orbifolder is a program developed in C++ that computes and analyzes the low-energy effective theory of heterotic orbifold compactifications. The program includes routines to compute the massless spectrum, to identify the allowed couplings in the superpotential, to automatically generate large sets of orbifold models, to identify phenomenologically interesting models (e.g. MSSM-like models) and to analyze their vacuum configurations. Program summaryProgram title: orbifolder Catalogue identifier: AELR_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AELR_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: GNU General Public License version 3 No. of lines in distributed program, including test data, etc.: 145 572 No. of bytes in distributed program, including test data, etc.: 930 517 Distribution format: tar.gz Programming language:C++ Computer: Personal computer Operating system: Tested on Linux (Fedora 15, Ubuntu 11, SuSE 11) Word size: 32 bits or 64 bits Classification: 11.1 External routines: Boost (http://www.boost.org/), GSL (http://www.gnu.org/software/gsl/) Nature of problem: Calculating the low-energy spectrum of heterotic orbifold compactifications. Solution method: Quadratic equations on a lattice; representation theory; polynomial algebra. Running time: Less than a second per model.
COMPUTATIONAL ELECTROCHEMISTRY: AQUEOUS ONE-ELECTRON OXIDATION POTENTIALS FOR SUBSTITUTED ANILINES
Semiempirical molecular orbital theory and density functional theory are used to compute one-electron oxidation potentials for aniline and a set of 21 mono- and di-substituted anilines in aqueous solution. Linear relationships between theoretical predictions and experiment are co...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Błaziak, Kacper; Panek, Jarosław J.; Jezierska, Aneta, E-mail: aneta.jezierska@chem.uni.wroc.pl
2015-07-21
Quinoline derivatives are interesting objects to study internal reorganizations due to the observed excited-state-induced intramolecular proton transfer (ESIPT). Here, we report on computations for selected 12 quinoline derivatives possessing three kinds of intramolecular hydrogen bonds. Density functional theory was employed for the current investigations. The metric and electronic structure simulations were performed for the ground state and first excited singlet and triplet states. The computed potential energy profiles do not show a spontaneous proton transfer in the ground state, whereas excited states exhibit this phenomenon. Atoms in Molecules (AIM) theory was applied to study the nature of hydrogen bonding, whereasmore » Harmonic Oscillator Model of aromaticity index (HOMA) provided data of aromaticity evolution as a derivative of the bridge proton position. The AIM-based topological analysis confirmed the presence of the intramolecular hydrogen bonding. In addition, using the theory, we were able to provide a quantitative illustration of bonding transformation: from covalent to the hydrogen. On the basis of HOMA analysis, we showed that the aromaticity of both rings is dependent on the location of the bridge proton. Further, the computed results were compared with experimental data available. Finally, ESIPT occurrence was compared for the three investigated kinds of hydrogen bridges, and competition between two bridges in one molecule was studied.« less
Mind the gap: an attempt to bridge computational and neuroscientific approaches to study creativity
Wiggins, Geraint A.; Bhattacharya, Joydeep
2014-01-01
Creativity is the hallmark of human cognition and is behind every innovation, scientific discovery, piece of music, artwork, and idea that have shaped our lives, from ancient times till today. Yet scientific understanding of creative processes is quite limited, mostly due to the traditional belief that considers creativity as a mysterious puzzle, a paradox, defying empirical enquiry. Recently, there has been an increasing interest in revealing the neural correlates of human creativity. Though many of these studies, pioneering in nature, help demystification of creativity, but the field is still dominated by popular beliefs in associating creativity with “right brain thinking”, “divergent thinking”, “altered states” and so on (Dietrich and Kanso, 2010). In this article, we discuss a computational framework for creativity based on Baars’ Global Workspace Theory (GWT; Baars, 1988) enhanced with mechanisms based on information theory. Next we propose a neurocognitive architecture of creativity with a strong focus on various facets (i.e., unconscious thought theory, mind wandering, spontaneous brain states) of un/pre-conscious brain responses. Our principal argument is that pre-conscious creativity happens prior to conscious creativity and the proposed computational model may provide a mechanism by which this transition is managed. This integrative approach, albeit unconventional, will hopefully stimulate future neuroscientific studies of the inscrutable phenomenon of creativity. PMID:25104930
NASA Astrophysics Data System (ADS)
Rodriguez, Sarah L.; Lehman, Kathleen
2017-10-01
This theoretical paper explores the need for enhanced, intersectional computing identity theory for the purpose of developing a diverse group of computer scientists for the future. Greater theoretical understanding of the identity formation process specifically for computing is needed in order to understand how students come to understand themselves as computer scientists. To ensure that the next generation of computer scientists is diverse, this paper presents a case for examining identity development intersectionally, understanding the ways in which women and underrepresented students may have difficulty identifying as computer scientists and be systematically oppressed in their pursuit of computer science careers. Through a review of the available scholarship, this paper suggests that creating greater theoretical understanding of the computing identity development process will inform the way in which educational stakeholders consider computer science practices and policies.
NASA Astrophysics Data System (ADS)
Bonezzi, Roberto; Boulanger, Nicolas; De Filippi, David; Sundell, Per
2017-11-01
We first prove that, in Vasiliev’s theory, the zero-form charges studied in Sezgin E and Sundell P 2011 (arXiv:1103.2360 [hep-th]) and Colombo N and Sundell P 20 (arXiv:1208.3880 [hep-th]) are twisted open Wilson lines in the noncommutative Z space. This is shown by mapping Vasiliev’s higher-spin model on noncommutative Yang-Mills theory. We then prove that, prior to Bose-symmetrising, the cyclically-symmetric higher-spin invariants given by the leading order of these n-point zero-form charges are equal to corresponding cyclically-invariant building blocks of n-point correlation functions of bilinear operators in free conformal field theories (CFT) in three dimensions. On the higher spin gravity side, our computation reproduces the results of Didenko V and Skvortsov E 2013 J. High Energy Phys. JHEP04(2013)158 using an alternative method amenable to the computation of subleading corrections obtained by perturbation theory in normal order. On the free CFT side, our proof involves the explicit computation of the separate cyclic building blocks of the correlation functions of n conserved currents in arbitrary dimension d>2 using polarization vectors, which is an original result. It is shown to agree, for d=3 , with the results obtained in Gelfond O A and Vasiliev M A 2013 Nucl. Phys. B 876 871-917 in various dimensions and where polarization spinors were used.
Visualization of x-ray computer tomography using computer-generated holography
NASA Astrophysics Data System (ADS)
Daibo, Masahiro; Tayama, Norio
1998-09-01
The theory converted from x-ray projection data to the hologram directly by combining the computer tomography (CT) with the computer generated hologram (CGH), is proposed. The purpose of this study is to offer the theory for realizing the all- electronic and high-speed seeing through 3D visualization system, which is for the application to medical diagnosis and non- destructive testing. First, the CT is expressed using the pseudo- inverse matrix which is obtained by the singular value decomposition. CGH is expressed in the matrix style. Next, `projection to hologram conversion' (PTHC) matrix is calculated by the multiplication of phase matrix of CGH with pseudo-inverse matrix of the CT. Finally, the projection vector is converted to the hologram vector directly, by multiplication of the PTHC matrix with the projection vector. Incorporating holographic analog computation into CT reconstruction, it becomes possible that the calculation amount is drastically reduced. We demonstrate the CT cross section which is reconstituted by He-Ne laser in the 3D space from the real x-ray projection data acquired by x-ray television equipment, using our direct conversion technique.
Role of Statistical Random-Effects Linear Models in Personalized Medicine.
Diaz, Francisco J; Yeh, Hung-Wen; de Leon, Jose
2012-03-01
Some empirical studies and recent developments in pharmacokinetic theory suggest that statistical random-effects linear models are valuable tools that allow describing simultaneously patient populations as a whole and patients as individuals. This remarkable characteristic indicates that these models may be useful in the development of personalized medicine, which aims at finding treatment regimes that are appropriate for particular patients, not just appropriate for the average patient. In fact, published developments show that random-effects linear models may provide a solid theoretical framework for drug dosage individualization in chronic diseases. In particular, individualized dosages computed with these models by means of an empirical Bayesian approach may produce better results than dosages computed with some methods routinely used in therapeutic drug monitoring. This is further supported by published empirical and theoretical findings that show that random effects linear models may provide accurate representations of phase III and IV steady-state pharmacokinetic data, and may be useful for dosage computations. These models have applications in the design of clinical algorithms for drug dosage individualization in chronic diseases; in the computation of dose correction factors; computation of the minimum number of blood samples from a patient that are necessary for calculating an optimal individualized drug dosage in therapeutic drug monitoring; measure of the clinical importance of clinical, demographic, environmental or genetic covariates; study of drug-drug interactions in clinical settings; the implementation of computational tools for web-site-based evidence farming; design of pharmacogenomic studies; and in the development of a pharmacological theory of dosage individualization.
Generalized recursion relations for correlators in the gauge-gravity correspondence.
Raju, Suvrat
2011-03-04
We show that a generalization of the Britto-Cachazo-Feng-Witten recursion relations gives a new and efficient method of computing correlation functions of the stress tensor or conserved currents in conformal field theories with an (d+1)-dimensional anti-de Sitter space dual, for d≥4, in the limit where the bulk theory is approximated by tree-level Yang-Mills theory or gravity. In supersymmetric theories, additional correlators of operators that live in the same multiplet as a conserved current or stress tensor can be computed by these means.
Quantum Information Theory - an Invitation
NASA Astrophysics Data System (ADS)
Werner, Reinhard F.
Quantum information and quantum computers have received a lot of public attention recently. Quantum computers have been advertised as a kind of warp drive for computing, and indeed the promise of the algorithms of Shor and Grover is to perform computations which are extremely hard or even provably impossible on any merely ``classical'' computer.In this article I shall give an account of the basic concepts of quantum information theory is given, staying as much as possible in the area of general agreement.The article is divided into two parts. The first (up to the end of Sect. 2.5) is mostly in plain English, centered around the exploration of what can or cannot be done with quantum systems as information carriers. The second part, Sect. 2.6, then gives a description of the mathematical structures and of some of the tools needed to develop the theory.
Use of a database for managing qualitative research data.
Ross, B A
1994-01-01
In this article, a process for handling text data in qualitative research projects by using existing word-processing and database programs is described. When qualitative data are managed using this method, the information is more readily available and the coding and organization of the data are enhanced. Furthermore, the narrative always remains intact regardless of how it is arranged or re-arranged, and there is a concomitant time savings and increased accuracy. The author hopes that this article will inspire some readers to explore additional methods and processes for computer-aided, nonstatistical data management. The study referred to in this article (Ross, 1991) was a qualitative research project which sought to find out how teaching faculty in nursing and education used computers in their professional work. Ajzen and Fishbein's (1980) Theory of Reasoned Action formed the theoretical basis for this work. This theory proposes that behavior, in this study the use of computers, is the result of intentions and that intentions are the result of attitudes and social norms. The study found that although computer use was sometimes the result of attitudes, more often it seemed to be the result of subjective (perceived) norms or intervening variables. Teaching faculty apparently did not initially make reasoned judgments about the computers or the programs they used, but chose to use whatever was required or available.
Mechanisms of Developmental Change in Infant Categorization
ERIC Educational Resources Information Center
Westermann, Gert; Mareschal, Denis
2012-01-01
Computational models are tools for testing mechanistic theories of learning and development. Formal models allow us to instantiate theories of cognitive development in computer simulations. Model behavior can then be compared to real performance. Connectionist models, loosely based on neural information processing, have been successful in…
Target Information Processing: A Joint Decision and Estimation Approach
2012-03-29
ground targets ( track - before - detect ) using computer cluster and graphics processing unit. Estimation and filtering theory is one of the most important...targets ( track - before - detect ) using computer cluster and graphics processing unit. Estimation and filtering theory is one of the most important
Situated Learning in Computer Science Education
ERIC Educational Resources Information Center
Ben-Ari, Mordechai
2004-01-01
Sociocultural theories of learning such as Wenger and Lave's situated learning have been suggested as alternatives to cognitive theories of learning like constructivism. This article examines situated learning within the context of computer science (CS) education. Situated learning accurately describes some CS communities like open-source software…
Towards topological quantum computer
NASA Astrophysics Data System (ADS)
Melnikov, D.; Mironov, A.; Mironov, S.; Morozov, A.; Morozov, An.
2018-01-01
Quantum R-matrices, the entangling deformations of non-entangling (classical) permutations, provide a distinguished basis in the space of unitary evolutions and, consequently, a natural choice for a minimal set of basic operations (universal gates) for quantum computation. Yet they play a special role in group theory, integrable systems and modern theory of non-perturbative calculations in quantum field and string theory. Despite recent developments in those fields the idea of topological quantum computing and use of R-matrices, in particular, practically reduce to reinterpretation of standard sets of quantum gates, and subsequently algorithms, in terms of available topological ones. In this paper we summarize a modern view on quantum R-matrix calculus and propose to look at the R-matrices acting in the space of irreducible representations, which are unitary for the real-valued couplings in Chern-Simons theory, as the fundamental set of universal gates for topological quantum computer. Such an approach calls for a more thorough investigation of the relation between topological invariants of knots and quantum algorithms.
Chen, Rong; Chung, Shin-Ho
2013-01-01
The discovery of new drugs that selectively block or modulate ion channels has great potential to provide new treatments for a host of conditions. One promising avenue revolves around modifying or mimicking certain naturally occurring ion channel modulator toxins. This strategy appears to offer the prospect of designing drugs that are both potent and specific. The use of computational modeling is crucial to this endeavor, as it has the potential to provide lower cost alternatives for exploring the effects of new compounds on ion channels. In addition, computational modeling can provide structural information and theoretical understanding that is not easily derivable from experimental results. In this review, we look at the theory and computational methods that are applicable to the study of ion channel modulators. The first section provides an introduction to various theoretical concepts, including force-fields and the statistical mechanics of binding. We then look at various computational techniques available to the researcher, including molecular dynamics, Brownian dynamics, and molecular docking systems. The latter section of the review explores applications of these techniques, concentrating on pore blocker and gating modifier toxins of potassium and sodium channels. After first discussing the structural features of these channels, and their modes of block, we provide an in-depth review of past computational work that has been carried out. Finally, we discuss prospects for future developments in the field. PMID:23589832
NASA Astrophysics Data System (ADS)
Cantero, Francisco; Castro-Orgaz, Oscar; Garcia-Marín, Amanda; Ayuso, José Luis; Dey, Subhasish
2015-10-01
Is the energy equation for gradually-varied flow the best approximation for the free surface profile computations in river flows? Determination of flood inundation in rivers and natural waterways is based on the hydraulic computation of flow profiles. This is usually done using energy-based gradually-varied flow models, like HEC-RAS, that adopts a vertical division method for discharge prediction in compound channel sections. However, this discharge prediction method is not so accurate in the context of advancements over the last three decades. This paper firstly presents a study of the impact of discharge prediction on the gradually-varied flow computations by comparing thirteen different methods for compound channels, where both energy and momentum equations are applied. The discharge, velocity distribution coefficients, specific energy, momentum and flow profiles are determined. After the study of gradually-varied flow predictions, a new theory is developed to produce higher-order energy and momentum equations for rapidly-varied flow in compound channels. These generalized equations enable to describe the flow profiles with more generality than the gradually-varied flow computations. As an outcome, results of gradually-varied flow provide realistic conclusions for computations of flow in compound channels, showing that momentum-based models are in general more accurate; whereas the new theory developed for rapidly-varied flow opens a new research direction, so far not investigated in flows through compound channels.
NASA Technical Reports Server (NTRS)
Reuther, James; Alonso, Juan Jose; Rimlinger, Mark J.; Jameson, Antony
1996-01-01
This work describes the application of a control theory-based aerodynamic shape optimization method to the problem of supersonic aircraft design. The design process is greatly accelerated through the use of both control theory and a parallel implementation on distributed memory computers. Control theory is employed to derive the adjoint differential equations whose solution allows for the evaluation of design gradient information at a fraction of the computational cost required by previous design methods (13, 12, 44, 38). The resulting problem is then implemented on parallel distributed memory architectures using a domain decomposition approach, an optimized communication schedule, and the MPI (Message Passing Interface) Standard for portability and efficiency. The final result achieves very rapid aerodynamic design based on higher order computational fluid dynamics methods (CFD). In our earlier studies, the serial implementation of this design method (19, 20, 21, 23, 39, 25, 40, 41, 42, 43, 9) was shown to be effective for the optimization of airfoils, wings, wing-bodies, and complex aircraft configurations using both the potential equation and the Euler equations (39, 25). In our most recent paper, the Euler method was extended to treat complete aircraft configurations via a new multiblock implementation. Furthermore, during the same conference, we also presented preliminary results demonstrating that the basic methodology could be ported to distributed memory parallel computing architectures [241. In this paper, our concem will be to demonstrate that the combined power of these new technologies can be used routinely in an industrial design environment by applying it to the case study of the design of typical supersonic transport configurations. A particular difficulty of this test case is posed by the propulsion/airframe integration.
Quantum chemistry simulation on quantum computers: theories and experiments.
Lu, Dawei; Xu, Boruo; Xu, Nanyang; Li, Zhaokai; Chen, Hongwei; Peng, Xinhua; Xu, Ruixue; Du, Jiangfeng
2012-07-14
It has been claimed that quantum computers can mimic quantum systems efficiently in the polynomial scale. Traditionally, those simulations are carried out numerically on classical computers, which are inevitably confronted with the exponential growth of required resources, with the increasing size of quantum systems. Quantum computers avoid this problem, and thus provide a possible solution for large quantum systems. In this paper, we first discuss the ideas of quantum simulation, the background of quantum simulators, their categories, and the development in both theories and experiments. We then present a brief introduction to quantum chemistry evaluated via classical computers followed by typical procedures of quantum simulation towards quantum chemistry. Reviewed are not only theoretical proposals but also proof-of-principle experimental implementations, via a small quantum computer, which include the evaluation of the static molecular eigenenergy and the simulation of chemical reaction dynamics. Although the experimental development is still behind the theory, we give prospects and suggestions for future experiments. We anticipate that in the near future quantum simulation will become a powerful tool for quantum chemistry over classical computations.
The Psychology of Mathematics Learning: Past and Present.
ERIC Educational Resources Information Center
Education and Urban Society, 1985
1985-01-01
Reviews trends in applying psychology to mathematics learning. Discusses the influence of behaviorism and other functionalist theories, Gestalt theory, Piagetian theory, and the "new functionalism" evident in computer-oriented theories of information processing. (GC)
Can computational goals inform theories of vision?
Anderson, Barton L
2015-04-01
One of the most lasting contributions of Marr's posthumous book is his articulation of the different "levels of analysis" that are needed to understand vision. Although a variety of work has examined how these different levels are related, there is comparatively little examination of the assumptions on which his proposed levels rest, or the plausibility of the approach Marr articulated given those assumptions. Marr placed particular significance on computational level theory, which specifies the "goal" of a computation, its appropriateness for solving a particular problem, and the logic by which it can be carried out. The structure of computational level theory is inherently teleological: What the brain does is described in terms of its purpose. I argue that computational level theory, and the reverse-engineering approach it inspires, requires understanding the historical trajectory that gave rise to functional capacities that can be meaningfully attributed with some sense of purpose or goal, that is, a reconstruction of the fitness function on which natural selection acted in shaping our visual abilities. I argue that this reconstruction is required to distinguish abilities shaped by natural selection-"natural tasks" -from evolutionary "by-products" (spandrels, co-optations, and exaptations), rather than merely demonstrating that computational goals can be embedded in a Bayesian model that renders a particular behavior or process rational. Copyright © 2015 Cognitive Science Society, Inc.
On the effective field theory of heterotic vacua
NASA Astrophysics Data System (ADS)
McOrist, Jock
2018-04-01
The effective field theory of heterotic vacua that realise [InlineEquation not available: see fulltext.] preserving N{=}1 supersymmetry is studied. The vacua in question admit large radius limits taking the form [InlineEquation not available: see fulltext.], with [InlineEquation not available: see fulltext.] a smooth threefold with vanishing first Chern class and a stable holomorphic gauge bundle [InlineEquation not available: see fulltext.]. In a previous paper we calculated the kinetic terms for moduli, deducing the moduli metric and Kähler potential. In this paper, we compute the remaining couplings in the effective field theory, correct to first order in {α ^{\\backprime } }. In particular, we compute the contribution of the matter sector to the Kähler potential and derive the Yukawa couplings and other quadratic fermionic couplings. From this we write down a Kähler potential [InlineEquation not available: see fulltext.] and superpotential [InlineEquation not available: see fulltext.].
Is gross moist stability a useful quantity for studying the moisture mode theory?
NASA Astrophysics Data System (ADS)
Inoue, K.; Back, L. E.
2016-12-01
The idea is growing and being accepted that the Madden-Julian Oscillation (MJO) is a moisture mode. Along with the appearance of the moisture mode theory, a conceptual quantity called gross moist stability (GMS) has gained increasing attention. However, the GMS is a vexing quantity because it can be interpreted in different ways, depending on the size of spatial domains where the GMS is computed and on computation methodologies. We present a few different illustrations of the GMS using satellite observations. We first show GMS variability as a phase transition on a phase plane that we refer to as the GMS plane. Second, we demonstrate that the GMS variability shown as a time-series, which much past literature presented, is most likely not relevant to the moisture mode theory. In this talk, we present a protocol of moisture-mode-oriented GMS analyses with satellite observations.
Crutchfield, James P; Ditto, William L; Sinha, Sudeshna
2010-09-01
How dynamical systems store and process information is a fundamental question that touches a remarkably wide set of contemporary issues: from the breakdown of Moore's scaling laws--that predicted the inexorable improvement in digital circuitry--to basic philosophical problems of pattern in the natural world. It is a question that also returns one to the earliest days of the foundations of dynamical systems theory, probability theory, mathematical logic, communication theory, and theoretical computer science. We introduce the broad and rather eclectic set of articles in this Focus Issue that highlights a range of current challenges in computing and dynamical systems.
Advanced Small Perturbation Potential Flow Theory for Unsteady Aerodynamic and Aeroelastic Analyses
NASA Technical Reports Server (NTRS)
Batina, John T.
2005-01-01
An advanced small perturbation (ASP) potential flow theory has been developed to improve upon the classical transonic small perturbation (TSP) theories that have been used in various computer codes. These computer codes are typically used for unsteady aerodynamic and aeroelastic analyses in the nonlinear transonic flight regime. The codes exploit the simplicity of stationary Cartesian meshes with the movement or deformation of the configuration under consideration incorporated into the solution algorithm through a planar surface boundary condition. The new ASP theory was developed methodically by first determining the essential elements required to produce full-potential-like solutions with a small perturbation approach on the requisite Cartesian grid. This level of accuracy required a higher-order streamwise mass flux and a mass conserving surface boundary condition. The ASP theory was further developed by determining the essential elements required to produce results that agreed well with Euler solutions. This level of accuracy required mass conserving entropy and vorticity effects, and second-order terms in the trailing wake boundary condition. Finally, an integral boundary layer procedure, applicable to both attached and shock-induced separated flows, was incorporated for viscous effects. The resulting ASP potential flow theory, including entropy, vorticity, and viscous effects, is shown to be mathematically more appropriate and computationally more accurate than the classical TSP theories. The formulaic details of the ASP theory are described fully and the improvements are demonstrated through careful comparisons with accepted alternative results and experimental data. The new theory has been used as the basis for a new computer code called ASP3D (Advanced Small Perturbation - 3D), which also is briefly described with representative results.
INHYD: Computer code for intraply hybrid composite design. A users manual
NASA Technical Reports Server (NTRS)
Chamis, C. C.; Sinclair, J. H.
1983-01-01
A computer program (INHYD) was developed for intraply hybrid composite design. A users manual for INHYD is presented. In INHYD embodies several composite micromechanics theories, intraply hybrid composite theories, and an integrated hygrothermomechanical theory. The INHYD can be run in both interactive and batch modes. It has considerable flexibility and capability, which the user can exercise through several options. These options are demonstrated through appropriate INHYD runs in the manual.
Dorazio, R.M.; Johnson, F.A.
2003-01-01
Bayesian inference and decision theory may be used in the solution of relatively complex problems of natural resource management, owing to recent advances in statistical theory and computing. In particular, Markov chain Monte Carlo algorithms provide a computational framework for fitting models of adequate complexity and for evaluating the expected consequences of alternative management actions. We illustrate these features using an example based on management of waterfowl habitat.
General results for higher spin Wilson lines and entanglement in Vasiliev theory
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hegde, Ashwin; Kraus, Per; Perlmutter, Eric
Here, we develop tools for the efficient evaluation of Wilson lines in 3D higher spin gravity, and use these to compute entanglement entropy in the hs[λ ] Vasiliev theory that governs the bulk side of the duality proposal of Gaberdiel and Gopakumar. Our main technical advance is the determination of SL(N) Wilson lines for arbitrary N, which, in suitable cases, enables us to analytically continue to hs[λ ] via N→ -λ. We then apply this result to compute various quantities of interest, including entanglement entropy expanded perturbatively in the background higher spin charge, chemical potential, and interval size. This includesmore » a computation of entanglement entropy in the higher spin black hole of the Vasiliev theory. Our results are consistent with conformal field theory calculations. We also provide an alternative derivation of the Wilson line, by showing how it arises naturally from earlier work on scalar correlators in higher spin theory. The general picture that emerges is consistent with the statement that the SL(N) Wilson line computes the semiclassical W N vacuum block, and our results provide an explicit result for this object.« less
General results for higher spin Wilson lines and entanglement in Vasiliev theory
Hegde, Ashwin; Kraus, Per; Perlmutter, Eric
2016-01-28
Here, we develop tools for the efficient evaluation of Wilson lines in 3D higher spin gravity, and use these to compute entanglement entropy in the hs[λ ] Vasiliev theory that governs the bulk side of the duality proposal of Gaberdiel and Gopakumar. Our main technical advance is the determination of SL(N) Wilson lines for arbitrary N, which, in suitable cases, enables us to analytically continue to hs[λ ] via N→ -λ. We then apply this result to compute various quantities of interest, including entanglement entropy expanded perturbatively in the background higher spin charge, chemical potential, and interval size. This includesmore » a computation of entanglement entropy in the higher spin black hole of the Vasiliev theory. Our results are consistent with conformal field theory calculations. We also provide an alternative derivation of the Wilson line, by showing how it arises naturally from earlier work on scalar correlators in higher spin theory. The general picture that emerges is consistent with the statement that the SL(N) Wilson line computes the semiclassical W N vacuum block, and our results provide an explicit result for this object.« less
Higgs Amplitudes from N=4 Supersymmetric Yang-Mills Theory.
Brandhuber, Andreas; Kostacińska, Martyna; Penante, Brenda; Travaglini, Gabriele
2017-10-20
Higgs plus multigluon amplitudes in QCD can be computed in an effective Lagrangian description. In the infinite top-mass limit, an amplitude with a Higgs boson and n gluons is computed by the form factor of the operator TrF^{2}. Up to two loops and for three gluons, its maximally transcendental part is captured entirely by the form factor of the protected stress tensor multiplet operator T_{2} in N=4 supersymmetric Yang-Mills theory. The next order correction involves the calculation of the form factor of the higher-dimensional, trilinear operator TrF^{3}. We present explicit results at two loops for three gluons, including the subleading transcendental terms derived from a particular descendant of the Konishi operator that contains TrF^{3}. These are expressed in terms of a few universal building blocks already identified in earlier calculations. We show that the maximally transcendental part of this quantity, computed in nonsupersymmetric Yang-Mills theory, is identical to the form factor of another protected operator, T_{3}, in the maximally supersymmetric theory. Our results suggest that the maximally transcendental part of Higgs amplitudes in QCD can be entirely computed through N=4 super Yang-Mills theory.
NASA Astrophysics Data System (ADS)
Marzari, Nicola
The last 30 years have seen the steady and exhilarating development of powerful quantum-simulation engines for extended systems, dedicated to the solution of the Kohn-Sham equations of density-functional theory, often augmented by density-functional perturbation theory, many-body perturbation theory, time-dependent density-functional theory, dynamical mean-field theory, and quantum Monte Carlo. Their implementation on massively parallel architectures, now leveraging also GPUs and accelerators, has started a massive effort in the prediction from first principles of many or of complex materials properties, leading the way to the exascale through the combination of HPC (high-performance computing) and HTC (high-throughput computing). Challenges and opportunities abound: complementing hardware and software investments and design; developing the materials' informatics infrastructure needed to encode knowledge into complex protocols and workflows of calculations; managing and curating data; resisting the complacency that we have already reached the predictive accuracy needed for materials design, or a robust level of verification of the different quantum engines. In this talk I will provide an overview of these challenges, with the ultimate prize being the computational understanding, prediction, and design of properties and performance for novel or complex materials and devices.
ERIC Educational Resources Information Center
Burroughs-Lange, Sue G.; Lange, John
This paper evaluates the effects of using the NUDIST (Non-numerical, Unstructured Data Indexing, Searching and Theorising) computer program to organize coded, qualitative data. The use of the software is discussed within the context of the study for which it was used: an Australian study that aimed to develop a theoretical understanding of the…
Electromagnetic Showers at High Energy
ERIC Educational Resources Information Center
Loos, J. S.; Dawson, S. L.
1978-01-01
Some of the properties of electromagnetic showers observed in an experimental study are illustrated. Experimental data and results from quantum electrodynamics are discussed. Data and theory are compared using computer simulation. (BB)
The Laboratory-Based Economics Curriculum.
ERIC Educational Resources Information Center
King, Paul G.; LaRoe, Ross M.
1991-01-01
Describes the liberal arts, computer laboratory-based economics program at Denison University (Ohio). Includes as goals helping students to (1) understand deductive arguments, (2) learn to apply theory in real-world situations, and (3) test and modify theory when necessary. Notes that the program combines computer laboratory experiments for…
ERIC Educational Resources Information Center
Wheeler, David L.
1988-01-01
Scientists feel that progress in artificial intelligence and the availability of thousands of experimental results make this the right time to build and test theories on how people think and learn, using the computer to model minds. (MSE)
Friederichs, Stijn A H; Oenema, Anke; Bolman, Catherine; Lechner, Lilian
2015-08-18
Our main objective in the current study was to evaluate the long-term effectiveness (12 months from baseline) of I Move (a web-based computer tailored physical activity intervention, based on self-determination theory and motivational interviewing). To this end, we compared I Move to a web-based computer tailored physical activity intervention based on traditional health behavior theories (Active Plus), and to a no-intervention control group. As a secondary objective, the present study aimed to identify participant characteristics that moderate the long term effects of I Move and Active Plus. A randomized controlled trial was conducted, comparing three research conditions: 1) the I Move condition, participants in this condition received I Move; 2) the Active Plus condition, participants in this condition received Active Plus; 3) the control condition; participants in this condition received no intervention and were placed on a waiting list. Main outcome measures were weekly minutes of moderate to vigorous physical activity and weekly days with minimal 30 min of physical activity. All measurements were taken by web-based questionnaires via the study website. Intervention effects were analyzed using multilevel linear regression analyses. At 12 months from baseline, I Move was found to be effective in increasing weekly minutes of moderate to vigorous physical activity (ES = .13), while Active Plus was not. In contrast, Active Plus was found to be effective in increasing weekly days with ≥ 30 min PA at 12 months (ES = .11), while I Move was not. No moderators of the effects of I Move were found. The results suggest that web-based computer tailored physical activity interventions might best include elements based on both self-determination theory/motivational interviewing and traditional health behavioral theories. To be more precise, it is arguable that the focus of the theoretical foundations, used in new web-based PA interventions should depend on the intended program outcome. In order to draw firm conclusions, however, more research on the effects of self-determination theory and motivational interviewing in web-based physical activity promotion is needed. Dutch Trial Register NTR4129.
Datta, Subhra; Ghosal, Sandip; Patankar, Neelesh A
2006-02-01
Electroosmotic flow in a straight micro-channel of rectangular cross-section is computed numerically for several situations where the wall zeta-potential is not constant but has a specified spatial variation. The results of the computation are compared with an earlier published asymptotic theory based on the lubrication approximation: the assumption that any axial variations take place on a long length scale compared to a characteristic channel width. The computational results are found to be in excellent agreement with the theory even when the scale of axial variations is comparable to the channel width. In the opposite limit when the wavelength of fluctuations is much shorter than the channel width, the lubrication theory fails to describe the solution either qualitatively or quantitatively. In this short wave limit the solution is well described by Ajdari's theory for electroosmotic flow between infinite parallel plates (Ajdari, A., Phys. Rev. E 1996, 53, 4996-5005.) The infinitely thin electric double layer limit is assumed in the theory as well as in the simulation.
Neural Computation and the Computational Theory of Cognition
ERIC Educational Resources Information Center
Piccinini, Gualtiero; Bahar, Sonya
2013-01-01
We begin by distinguishing computationalism from a number of other theses that are sometimes conflated with it. We also distinguish between several important kinds of computation: computation in a generic sense, digital computation, and analog computation. Then, we defend a weak version of computationalism--neural processes are computations in the…
ERIC Educational Resources Information Center
Gegenfurtner, Andreas; Veermans, Koen; Vauras, Marja
2013-01-01
This meta-analysis (29 studies, k = 33, N = 4158) examined the longitudinal development of the relationship between performance self-efficacy and transfer before and after training. A specific focus was on training programs that afforded varying degrees of computer-supported collaborative learning (CSCL). Consistent with social cognitive theory,…
An application of modern control theory to jet propulsion systems. [considering onboard computer
NASA Technical Reports Server (NTRS)
Merrill, W. C.
1975-01-01
The control of an airbreathing turbojet engine by an onboard digital computer is studied. The approach taken is to model the turbojet engine as a linear, multivariable system whose parameters vary with engine operating environment. From this model adaptive closed-loop or feedback control laws are designed and applied to the acceleration of the turbojet engine.
Recursive Techniques for Computing Gluon Scattering in Anti-de-Sitter Space
NASA Astrophysics Data System (ADS)
Shyaka, Claude; Kharel, Savan
2016-03-01
The anti-de Sitter/conformal field theory correspondence is a relationship between two kinds of physical theories. On one side of the duality are special type of quantum (conformal) field theories known as the Yang-Mills theory. These quantum field theories are known to be equivalent to theories of gravity in Anti-de Sitter (AdS) space. The physical observables in the theory are the correlation functions that live in the boundary of AdS space. In general correlation functions are computed using configuration space and the expressions are extremely complicated. Using momentum basis and recursive techniques developed by Raju, we extend tree level correlation functions for four and five-point correlation functions in Yang-Mills theory in Anti-de Sitter space. In addition, we show that for certain external helicity, the correlation functions have simple analytic structure. Finally, we discuss how one can generalize these results to n-point functions. Hendrix college odyssey Grant.
Theoretical Investigation of oxides for batteries and fuel cell applications
NASA Astrophysics Data System (ADS)
Ganesh, Panchapakesan; Lubimtsev, Andrew A.; Balachandran, Janakiraman
I will present theoretical studies of Li-ion and proton-conducting oxides using a combination of theory and computations that involve Density Functional Theory based atomistic modeling, cluster-expansion based studies, global optimization, high-throughput computations and machine learning based investigation of ionic transport in oxide materials. In Li-ion intercalated oxides, we explain the experimentally observed (Nature Materials 12, 518-522 (2013)) 'intercalation pseudocapacitance' phenomenon, and explain why Nb2O5 is special to show this behavior when Li-ions are intercalated (J. Mater. Chem. A, 2013,1, 14951-14956), but not when Na-ions are used. In addition, we explore Li-ion intercalation theoretically in VO2 (B) phase, which is somewhat structurally similar to Nb2O5 and predict an interesting role of site-trapping on the voltage and capacity of the material, validated by ongoing experiments. Computations of proton conducting oxides explain why Y-doped BaZrO3 , one of the fastest proton conducting oxide, shows a decrease in conductivity above 20% Y-doping. Further, using high throughput computations and machine learning tools we discover general principles to improve proton conductivity. Acknowledgements: LDRD at ORNL and CNMS at ORNL
DOE Office of Scientific and Technical Information (OSTI.GOV)
Harmark, Troels; Orselli, Marta
We match the Hagedorn/deconfinement temperature of planar N=4 super Yang-Mills (SYM) on RxS{sup 3} to the Hagedorn temperature of string theory on AdS{sub 5}xS{sup 5}. The match is done in a near-critical region where both gauge theory and string theory are weakly coupled. The near-critical region is near a point with zero temperature and critical chemical potential. On the gauge-theory side we are taking a decoupling limit found in Ref. 7 in which the physics of planar N=4 SYM is given exactly by the ferromagnetic XXX{sub 1/2} Heisenberg spin chain. We find moreover a general relation between the Hagedorn/deconfinement temperaturemore » and the thermodynamics of the Heisenberg spin chain and we use this to compute it in two distinct regimes. On the string-theory side, we identify the dual limit for which the string tension and string coupling go to zero. This limit is taken of string theory on a maximally supersymmetric pp-wave background with a flat direction, obtained from a Penrose limit of AdS{sub 5}xS{sup 5}. We compute the Hagedorn temperature of the string theory and find agreement with the Hagedorn/deconfinement temperature computed on the gauge-theory side.« less
CSM research: Methods and application studies
NASA Technical Reports Server (NTRS)
Knight, Norman F., Jr.
1989-01-01
Computational mechanics is that discipline of applied science and engineering devoted to the study of physical phenomena by means of computational methods based on mathematical modeling and simulation, utilizing digital computers. The discipline combines theoretical and applied mechanics, approximation theory, numerical analysis, and computer science. Computational mechanics has had a major impact on engineering analysis and design. When applied to structural mechanics, the discipline is referred to herein as computational structural mechanics. Complex structures being considered by NASA for the 1990's include composite primary aircraft structures and the space station. These structures will be much more difficult to analyze than today's structures and necessitate a major upgrade in computerized structural analysis technology. NASA has initiated a research activity in structural analysis called Computational Structural Mechanics (CSM). The broad objective of the CSM activity is to develop advanced structural analysis technology that will exploit modern and emerging computers, such as those with vector and/or parallel processing capabilities. Here, the current research directions for the Methods and Application Studies Team of the Langley CSM activity are described.
MaRIE theory, modeling and computation roadmap executive summary
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lookman, Turab
The confluence of MaRIE (Matter-Radiation Interactions in Extreme) and extreme (exascale) computing timelines offers a unique opportunity in co-designing the elements of materials discovery, with theory and high performance computing, itself co-designed by constrained optimization of hardware and software, and experiments. MaRIE's theory, modeling, and computation (TMC) roadmap efforts have paralleled 'MaRIE First Experiments' science activities in the areas of materials dynamics, irradiated materials and complex functional materials in extreme conditions. The documents that follow this executive summary describe in detail for each of these areas the current state of the art, the gaps that exist and the road mapmore » to MaRIE and beyond. Here we integrate the various elements to articulate an overarching theme related to the role and consequences of heterogeneities which manifest as competing states in a complex energy landscape. MaRIE experiments will locate, measure and follow the dynamical evolution of these heterogeneities. Our TMC vision spans the various pillar science and highlights the key theoretical and experimental challenges. We also present a theory, modeling and computation roadmap of the path to and beyond MaRIE in each of the science areas.« less
Computational theory of line drawing interpretation
NASA Technical Reports Server (NTRS)
Witkin, A. P.
1981-01-01
The recovery of the three dimensional structure of visible surfaces depicted in an image by emphasizing the role of geometric cues present in line drawings, was studied. Three key components are line classification, line interpretation, and surface interpolation. A model for three dimensional line interpretation and surface orientation was refined and a theory for the recovery of surface shape from surface marking geometry was developed. A new approach to the classification of edges was developed and implemented signatures were deduced for each of several edge types, expressed in terms of correlational properties of the image intensities in the vicinity of the edge. A computer program was developed that evaluates image edges as compared with these prototype signatures.
NASA Astrophysics Data System (ADS)
Sharma, Abhiraj; Suryanarayana, Phanish
2018-05-01
We present an accurate and efficient real-space Density Functional Theory (DFT) framework for the ab initio study of non-orthogonal crystal systems. Specifically, employing a local reformulation of the electrostatics, we develop a novel Kronecker product formulation of the real-space kinetic energy operator that significantly reduces the number of operations associated with the Laplacian-vector multiplication, the dominant cost in practical computations. In particular, we reduce the scaling with respect to finite-difference order from quadratic to linear, thereby significantly bridging the gap in computational cost between non-orthogonal and orthogonal systems. We verify the accuracy and efficiency of the proposed methodology through selected examples.
NASA Astrophysics Data System (ADS)
Jain, Amber; Herman, Michael F.; Ouyang, Wenjun; Subotnik, Joseph E.
2015-10-01
We provide an in-depth investigation of transmission coefficients as computed using the augmented-fewest switches surface hopping algorithm in the low energy regime. Empirically, microscopic reversibility is shown to hold approximately. Furthermore, we show that, in some circumstances, including decoherence on top of surface hopping calculations can help recover (as opposed to destroy) oscillations in the transmission coefficient as a function of energy; these oscillations can be studied analytically with semiclassical scattering theory. Finally, in the spirit of transition state theory, we also show that transmission coefficients can be calculated rather accurately starting from the curve crossing point and running trajectories forwards and backwards.
Prediction of aircraft sideline noise attenuation
NASA Technical Reports Server (NTRS)
Zorumski, W. E.
1978-01-01
A computational study is made using the recommended ground effect theory by Pao, Wenzel, and Oncley. It is shown that this theory adequately predicts the measured ground attenuation data by Parkin and Scholes, which is the only available large data set. It is also shown, however, that the ground effect theory does not predict the measured lateral attenuations from actual aircraft flyovers. There remain one or more important lateral effects on aircraft noise, such as sideline shielding of sources, which must be incorporated in the prediction methods. Experiments at low elevation angles (0 deg to 10 deg) and low-to-intermediate frequencies are recommended to further validate the ground effect theory.
One-dimensional analysis of filamentary composite beam columns with thin-walled open sections
NASA Technical Reports Server (NTRS)
Lo, Patrick K.-L.; Johnson, Eric R.
1986-01-01
Vlasov's one-dimensional structural theory for thin-walled open section bars was originally developed and used for metallic elements. The theory was recently extended to laminated bars fabricated from advanced composite materials. The purpose of this research is to provide a study and assessment of the extended theory. The focus is on flexural and torsional-flexural buckling of thin-walled, open section, laminated composite columns. Buckling loads are computed from the theory using a linear bifurcation analysis and a geometrically nonlinear beam column analysis by the finite element method. Results from the analyses are compared to available test data.
Abilities and Affordances: Factors Influencing Successful Child-Tablet Communication
ERIC Educational Resources Information Center
Dubé, Adam K.; McEwen, Rhonda N.
2017-01-01
Using Luhmann's communication theory and affordance theories, we develop a framework to examine how kindergarten-grade 2 students interact with tablet computers. We assessed whether cognitive ability and device configuration influence how successfully children use tablet computers. We found that children's limited ability to direct their cognitive…
Educational Research and Theory Perspectives on Intelligent Computer-Assisted Instruction.
ERIC Educational Resources Information Center
Tennyson, Robert D.; Christensen, Dean L.
This paper defines the next generation of intelligent computer-assisted instructional systems (ICAI) by depicting the elaborations and extensions offered by educational research and theory perspectives to enhance the ICAI environment. The first section describes conventional ICAI systems, which use expert systems methods and have three modules: a…
Overview of a Linguistic Theory of Design. AI Memo 383A.
ERIC Educational Resources Information Center
Miller, Mark L.; Goldstein, Ira P.
The SPADE theory, which uses linguistic formalisms to model the planning and debugging processes of computer programming, was simultaneously developed and tested in three separate contexts--computer uses in education, automatic programming (a traditional artificial intelligence arena), and protocol analysis (the domain of information processing…
Coalescent: an open-science framework for importance sampling in coalescent theory.
Tewari, Susanta; Spouge, John L
2015-01-01
Background. In coalescent theory, computer programs often use importance sampling to calculate likelihoods and other statistical quantities. An importance sampling scheme can exploit human intuition to improve statistical efficiency of computations, but unfortunately, in the absence of general computer frameworks on importance sampling, researchers often struggle to translate new sampling schemes computationally or benchmark against different schemes, in a manner that is reliable and maintainable. Moreover, most studies use computer programs lacking a convenient user interface or the flexibility to meet the current demands of open science. In particular, current computer frameworks can only evaluate the efficiency of a single importance sampling scheme or compare the efficiencies of different schemes in an ad hoc manner. Results. We have designed a general framework (http://coalescent.sourceforge.net; language: Java; License: GPLv3) for importance sampling that computes likelihoods under the standard neutral coalescent model of a single, well-mixed population of constant size over time following infinite sites model of mutation. The framework models the necessary core concepts, comes integrated with several data sets of varying size, implements the standard competing proposals, and integrates tightly with our previous framework for calculating exact probabilities. For a given dataset, it computes the likelihood and provides the maximum likelihood estimate of the mutation parameter. Well-known benchmarks in the coalescent literature validate the accuracy of the framework. The framework provides an intuitive user interface with minimal clutter. For performance, the framework switches automatically to modern multicore hardware, if available. It runs on three major platforms (Windows, Mac and Linux). Extensive tests and coverage make the framework reliable and maintainable. Conclusions. In coalescent theory, many studies of computational efficiency consider only effective sample size. Here, we evaluate proposals in the coalescent literature, to discover that the order of efficiency among the three importance sampling schemes changes when one considers running time as well as effective sample size. We also describe a computational technique called "just-in-time delegation" available to improve the trade-off between running time and precision by constructing improved importance sampling schemes from existing ones. Thus, our systems approach is a potential solution to the "2(8) programs problem" highlighted by Felsenstein, because it provides the flexibility to include or exclude various features of similar coalescent models or importance sampling schemes.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hannon, Kevin P.; Li, Chenyang; Evangelista, Francesco A., E-mail: francesco.evangelista@emory.edu
2016-05-28
We report an efficient implementation of a second-order multireference perturbation theory based on the driven similarity renormalization group (DSRG-MRPT2) [C. Li and F. A. Evangelista, J. Chem. Theory Comput. 11, 2097 (2015)]. Our implementation employs factorized two-electron integrals to avoid storage of large four-index intermediates. It also exploits the block structure of the reference density matrices to reduce the computational cost to that of second-order Møller–Plesset perturbation theory. Our new DSRG-MRPT2 implementation is benchmarked on ten naphthyne isomers using basis sets up to quintuple-ζ quality. We find that the singlet-triplet splittings (Δ{sub ST}) of the naphthyne isomers strongly depend onmore » the equilibrium structures. For a consistent set of geometries, the Δ{sub ST} values predicted by the DSRG-MRPT2 are in good agreements with those computed by the reduced multireference coupled cluster theory with singles, doubles, and perturbative triples.« less
Computational predictions of energy materials using density functional theory
NASA Astrophysics Data System (ADS)
Jain, Anubhav; Shin, Yongwoo; Persson, Kristin A.
2016-01-01
In the search for new functional materials, quantum mechanics is an exciting starting point. The fundamental laws that govern the behaviour of electrons have the possibility, at the other end of the scale, to predict the performance of a material for a targeted application. In some cases, this is achievable using density functional theory (DFT). In this Review, we highlight DFT studies predicting energy-related materials that were subsequently confirmed experimentally. The attributes and limitations of DFT for the computational design of materials for lithium-ion batteries, hydrogen production and storage materials, superconductors, photovoltaics and thermoelectric materials are discussed. In the future, we expect that the accuracy of DFT-based methods will continue to improve and that growth in computing power will enable millions of materials to be virtually screened for specific applications. Thus, these examples represent a first glimpse of what may become a routine and integral step in materials discovery.
Computational Relativistic Astrophysics Using the Flow Field-Dependent Variation Theory
NASA Technical Reports Server (NTRS)
Richardson, G. A.; Chung, T. J.
2002-01-01
We present our method for solving general relativistic nonideal hydrodynamics. Relativistic effects become pronounced in such cases as jet formation from black hole magnetized accretion disks which may lead to the study of gamma-ray bursts. Nonideal flows are present where radiation, magnetic forces, viscosities, and turbulence play an important role. Our concern in this paper is to reexamine existing numerical simulation tools as to the accuracy and efficiency of computations and introduce a new approach known as the flow field-dependent variation (FDV) method. The main feature of the FDV method consists of accommodating discontinuities of shock waves and high gradients of flow variables such as occur in turbulence and unstable motions. In this paper, the physics involved in the solution of relativistic hydrodynamics and solution strategies of the FDV theory are elaborated. The general relativistic astrophysical flow and shock solver (GRAFSS) is introduced, and some simple example problems for computational relativistic astrophysics (CRA) are demonstrated.
Shadows, signals, and stability in Einsteinian cubic gravity
NASA Astrophysics Data System (ADS)
Hennigar, Robie A.; Jahani Poshteh, Mohammad Bagher; Mann, Robert B.
2018-03-01
We conduct a preliminary investigation into the phenomenological implications of Einsteinian cubic gravity (ECG), a four-dimensional theory of gravity cubic in curvature of interest for its unique formulation and properties. We find an analytic approximation for a spherically symmetric black hole solution to this theory using a continued fraction ansatz. This approximate solution is valid everywhere outside of the horizon and we use it to study the orbit of massive test bodies near a black hole, specifically computing the innermost stable circular orbit. We compute constraints on the ECG coupling parameter imposed by Shapiro time delay. We then compute the shadow of an ECG black hole and find it to be larger than its Einsteinian counterpart in general relativity for the same value of the mass. Applying our results to Sgr A*, we find that departures from general relativity are small but in principle distinguishable.
Numerical simulation of supersonic inlets using a three-dimensional viscous flow analysis
NASA Technical Reports Server (NTRS)
Anderson, B. H.; Towne, C. E.
1980-01-01
A three dimensional fully viscous computer analysis was evaluated to determine its usefulness in the design of supersonic inlets. This procedure takes advantage of physical approximations to limit the high computer time and storage associated with complete Navier-Stokes solutions. Computed results are presented for a Mach 3.0 supersonic inlet with bleed and a Mach 7.4 hypersonic inlet. Good agreement was obtained between theory and data for both inlets. Results of a mesh sensitivity study are also shown.
Intention, emotion, and action: a neural theory based on semantic pointers.
Schröder, Tobias; Stewart, Terrence C; Thagard, Paul
2014-06-01
We propose a unified theory of intentions as neural processes that integrate representations of states of affairs, actions, and emotional evaluation. We show how this theory provides answers to philosophical questions about the concept of intention, psychological questions about human behavior, computational questions about the relations between belief and action, and neuroscientific questions about how the brain produces actions. Our theory of intention ties together biologically plausible mechanisms for belief, planning, and motor control. The computational feasibility of these mechanisms is shown by a model that simulates psychologically important cases of intention. © 2013 Cognitive Science Society, Inc.
Why use DFT methods in the study of carbohydrates?
USDA-ARS?s Scientific Manuscript database
The recent advances in density functional theory (DFT) and computer technology allow us to study systems with more than 100 atoms routinely. This makes it feasible to study large carbohydrate molecules via quantum mechanical methods, whereas in the past, studies of carbohydrates were restricted to ...
Bloomfield, Jacqueline; Roberts, Julia; While, Alison
2010-03-01
High quality health care demands a nursing workforce with sound clinical skills. However, the clinical competency of newly qualified nurses continues to stimulate debate about the adequacy of current methods of clinical skills education and emphasises the need for innovative teaching strategies. Despite the increasing use of e-learning within nurse education, evidence to support its use for clinical skills teaching is limited and inconclusive. This study tested whether nursing students could learn and retain the theory and skill of handwashing more effectively when taught using computer-assisted learning compared with conventional face-to-face methods. The study employed a two group randomised controlled design. The intervention group used an interactive, multimedia, self-directed computer-assisted learning module. The control group was taught by an experienced lecturer in a clinical skills room. Data were collected over a 5-month period between October 2004 and February 2005. Knowledge was tested at four time points and handwashing skills were assessed twice. Two-hundred and forty-two first year nursing students of mixed gender; age; educational background and first language studying at one British university were recruited to the study. Participant attrition increased during the study. Knowledge scores increased significantly from baseline in both groups and no significant differences were detected between the scores of the two groups. Skill performance scores were similar in both groups at the 2-week follow-up with significant differences emerging at the 8-week follow-up in favour of the intervention group, however, this finding must be interpreted with caution in light of sample size and attrition rates. The computer-assisted learning module was an effective strategy for teaching both the theory and practice of handwashing to nursing students and in this study was found to be at least as effective as conventional face-to-face teaching methods. Copyright 2009 Elsevier Ltd. All rights reserved.
Unperturbed Schelling Segregation in Two or Three Dimensions
NASA Astrophysics Data System (ADS)
Barmpalias, George; Elwes, Richard; Lewis-Pye, Andrew
2016-09-01
Schelling's models of segregation, first described in 1969 (Am Econ Rev 59:488-493, 1969) are among the best known models of self-organising behaviour. Their original purpose was to identify mechanisms of urban racial segregation. But his models form part of a family which arises in statistical mechanics, neural networks, social science, and beyond, where populations of agents interact on networks. Despite extensive study, unperturbed Schelling models have largely resisted rigorous analysis, prior results generally focusing on variants in which noise is introduced into the dynamics, the resulting system being amenable to standard techniques from statistical mechanics or stochastic evolutionary game theory (Young in Individual strategy and social structure: an evolutionary theory of institutions, Princeton University Press, Princeton, 1998). A series of recent papers (Brandt et al. in: Proceedings of the 44th annual ACM symposium on theory of computing (STOC 2012), 2012); Barmpalias et al. in: 55th annual IEEE symposium on foundations of computer science, Philadelphia, 2014, J Stat Phys 158:806-852, 2015), has seen the first rigorous analyses of 1-dimensional unperturbed Schelling models, in an asymptotic framework largely unknown in statistical mechanics. Here we provide the first such analysis of 2- and 3-dimensional unperturbed models, establishing most of the phase diagram, and answering a challenge from Brandt et al. in: Proceedings of the 44th annual ACM symposium on theory of computing (STOC 2012), 2012).
Research on application of intelligent computation based LUCC model in urbanization process
NASA Astrophysics Data System (ADS)
Chen, Zemin
2007-06-01
Global change study is an interdisciplinary and comprehensive research activity with international cooperation, arising in 1980s, with the largest scopes. The interaction between land use and cover change, as a research field with the crossing of natural science and social science, has become one of core subjects of global change study as well as the front edge and hot point of it. It is necessary to develop research on land use and cover change in urbanization process and build an analog model of urbanization to carry out description, simulation and analysis on dynamic behaviors in urban development change as well as to understand basic characteristics and rules of urbanization process. This has positive practical and theoretical significance for formulating urban and regional sustainable development strategy. The effect of urbanization on land use and cover change is mainly embodied in the change of quantity structure and space structure of urban space, and LUCC model in urbanization process has been an important research subject of urban geography and urban planning. In this paper, based upon previous research achievements, the writer systematically analyzes the research on land use/cover change in urbanization process with the theories of complexity science research and intelligent computation; builds a model for simulating and forecasting dynamic evolution of urban land use and cover change, on the basis of cellular automation model of complexity science research method and multi-agent theory; expands Markov model, traditional CA model and Agent model, introduces complexity science research theory and intelligent computation theory into LUCC research model to build intelligent computation-based LUCC model for analog research on land use and cover change in urbanization research, and performs case research. The concrete contents are as follows: 1. Complexity of LUCC research in urbanization process. Analyze urbanization process in combination with the contents of complexity science research and the conception of complexity feature to reveal the complexity features of LUCC research in urbanization process. Urban space system is a complex economic and cultural phenomenon as well as a social process, is the comprehensive characterization of urban society, economy and culture, and is a complex space system formed by society, economy and nature. It has dissipative structure characteristics, such as opening, dynamics, self-organization, non-balance etc. Traditional model cannot simulate these social, economic and natural driving forces of LUCC including main feedback relation from LUCC to driving force. 2. Establishment of Markov extended model of LUCC analog research in urbanization process. Firstly, use traditional LUCC research model to compute change speed of regional land use through calculating dynamic degree, exploitation degree and consumption degree of land use; use the theory of fuzzy set to rewrite the traditional Markov model, establish structure transfer matrix of land use, forecast and analyze dynamic change and development trend of land use, and present noticeable problems and corresponding measures in urbanization process according to research results. 3. Application of intelligent computation research and complexity science research method in LUCC analog model in urbanization process. On the basis of detailed elaboration of the theory and the model of LUCC research in urbanization process, analyze the problems of existing model used in LUCC research (namely, difficult to resolve many complexity phenomena in complex urban space system), discuss possible structure realization forms of LUCC analog research in combination with the theories of intelligent computation and complexity science research. Perform application analysis on BP artificial neural network and genetic algorithms of intelligent computation and CA model and MAS technology of complexity science research, discuss their theoretical origins and their own characteristics in detail, elaborate the feasibility of them in LUCC analog research, and bring forward improvement methods and measures on existing problems of this kind of model. 4. Establishment of LUCC analog model in urbanization process based on theories of intelligent computation and complexity science. Based on the research on abovementioned BP artificial neural network, genetic algorithms, CA model and multi-agent technology, put forward improvement methods and application assumption towards their expansion on geography, build LUCC analog model in urbanization process based on CA model and Agent model, realize the combination of learning mechanism of BP artificial neural network and fuzzy logic reasoning, express the regulation with explicit formula, and amend the initial regulation through self study; optimize network structure of LUCC analog model and methods and procedures of model parameters with genetic algorithms. In this paper, I introduce research theory and methods of complexity science into LUCC analog research and presents LUCC analog model based upon CA model and MAS theory. Meanwhile, I carry out corresponding expansion on traditional Markov model and introduce the theory of fuzzy set into data screening and parameter amendment of improved model to improve the accuracy and feasibility of Markov model in the research on land use/cover change.
Developing and validating an instrument for measuring mobile computing self-efficacy.
Wang, Yi-Shun; Wang, Hsiu-Yuan
2008-08-01
IT-related self-efficacy has been found to have a critical influence on system use. However, traditional measures of computer self-efficacy and Internet-related self-efficacy are perceived to be inapplicable in the context of mobile computing and commerce because they are targeted primarily at either desktop computer or wire-based technology contexts. Based on previous research, this study develops and validates a multidimensional instrument for measuring mobile computing self-efficacy (MCSE). This empirically validated instrument will be useful to researchers in developing and testing the theories of mobile user behavior, and to practitioners in assessing the mobile computing self-efficacy of users and promoting the use of mobile commerce systems.
Computing organic stereoselectivity - from concepts to quantitative calculations and predictions.
Peng, Qian; Duarte, Fernanda; Paton, Robert S
2016-11-07
Advances in theory and processing power have established computation as a valuable interpretative and predictive tool in the discovery of new asymmetric catalysts. This tutorial review outlines the theory and practice of modeling stereoselective reactions. Recent examples illustrate how an understanding of the fundamental principles and the application of state-of-the-art computational methods may be used to gain mechanistic insight into organic and organometallic reactions. We highlight the emerging potential of this computational tool-box in providing meaningful predictions for the rational design of asymmetric catalysts. We present an accessible account of the field to encourage future synergy between computation and experiment.
Quantum to classical transition in quantum field theory
NASA Astrophysics Data System (ADS)
Lombardo, Fernando C.
1998-12-01
We study the quatum to classical transition process in the context of quantum field theory. Extending the influence functional formalism of Feynman and Vernon, we study the decoherence process for self-interacting quantum fields in flat space. We also use this formalism for arbitrary geometries to analyze the quantum to classical transition in quantum gravity. After summarizing the main results known for the quantum Brownian motion, we consider a self-interacting field theory in Minkowski spacetime. We compute a coarse grained effective action by integrating out the field modes with wavelength shorter than a critical value. From this effective action we obtain the evolution equation for the reduced density matrix (master equation). We compute the diffusion coefficients for this equation and analyze the decoherence induced on the long-wavelength modes. We generalize the results to the case of a conformally coupled scalar field in de Sitter spacetime. We show that the decoherence is effective as long as the critical wavelength is taken to be not shorter than the Hubble radius. On the other hand, we study the classical limit for scalar-tensorial models in two dimensions. We consider different couplings between the dilaton and the scalar field. We discuss the Hawking radiation process and, from an exact evaluation of the influence functional, we study the conditions by which decoherence ensures the validity of the semiclassical approximation in cosmological metrics. Finally we consider four dimensional models with massive scalar fields, arbitrary coupled to the geometry. We compute the Einstein-Langevin equations in order to study the effect of the fluctuations induced by the quantum fields on the classical geometry.
Holographic studies of thermal gauge theories with flavour
NASA Astrophysics Data System (ADS)
Thomson, Rowan F. M.
The AdS/CFT correspondence and its extensions to more general gauge/gravity dualities have provided a powerful framework for the study of strongly coupled gauge theories. This thesis explores properties of a large class of thermal strongly coupled gauge theories using the gravity dual. In order to bring the holographic framework closer to Quantum Chromodynamics (QCD), we study theories with matter in the fundamental representation. In particular, we focus on the holographic dual of SU ( N c ) supersymmetric Yang-Mills theory coupled to N f = N c flavours of fundamental matter at finite temperature, which is realised as N f Dq-brane probes in the near horizon (black hole) geometry of N c black Dp-branes. We explore many aspects of these Dp/Dq brane systems, often focussing on the D3/D7 brane system which is dual to a four dimensional gauge theory. We study the thermodynamics of the Dq-brane probes in the black hole geometry. At low temperature, the branes sit outside the black hole and the meson spectrum is discrete and possesses a mass gap. As the temperature increases, the branes approach a critical solution. Eventually, they fall into the horizon and a phase transition occurs. At large N c and large 't Hooft coupling, we show that this phase transition is always first order. We calculate the free energy, entropy and energy densities, as well as the speed of sound in these systems. We compute the meson spectrum for brane embeddings outside the horizon and find that tachyonic modes appear where this phase is expected to be unstable from thermodynamic considerations. We study the system at non-zero baryon density n b and find that there is a line of phase transitions for small n b , terminating at a critical point with finite n b . We demonstrate that, to leading order in N f / N c , the viscosity to entropy density ratio in these theories saturates the conjectured universal bound e/ S >= 1/4p. Finally, we compute spectral functions and diffusion constants for fundamental matter in the high temperature phase of the D3/D7 theory.
Towards timelike singularity via AdS dual
NASA Astrophysics Data System (ADS)
Bhowmick, Samrat; Chatterjee, Soumyabrata
2017-07-01
It is well known that Kasner geometry with spacelike singularity can be extended to bulk AdS-like geometry, furthermore, one can study field theory on this Kasner space via its gravity dual. In this paper, we show that there exists a Kasner-like geometry with timelike singularity for which one can construct a dual gravity description. We then study various extremal surfaces including spacelike geodesics in the dual gravity description. Finally, we compute correlators of highly massive operators in the boundary field theory with a geodesic approximation.
Exact results in 3d N = 2 Spin(7) gauge theories with vector and spinor matters
NASA Astrophysics Data System (ADS)
Nii, Keita
2018-05-01
We study three-dimensional N = 2 Spin(7) gauge theories with N S spinorial matters and with N f vectorial matters. The quantum Coulomb branch on the moduli space of vacua is one- or two-dimensional depending on the matter contents. For particular values of ( N f , N S ), we find s-confinement phases and derive exact superpotentials. The 3d dynamics of Spin(7) is connected to the 4d dynamics via KK-monopoles. Along the Higgs branch of the Spin(7) theories, we obtain 3d N = 2 G 2 or SU(4) theories and some of them lead to new s-confinement phases. As a check of our analysis we compute superconformal indices for these theories.
Computational Relativistic Astrophysics Using the Flowfield-Dependent Variation Theory
NASA Technical Reports Server (NTRS)
Richardson, G. A.; Chung, T. J.; Whitaker, Ann F. (Technical Monitor)
2001-01-01
Theoretical models, observations and measurements have preoccupied astrophysicists for many centuries. Only in recent years, has the theory of relativity as applied to astrophysical flows met the challenges of how the governing equations can be solved numerically with accuracy and efficiency. Even without the effects of relativity, the physics of magnetohydrodynamic flow instability, turbulence, radiation, and enhanced transport in accretion disks has not been completely resolved. Relativistic effects become pronounced in such cases as jet formation from black hole magnetized accretion disks and also in the study of Gamma-Ray bursts (GRB). Thus, our concern in this paper is to reexamine existing numerical simulation tools as to the accuracy and efficiency of computations and introduce a new approach known as the flowfield-dependent variation (FDV) method. The main feature of the FDV method consists of accommodating discontinuities of shock waves and high gradients of flow variables such as occur in turbulence and unstable motions. In this paper, the physics involved in the solution of relativistic hydrodynamics and solution strategies of the FDV theory are elaborated. The general relativistic astrophysical flow and shock solver (GRAFSS) is introduced, and some simple example problems for Computational Relativistic Astrophysics (CRA) are demonstrated.
Quantum field theory and coalgebraic logic in theoretical computer science.
Basti, Gianfranco; Capolupo, Antonio; Vitiello, Giuseppe
2017-11-01
We suggest that in the framework of the Category Theory it is possible to demonstrate the mathematical and logical dual equivalence between the category of the q-deformed Hopf Coalgebras and the category of the q-deformed Hopf Algebras in quantum field theory (QFT), interpreted as a thermal field theory. Each pair algebra-coalgebra characterizes a QFT system and its mirroring thermal bath, respectively, so to model dissipative quantum systems in far-from-equilibrium conditions, with an evident significance also for biological sciences. Our study is in fact inspired by applications to neuroscience where the brain memory capacity, for instance, has been modeled by using the QFT unitarily inequivalent representations. The q-deformed Hopf Coalgebras and the q-deformed Hopf Algebras constitute two dual categories because characterized by the same functor T, related with the Bogoliubov transform, and by its contravariant application T op , respectively. The q-deformation parameter is related to the Bogoliubov angle, and it is effectively a thermal parameter. Therefore, the different values of q identify univocally, and label the vacua appearing in the foliation process of the quantum vacuum. This means that, in the framework of Universal Coalgebra, as general theory of dynamic and computing systems ("labelled state-transition systems"), the so labelled infinitely many quantum vacua can be interpreted as the Final Coalgebra of an "Infinite State Black-Box Machine". All this opens the way to the possibility of designing a new class of universal quantum computing architectures based on this coalgebraic QFT formulation, as its ability of naturally generating a Fibonacci progression demonstrates. Copyright © 2017 Elsevier Ltd. All rights reserved.
Bayesian learning for spatial filtering in an EEG-based brain-computer interface.
Zhang, Haihong; Yang, Huijuan; Guan, Cuntai
2013-07-01
Spatial filtering for EEG feature extraction and classification is an important tool in brain-computer interface. However, there is generally no established theory that links spatial filtering directly to Bayes classification error. To address this issue, this paper proposes and studies a Bayesian analysis theory for spatial filtering in relation to Bayes error. Following the maximum entropy principle, we introduce a gamma probability model for describing single-trial EEG power features. We then formulate and analyze the theoretical relationship between Bayes classification error and the so-called Rayleigh quotient, which is a function of spatial filters and basically measures the ratio in power features between two classes. This paper also reports our extensive study that examines the theory and its use in classification, using three publicly available EEG data sets and state-of-the-art spatial filtering techniques and various classifiers. Specifically, we validate the positive relationship between Bayes error and Rayleigh quotient in real EEG power features. Finally, we demonstrate that the Bayes error can be practically reduced by applying a new spatial filter with lower Rayleigh quotient.
Semiclassical Path Integral Calculation of Nonlinear Optical Spectroscopy.
Provazza, Justin; Segatta, Francesco; Garavelli, Marco; Coker, David F
2018-02-13
Computation of nonlinear optical response functions allows for an in-depth connection between theory and experiment. Experimentally recorded spectra provide a high density of information, but to objectively disentangle overlapping signals and to reach a detailed and reliable understanding of the system dynamics, measurements must be integrated with theoretical approaches. Here, we present a new, highly accurate and efficient trajectory-based semiclassical path integral method for computing higher order nonlinear optical response functions for non-Markovian open quantum systems. The approach is, in principle, applicable to general Hamiltonians and does not require any restrictions on the form of the intrasystem or system-bath couplings. This method is systematically improvable and is shown to be valid in parameter regimes where perturbation theory-based methods qualitatively breakdown. As a test of the methodology presented here, we study a system-bath model for a coupled dimer for which we compare against numerically exact results and standard approximate perturbation theory-based calculations. Additionally, we study a monomer with discrete vibronic states that serves as the starting point for future investigation of vibronic signatures in nonlinear electronic spectroscopy.
Aids to Computer-Based Multimedia Learning.
ERIC Educational Resources Information Center
Mayer, Richard E.; Moreno, Roxana
2002-01-01
Presents a cognitive theory of multimedia learning that draws on dual coding theory, cognitive load theory, and constructivist learning theory and derives some principles of instructional design for fostering multimedia learning. These include principles of multiple representation, contiguity, coherence, modality, and redundancy. (SLD)
Role of Statistical Random-Effects Linear Models in Personalized Medicine
Diaz, Francisco J; Yeh, Hung-Wen; de Leon, Jose
2012-01-01
Some empirical studies and recent developments in pharmacokinetic theory suggest that statistical random-effects linear models are valuable tools that allow describing simultaneously patient populations as a whole and patients as individuals. This remarkable characteristic indicates that these models may be useful in the development of personalized medicine, which aims at finding treatment regimes that are appropriate for particular patients, not just appropriate for the average patient. In fact, published developments show that random-effects linear models may provide a solid theoretical framework for drug dosage individualization in chronic diseases. In particular, individualized dosages computed with these models by means of an empirical Bayesian approach may produce better results than dosages computed with some methods routinely used in therapeutic drug monitoring. This is further supported by published empirical and theoretical findings that show that random effects linear models may provide accurate representations of phase III and IV steady-state pharmacokinetic data, and may be useful for dosage computations. These models have applications in the design of clinical algorithms for drug dosage individualization in chronic diseases; in the computation of dose correction factors; computation of the minimum number of blood samples from a patient that are necessary for calculating an optimal individualized drug dosage in therapeutic drug monitoring; measure of the clinical importance of clinical, demographic, environmental or genetic covariates; study of drug-drug interactions in clinical settings; the implementation of computational tools for web-site-based evidence farming; design of pharmacogenomic studies; and in the development of a pharmacological theory of dosage individualization. PMID:23467392
Reliable semiclassical computations in QCD
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dine, Michael; Department of Physics, Stanford University Stanford, California 94305-4060; Festuccia, Guido
We revisit the question of whether or not one can perform reliable semiclassical QCD computations at zero temperature. We study correlation functions with no perturbative contributions, and organize the problem by means of the operator product expansion, establishing a precise criterion for the validity of a semiclassical calculation. For N{sub f}>N, a systematic computation is possible; for N{sub f}
Analytic energy gradients for the orbital-optimized third-order Møller-Plesset perturbation theory
NASA Astrophysics Data System (ADS)
Bozkaya, Uǧur
2013-09-01
Analytic energy gradients for the orbital-optimized third-order Møller-Plesset perturbation theory (OMP3) [U. Bozkaya, J. Chem. Phys. 135, 224103 (2011)], 10.1063/1.3665134 are presented. The OMP3 method is applied to problematic chemical systems with challenging electronic structures. The performance of the OMP3 method is compared with those of canonical second-order Møller-Plesset perturbation theory (MP2), third-order Møller-Plesset perturbation theory (MP3), coupled-cluster singles and doubles (CCSD), and coupled-cluster singles and doubles with perturbative triples [CCSD(T)] for investigating equilibrium geometries, vibrational frequencies, and open-shell reaction energies. For bond lengths, the performance of OMP3 is in between those of MP3 and CCSD. For harmonic vibrational frequencies, the OMP3 method significantly eliminates the singularities arising from the abnormal response contributions observed for MP3 in case of symmetry-breaking problems, and provides noticeably improved vibrational frequencies for open-shell molecules. For open-shell reaction energies, OMP3 exhibits a better performance than MP3 and CCSD as in case of barrier heights and radical stabilization energies. As discussed in previous studies, the OMP3 method is several times faster than CCSD in energy computations. Further, in analytic gradient computations for the CCSD method one needs to solve λ-amplitude equations, however for OMP3 one does not since λ _{ab}^{ij(1)} = t_{ij}^{ab(1)} and λ _{ab}^{ij(2)} = t_{ij}^{ab(2)}. Additionally, one needs to solve orbital Z-vector equations for CCSD, but for OMP3 orbital response contributions are zero owing to the stationary property of OMP3. Overall, for analytic gradient computations the OMP3 method is several times less expensive than CCSD (roughly ˜4-6 times). Considering the balance of computational cost and accuracy we conclude that the OMP3 method emerges as a very useful tool for the study of electronically challenging chemical systems.
Analytic energy gradients for the orbital-optimized third-order Møller-Plesset perturbation theory.
Bozkaya, Uğur
2013-09-14
Analytic energy gradients for the orbital-optimized third-order Møller-Plesset perturbation theory (OMP3) [U. Bozkaya, J. Chem. Phys. 135, 224103 (2011)] are presented. The OMP3 method is applied to problematic chemical systems with challenging electronic structures. The performance of the OMP3 method is compared with those of canonical second-order Møller-Plesset perturbation theory (MP2), third-order Møller-Plesset perturbation theory (MP3), coupled-cluster singles and doubles (CCSD), and coupled-cluster singles and doubles with perturbative triples [CCSD(T)] for investigating equilibrium geometries, vibrational frequencies, and open-shell reaction energies. For bond lengths, the performance of OMP3 is in between those of MP3 and CCSD. For harmonic vibrational frequencies, the OMP3 method significantly eliminates the singularities arising from the abnormal response contributions observed for MP3 in case of symmetry-breaking problems, and provides noticeably improved vibrational frequencies for open-shell molecules. For open-shell reaction energies, OMP3 exhibits a better performance than MP3 and CCSD as in case of barrier heights and radical stabilization energies. As discussed in previous studies, the OMP3 method is several times faster than CCSD in energy computations. Further, in analytic gradient computations for the CCSD method one needs to solve λ-amplitude equations, however for OMP3 one does not since λ(ab)(ij(1))=t(ij)(ab(1)) and λ(ab)(ij(2))=t(ij)(ab(2)). Additionally, one needs to solve orbital Z-vector equations for CCSD, but for OMP3 orbital response contributions are zero owing to the stationary property of OMP3. Overall, for analytic gradient computations the OMP3 method is several times less expensive than CCSD (roughly ~4-6 times). Considering the balance of computational cost and accuracy we conclude that the OMP3 method emerges as a very useful tool for the study of electronically challenging chemical systems.
Al-Harbi, L M; El-Mossalamy, E H; Obaid, A Y; Al-Jedaani, A H
2014-01-01
Charge transfer complexes of substituted aryl Schiff bases as donors with picric acid and m-dinitrobenzene as acceptors were investigated by using computational analysis calculated by Configuration Interaction Singles Hartree-Fock (CIS-HF) at standard 6-31G∗ basis set and Time-Dependent Density-Functional Theory (TD-DFT) levels of theory at standard 6-31G∗∗ basis set, infrared spectra, visible and nuclear magnetic resonance spectra are investigated. The optimized geometries and vibrational frequencies were evaluated. The energy and oscillator strength were calculated by Configuration Interaction Singles Hartree-Fock method (CIS-HF) and the Time-Dependent Density-Functional Theory (TD-DFT) results. Electronic properties, such as HOMO and LUMO energies and band gaps of CTCs set, were studied by the Time-Dependent density functional theory with Becke-Lee-Young-Parr (B3LYP) composite exchange correlation functional and by Configuration Interaction Singles Hartree-Fock method (CIS-HF). The ionization potential Ip and electron affinity EA were calculated by PM3, HF and DFT methods. The columbic force was calculated theoretically by using (CIS-HF and TD-DFT) methods. This study confirms that the theoretical calculation of vibrational frequencies for (aryl Schiff bases--(m-dinitrobenzene and picric acid)) complexes are quite useful for the vibrational assignment and for predicting new vibrational frequencies. Copyright © 2013 Elsevier B.V. All rights reserved.
ARTIFICIAL INTELLIGENCE , RECURSIVE FUNCTIONS), (*RECURSIVE FUNCTIONS, ARTIFICIAL INTELLIGENCE ), (*MATHEMATICAL LOGIC, ARTIFICIAL INTELLIGENCE ), METAMATHEMATICS, AUTOMATA, NUMBER THEORY, INFORMATION THEORY, COMBINATORIAL ANALYSIS
ERIC Educational Resources Information Center
Clark, William M.; Jackson, Yaminah Z.; Morin, Michael T.; Ferraro, Giacomo P.
2011-01-01
Laboratory experiments and computer models for studying the mass transfer process of removing CO2 from air using water or dilute NaOH solution as absorbent are presented. Models tie experiment to theory and give a visual representation of concentration profiles and also illustrate the two-film theory and the relative importance of various…
On stability of the structure of implicit personality theory over situations.
Hochwälder, J
1995-12-01
In the present study, the following (hitherto unaddressed) question was posed: "Is the structure of implicit personality theory stable over situations?". In order to answer this question, correlation coefficients were computed between different aspects of two trait-structures obtained under different situational conditions. The results seem to indicate that the structure of IPT is stable over situations. The results are discussed in the light of some methodological considerations.
The ϱ-ππ coupling constant in lattice gauge theory
NASA Astrophysics Data System (ADS)
Gottlieb, Steven; MacKenzie, Paul B.; Thacker, H. B.; Weingarten, Don
1984-01-01
We present a method for studying hadronic transitions in lattice gauge theory which requires computer time comparable to that required by recent hadron spectrum calculations. This method is applied to a calculation of the decay ϱ-->ππ. On leave from the Department of Physics, Indiana University, Bloomington, IN 47405, USA. Address after September 1, 1983: IBM, T.J. Watson Research Center, Yorktown Heights, NY 10598, USA.
False-vacuum decay in generalized extended inflation
NASA Technical Reports Server (NTRS)
Holman, Richard; Kolb, Edward W.; Vadas, Sharon L.; Wang, Yun
1990-01-01
False-vacuum decay was studied in context of generalized extended inflationary theories, and the bubble nucleation rates was computed for these theories in the limit of G(sub N) yields 0. It was found that the time dependence of the nucleation rate can be exponentially strong through the time dependence of the Jordan-Brans-Dicke field. This can have a pronounced effect on whether extended inflation can be successfully implemented.
Sadasivam, Rajani Shankar; Cutrona, Sarah L; Kinney, Rebecca L; Marlin, Benjamin M; Mazor, Kathleen M; Lemon, Stephenie C; Houston, Thomas K
2016-03-07
What is the next frontier for computer-tailored health communication (CTHC) research? In current CTHC systems, study designers who have expertise in behavioral theory and mapping theory into CTHC systems select the variables and develop the rules that specify how the content should be tailored, based on their knowledge of the targeted population, the literature, and health behavior theories. In collective-intelligence recommender systems (hereafter recommender systems) used by Web 2.0 companies (eg, Netflix and Amazon), machine learning algorithms combine user profiles and continuous feedback ratings of content (from themselves and other users) to empirically tailor content. Augmenting current theory-based CTHC with empirical recommender systems could be evaluated as the next frontier for CTHC. The objective of our study was to uncover barriers and challenges to using recommender systems in health promotion. We conducted a focused literature review, interviewed subject experts (n=8), and synthesized the results. We describe (1) limitations of current CTHC systems, (2) advantages of incorporating recommender systems to move CTHC forward, and (3) challenges to incorporating recommender systems into CTHC. Based on the evidence presented, we propose a future research agenda for CTHC systems. We promote discussion of ways to move CTHC into the 21st century by incorporation of recommender systems.
Cutrona, Sarah L; Kinney, Rebecca L; Marlin, Benjamin M; Mazor, Kathleen M; Lemon, Stephenie C; Houston, Thomas K
2016-01-01
Background What is the next frontier for computer-tailored health communication (CTHC) research? In current CTHC systems, study designers who have expertise in behavioral theory and mapping theory into CTHC systems select the variables and develop the rules that specify how the content should be tailored, based on their knowledge of the targeted population, the literature, and health behavior theories. In collective-intelligence recommender systems (hereafter recommender systems) used by Web 2.0 companies (eg, Netflix and Amazon), machine learning algorithms combine user profiles and continuous feedback ratings of content (from themselves and other users) to empirically tailor content. Augmenting current theory-based CTHC with empirical recommender systems could be evaluated as the next frontier for CTHC. Objective The objective of our study was to uncover barriers and challenges to using recommender systems in health promotion. Methods We conducted a focused literature review, interviewed subject experts (n=8), and synthesized the results. Results We describe (1) limitations of current CTHC systems, (2) advantages of incorporating recommender systems to move CTHC forward, and (3) challenges to incorporating recommender systems into CTHC. Based on the evidence presented, we propose a future research agenda for CTHC systems. Conclusions We promote discussion of ways to move CTHC into the 21st century by incorporation of recommender systems. PMID:26952574
Steerability Analysis of Tracked Vehicles: Theory and User’s Guide for Computer Program TVSTEER
1986-08-01
Baladi , Donald E. Barnes, Rebecca P. BergerC oStructures Laboratory NDEPARTMENT OF THE ARMY ___ Waterways Experiment Station, Corps of Engineers . U P0 Box...Analysis of Tracked Vehicles: Theory and User’s Guide for Computer Program TVSTEER - 12 PERSONAL AUTHOR(S) Baladi , George Y., Barnes, Donald E...mathematical model was formulated by Drs. George Y. Baladi and Behzad Rohani. The logic and computer programming were accomplished by Dr. Baladi and
Vollmer Dahlke, Deborah; Fair, Kayla; Hong, Y Alicia; Beaudoin, Christopher E; Pulczinski, Jairus; Ory, Marcia G
2015-03-27
Thousands of mobile health apps are now available for use on mobile phones for a variety of uses and conditions, including cancer survivorship. Many of these apps appear to deliver health behavior interventions but may fail to consider design considerations based in human computer interface and health behavior change theories. This study is designed to assess the presence of and manner in which health behavior change and health communication theories are applied in mobile phone cancer survivorship apps. The research team selected a set of criteria-based health apps for mobile phones and assessed each app using qualitative coding methods to assess the application of health behavior change and communication theories. Each app was assessed using a coding derived from the taxonomy of 26 health behavior change techniques by Abraham and Michie with a few important changes based on the characteristics of mHealth apps that are specific to information processing and human computer interaction such as control theory and feedback systems. A total of 68 mobile phone apps and games built on the iOS and Android platforms were coded, with 65 being unique. Using a Cohen's kappa analysis statistic, the inter-rater reliability for the iOS apps was 86.1 (P<.001) and for the Android apps, 77.4 (P<.001). For the most part, the scores for inclusion of theory-based health behavior change characteristics in the iOS platform cancer survivorship apps were consistently higher than those of the Android platform apps. For personalization and tailoring, 67% of the iOS apps (24/36) had these elements as compared to 38% of the Android apps (12/32). In the area of prompting for intention formation, 67% of the iOS apps (34/36) indicated these elements as compared to 16% (5/32) of the Android apps. Mobile apps are rapidly emerging as a way to deliver health behavior change interventions that can be tailored or personalized for individuals. As these apps and games continue to evolve and include interactive and adaptive sensors and other forms of dynamic feedback, their content and interventional elements need to be grounded in human computer interface design and health behavior and communication theory and practice.
Fair, Kayla; Hong, Y Alicia; Beaudoin, Christopher E; Pulczinski, Jairus; Ory, Marcia G
2015-01-01
Background Thousands of mobile health apps are now available for use on mobile phones for a variety of uses and conditions, including cancer survivorship. Many of these apps appear to deliver health behavior interventions but may fail to consider design considerations based in human computer interface and health behavior change theories. Objective This study is designed to assess the presence of and manner in which health behavior change and health communication theories are applied in mobile phone cancer survivorship apps. Methods The research team selected a set of criteria-based health apps for mobile phones and assessed each app using qualitative coding methods to assess the application of health behavior change and communication theories. Each app was assessed using a coding derived from the taxonomy of 26 health behavior change techniques by Abraham and Michie with a few important changes based on the characteristics of mHealth apps that are specific to information processing and human computer interaction such as control theory and feedback systems. Results A total of 68 mobile phone apps and games built on the iOS and Android platforms were coded, with 65 being unique. Using a Cohen’s kappa analysis statistic, the inter-rater reliability for the iOS apps was 86.1 (P<.001) and for the Android apps, 77.4 (P<.001). For the most part, the scores for inclusion of theory-based health behavior change characteristics in the iOS platform cancer survivorship apps were consistently higher than those of the Android platform apps. For personalization and tailoring, 67% of the iOS apps (24/36) had these elements as compared to 38% of the Android apps (12/32). In the area of prompting for intention formation, 67% of the iOS apps (34/36) indicated these elements as compared to 16% (5/32) of the Android apps. Conclusions Mobile apps are rapidly emerging as a way to deliver health behavior change interventions that can be tailored or personalized for individuals. As these apps and games continue to evolve and include interactive and adaptive sensors and other forms of dynamic feedback, their content and interventional elements need to be grounded in human computer interface design and health behavior and communication theory and practice. PMID:25830810
Kawano, Tomonori; Bouteau, François; Mancuso, Stefano
2012-11-01
The automata theory is the mathematical study of abstract machines commonly studied in the theoretical computer science and highly interdisciplinary fields that combine the natural sciences and the theoretical computer science. In the present review article, as the chemical and biological basis for natural computing or informatics, some plants, plant cells or plant-derived molecules involved in signaling are listed and classified as natural sequential machines (namely, the Mealy machines or Moore machines) or finite state automata. By defining the actions (states and transition functions) of these natural automata, the similarity between the computational data processing and plant decision-making processes became obvious. Finally, their putative roles as the parts for plant-based computing or robotic systems are discussed.
Kawano, Tomonori; Bouteau, François; Mancuso, Stefano
2012-01-01
The automata theory is the mathematical study of abstract machines commonly studied in the theoretical computer science and highly interdisciplinary fields that combine the natural sciences and the theoretical computer science. In the present review article, as the chemical and biological basis for natural computing or informatics, some plants, plant cells or plant-derived molecules involved in signaling are listed and classified as natural sequential machines (namely, the Mealy machines or Moore machines) or finite state automata. By defining the actions (states and transition functions) of these natural automata, the similarity between the computational data processing and plant decision-making processes became obvious. Finally, their putative roles as the parts for plant-based computing or robotic systems are discussed. PMID:23336016
Computer Series 41: Potential-Energy Surfaces and Transition-State Theory.
ERIC Educational Resources Information Center
Moss, S. J.; Coady, C. J.
1983-01-01
Describes computer programs involving the London-Eyring-Polany-Sato method (LEPS). The programs provide a valuable means of introducing students to potential energy surfaces and to the foundations of transition state theory. Program listings (with copies of student scripts) or programs on DOS 3.3 disc are available from authors. (JN)
Coping with the Stigma of Mental Illness: Empirically-Grounded Hypotheses from Computer Simulations
ERIC Educational Resources Information Center
Kroska, Amy; Har, Sarah K.
2011-01-01
This research demonstrates how affect control theory and its computer program, "Interact", can be used to develop empirically-grounded hypotheses regarding the connection between cultural labels and behaviors. Our demonstration focuses on propositions in the modified labeling theory of mental illness. According to the MLT, negative societal…
Towards a Theory-Based Design Framework for an Effective E-Learning Computer Programming Course
ERIC Educational Resources Information Center
McGowan, Ian S.
2016-01-01
Built on Dabbagh (2005), this paper presents a four component theory-based design framework for an e-learning session in introductory computer programming. The framework, driven by a body of exemplars component, emphasizes the transformative interaction between the knowledge building community (KBC) pedagogical model, a mixed instructional…
Learning Style Theory and Computer Mediated Communication.
ERIC Educational Resources Information Center
Atkins, Hilary; Moore, David; Sharpe, Simon; Hobbs, Dave
This paper looks at the low participation rates in computer mediated conferences (CMC) and argues that one of the causes of this may be an incompatibility between students' learning styles and the style adopted by CMC. Curry's Onion Model provides a well-established framework within which to view the main learning style theories (Riding and…
Observations in the Computer Room: L2 Output and Learner Behaviour
ERIC Educational Resources Information Center
Leahy, Christine
2004-01-01
This article draws on second language theory, particularly output theory as defined by Swain (1995), in order to conceptualise observations made in a computer-assisted language learning setting. It investigates second language output and learner behaviour within an electronic role-play setting, based on a subject-specific problem solving task and…
THREE-PEE SAMPLING THEORY and program 'THRP' for computer generation of selection criteria
L. R. Grosenbaugh
1965-01-01
Theory necessary for sampling with probability proportional to prediction ('three-pee,' or '3P,' sampling) is first developed and then exemplified by numerical comparisons of several estimators. Program 'T RP' for computer generation of appropriate 3P-sample-selection criteria is described, and convenient random integer dispensers are...
NASA Technical Reports Server (NTRS)
Decker, Arthur J.; Izen, Steven H.
1992-01-01
A theory to determine the properties of a fluid from measurements of its projections was developed and tested. Viewing cones as small as 10 degrees were evaluated, with the only assumption being that the property was space limited. The results of applying the theory to numerical and actual interferograms of a spherical discontinuity of refractive index are presented. The theory was developed to test the practicality and limits of using three dimensional computer tomography in internal fluid dynamics.
NASA Technical Reports Server (NTRS)
Decker, Arthur J.; Izen, Steven H.
1991-01-01
A theory to determine the properties of a fluid from measurements of its projections was developed and tested. Viewing cones as small as 10 degrees were evaluated, with the only assumption being that the property was space limited. The results of applying the theory to numerical and actual interferograms of a spherical discontinuity of refractive index are presented. The theory was developed to test the practicality and limits of using three-dimensional computer tomography in internal fluid dynamics.
ERIC Educational Resources Information Center
Schmitt, T. A.; Sass, D. A.; Sullivan, J. R.; Walker, C. M.
2010-01-01
Imposed time limits on computer adaptive tests (CATs) can result in examinees having difficulty completing all items, thus compromising the validity and reliability of ability estimates. In this study, the effects of speededness were explored in a simulated CAT environment by varying examinee response patterns to end-of-test items. Expectedly,…
A queueing model of pilot decision making in a multi-task flight management situation
NASA Technical Reports Server (NTRS)
Walden, R. S.; Rouse, W. B.
1977-01-01
Allocation of decision making responsibility between pilot and computer is considered and a flight management task, designed for the study of pilot-computer interaction, is discussed. A queueing theory model of pilot decision making in this multi-task, control and monitoring situation is presented. An experimental investigation of pilot decision making and the resulting model parameters are discussed.
2014-01-01
Background Evidence indicates that post − stroke rehabilitation improves function, independence and quality of life. A key aspect of rehabilitation is the provision of appropriate information and feedback to the learner. Advances in information and communications technology (ICT) have allowed for the development of various systems to complement stroke rehabilitation that could be used in the home setting. These systems may increase the provision of rehabilitation a stroke survivor receives and carries out, as well as providing a learning platform that facilitates long-term self-managed rehabilitation and behaviour change. This paper describes the application of an innovative evaluative methodology to explore the utilisation of feedback for post-stroke upper-limb rehabilitation in the home. Methods Using the principles of realistic evaluation, this study aimed to test and refine intervention theories by exploring the complex interactions of contexts, mechanisms and outcomes that arise from technology deployment in the home. Methods included focus groups followed by multi-method case studies (n = 5) before, during and after the use of computer-based equipment. Data were analysed in relation to the context-mechanism-outcome hypotheses case by case. This was followed by a synthesis of the findings to answer the question, ‘what works for whom and in what circumstances and respects?’ Results Data analysis reveals that to achieve desired outcomes through the use of ICT, key elements of computer feedback, such as accuracy, measurability, rewarding feedback, adaptability, and knowledge of results feedback, are required to trigger the theory-driven mechanisms underpinning the intervention. In addition, the pre-existing context and the personal and environmental contexts, such as previous experience of service delivery, personal goals, trust in the technology, and social circumstances may also enable or constrain the underpinning theory-driven mechanisms. Conclusions Findings suggest that the theory-driven mechanisms underpinning the utilisation of feedback from computer-based technology for home-based upper-limb post-stroke rehabilitation are dependent on key elements of computer feedback and the personal and environmental context. The identification of these elements may therefore inform the development of technology; therapy education and the subsequent adoption of technology and a self-management paradigm; long-term self-managed rehabilitation; and importantly, improvements in the physical and psychosocial aspects of recovery. PMID:24903401
Parker, Jack; Mawson, Susan; Mountain, Gail; Nasr, Nasrin; Zheng, Huiru
2014-06-05
Evidence indicates that post-stroke rehabilitation improves function, independence and quality of life. A key aspect of rehabilitation is the provision of appropriate information and feedback to the learner.Advances in information and communications technology (ICT) have allowed for the development of various systems to complement stroke rehabilitation that could be used in the home setting. These systems may increase the provision of rehabilitation a stroke survivor receives and carries out, as well as providing a learning platform that facilitates long-term self-managed rehabilitation and behaviour change. This paper describes the application of an innovative evaluative methodology to explore the utilisation of feedback for post-stroke upper-limb rehabilitation in the home. Using the principles of realistic evaluation, this study aimed to test and refine intervention theories by exploring the complex interactions of contexts, mechanisms and outcomes that arise from technology deployment in the home. Methods included focus groups followed by multi-method case studies (n = 5) before, during and after the use of computer-based equipment. Data were analysed in relation to the context-mechanism-outcome hypotheses case by case. This was followed by a synthesis of the findings to answer the question, 'what works for whom and in what circumstances and respects?' Data analysis reveals that to achieve desired outcomes through the use of ICT, key elements of computer feedback, such as accuracy, measurability, rewarding feedback, adaptability, and knowledge of results feedback, are required to trigger the theory-driven mechanisms underpinning the intervention. In addition, the pre-existing context and the personal and environmental contexts, such as previous experience of service delivery, personal goals, trust in the technology, and social circumstances may also enable or constrain the underpinning theory-driven mechanisms. Findings suggest that the theory-driven mechanisms underpinning the utilisation of feedback from computer-based technology for home-based upper-limb post-stroke rehabilitation are dependent on key elements of computer feedback and the personal and environmental context. The identification of these elements may therefore inform the development of technology; therapy education and the subsequent adoption of technology and a self-management paradigm; long-term self-managed rehabilitation; and importantly, improvements in the physical and psychosocial aspects of recovery.
An experiment for determining the Euler load by direct computation
NASA Technical Reports Server (NTRS)
Thurston, Gaylen A.; Stein, Peter A.
1986-01-01
A direct algorithm is presented for computing the Euler load of a column from experimental data. The method is based on exact inextensional theory for imperfect columns, which predicts two distinct deflected shapes at loads near the Euler load. The bending stiffness of the column appears in the expression for the Euler load along with the column length, therefore the experimental data allows a direct computation of bending stiffness. Experiments on graphite-epoxy columns of rectangular cross-section are reported in the paper. The bending stiffness of each composite column computed from experiment is compared with predictions from laminated plate theory.
Computational Science: A Research Methodology for the 21st Century
NASA Astrophysics Data System (ADS)
Orbach, Raymond L.
2004-03-01
Computational simulation - a means of scientific discovery that employs computer systems to simulate a physical system according to laws derived from theory and experiment - has attained peer status with theory and experiment. Important advances in basic science are accomplished by a new "sociology" for ultrascale scientific computing capability (USSCC), a fusion of sustained advances in scientific models, mathematical algorithms, computer architecture, and scientific software engineering. Expansion of current capabilities by factors of 100 - 1000 open up new vistas for scientific discovery: long term climatic variability and change, macroscopic material design from correlated behavior at the nanoscale, design and optimization of magnetic confinement fusion reactors, strong interactions on a computational lattice through quantum chromodynamics, and stellar explosions and element production. The "virtual prototype," made possible by this expansion, can markedly reduce time-to-market for industrial applications such as jet engines and safer, more fuel efficient cleaner cars. In order to develop USSCC, the National Energy Research Scientific Computing Center (NERSC) announced the competition "Innovative and Novel Computational Impact on Theory and Experiment" (INCITE), with no requirement for current DOE sponsorship. Fifty nine proposals for grand challenge scientific problems were submitted for a small number of awards. The successful grants, and their preliminary progress, will be described.
Higher-order adaptive finite-element methods for Kohn–Sham density functional theory
DOE Office of Scientific and Technical Information (OSTI.GOV)
Motamarri, P.; Nowak, M.R.; Leiter, K.
2013-11-15
We present an efficient computational approach to perform real-space electronic structure calculations using an adaptive higher-order finite-element discretization of Kohn–Sham density-functional theory (DFT). To this end, we develop an a priori mesh-adaption technique to construct a close to optimal finite-element discretization of the problem. We further propose an efficient solution strategy for solving the discrete eigenvalue problem by using spectral finite-elements in conjunction with Gauss–Lobatto quadrature, and a Chebyshev acceleration technique for computing the occupied eigenspace. The proposed approach has been observed to provide a staggering 100–200-fold computational advantage over the solution of a generalized eigenvalue problem. Using the proposedmore » solution procedure, we investigate the computational efficiency afforded by higher-order finite-element discretizations of the Kohn–Sham DFT problem. Our studies suggest that staggering computational savings—of the order of 1000-fold—relative to linear finite-elements can be realized, for both all-electron and local pseudopotential calculations, by using higher-order finite-element discretizations. On all the benchmark systems studied, we observe diminishing returns in computational savings beyond the sixth-order for accuracies commensurate with chemical accuracy, suggesting that the hexic spectral-element may be an optimal choice for the finite-element discretization of the Kohn–Sham DFT problem. A comparative study of the computational efficiency of the proposed higher-order finite-element discretizations suggests that the performance of finite-element basis is competing with the plane-wave discretization for non-periodic local pseudopotential calculations, and compares to the Gaussian basis for all-electron calculations to within an order of magnitude. Further, we demonstrate the capability of the proposed approach to compute the electronic structure of a metallic system containing 1688 atoms using modest computational resources, and good scalability of the present implementation up to 192 processors.« less
Parsing partial molar volumes of small molecules: a molecular dynamics study.
Patel, Nisha; Dubins, David N; Pomès, Régis; Chalikian, Tigran V
2011-04-28
We used molecular dynamics (MD) simulations in conjunction with the Kirkwood-Buff theory to compute the partial molar volumes for a number of small solutes of various chemical natures. We repeated our computations using modified pair potentials, first, in the absence of the Coulombic term and, second, in the absence of the Coulombic and the attractive Lennard-Jones terms. Comparison of our results with experimental data and the volumetric results of Monte Carlo simulation with hard sphere potentials and scaled particle theory-based computations led us to conclude that, for small solutes, the partial molar volume computed with the Lennard-Jones potential in the absence of the Coulombic term nearly coincides with the cavity volume. On the other hand, MD simulations carried out with the pair interaction potentials containing only the repulsive Lennard-Jones term produce unrealistically large partial molar volumes of solutes that are close to their excluded volumes. Our simulation results are in good agreement with the reported schemes for parsing partial molar volume data on small solutes. In particular, our determined interaction volumes() and the thickness of the thermal volume for individual compounds are in good agreement with empirical estimates. This work is the first computational study that supports and lends credence to the practical algorithms of parsing partial molar volume data that are currently in use for molecular interpretations of volumetric data.
Holographic Rényi entropy in AdS3/LCFT2 correspondence
NASA Astrophysics Data System (ADS)
Chen, Bin; Song, Feng-yan; Zhang, Jia-ju
2014-03-01
The recent study in AdS3/CFT2 correspondence shows that the tree level contribution and 1-loop correction of holographic Rényi entanglement entropy (HRE) exactly match the direct CFT computation in the large central charge limit. This allows the Rényi entanglement entropy to be a new window to study the AdS/CFT correspondence. In this paper we generalize the study of Rényi entanglement entropy in pure AdS3 gravity to the massive gravity theories at the critical points. For the cosmological topological massive gravity (CTMG), the dual conformal field theory (CFT) could be a chiral conformal field theory or a logarithmic conformal field theory (LCFT), depending on the asymptotic boundary conditions imposed. In both cases, by studying the short interval expansion of the Rényi entanglement entropy of two disjoint intervals with small cross ratio x, we find that the classical and 1-loop HRE are in exact match with the CFT results, up to order x 6. To this order, the difference between the massless graviton and logarithmic mode can be seen clearly. Moreover, for the cosmological new massive gravity (CNMG) at critical point, which could be dual to a logarithmic CFT as well, we find the similar agreement in the CNMG/LCFT correspondence. Furthermore we read the 2-loop correction of graviton and logarithmic mode to HRE from CFT computation. It has distinct feature from the one in pure AdS3 gravity.
Nonunitary Lagrangians and Unitary Non-Lagrangian Conformal Field Theories.
Buican, Matthew; Laczko, Zoltan
2018-02-23
In various dimensions, we can sometimes compute observables of interacting conformal field theories (CFTs) that are connected to free theories via the renormalization group (RG) flow by computing protected quantities in the free theories. On the other hand, in two dimensions, it is often possible to algebraically construct observables of interacting CFTs using free fields without the need to explicitly construct an underlying RG flow. In this Letter, we begin to extend this idea to higher dimensions by showing that one can compute certain observables of an infinite set of unitary strongly interacting four-dimensional N=2 superconformal field theories (SCFTs) by performing simple calculations involving sets of nonunitary free four-dimensional hypermultiplets. These free fields are distant cousins of the Majorana fermion underlying the two-dimensional Ising model and are not obviously connected to our interacting theories via an RG flow. Rather surprisingly, this construction gives us Lagrangians for particular observables in certain subsectors of many "non-Lagrangian" SCFTs by sacrificing unitarity while preserving the full N=2 superconformal algebra. As a by-product, we find relations between characters in unitary and nonunitary affine Kac-Moody algebras. We conclude by commenting on possible generalizations of our construction.
Nonunitary Lagrangians and Unitary Non-Lagrangian Conformal Field Theories
NASA Astrophysics Data System (ADS)
Buican, Matthew; Laczko, Zoltan
2018-02-01
In various dimensions, we can sometimes compute observables of interacting conformal field theories (CFTs) that are connected to free theories via the renormalization group (RG) flow by computing protected quantities in the free theories. On the other hand, in two dimensions, it is often possible to algebraically construct observables of interacting CFTs using free fields without the need to explicitly construct an underlying RG flow. In this Letter, we begin to extend this idea to higher dimensions by showing that one can compute certain observables of an infinite set of unitary strongly interacting four-dimensional N =2 superconformal field theories (SCFTs) by performing simple calculations involving sets of nonunitary free four-dimensional hypermultiplets. These free fields are distant cousins of the Majorana fermion underlying the two-dimensional Ising model and are not obviously connected to our interacting theories via an RG flow. Rather surprisingly, this construction gives us Lagrangians for particular observables in certain subsectors of many "non-Lagrangian" SCFTs by sacrificing unitarity while preserving the full N =2 superconformal algebra. As a by-product, we find relations between characters in unitary and nonunitary affine Kac-Moody algebras. We conclude by commenting on possible generalizations of our construction.
NASA Astrophysics Data System (ADS)
Grinberg, Horacio; Freed, Karl F.; Williams, Carl J.
1997-08-01
The analytical infinite order sudden (IOS) quantum theory of triatomic photodissociation, developed in paper I, is applied to study the indirect photodissociation of NOCl through a real or virtual intermediate state. The theory uses the IOS approximation for the dynamics in the final dissociative channels and an Airy function approximation for the continuum functions. The transition is taken as polarized in the plane of the molecule; symmetric top wave functions are used for both the initial and intermediate bound states; and simple semiempirical model potentials are employed for each state. The theory provides analytical expressions for the photofragment yield spectrum for producing particular final fragment ro-vibrational states as a function of the photon excitation energy. Computations are made of the photofragment excitation spectrum of NOCl in the region of the T1(13A″)←S0(11A') transition for producing the NO fragment in the vibrational states nNO=0, 1, and 2. The computed spectra for the unexcited nNO==0 and excited nNO=2 states are in reasonable agreement with experiment. However, some discrepancies are observed for the singly excited nNO=1 vibrational state, indicating deficiencies in the semiempirical potential energy surface. Computations for two different orientations of the in-plane transition dipole moment produce very similar excitation spectra. Calculations of fragment rotational distributions are performed for high values of the total angular momentum J, a feature that would be very difficult to perform with close-coupled methods. Computations are also made of the thermally averaged rotational energy distributions to simulate the conditions in actual supersonic jet experiments.
Analyzing Big Data in Psychology: A Split/Analyze/Meta-Analyze Approach
Cheung, Mike W.-L.; Jak, Suzanne
2016-01-01
Big data is a field that has traditionally been dominated by disciplines such as computer science and business, where mainly data-driven analyses have been performed. Psychology, a discipline in which a strong emphasis is placed on behavioral theories and empirical research, has the potential to contribute greatly to the big data movement. However, one challenge to psychologists—and probably the most crucial one—is that most researchers may not have the necessary programming and computational skills to analyze big data. In this study we argue that psychologists can also conduct big data research and that, rather than trying to acquire new programming and computational skills, they should focus on their strengths, such as performing psychometric analyses and testing theories using multivariate analyses to explain phenomena. We propose a split/analyze/meta-analyze approach that allows psychologists to easily analyze big data. Two real datasets are used to demonstrate the proposed procedures in R. A new research agenda related to the analysis of big data in psychology is outlined at the end of the study. PMID:27242639
Analyzing Big Data in Psychology: A Split/Analyze/Meta-Analyze Approach.
Cheung, Mike W-L; Jak, Suzanne
2016-01-01
Big data is a field that has traditionally been dominated by disciplines such as computer science and business, where mainly data-driven analyses have been performed. Psychology, a discipline in which a strong emphasis is placed on behavioral theories and empirical research, has the potential to contribute greatly to the big data movement. However, one challenge to psychologists-and probably the most crucial one-is that most researchers may not have the necessary programming and computational skills to analyze big data. In this study we argue that psychologists can also conduct big data research and that, rather than trying to acquire new programming and computational skills, they should focus on their strengths, such as performing psychometric analyses and testing theories using multivariate analyses to explain phenomena. We propose a split/analyze/meta-analyze approach that allows psychologists to easily analyze big data. Two real datasets are used to demonstrate the proposed procedures in R. A new research agenda related to the analysis of big data in psychology is outlined at the end of the study.
Conceptual strategies and inter-theory relations: The case of nanoscale cracks
NASA Astrophysics Data System (ADS)
Bursten, Julia R.
2018-05-01
This paper introduces a new account of inter-theory relations in physics, which I call the conceptual strategies account. Using the example of a multiscale computer simulation model of nanoscale crack propagation in silicon, I illustrate this account and contrast it with existing reductive, emergent, and handshaking approaches. The conceptual strategies account develops the notion that relations among physical theories, and among their models, are constrained but not dictated by limitations from physics, mathematics, and computation, and that conceptual reasoning within those limits is required both to generate and to understand the relations between theories. Conceptual strategies result in a variety of types of relations between theories and models. These relations are themselves epistemic objects, like theories and models, and as such are an under-recognized part of the epistemic landscape of science.
Bai, Xiao-ping; Zhang, Xi-wei
2013-01-01
Selecting construction schemes of the building engineering project is a complex multiobjective optimization decision process, in which many indexes need to be selected to find the optimum scheme. Aiming at this problem, this paper selects cost, progress, quality, and safety as the four first-order evaluation indexes, uses the quantitative method for the cost index, uses integrated qualitative and quantitative methodologies for progress, quality, and safety indexes, and integrates engineering economics, reliability theories, and information entropy theory to present a new evaluation method for building construction project. Combined with a practical case, this paper also presents detailed computing processes and steps, including selecting all order indexes, establishing the index matrix, computing score values of all order indexes, computing the synthesis score, sorting all selected schemes, and making analysis and decision. Presented method can offer valuable references for risk computing of building construction projects.
Theoretical studies on bimolecular reaction dynamics
Clary, David C.
2008-01-01
This perspective discusses progress in the theory of bimolecular reaction dynamics in the gas phase. The examples selected show that definitive quantum dynamical computations are providing insights into the detailed mechanisms of chemical reactions. PMID:18626015
NNLO QCD predictions for fully-differential top-quark pair production at the Tevatron
NASA Astrophysics Data System (ADS)
Czakon, Michal; Fiedler, Paul; Heymes, David; Mitov, Alexander
2016-05-01
We present a comprehensive study of differential distributions for Tevatron top-pair events at the level of stable top quarks. All calculations are performed in NNLO QCD with the help of a fully differential partonic Monte-Carlo and are exact at this order in perturbation theory. We present predictions for all kinematic distributions for which data exists. Particular attention is paid on the top-quark forward-backward asymmetry which we study in detail. We compare the NNLO results with existing approximate NNLO predictions as well as differential distributions computed with different parton distribution sets. Theory errors are significantly smaller than current experimental ones with overall agreement between theory and data.
The complete process of large elastic-plastic deflection of a cantilever
NASA Astrophysics Data System (ADS)
Wu, Xiaoqiang; Yu, Tongxi
1986-11-01
An extension of the Elastica theory is developed to study the large deflection of an elastic-perfectly plastic horizontal cantilever beam subjected to a vertical concentrated force at its tip. The entire process is divided into four stages: I.elastic in the whole cantilever; II.loading and developing of the plastic region; III.unloading in the plastic region; and IV.reverse loading. Solutions for stages I and II are presented in a closed form. A combination of closed-form solution and numerical integration is presented for stage III. Finally, stage IV is qualitatively studied. Computed results are given and compared with those from small-deflection theory and from the Elastica theory.
Deformation of extremal black holes from stringy interactions
NASA Astrophysics Data System (ADS)
Chen, Baoyi; Stein, Leo C.
2018-04-01
Black holes are a powerful setting for studying general relativity and theories beyond GR. However, analytical solutions for rotating black holes in beyond-GR theories are difficult to find because of the complexity of such theories. In this paper, we solve for the deformation to the near-horizon extremal Kerr metric due to two example string-inspired beyond-GR theories: Einstein-dilaton-Gauss-Bonnet and dynamical Chern-Simons theory. We accomplish this by making use of the enhanced symmetry group of NHEK and the weak-coupling limit of EdGB and dCS. We find that the EdGB metric deformation has a curvature singularity, while the dCS metric is regular. From these solutions, we compute orbital frequencies, horizon areas, and entropies. This sets the stage for analytically understanding the microscopic origin of black hole entropy in beyond-GR theories.
Nonperturbative study of dynamical SUSY breaking in N =(2 ,2 ) Yang-Mills theory
NASA Astrophysics Data System (ADS)
Catterall, Simon; Jha, Raghav G.; Joseph, Anosh
2018-03-01
We examine the possibility of dynamical supersymmetry breaking in two-dimensional N =(2 ,2 ) supersymmetric Yang-Mills theory. The theory is discretized on a Euclidean spacetime lattice using a supersymmetric lattice action. We compute the vacuum energy of the theory at finite temperature and take the zero-temperature limit. Supersymmetry will be spontaneously broken in this theory if the measured ground-state energy is nonzero. By performing simulations on a range of lattices up to 96 ×96 we are able to perform a careful extrapolation to the continuum limit for a wide range of temperatures. Subsequent extrapolations to the zero-temperature limit yield an upper bound on the ground-state energy density. We find the energy density to be statistically consistent with zero in agreement with the absence of dynamical supersymmetry breaking in this theory.
Multiconfiguration pair-density functional theory investigation of the electronic spectrum of MnO4-
NASA Astrophysics Data System (ADS)
Sharma, Prachi; Truhlar, Donald G.; Gagliardi, Laura
2018-03-01
The electronic spectrum of permanganate ions contains various highly multiconfigurational ligand-to-metal charge transfer states and is notorious for being one of the most challenging systems to be treated by quantum-chemical methods. Here we studied the lowest nine vertical excitation energies using restricted active space second-order perturbation theory (RASPT2) and multiconfiguration pair-density functional theory (MC-PDFT) to test and compare these two theories in computing such a challenging spectrum. The results are compared to literature data, including time-dependent density functional theory, completely renormalized equation-of-motion couple-cluster theory with single and double excitations, symmetry-adapted-cluster configuration interaction, and experimental spectra in the gas phase and solution. Our results show that MC-PDFT accurately predicts the spectrum at a significantly reduced cost as compared to RASPT2.
Multiconfiguration pair-density functional theory investigation of the electronic spectrum of MnO4.
Sharma, Prachi; Truhlar, Donald G; Gagliardi, Laura
2018-03-28
The electronic spectrum of permanganate ions contains various highly multiconfigurational ligand-to-metal charge transfer states and is notorious for being one of the most challenging systems to be treated by quantum-chemical methods. Here we studied the lowest nine vertical excitation energies using restricted active space second-order perturbation theory (RASPT2) and multiconfiguration pair-density functional theory (MC-PDFT) to test and compare these two theories in computing such a challenging spectrum. The results are compared to literature data, including time-dependent density functional theory, completely renormalized equation-of-motion couple-cluster theory with single and double excitations, symmetry-adapted-cluster configuration interaction, and experimental spectra in the gas phase and solution. Our results show that MC-PDFT accurately predicts the spectrum at a significantly reduced cost as compared to RASPT2.
Electronic structure, chemical bonding, and geometry of pure and Sr-doped CaCO3.
Stashans, Arvids; Chamba, Gaston; Pinto, Henry
2008-02-01
The electronic structure, chemical bonding, geometry, and effects produced by Sr-doping in CaCO(3) have been studied on the basis of density-functional theory using the VASP simulation package and molecular-orbital theory utilizing the CLUSTERD computer code. Two calcium carbonate structures which occur naturally in anhydrous crystalline forms, calcite and aragonite, were considered in the present investigation. The obtained diagrams of density of states show similar patterns for both materials. The spatial structures are computed and analyzed in comparison to the available experimental data. The electronic properties and atomic displacements because of the trace element Sr-incorporation are discussed in a comparative manner for the two crystalline structures. (c) 2007 Wiley Periodicals, Inc.
NASA Technical Reports Server (NTRS)
Mehra, R. K.; Washburn, R. B.; Sajan, S.; Carroll, J. V.
1979-01-01
A hierarchical real time algorithm for optimal three dimensional control of aircraft is described. Systematic methods are developed for real time computation of nonlinear feedback controls by means of singular perturbation theory. The results are applied to a six state, three control variable, point mass model of an F-4 aircraft. Nonlinear feedback laws are presented for computing the optimal control of throttle, bank angle, and angle of attack. Real Time capability is assessed on a TI 9900 microcomputer. The breakdown of the singular perturbation approximation near the terminal point is examined Continuation methods are examined to obtain exact optimal trajectories starting from the singular perturbation solutions.
Entanglement of purification in free scalar field theories
NASA Astrophysics Data System (ADS)
Bhattacharyya, Arpan; Takayanagi, Tadashi; Umemoto, Koji
2018-04-01
We compute the entanglement of purification (EoP) in a 2d free scalar field theory with various masses. This quantity measures correlations between two subsystems and is reduced to the entanglement entropy when the total system is pure. We obtain explicit numerical values by assuming minimal gaussian wave functionals for the purified states. We find that when the distance between the subsystems is large, the EoP behaves like the mutual information. However, when the distance is small, the EoP shows a characteristic behavior which qualitatively agrees with the conjectured holographic computation and which is different from that of the mutual information. We also study behaviors of mutual information in purified spaces and violations of monogamy/strong superadditivity.
Applying IRSS Theory: The Clark Atlanta University Exemplar
ERIC Educational Resources Information Center
Payton, Fay Cobb; Suarez-Brown, Tiki L.; Smith Lamar, Courtney
2012-01-01
The percentage of underrepresented minorities (African-American, Hispanic, Native Americans) that have obtained graduate level degrees within computing disciplines (computer science, computer information systems, computer engineering, and information technology) is dismal at best. Despite the fact that academia, the computing workforce,…
Curchod, Basile F E; Penfold, Thomas J; Rothlisberger, Ursula; Tavernelli, Ivano
2013-01-01
The implementation of local control theory using nonadiabatic molecular dynamics within the framework of linear-response time-dependent density functional theory is discussed. The method is applied to study the photoexcitation of lithium fluoride, for which we demonstrate that this approach can efficiently generate a pulse, on-the-fly, able to control the population transfer between two selected electronic states. Analysis of the computed control pulse yields insights into the photophysics of the process identifying the relevant frequencies associated to the curvature of the initial and final state potential energy curves and their energy differences. The limitations inherent to the use of the trajectory surface hopping approach are also discussed.
Theory of low frequency noise transmission through turbines
NASA Technical Reports Server (NTRS)
Matta, R. K.; Mani, R.
1979-01-01
Improvements of the existing theory of low frequency noise transmission through turbines and development of a working prediction tool are described. The existing actuator-disk model and a new finite-chord model were utilized in an analytical study. The interactive effect of adjacent blade rows, higher order spinning modes, blade-passage shocks, and duct area variations were considered separately. The improved theory was validated using the data acquired in an earlier NASA program. Computer programs incorporating the improved theory were produced for transmission loss prediction purposes. The programs were exercised parametrically and charts constructed to define approximately the low frequency noise transfer through turbines. The loss through the exhaust nozzle and flow(s) was also considered.
Cosmic ray diffusion: Report of the Workshop in Cosmic Ray Diffusion Theory
NASA Technical Reports Server (NTRS)
Birmingham, T. J.; Jones, F. C.
1975-01-01
A workshop in cosmic ray diffusion theory was held at Goddard Space Flight Center on May 16-17, 1974. Topics discussed and summarized are: (1) cosmic ray measurements as related to diffusion theory; (2) quasi-linear theory, nonlinear theory, and computer simulation of cosmic ray pitch-angle diffusion; and (3) magnetic field fluctuation measurements as related to diffusion theory.
NASA Technical Reports Server (NTRS)
Hanson, Donald B.; Parzych, David J.
1993-01-01
This report presents the derivation of a frequency domain theory and working equations for radiation of propeller harmonic noise in the presence of angular inflow. In applying the acoustic analogy, integration over the tangential coordinate of the source region is performed numerically, permitting the equations to be solved without approximation for any degree of angular inflow. Inflow angle is specified in terms of yaw, pitch, and roll angles of the aircraft. Since these can be arbitrarily large, the analysis applies with equal accuracy to propellers and helicopter rotors. For thickness and loading, the derivation is given in complete detail with working equations for near and far field. However, the quadrupole derivation has been carried only far enough to show feasibility of the numerical approach. Explicit formulas are presented for computation of source elements, evaluation of Green's functions, and location of observer points in various visual and retarded coordinate systems. The resulting computer program, called WOBBLE has been written in FORTRAN and follows the notation of this report very closely. The new theory is explored to establish the effects of varying inflow angle on axial and circumferential directivity. Also, parametric studies were performed to evaluate various phenomena outside the capabilities of earlier theories, such as an unsteady thickness effect. Validity of the theory was established by comparison with test data from conventional propellers and Prop Fans in flight and in wind tunnels under a variety of operating conditions and inflow angles.
Black, Nicola; Mullan, Barbara; Sharpe, Louise
2016-09-01
The current aim was to examine the effectiveness of behaviour change techniques (BCTs), theory and other characteristics in increasing the effectiveness of computer-delivered interventions (CDIs) to reduce alcohol consumption. Included were randomised studies with a primary aim of reducing alcohol consumption, which compared self-directed CDIs to assessment-only control groups. CDIs were coded for the use of 42 BCTs from an alcohol-specific taxonomy, the use of theory according to a theory coding scheme and general characteristics such as length of the CDI. Effectiveness of CDIs was assessed using random-effects meta-analysis and the association between the moderators and effect size was assessed using univariate and multivariate meta-regression. Ninety-three CDIs were included in at least one analysis and produced small, significant effects on five outcomes (d+ = 0.07-0.15). Larger effects occurred with some personal contact, provision of normative information or feedback on performance, prompting commitment or goal review, the social norms approach and in samples with more women. Smaller effects occurred when information on the consequences of alcohol consumption was provided. These findings can be used to inform both intervention- and theory-development. Intervention developers should focus on, including specific, effective techniques, rather than many techniques or more-elaborate approaches.
D-region blunt probe data analysis using hybrid computer techniques
NASA Technical Reports Server (NTRS)
Burkhard, W. J.
1973-01-01
The feasibility of performing data reduction techniques with a hybrid computer was studied. The data was obtained from the flight of a parachute born probe through the D-region of the ionosphere. A presentation of the theory of blunt probe operation is included with emphasis on the equations necessary to perform the analysis. This is followed by a discussion of computer program development. Included in this discussion is a comparison of computer and hand reduction results for the blunt probe launched on 31 January 1972. The comparison showed that it was both feasible and desirable to use the computer for data reduction. The results of computer data reduction performed on flight data acquired from five blunt probes are also presented.
Simulating Serious Games: A Discrete-Time Computational Model Based on Cognitive Flow Theory
ERIC Educational Resources Information Center
Westera, Wim
2018-01-01
This paper presents a computational model for simulating how people learn from serious games. While avoiding the combinatorial explosion of a games micro-states, the model offers a meso-level pathfinding approach, which is guided by cognitive flow theory and various concepts from learning sciences. It extends a basic, existing model by exposing…
ERIC Educational Resources Information Center
Pange, Jenny; Kontozisis, Dimitrios
2001-01-01
Greek preschoolers' level of knowledge about computers was examined as they participated in a classroom project to introduce them to new technologies. The project was based on Vygotsky's theory of socio-cultural learning. Findings suggest that this approach is a successful way to introduce new technologies to young children. (JPB)
Optical levitation experiments to assess the validity of the generalized Lorenz-Mie theory.
Guilloteau, F; Gréhan, G; Gouesbet, G
1992-05-20
Experimental near-forward-scattering diagrams obtained with one particle in optical levitation are recorded and compared with scattering diagrams computed by using the generalized Lorenz-Mie theory. Comparisons concern the particular case of an off-axis location of the particle. Agreement between experimental and computed diagrams is found to be satisfactory.
ERIC Educational Resources Information Center
Kelderman, Henk
1992-01-01
Describes algorithms used in the computer program LOGIMO for obtaining maximum likelihood estimates of the parameters in loglinear models. These algorithms are also useful for the analysis of loglinear item-response theory models. Presents modified versions of the iterative proportional fitting and Newton-Raphson algorithms. Simulated data…
The Concept of Energy in Psychological Theory. Cognitive Science Program, Technical Report No. 86-2.
ERIC Educational Resources Information Center
Posner, Michael I.; Rothbart, Mary Klevjord
This paper describes a basic framework for integration of computational and energetic concepts in psychological theory. The framework is adapted from a general effort to understand the neural systems underlying cognition. The element of the cognitive system that provides the best basis for attempting to relate energetic and computational ideas is…
Phenomenography and Grounded Theory as Research Methods in Computing Education Research Field
ERIC Educational Resources Information Center
Kinnunen, Paivi; Simon, Beth
2012-01-01
This paper discusses two qualitative research methods, phenomenography and grounded theory. We introduce both methods' data collection and analysis processes and the type or results you may get at the end by using examples from computing education research. We highlight some of the similarities and differences between the aim, data collection and…
Linking Pedagogical Theory of Computer Games to their Usability
ERIC Educational Resources Information Center
Ang, Chee Siang; Avni, Einav; Zaphiris, Panayiotis
2008-01-01
This article reviews a range of literature of computer games and learning theories and attempts to establish a link between them by proposing a typology of games which we use as a new usability measure for the development of guidelines for game-based learning. First, we examine game literature in order to understand the key elements that…
NASA Astrophysics Data System (ADS)
Reuter, Matthew; Tschudi, Stephen
When investigating the electrical response properties of molecules, experiments often measure conductance whereas computation predicts transmission probabilities. Although the Landauer-Büttiker theory relates the two in the limit of coherent scattering through the molecule, a direct comparison between experiment and computation can still be difficult. Experimental data (specifically that from break junctions) is statistical and computational results are deterministic. Many studies compare the most probable experimental conductance with computation, but such an analysis discards almost all of the experimental statistics. In this work we develop tools to decipher the Landauer-Büttiker transmission function directly from experimental statistics and then apply them to enable a fairer comparison between experimental and computational results.
The application of the thermodynamic perturbation theory to study the hydrophobic hydration.
Mohoric, Tomaz; Urbic, Tomaz; Hribar-Lee, Barbara
2013-07-14
The thermodynamic perturbation theory was tested against newly obtained Monte Carlo computer simulations to describe the major features of the hydrophobic effect in a simple 3D-Mercedes-Benz water model: the temperature and hydrophobe size dependence on entropy, enthalpy, and free energy of transfer of a simple hydrophobic solute into water. An excellent agreement was obtained between the theoretical and simulation results. Further, the thermodynamic perturbation theory qualitatively correctly (with respect to the experimental data) describes the solvation thermodynamics under conditions where the simulation results are difficult to obtain with good enough accuracy, e.g., at high pressures.
NASA Astrophysics Data System (ADS)
Zhang, Le; Zhang, Shaoxiang
2017-03-01
A body of research [1-7] has already shown that epigenetic reprogramming plays a critical role in maintaining the normal development of embryos. However, the mechanistic quantitation of the epigenetic interactions between sperms and oocytes and the related impact on embryo development are still not clear [6,7]. In this study, Wang et al., [8] develop a modeling framework that addresses this question by integrating game theory and the latest discoveries of the epigenetic control of embryo development.
The application of the thermodynamic perturbation theory to study the hydrophobic hydration
NASA Astrophysics Data System (ADS)
Mohorič, Tomaž; Urbic, Tomaz; Hribar-Lee, Barbara
2013-07-01
The thermodynamic perturbation theory was tested against newly obtained Monte Carlo computer simulations to describe the major features of the hydrophobic effect in a simple 3D-Mercedes-Benz water model: the temperature and hydrophobe size dependence on entropy, enthalpy, and free energy of transfer of a simple hydrophobic solute into water. An excellent agreement was obtained between the theoretical and simulation results. Further, the thermodynamic perturbation theory qualitatively correctly (with respect to the experimental data) describes the solvation thermodynamics under conditions where the simulation results are difficult to obtain with good enough accuracy, e.g., at high pressures.
NMR and NQR parameters of ethanol crystal
NASA Astrophysics Data System (ADS)
Milinković, M.; Bilalbegović, G.
2012-04-01
Electric field gradients and chemical shielding tensors of the stable monoclinic crystal phase of ethanol are computed. The projector-augmented wave (PAW) and gauge-including projector-augmented wave (GIPAW) models in the periodic plane-wave density functional theory are used. The crystal data from X-ray measurements, as well as the structures where either all atomic, or only hydrogen atom positions are optimized in the density functional theory are analyzed. These structural models are also studied by including the semi-empirical van der Waals correction to the density functional theory. Infrared spectra of these five crystal models are calculated.
NASA Technical Reports Server (NTRS)
Worm, Jeffrey A.; Culas, Donald E.
1991-01-01
Computers are not designed to handle terms where uncertainty is present. To deal with uncertainty, techniques other than classical logic must be developed. This paper examines the concepts of statistical analysis, the Dempster-Shafer theory, rough set theory, and fuzzy set theory to solve this problem. The fundamentals of these theories are combined to provide the possible optimal solution. By incorporating principles from these theories, a decision-making process may be simulated by extracting two sets of fuzzy rules: certain rules and possible rules. From these rules a corresponding measure of how much we believe these rules is constructed. From this, the idea of how much a fuzzy diagnosis is definable in terms of its fuzzy attributes is studied.
Five-dimensional gauge theory and compactification on a torus
NASA Astrophysics Data System (ADS)
Haghighat, Babak; Vandoren, Stefan
2011-09-01
We study five-dimensional minimally supersymmetric gauge theory compactified on a torus down to three dimensions, and its embedding into string/M-theory using geometric engineering. The moduli space on the Coulomb branch is hyperkähler equipped with a metric with modular transformation properties. We determine the one-loop corrections to the metric and show that they can be interpreted as worldsheet and D1-brane instantons in type IIB string theory. Furthermore, we analyze instanton corrections coming from the solitonic BPS magnetic string wrapped over the torus. In particular, we show how to compute the path-integral for the zero-modes from the partition function of the M5 brane, or, using a 2d/4d correspondence, from the partition function of N=4 SYM theory on a Hirzebruch surface.
Acceleration and Velocity Sensing from Measured Strain
NASA Technical Reports Server (NTRS)
Pak, Chan-Gi; Truax, Roger
2016-01-01
A simple approach for computing acceleration and velocity of a structure from the strain is proposed in this study. First, deflection and slope of the structure are computed from the strain using a two-step theory. Frequencies of the structure are computed from the time histories of strain using a parameter estimation technique together with an Autoregressive Moving Average model. From deflection, slope, and frequencies of the structure, acceleration and velocity of the structure can be obtained using the proposed approach. shape sensing, fiber optic strain sensor, system equivalent reduction and expansion process.
ERIC Educational Resources Information Center
Zhang, Li-Fang; He, Yunfeng
2003-01-01
In the present study, the thinking styles as defined in Sternberg's theory of mental self-government are tested against yet another domain relevant to student learning. This domain is students' knowledge and use of as well as their attitudes toward the use of computing and information technology (CIT) in education. One hundred and ninety-three (75…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Saad, Yousef
2014-03-19
The master project under which this work is funded had as its main objective to develop computational methods for modeling electronic excited-state and optical properties of various nanostructures. The specific goals of the computer science group were primarily to develop effective numerical algorithms in Density Functional Theory (DFT) and Time Dependent Density Functional Theory (TDDFT). There were essentially four distinct stated objectives. The first objective was to study and develop effective numerical algorithms for solving large eigenvalue problems such as those that arise in Density Functional Theory (DFT) methods. The second objective was to explore so-called linear scaling methods ormore » Methods that avoid diagonalization. The third was to develop effective approaches for Time-Dependent DFT (TDDFT). Our fourth and final objective was to examine effective solution strategies for other problems in electronic excitations, such as the GW/Bethe-Salpeter method, and quantum transport problems.« less
Ab initio calculation of resonant Raman intensities of transition metal dichalcogenides
NASA Astrophysics Data System (ADS)
Miranda, Henrique; Reichardt, Sven; Molina-Sanchez, Alejandro; Wirtz, Ludger
Raman spectroscopy is used to characterize optical and vibrational properties of materials. Its computational simulation is important for the interpretation of experimental results. Two approaches are the bond polarizability model and density functional perturbation theory. However, both are known to not capture resonance effects. These resonances and quantum interference effects are important to correctly reproduce the intensities as a function of laser energy as, e.g., reported for the case of multi-layer MoTe21.We present two fully ab initio approaches that overcome this limitation. In the first, we calculate finite difference derivatives of the dielectric susceptibility with the phonon displacements2. In the second we calculate electron-light and electron-phonon matrix elements from density functional theory and use them to evaluate expressions for the Raman intensity derived from time-dependent perturbation theory. These expressions are implemented in a computer code that performs the calculations as a post-processing step. We compare both methods and study the case of triple-layer MoTe2. Luxembourg National Research Fund (FNR).
RG flow from Φ 4 theory to the 2D Ising model
Anand, Nikhil; Genest, Vincent X.; Katz, Emanuel; ...
2017-08-16
We study 1+1 dimensional Φ 4 theory using the recently proposed method of conformal truncation. Starting in the UV CFT of free field theory, we construct a complete basis of states with definite conformal Casimir, C. We use these states to express the Hamiltonian of the full interacting theory in lightcone quantization. After truncating to states with C≤C max, we numerically diagonalize the Hamiltonian at strong coupling and study the resulting IR dynamics. We compute non-perturbative spectral densities of several local operators, which are equivalent to real-time, infinite-volume correlation functions. These spectral densities, which include the Zamolodchikov C-function along themore » full RG flow, are calculable at any value of the coupling. Near criticality, our numerical results reproduce correlation functions in the 2D Ising model.« less
Toward a domain theory in English as a second language.
Strong-Krause, Diane
2009-01-01
This paper demonstrates how domain theory development is enhanced by using both theoretical data and empirical data. The study explored the domain of speaking English as a second language (ESL) comparing hypothetical data on speaking tasks provided by an experienced teacher and by a certified ACTFL oral proficiency interview rater with observed data from scores on a computer-delivered speaking exam. While the hypothetical data and observed data showed similar patterns in task difficulty in general, some tasks were identified as being much easier or harder than expected. These differences raise questions not only about test task design but also about the theoretical underpinnings of the domain. The results of the study suggest that this approach, where theory and data are examined together, will improve test design as well as benefit domain theory development.
NASA Astrophysics Data System (ADS)
Hamed, Samia; Sharifzadeh, Sahar; Neaton, Jeffrey
2014-03-01
Elucidation of the energy transfer mechanism in natural photosynthetic systems remains an exciting challenge. In particular, biomimetic protein-pigment complexes provide a unique study space in which individual parameters are adjusted and the impact of those changes captured. Here, we compute the excited state properties of a group of xanthene-derivative chromophores to be employed in the construction of new biomimetic light harvesting frameworks. Excitation energies, transition dipoles, and natural transition orbitals for the low-lying singlet and triplet states of these experimentally-relevant chromophores are obtained from first-principles density functional theory. The performance of several exchange-correlation functionals, including an optimally-tuned range-separated hybrid, are evaluated and compared with many body perturbation theory and experiment. Finally, we will discuss the implication of our results for the bottom-up design of new chromophores. This work is supported by the DOE and computational resources are provided by NERSC.
Singh, Gurpreet; Mohanty, B P; Saini, G S S
2016-02-15
Structure, vibrational and nuclear magnetic resonance spectra, and antioxidant action of ascorbic acid towards hydroxyl radicals have been studied computationally and in vitro by ultraviolet-visible, nuclear magnetic resonance and vibrational spectroscopic techniques. Time dependant density functional theory calculations have been employed to specify various electronic transitions in ultraviolet-visible spectra. Observed chemical shifts and vibrational bands in nuclear magnetic resonance and vibrational spectra, respectively have been assigned with the help of calculations. Changes in the structure of ascorbic acid in aqueous phase have been examined computationally and experimentally by recording Raman spectra in aqueous medium. Theoretical calculations of the interaction between ascorbic acid molecule and hydroxyl radical predicted the formation of dehydroascorbic acid as first product, which has been confirmed by comparing its simulated spectra with the corresponding spectra of ascorbic acid in presence of hydrogen peroxide. Copyright © 2015 Elsevier B.V. All rights reserved.
Methods of training the graduate level and professional geologist in remote sensing technology
NASA Technical Reports Server (NTRS)
Kolm, K. E.
1981-01-01
Requirements for a basic course in remote sensing to accommodate the needs of the graduate level and professional geologist are described. The course should stress the general topics of basic remote sensing theory, the theory and data types relating to different remote sensing systems, an introduction to the basic concepts of computer image processing and analysis, the characteristics of different data types, the development of methods for geological interpretations, the integration of all scales and data types of remote sensing in a given study, the integration of other data bases (geophysical and geochemical) into a remote sensing study, and geological remote sensing applications. The laboratories should stress hands on experience to reinforce the concepts and procedures presented in the lecture. The geologist should then be encouraged to pursue a second course in computer image processing and analysis of remotely sensed data.
NASA Astrophysics Data System (ADS)
Kumar, S. Anil; Bhaskar, BL
2018-02-01
Ab-initio computational study of antihemorrhage drug molecule diethylammonium 2,5-dihydroxybenzene sulfonate, popularly known as ethamsylate, has been attempted using Gaussian 09. The optimized molecular geometry has been envisaged using density functional theory method at B3LYP/6-311 basis set. Different geometrical parameters like bond lengths and bond angles were computed and compared against the experimental results available in literature. Fourier transform infrared scanning of the title molecule was performed and vibrational frequencies were also computed using Gaussian software. The presence of O-H---O hydrogen bonds between C6H5O5S- anions and N-H---O hydrogen bonds between anion and cation is evident in the computational studies also. In general, satisfactory agreement of concordance has been observed between computational and experimental results.
Infrared computations of defect Schur indices
Córdova, Clay; Gaiotto, Davide; Shao, Shu-Heng
2016-11-18
We conjecture a formula for the Schur index of four-dimensional N = 2 theories in the presence of boundary conditions and/or line defects, in terms of the low-energy effective Seiberg-Witten description of the system together with massive BPS excitations. We test our proposal in a variety of examples for SU(2) gauge theories, either conformal or asymptotically free. We use the conjecture to compute these defect-enriched Schur indices for theories which lack a Lagrangian description, such as Argyres-Douglas theories. We demonstrate in various examples that line defect indices can be expressed as sums of characters of the associated two-dimensional chiral algebramore » and that for Argyres-Douglas theories the line defect OPE reduces in the index to the Verlinde algebra.« less
Infrared computations of defect Schur indices
DOE Office of Scientific and Technical Information (OSTI.GOV)
Córdova, Clay; Gaiotto, Davide; Shao, Shu-Heng
We conjecture a formula for the Schur index of four-dimensional N = 2 theories in the presence of boundary conditions and/or line defects, in terms of the low-energy effective Seiberg-Witten description of the system together with massive BPS excitations. We test our proposal in a variety of examples for SU(2) gauge theories, either conformal or asymptotically free. We use the conjecture to compute these defect-enriched Schur indices for theories which lack a Lagrangian description, such as Argyres-Douglas theories. We demonstrate in various examples that line defect indices can be expressed as sums of characters of the associated two-dimensional chiral algebramore » and that for Argyres-Douglas theories the line defect OPE reduces in the index to the Verlinde algebra.« less
Wilbraham, Liam; Verma, Pragya; Truhlar, Donald G; Gagliardi, Laura; Ciofini, Ilaria
2017-05-04
The spin-state orderings in nine Fe(II) and Fe(III) complexes with ligands of diverse ligand-field strength were investigated with multiconfiguration pair-density functional theory (MC-PDFT). The performance of this method was compared to that of complete active space second-order perturbation theory (CASPT2) and Kohn-Sham density functional theory. We also investigated the dependence of CASPT2 and MC-PDFT results on the size of the active-space. MC-PDFT reproduces the CASPT2 spin-state ordering, the dependence on the ligand field strength, and the dependence on active space at a computational cost that is significantly reduced as compared to CASPT2.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Svendsen, Harald G.
In this paper we study a solution of heterotic string theory corresponding to a rotating Kerr-Taub-NUT spacetime. It has an exact CFT description as a heterotic coset model, and a Lagrangian formulation as a gauged WZNW model. It is a generalization of a recently discussed stringy Taub-NUT solution, and is interesting as another laboratory for studying the fate of closed timelike curves and cosmological singularities in string theory. We extend the computation of the exact metric and dilaton to this rotating case, and then discuss some properties of the metric, with particular emphasis on the curvature singularities.
The mathematical theory of signal processing and compression-designs
NASA Astrophysics Data System (ADS)
Feria, Erlan H.
2006-05-01
The mathematical theory of signal processing, named processor coding, will be shown to inherently arise as the computational time dual of Shannon's mathematical theory of communication which is also known as source coding. Source coding is concerned with signal source memory space compression while processor coding deals with signal processor computational time compression. Their combination is named compression-designs and referred as Conde in short. A compelling and pedagogically appealing diagram will be discussed highlighting Conde's remarkable successful application to real-world knowledge-aided (KA) airborne moving target indicator (AMTI) radar.
Non-polynomial closed string field theory: loops and conformal maps
NASA Astrophysics Data System (ADS)
Hua, Long; Kaku, Michio
1990-11-01
Recently, we proposed the complete classical action for the non-polynomial closed string field theory, which succesfully reproduced all closed string tree amplitudes. (The action was simultaneously proposed by the Kyoto group). In this paper, we analyze the structure of the theory. We (a) compute the explicit conformal map for all g-loop, p-puncture diagrams, (b) compute all one-loop, two-puncture maps in terms of hyper-elliptic functions, and (c) analyze their modular structure. We analyze, but do not resolve, the question of modular invariance.
Real-time dynamics of lattice gauge theories with a few-qubit quantum computer
NASA Astrophysics Data System (ADS)
Martinez, Esteban A.; Muschik, Christine A.; Schindler, Philipp; Nigg, Daniel; Erhard, Alexander; Heyl, Markus; Hauke, Philipp; Dalmonte, Marcello; Monz, Thomas; Zoller, Peter; Blatt, Rainer
2016-06-01
Gauge theories are fundamental to our understanding of interactions between the elementary constituents of matter as mediated by gauge bosons. However, computing the real-time dynamics in gauge theories is a notorious challenge for classical computational methods. This has recently stimulated theoretical effort, using Feynman’s idea of a quantum simulator, to devise schemes for simulating such theories on engineered quantum-mechanical devices, with the difficulty that gauge invariance and the associated local conservation laws (Gauss laws) need to be implemented. Here we report the experimental demonstration of a digital quantum simulation of a lattice gauge theory, by realizing (1 + 1)-dimensional quantum electrodynamics (the Schwinger model) on a few-qubit trapped-ion quantum computer. We are interested in the real-time evolution of the Schwinger mechanism, describing the instability of the bare vacuum due to quantum fluctuations, which manifests itself in the spontaneous creation of electron-positron pairs. To make efficient use of our quantum resources, we map the original problem to a spin model by eliminating the gauge fields in favour of exotic long-range interactions, which can be directly and efficiently implemented on an ion trap architecture. We explore the Schwinger mechanism of particle-antiparticle generation by monitoring the mass production and the vacuum persistence amplitude. Moreover, we track the real-time evolution of entanglement in the system, which illustrates how particle creation and entanglement generation are directly related. Our work represents a first step towards quantum simulation of high-energy theories using atomic physics experiments—the long-term intention is to extend this approach to real-time quantum simulations of non-Abelian lattice gauge theories.
Importance of elastic finite-size effects: Neutral defects in ionic compounds
Burr, P. A.; Cooper, M. W. D.
2017-09-15
Small system sizes are a well known source of error in DFT calculations, yet computational constraints frequently dictate the use of small supercells, often as small as 96 atoms in oxides and compound semiconductors. In ionic compounds, electrostatic finite size effects have been well characterised, but self-interaction of charge neutral defects is often discounted or assumed to follow an asymptotic behaviour and thus easily corrected with linear elastic theory. Here we show that elastic effect are also important in the description of defects in ionic compounds and can lead to qualitatively incorrect conclusions if inadequatly small supercells are used; moreover,more » the spurious self-interaction does not follow the behaviour predicted by linear elastic theory. Considering the exemplar cases of metal oxides with fluorite structure, we show that numerous previous studies, employing 96-atom supercells, misidentify the ground state structure of (charge neutral) Schottky defects. We show that the error is eliminated by employing larger cells (324, 768 and 1500 atoms), and careful analysis determines that elastic effects, not electrostatic, are responsible. The spurious self-interaction was also observed in non-oxide ionic compounds and irrespective of the computational method used, thereby resolving long standing discrepancies between DFT and force-field methods, previously attributed to the level of theory. The surprising magnitude of the elastic effects are a cautionary tale for defect calculations in ionic materials, particularly when employing computationally expensive methods (e.g. hybrid functionals) or when modelling large defect clusters. We propose two computationally practicable methods to test the magnitude of the elastic self-interaction in any ionic system. In commonly studies oxides, where electrostatic effects would be expected to be dominant, it is the elastic effects that dictate the need for larger supercells | greater than 96 atoms.« less
Importance of elastic finite-size effects: Neutral defects in ionic compounds
DOE Office of Scientific and Technical Information (OSTI.GOV)
Burr, P. A.; Cooper, M. W. D.
Small system sizes are a well known source of error in DFT calculations, yet computational constraints frequently dictate the use of small supercells, often as small as 96 atoms in oxides and compound semiconductors. In ionic compounds, electrostatic finite size effects have been well characterised, but self-interaction of charge neutral defects is often discounted or assumed to follow an asymptotic behaviour and thus easily corrected with linear elastic theory. Here we show that elastic effect are also important in the description of defects in ionic compounds and can lead to qualitatively incorrect conclusions if inadequatly small supercells are used; moreover,more » the spurious self-interaction does not follow the behaviour predicted by linear elastic theory. Considering the exemplar cases of metal oxides with fluorite structure, we show that numerous previous studies, employing 96-atom supercells, misidentify the ground state structure of (charge neutral) Schottky defects. We show that the error is eliminated by employing larger cells (324, 768 and 1500 atoms), and careful analysis determines that elastic effects, not electrostatic, are responsible. The spurious self-interaction was also observed in non-oxide ionic compounds and irrespective of the computational method used, thereby resolving long standing discrepancies between DFT and force-field methods, previously attributed to the level of theory. The surprising magnitude of the elastic effects are a cautionary tale for defect calculations in ionic materials, particularly when employing computationally expensive methods (e.g. hybrid functionals) or when modelling large defect clusters. We propose two computationally practicable methods to test the magnitude of the elastic self-interaction in any ionic system. In commonly studies oxides, where electrostatic effects would be expected to be dominant, it is the elastic effects that dictate the need for larger supercells | greater than 96 atoms.« less
A Review of Computer-Based Human Behavior Representations and Their Relation to Military Simulations
2003-08-01
described by Emery and Trist (1960), activity theory introduced by Vygotsky in the 1930s and formalized by Leont’ev (1979) and situated cognition theory by...II-6 B. Adaptive Resonance Theory (ART) .......................................................... II-6 1. Model...II-31 G. Cognitive Complexity Theory (CCT
Research on a Frame-Based Model of Reading Comprehension. Final Report.
ERIC Educational Resources Information Center
Goldstein, Ira
This report summarizes computational investigations of language comprehension based on Marvin Minsky's theory of frames, a recent advance in artifical intelligence theories about the representation of knowledge. The investigations discussed explored frame theory as a basis for text comprehension by implementing models of the theory and developing…
Concepts as Semantic Pointers: A Framework and Computational Model
ERIC Educational Resources Information Center
Blouw, Peter; Solodkin, Eugene; Thagard, Paul; Eliasmith, Chris
2016-01-01
The reconciliation of theories of concepts based on prototypes, exemplars, and theory-like structures is a longstanding problem in cognitive science. In response to this problem, researchers have recently tended to adopt either hybrid theories that combine various kinds of representational structure, or eliminative theories that replace concepts…
Theoretical studies of electronically excited states
DOE Office of Scientific and Technical Information (OSTI.GOV)
Besley, Nicholas A.
2014-10-06
Time-dependent density functional theory is the most widely used quantum chemical method for studying molecules in electronically excited states. However, excited states can also be computed within Kohn-Sham density functional theory by exploiting methods that converge the self-consistent field equations to give excited state solutions. The usefulness of single reference self-consistent field based approaches for studying excited states is demonstrated by considering the calculation of several types of spectroscopy including the infrared spectroscopy of molecules in an electronically excited state, the rovibrational spectrum of the NO-Ar complex, core electron binding energies and the emission spectroscopy of BODIPY in water.
ERIC Educational Resources Information Center
Alty, James L.
Dual Coding Theory has quite specific predictions about how information in different media is stored, manipulated and recalled. Different combinations of media are expected to have significant effects upon the recall and retention of information. This obviously may have important consequences in the design of computer-based programs. The paper…
NASA Technical Reports Server (NTRS)
Gokoglu, S. A.; Chen, B. K.; Rosner, D. E.
1984-01-01
The computer program based on multicomponent chemically frozen boundary layer (CFBL) theory for calculating vapor and/or small particle deposition rates is documented. A specific application to perimter-averaged Na2SO4 deposition rate calculations on a cylindrical collector is demonstrated. The manual includes a typical program input and output for users.
ERIC Educational Resources Information Center
Mousikou, Petroula; Rastle, Kathleen; Besner, Derek; Coltheart, Max
2015-01-01
Dual-route theories of reading posit that a sublexical reading mechanism that operates serially and from left to right is involved in the orthography-to-phonology computation. These theories attribute the masked onset priming effect (MOPE) and the phonological Stroop effect (PSE) to the serial left-to-right operation of this mechanism. However,…
ERIC Educational Resources Information Center
Impelluso, Thomas J.
2009-01-01
Cognitive Load Theory (CLT) was used as a foundation to redesign a computer programming class for mechanical engineers, in which content was delivered with hybrid/distance technology. The effort confirmed the utility of CLT in course design. And it demonstrates that hybrid/distance learning is not merely a tool of convenience, but one, which, when…
ERIC Educational Resources Information Center
Litofsky, Joshua; Viswanathan, Rama
2015-01-01
Matrix diagonalization, the key technique at the heart of modern computational chemistry for the numerical solution of the Schrödinger equation, can be easily introduced in the physical chemistry curriculum in a pedagogical context using simple Hückel molecular orbital theory for p bonding in molecules. We present details and results of…
NASA Technical Reports Server (NTRS)
Amiet, R. K.
1991-01-01
A unified theory for aerodynamics and noise of advanced turboprops is presented. The theory and a computer code developed for evaluation at the shielding benefits that might be expected by an aircraft wing in a wing-mounted propeller installation are presented. Several computed directivity patterns are presented to demonstrate the theory. Recently with the advent of the concept of using the wing of an aircraft for noise shielding, the case of diffraction by a surface in a flow has been given attention. The present analysis is based on the case of diffraction of no flow. By combining a Galilean and a Lorentz transform, the wave equation with a mean flow can be reduced to the ordinary equation. Allowance is also made in the analysis for the case of a swept wing. The same combination of Galilean and Lorentz transforms lead to a problem with no flow but a different sweep. The solution procedures for the cases of leading and trailing edges are basically the same. Two normalizations of the solution are given by the computer program. FORTRAN computer programs are presented with detailed documentation. The output from these programs compares favorably with the results of other investigators.
Dispersion interactions between neighboring Bi atoms in (BiH3 )2 and Te(BiR2 )2.
Haack, Rebekka; Schulz, Stephan; Jansen, Georg
2018-03-13
Triggered by the observation of a short Bi⋯Bi distance and a BiTeBi bond angle of only 86.6° in the crystal structure of bis(diethylbismuthanyl)tellurane quantum chemical computations on interactions between neighboring Bi atoms in Te(BiR 2 ) 2 molecules (R = H, Me, Et) and in (BiH 3 ) 2 were undertaken. Bi⋯Bi distances atoms were found to significantly shorten upon inclusion of the d shells of the heavy metal atoms into the electron correlation treatment, and it was confirmed that interaction energies from spin component-scaled second-order Møller-Plesset theory (SCS-MP2) agree well with coupled-cluster singles and doubles theory including perturbative triples (CCSD(T)). Density functional theory-based symmetry-adapted perturbation theory (DFT-SAPT) was used to study the anisotropy of the interplay of dispersion attraction and steric repulsion between the Bi atoms. Finally, geometries and relative stabilities of syn-syn and syn-anti conformers of Te(BiR 2 ) 2 (R = H, Me, Et) and interconversion barriers between them were computed. © 2018 Wiley Periodicals, Inc. © 2018 Wiley Periodicals, Inc.
Systematics of the cusp anomalous dimension
NASA Astrophysics Data System (ADS)
Henn, Johannes M.; Huber, Tobias
2012-11-01
We study the velocity-dependent cusp anomalous dimension in supersymmetric Yang-Mills theory. In a paper by Correa, Maldacena, Sever, and one of the present authors, a scaling limit was identified in which the ladder diagrams are dominant and are mapped onto a Schrödinger problem. We show how to solve the latter in perturbation theory and provide an algorithm to compute the solution at any loop order. The answer is written in terms of harmonic polylogarithms. Moreover, we give evidence for two curious properties of the result. Firstly, we observe that the result can be written using a subset of harmonic polylogarithms only, at least up to six loops. Secondly, we show that in a light-like limit, only single zeta values appear in the asymptotic expansion, again up to six loops. We then extend the analysis of the scaling limit to systematically include subleading terms. This leads to a Schrödinger-type equation, but with an inhomogeneous term. We show how its solution can be computed in perturbation theory, in a way similar to the leading order case. Finally, we analyze the strong coupling limit of these subleading contributions and compare them to the string theory answer. We find agreement between the two calculations.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bazante, Alexandre P., E-mail: abazante@chem.ufl.edu; Bartlett, Rodney J.; Davidson, E. R.
The benzene radical anion is studied with ab initio coupled-cluster theory in large basis sets. Unlike the usual assumption, we find that, at the level of theory investigated, the minimum energy geometry is non-planar with tetrahedral distortion at two opposite carbon atoms. The anion is well known for its instability to auto-ionization which poses computational challenges to determine its properties. Despite the importance of the benzene radical anion, the considerable attention it has received in the literature so far has failed to address the details of its structure and shape-resonance character at a high level of theory. Here, we examinemore » the dynamic Jahn-Teller effect and its impact on the anion potential energy surface. We find that a minimum energy geometry of C{sub 2} symmetry is located below one D{sub 2h} stationary point on a C{sub 2h} pseudo-rotation surface. The applicability of standard wave function methods to an unbound anion is assessed with the stabilization method. The isotropic hyperfine splitting constants (A{sub iso}) are computed and compared to data obtained from experimental electron spin resonance experiments. Satisfactory agreement with experiment is obtained with coupled-cluster theory and large basis sets such as cc-pCVQZ.« less
NASA Astrophysics Data System (ADS)
Chibani, Wael; Ren, Xinguo; Scheffler, Matthias; Rinke, Patrick
2016-04-01
We present an embedding scheme for periodic systems that facilitates the treatment of the physically important part (here a unit cell or a supercell) with advanced electronic structure methods, that are computationally too expensive for periodic systems. The rest of the periodic system is treated with computationally less demanding approaches, e.g., Kohn-Sham density-functional theory, in a self-consistent manner. Our scheme is based on the concept of dynamical mean-field theory formulated in terms of Green's functions. Our real-space dynamical mean-field embedding scheme features two nested Dyson equations, one for the embedded cluster and another for the periodic surrounding. The total energy is computed from the resulting Green's functions. The performance of our scheme is demonstrated by treating the embedded region with hybrid functionals and many-body perturbation theory in the GW approach for simple bulk systems. The total energy and the density of states converge rapidly with respect to the computational parameters and approach their bulk limit with increasing cluster (i.e., computational supercell) size.
Dressing the post-Newtonian two-body problem and classical effective field theory
NASA Astrophysics Data System (ADS)
Kol, Barak; Smolkin, Michael
2009-12-01
We apply a dressed perturbation theory to better organize and economize the computation of high orders of the 2-body effective action of an inspiralling post-Newtonian (PN) gravitating binary. We use the effective field theory approach with the nonrelativistic field decomposition (NRG fields). For that purpose we develop quite generally the dressing theory of a nonlinear classical field theory coupled to pointlike sources. We introduce dressed charges and propagators, but unlike the quantum theory there are no dressed bulk vertices. The dressed quantities are found to obey recursive integral equations which succinctly encode parts of the diagrammatic expansion, and are the classical version of the Schwinger-Dyson equations. Actually, the classical equations are somewhat stronger since they involve only finitely many quantities, unlike the quantum theory. Classical diagrams are shown to factorize exactly when they contain nonlinear worldline vertices, and we classify all the possible topologies of irreducible diagrams for low loop numbers. We apply the dressing program to our post-Newtonian case of interest. The dressed charges consist of the dressed energy-momentum tensor after a nonrelativistic decomposition, and we compute all dressed charges (in the harmonic gauge) appearing up to 2PN in the 2-body effective action (and more). We determine the irreducible skeleton diagrams up to 3PN and we employ the dressed charges to compute several terms beyond 2PN.
Höfener, Sebastian; Trumm, Michael; Koke, Carsten; Heuser, Johannes; Ekström, Ulf; Skerencak-Frech, Andrej; Schimmelpfennig, Bernd; Panak, Petra J
2016-03-21
We report a combined computational and experimental study to investigate the UV/vis spectra of 2,6-bis(5,6-dialkyl-1,2,4-triazin-3-yl)pyridine (BTP) ligands in solution. In order to study molecules in solution using theoretical methods, force-field parameters for the ligand-water interaction are adjusted to ab initio quantum chemical calculations. Based on these parameters, molecular dynamics (MD) simulations are carried out from which snapshots are extracted as input to quantum chemical excitation-energy calculations to obtain UV/vis spectra of BTP ligands in solution using time-dependent density functional theory (TDDFT) employing the Tamm-Dancoff approximation (TDA). The range-separated CAM-B3LYP functional is used to avoid large errors for charge-transfer states occurring in the electronic spectra. In order to study environment effects with theoretical methods, the frozen-density embedding scheme is applied. This computational procedure allows to obtain electronic spectra calculated at the (range-separated) DFT level of theory in solution, revealing solvatochromic shifts upon solvation of up to about 0.6 eV. Comparison to experimental data shows a significantly improved agreement compared to vacuum calculations and enables the analysis of relevant excitations for the line shape in solution.