Sample records for classical test theory

  1. The Examination of Reliability According to Classical Test and Generalizability on a Job Performance Scale

    ERIC Educational Resources Information Center

    Yelboga, Atilla; Tavsancil, Ezel

    2010-01-01

    In this research, the classical test theory and generalizability theory analyses were carried out with the data obtained by a job performance scale for the years 2005 and 2006. The reliability coefficients obtained (estimated) from the classical test theory and generalizability theory analyses were compared. In classical test theory, test retest…

  2. On the Relationship Between Classical Test Theory and Item Response Theory: From One to the Other and Back.

    PubMed

    Raykov, Tenko; Marcoulides, George A

    2016-04-01

    The frequently neglected and often misunderstood relationship between classical test theory and item response theory is discussed for the unidimensional case with binary measures and no guessing. It is pointed out that popular item response models can be directly obtained from classical test theory-based models by accounting for the discrete nature of the observed items. Two distinct observational equivalence approaches are outlined that render the item response models from corresponding classical test theory-based models, and can each be used to obtain the former from the latter models. Similarly, classical test theory models can be furnished using the reverse application of either of those approaches from corresponding item response models.

  3. On the Relationship between Classical Test Theory and Item Response Theory: From One to the Other and Back

    ERIC Educational Resources Information Center

    Raykov, Tenko; Marcoulides, George A.

    2016-01-01

    The frequently neglected and often misunderstood relationship between classical test theory and item response theory is discussed for the unidimensional case with binary measures and no guessing. It is pointed out that popular item response models can be directly obtained from classical test theory-based models by accounting for the discrete…

  4. Rasch-family models are more valuable than score-based approaches for analysing longitudinal patient-reported outcomes with missing data.

    PubMed

    de Bock, Élodie; Hardouin, Jean-Benoit; Blanchin, Myriam; Le Neel, Tanguy; Kubis, Gildas; Bonnaud-Antignac, Angélique; Dantan, Étienne; Sébille, Véronique

    2016-10-01

    The objective was to compare classical test theory and Rasch-family models derived from item response theory for the analysis of longitudinal patient-reported outcomes data with possibly informative intermittent missing items. A simulation study was performed in order to assess and compare the performance of classical test theory and Rasch model in terms of bias, control of the type I error and power of the test of time effect. The type I error was controlled for classical test theory and Rasch model whether data were complete or some items were missing. Both methods were unbiased and displayed similar power with complete data. When items were missing, Rasch model remained unbiased and displayed higher power than classical test theory. Rasch model performed better than the classical test theory approach regarding the analysis of longitudinal patient-reported outcomes with possibly informative intermittent missing items mainly for power. This study highlights the interest of Rasch-based models in clinical research and epidemiology for the analysis of incomplete patient-reported outcomes data. © The Author(s) 2013.

  5. Demonstrating the Difference between Classical Test Theory and Item Response Theory Using Derived Test Data

    ERIC Educational Resources Information Center

    Magno, Carlo

    2009-01-01

    The present report demonstrates the difference between classical test theory (CTT) and item response theory (IRT) approach using an actual test data for chemistry junior high school students. The CTT and IRT were compared across two samples and two forms of test on their item difficulty, internal consistency, and measurement errors. The specific…

  6. Relationships among Classical Test Theory and Item Response Theory Frameworks via Factor Analytic Models

    ERIC Educational Resources Information Center

    Kohli, Nidhi; Koran, Jennifer; Henn, Lisa

    2015-01-01

    There are well-defined theoretical differences between the classical test theory (CTT) and item response theory (IRT) frameworks. It is understood that in the CTT framework, person and item statistics are test- and sample-dependent. This is not the perception with IRT. For this reason, the IRT framework is considered to be theoretically superior…

  7. The Prediction of Item Parameters Based on Classical Test Theory and Latent Trait Theory

    ERIC Educational Resources Information Center

    Anil, Duygu

    2008-01-01

    In this study, the prediction power of the item characteristics based on the experts' predictions on conditions try-out practices cannot be applied was examined for item characteristics computed depending on classical test theory and two-parameters logistic model of latent trait theory. The study was carried out on 9914 randomly selected students…

  8. Overview of classical test theory and item response theory for the quantitative assessment of items in developing patient-reported outcomes measures.

    PubMed

    Cappelleri, Joseph C; Jason Lundy, J; Hays, Ron D

    2014-05-01

    The US Food and Drug Administration's guidance for industry document on patient-reported outcomes (PRO) defines content validity as "the extent to which the instrument measures the concept of interest" (FDA, 2009, p. 12). According to Strauss and Smith (2009), construct validity "is now generally viewed as a unifying form of validity for psychological measurements, subsuming both content and criterion validity" (p. 7). Hence, both qualitative and quantitative information are essential in evaluating the validity of measures. We review classical test theory and item response theory (IRT) approaches to evaluating PRO measures, including frequency of responses to each category of the items in a multi-item scale, the distribution of scale scores, floor and ceiling effects, the relationship between item response options and the total score, and the extent to which hypothesized "difficulty" (severity) order of items is represented by observed responses. If a researcher has few qualitative data and wants to get preliminary information about the content validity of the instrument, then descriptive assessments using classical test theory should be the first step. As the sample size grows during subsequent stages of instrument development, confidence in the numerical estimates from Rasch and other IRT models (as well as those of classical test theory) would also grow. Classical test theory and IRT can be useful in providing a quantitative assessment of items and scales during the content-validity phase of PRO-measure development. Depending on the particular type of measure and the specific circumstances, the classical test theory and/or the IRT should be considered to help maximize the content validity of PRO measures. Copyright © 2014 Elsevier HS Journals, Inc. All rights reserved.

  9. An Examination of the Flynn Effect in the National Intelligence Test in Estonia

    ERIC Educational Resources Information Center

    Shiu, William

    2012-01-01

    This study examined the Flynn Effect (FE; i.e., the rise in IQ scores over time) in Estonia from Scale B of the National Intelligence Test using both classical test theory (CTT) and item response theory (IRT) methods. Secondary data from two cohorts (1934, n = 890 and 2006, n = 913) of students were analyzed, using both classical test theory (CTT)…

  10. Studying Reliability of Open Ended Mathematics Items According to the Classical Test Theory and Generalizability Theory

    ERIC Educational Resources Information Center

    Guler, Nese; Gelbal, Selahattin

    2010-01-01

    In this study, the Classical test theory and generalizability theory were used for determination to reliability of scores obtained from measurement tool of mathematics success. 24 open-ended mathematics question of the TIMSS-1999 was applied to 203 students in 2007-spring semester. Internal consistency of scores was found as 0.92. For…

  11. Accuracy of a Classical Test Theory-Based Procedure for Estimating the Reliability of a Multistage Test. Research Report. ETS RR-17-02

    ERIC Educational Resources Information Center

    Kim, Sooyeon; Livingston, Samuel A.

    2017-01-01

    The purpose of this simulation study was to assess the accuracy of a classical test theory (CTT)-based procedure for estimating the alternate-forms reliability of scores on a multistage test (MST) having 3 stages. We generated item difficulty and discrimination parameters for 10 parallel, nonoverlapping forms of the complete 3-stage test and…

  12. Influences on and Limitations of Classical Test Theory Reliability Estimates.

    ERIC Educational Resources Information Center

    Arnold, Margery E.

    It is incorrect to say "the test is reliable" because reliability is a function not only of the test itself, but of many factors. The present paper explains how different factors affect classical reliability estimates such as test-retest, interrater, internal consistency, and equivalent forms coefficients. Furthermore, the limits of classical test…

  13. Assessing the Performance of Classical Test Theory Item Discrimination Estimators in Monte Carlo Simulations

    ERIC Educational Resources Information Center

    Bazaldua, Diego A. Luna; Lee, Young-Sun; Keller, Bryan; Fellers, Lauren

    2017-01-01

    The performance of various classical test theory (CTT) item discrimination estimators has been compared in the literature using both empirical and simulated data, resulting in mixed results regarding the preference of some discrimination estimators over others. This study analyzes the performance of various item discrimination estimators in CTT:…

  14. Louis Guttman's Contributions to Classical Test Theory

    ERIC Educational Resources Information Center

    Zimmerman, Donald W.; Williams, Richard H.; Zumbo, Bruno D.; Ross, Donald

    2005-01-01

    This article focuses on Louis Guttman's contributions to the classical theory of educational and psychological tests, one of the lesser known of his many contributions to quantitative methods in the social sciences. Guttman's work in this field provided a rigorous mathematical basis for ideas that, for many decades after Spearman's initial work,…

  15. Improving Measurement in Health Education and Health Behavior Research Using Item Response Modeling: Comparison with the Classical Test Theory Approach

    ERIC Educational Resources Information Center

    Wilson, Mark; Allen, Diane D.; Li, Jun Corser

    2006-01-01

    This paper compares the approach and resultant outcomes of item response models (IRMs) and classical test theory (CTT). First, it reviews basic ideas of CTT, and compares them to the ideas about using IRMs introduced in an earlier paper. It then applies a comparison scheme based on the AERA/APA/NCME "Standards for Educational and…

  16. The Reliability and Precision of Total Scores and IRT Estimates as a Function of Polytomous IRT Parameters and Latent Trait Distribution

    ERIC Educational Resources Information Center

    Culpepper, Steven Andrew

    2013-01-01

    A classic topic in the fields of psychometrics and measurement has been the impact of the number of scale categories on test score reliability. This study builds on previous research by further articulating the relationship between item response theory (IRT) and classical test theory (CTT). Equations are presented for comparing the reliability and…

  17. The Effects of Academic and Interpersonal Stress on Dating Violence among College Students: A Test of Classical Strain Theory

    ERIC Educational Resources Information Center

    Mason, Brandon; Smithey, Martha

    2012-01-01

    This study examines Merton's Classical Strain Theory (1938) as a causative factor in intimate partner violence among college students. We theorize that college students experience general life strain and cumulative strain as they pursue the goal of a college degree. We test this strain on the likelihood of using intimate partner violence. Strain…

  18. A Classical Test Theory Analysis of the Light and Spectroscopy Concept Inventory National Study Data Set

    ERIC Educational Resources Information Center

    Schlingman, Wayne M.; Prather, Edward E.; Wallace, Colin S.; Brissenden, Gina; Rudolph, Alexander L.

    2012-01-01

    This paper is the first in a series of investigations into the data from the recent national study using the Light and Spectroscopy Concept Inventory (LSCI). In this paper, we use classical test theory to form a framework of results that will be used to evaluate individual item difficulties, item discriminations, and the overall reliability of the…

  19. An Analysis of Cross Racial Identity Scale Scores Using Classical Test Theory and Rasch Item Response Models

    ERIC Educational Resources Information Center

    Sussman, Joshua; Beaujean, A. Alexander; Worrell, Frank C.; Watson, Stevie

    2013-01-01

    Item response models (IRMs) were used to analyze Cross Racial Identity Scale (CRIS) scores. Rasch analysis scores were compared with classical test theory (CTT) scores. The partial credit model demonstrated a high goodness of fit and correlations between Rasch and CTT scores ranged from 0.91 to 0.99. CRIS scores are supported by both methods.…

  20. Test Theories, Educational Priorities and Reliability of Public Examinations in England

    ERIC Educational Resources Information Center

    Baird, Jo-Anne; Black, Paul

    2013-01-01

    Much has already been written on the controversies surrounding the use of different test theories in educational assessment. Other authors have noted the prevalence of classical test theory over item response theory in practice. This Special Issue draws together articles based upon work conducted on the Reliability Programme for England's…

  1. Nanoscale Capillary Flows in Alumina: Testing the Limits of Classical Theory.

    PubMed

    Lei, Wenwen; McKenzie, David R

    2016-07-21

    Anodic aluminum oxide (AAO) membranes have well-formed cylindrical channels, as small as 10 nm in diameter, in a close packed hexagonal array. The channels in AAO membranes simulate very small leaks that may be present for example in an aluminum oxide device encapsulation. The 10 nm alumina channel is the smallest that has been studied to date for its moisture flow properties and provides a stringent test of classical capillary theory. We measure the rate at which moisture penetrates channels with diameters in the range of 10 to 120 nm with moist air present at 1 atm on one side and dry air at the same total pressure on the other. We extend classical theory for water leak rates at high humidities by allowing for variable meniscus curvature at the entrance and show that the extended theory explains why the flow increases greatly when capillary filling occurs and enables the contact angle to be determined. At low humidities our measurements for air-filled channels agree well with theory for the interdiffusive flow of water vapor in air. The flow rate of water-filled channels is one order of magnitude less than expected from classical capillary filling theory and is coincidentally equal to the helium flow rate, validating the use of helium leak testing for evaluating moisture flows in aluminum oxide leaks.

  2. Mathematical model of the SH-3G helicopter

    NASA Technical Reports Server (NTRS)

    Phillips, J. D.

    1982-01-01

    A mathematical model of the Sikorsky SH-3G helicopter based on classical nonlinear, quasi-steady rotor theory was developed. The model was validated statically and dynamically by comparison with Navy flight-test data. The model incorporates ad hoc revisions which address the ideal assumptions of classical rotor theory and improve the static trim characteristics to provide a more realistic simulation, while retaining the simplicity of the classical model.

  3. An Alternative Approach to Identifying a Dimension in Second Language Proficiency.

    ERIC Educational Resources Information Center

    Griffin, Patrick E.; And Others

    Current practice in language testing has not yet integrated classical test theory with assessment of language skills. In addition, language testing needs to be part of theory development. Lack of sound testing procedures can lead to problems in research design and ultimately, inappropriate theory development. The debate over dimensionality of…

  4. Classical, Generalizability, and Multifaceted Rasch Detection of Interrater Variability in Large, Sparse Data Sets.

    ERIC Educational Resources Information Center

    MacMillan, Peter D.

    2000-01-01

    Compared classical test theory (CTT), generalizability theory (GT), and multifaceted Rasch model (MFRM) approaches to detecting and correcting for rater variability using responses of 4,930 high school students graded by 3 raters on 9 scales. The MFRM approach identified far more raters as different than did the CTT analysis. GT and Rasch…

  5. Aging Theories for Establishing Safe Life Spans of Airborne Critical Structural Components

    NASA Technical Reports Server (NTRS)

    Ko, William L.

    2003-01-01

    New aging theories have been developed to establish the safe life span of airborne critical structural components such as B-52B aircraft pylon hooks for carrying air-launch drop-test vehicles. The new aging theories use the equivalent-constant-amplitude loading spectrum to represent the actual random loading spectrum with the same damaging effect. The crack growth due to random loading cycling of the first flight is calculated using the half-cycle theory, and then extrapolated to all the crack growths of the subsequent flights. The predictions of the new aging theories (finite difference aging theory and closed-form aging theory) are compared with the classical flight-test life theory and the previously developed Ko first- and Ko second-order aging theories. The new aging theories predict the number of safe flights as considerably lower than that predicted by the classical aging theory, and slightly lower than those predicted by the Ko first- and Ko second-order aging theories due to the inclusion of all the higher order terms.

  6. Overview of Classical Test Theory and Item Response Theory for Quantitative Assessment of Items in Developing Patient-Reported Outcome Measures

    PubMed Central

    Cappelleri, Joseph C.; Lundy, J. Jason; Hays, Ron D.

    2014-01-01

    Introduction The U.S. Food and Drug Administration’s patient-reported outcome (PRO) guidance document defines content validity as “the extent to which the instrument measures the concept of interest” (FDA, 2009, p. 12). “Construct validity is now generally viewed as a unifying form of validity for psychological measurements, subsuming both content and criterion validity” (Strauss & Smith, 2009, p. 7). Hence both qualitative and quantitative information are essential in evaluating the validity of measures. Methods We review classical test theory and item response theory approaches to evaluating PRO measures including frequency of responses to each category of the items in a multi-item scale, the distribution of scale scores, floor and ceiling effects, the relationship between item response options and the total score, and the extent to which hypothesized “difficulty” (severity) order of items is represented by observed responses. Conclusion Classical test theory and item response theory can be useful in providing a quantitative assessment of items and scales during the content validity phase of patient-reported outcome measures. Depending on the particular type of measure and the specific circumstances, either one or both approaches should be considered to help maximize the content validity of PRO measures. PMID:24811753

  7. Selecting Items for Criterion-Referenced Tests.

    ERIC Educational Resources Information Center

    Mellenbergh, Gideon J.; van der Linden, Wim J.

    1982-01-01

    Three item selection methods for criterion-referenced tests are examined: the classical theory of item difficulty and item-test correlation; the latent trait theory of item characteristic curves; and a decision-theoretic approach for optimal item selection. Item contribution to the standardized expected utility of mastery testing is discussed. (CM)

  8. Extended Theories of Gravitation. Observation Protocols and Experimental Tests

    NASA Astrophysics Data System (ADS)

    Fatibene, Lorenzo; Ferraris, Marco; Francaviglia, Mauro; Magnano, Guido

    2013-09-01

    Within the framework of extended theories of gravitation we shall discuss physical equivalences among different formalisms and classical tests. As suggested by the Ehlers-Pirani-Schild framework, the conformal invariance will be preserved and its effect on observational protocols discussed. Accordingly, we shall review standard tests showing how Palatini f(R)-theories naturally passes solar system tests. Observation protocols will be discussed in this wider framework.

  9. Principles of Work Sample Testing. Volume I: A Non-Empirical Taxonomy of Test Uses; Volume II: Evaluation of Personnel Testing Programs; Volume III: Construction and Evaluation of Work Sample Tests; Volume IV: Generalizability.

    ERIC Educational Resources Information Center

    Guion, Robert M.; Ironson, Gail H.

    Challenges to classical psychometric theory are examined in the context of a broader range of fundamental, derived, and intuitive measurements in psychology; the challenges include content-referenced testing, latent trait theory, and generalizability theory. A taxonomy of psychological measurement is developed, based on: (1) purposes of…

  10. The Cultural Context of Career Assessment.

    ERIC Educational Resources Information Center

    Blustein, David L.; Ellis, Michael V.

    2000-01-01

    Building on social constructivism, culturally affirming career assessment should take a unificationist perspective, which does not assume the validity of tests across cultural contexts. Generalizability and item response theory are better suited than classical test theory to the unificationist perspective. (SK)

  11. Basic Concepts in Classical Test Theory: Tests Aren't Reliable, the Nature of Alpha, and Reliability Generalization as a Meta-analytic Method.

    ERIC Educational Resources Information Center

    Helms, LuAnn Sherbeck

    This paper discusses the fact that reliability is about scores and not tests and how reliability limits effect sizes. The paper also explores the classical reliability coefficients of stability, equivalence, and internal consistency. Stability is concerned with how stable test scores will be over time, while equivalence addresses the relationship…

  12. The SIETTE Automatic Assessment Environment

    ERIC Educational Resources Information Center

    Conejo, Ricardo; Guzmán, Eduardo; Trella, Monica

    2016-01-01

    This article describes the evolution and current state of the domain-independent Siette assessment environment. Siette supports different assessment methods--including classical test theory, item response theory, and computer adaptive testing--and integrates them with multidimensional student models used by intelligent educational systems.…

  13. Plasmon mass scale and quantum fluctuations of classical fields on a real time lattice

    NASA Astrophysics Data System (ADS)

    Kurkela, Aleksi; Lappi, Tuomas; Peuron, Jarkko

    2018-03-01

    Classical real-time lattice simulations play an important role in understanding non-equilibrium phenomena in gauge theories and are used in particular to model the prethermal evolution of heavy-ion collisions. Above the Debye scale the classical Yang-Mills (CYM) theory can be matched smoothly to kinetic theory. First we study the limits of the quasiparticle picture of the CYM fields by determining the plasmon mass of the system using 3 different methods. Then we argue that one needs a numerical calculation of a system of classical gauge fields and small linearized fluctuations, which correspond to quantum fluctuations, in a way that keeps the separation between the two manifest. We demonstrate and test an implementation of an algorithm with the linearized fluctuation showing that the linearization indeed works and that the Gauss's law is conserved.

  14. Quantum theory for 1D X-ray free electron laser

    NASA Astrophysics Data System (ADS)

    Anisimov, Petr M.

    2018-06-01

    Classical 1D X-ray Free Electron Laser (X-ray FEL) theory has stood the test of time by guiding FEL design and development prior to any full-scale analysis. Future X-ray FELs and inverse-Compton sources, where photon recoil approaches an electron energy spread value, push the classical theory to its limits of applicability. After substantial efforts by the community to find what those limits are, there is no universally agreed upon quantum approach to design and development of future X-ray sources. We offer a new approach to formulate the quantum theory for 1D X-ray FELs that has an obvious connection to the classical theory, which allows for immediate transfer of knowledge between the two regimes. We exploit this connection in order to draw quantum mechanical conclusions about the quantum nature of electrons and generated radiation in terms of FEL variables.

  15. Psychometric Properties of the Revised Purdue Spatial Visualization Tests: Visualization of Rotations (The Revised PSVT-R)

    ERIC Educational Resources Information Center

    Yoon, So Yoon

    2011-01-01

    Working under classical test theory (CTT) and item response theory (IRT) frameworks, this study investigated psychometric properties of the Revised Purdue Spatial Visualization Tests: Visualization of Rotations (Revised PSVT:R). The original version, the PSVT:R was designed by Guay (1976) to measure spatial visualization ability in…

  16. Generalizability Theory and Classical Test Theory

    ERIC Educational Resources Information Center

    Brennan, Robert L.

    2011-01-01

    Broadly conceived, reliability involves quantifying the consistencies and inconsistencies in observed scores. Generalizability theory, or G theory, is particularly well suited to addressing such matters in that it enables an investigator to quantify and distinguish the sources of inconsistencies in observed scores that arise, or could arise, over…

  17. The Value of Item Response Theory in Clinical Assessment: A Review

    ERIC Educational Resources Information Center

    Thomas, Michael L.

    2011-01-01

    Item response theory (IRT) and related latent variable models represent modern psychometric theory, the successor to classical test theory in psychological assessment. Although IRT has become prevalent in the measurement of ability and achievement, its contributions to clinical domains have been less extensive. Applications of IRT to clinical…

  18. Developing a Test for Assessing Elementary Students' Comprehension of Science Texts

    ERIC Educational Resources Information Center

    Wang, Jing-Ru; Chen, Shin-Feng; Tsay, Reuy-Fen; Chou, Ching-Ting; Lin, Sheau-Wen; Kao, Huey-Lien

    2012-01-01

    This study reports on the process of developing a test to assess students' reading comprehension of scientific materials and on the statistical results of the verification study. A combination of classic test theory and item response theory approaches was used to analyze the assessment data from a verification study. Data analysis indicates the…

  19. Comparison of IRT and CTT Using Secondary School Reading Comprehension Assessments

    ERIC Educational Resources Information Center

    Coggins, Joanne V.; Kim, Jwa K.; Briggs, Laura C.

    2017-01-01

    The Gates-MacGinitie Reading Comprehension Test, fourth edition (GMRT-4) and the ACT Reading Tests (ACT-R) were administered to 423 high school students in order to explore the similarities and dissimilarities of data produced through classical test theory (CTT) and item response theory (IRT) analysis. Despite the many advantages of IRT…

  20. Bayesian Estimation of Multi-Unidimensional Graded Response IRT Models

    ERIC Educational Resources Information Center

    Kuo, Tzu-Chun

    2015-01-01

    Item response theory (IRT) has gained an increasing popularity in large-scale educational and psychological testing situations because of its theoretical advantages over classical test theory. Unidimensional graded response models (GRMs) are useful when polytomous response items are designed to measure a unified latent trait. They are limited in…

  1. Comparing the Effectiveness of SPSS and EduG Using Different Designs for Generalizability Theory

    ERIC Educational Resources Information Center

    Teker, Gulsen Tasdelen; Guler, Nese; Uyanik, Gulden Kaya

    2015-01-01

    Generalizability theory (G theory) provides a broad conceptual framework for social sciences such as psychology and education, and a comprehensive construct for numerous measurement events by using analysis of variance, a strong statistical method. G theory, as an extension of both classical test theory and analysis of variance, is a model which…

  2. Experimental contextuality in classical light

    NASA Astrophysics Data System (ADS)

    Li, Tao; Zeng, Qiang; Song, Xinbing; Zhang, Xiangdong

    2017-03-01

    The Klyachko, Can, Binicioglu, and Shumovsky (KCBS) inequality is an important contextuality inequality in three-level system, which has been demonstrated experimentally by using quantum states. Using the path and polarization degrees of freedom of classical optics fields, we have constructed the classical trit (cetrit), tested the KCBS inequality and its geometrical form (Wright’s inequality) in this work. The projection measurement has been implemented, the clear violations of the KCBS inequality and its geometrical form have been observed. This means that the contextuality inequality, which is commonly used in test of the conflict between quantum theory and noncontextual realism, may be used as a quantitative tool in classical optical coherence to describe correlation characteristics of the classical fields.

  3. Item Response Modeling with Sum Scores

    ERIC Educational Resources Information Center

    Johnson, Timothy R.

    2013-01-01

    One of the distinctions between classical test theory and item response theory is that the former focuses on sum scores and their relationship to true scores, whereas the latter concerns item responses and their relationship to latent scores. Although item response theory is often viewed as the richer of the two theories, sum scores are still…

  4. Examining Differential Item Functions of Different Item Ordered Test Forms According to Item Difficulty Levels

    ERIC Educational Resources Information Center

    Çokluk, Ömay; Gül, Emrah; Dogan-Gül, Çilem

    2016-01-01

    The study aims to examine whether differential item function is displayed in three different test forms that have item orders of random and sequential versions (easy-to-hard and hard-to-easy), based on Classical Test Theory (CTT) and Item Response Theory (IRT) methods and bearing item difficulty levels in mind. In the correlational research, the…

  5. Assessing the Accuracy and Consistency of Language Proficiency Classification under Competing Measurement Models

    ERIC Educational Resources Information Center

    Zhang, Bo

    2010-01-01

    This article investigates how measurement models and statistical procedures can be applied to estimate the accuracy of proficiency classification in language testing. The paper starts with a concise introduction of four measurement models: the classical test theory (CTT) model, the dichotomous item response theory (IRT) model, the testlet response…

  6. KIDMAP--A Diagnostic Tool for Teachers.

    ERIC Educational Resources Information Center

    Lee, Yew Jin; Linacre, John M.; Yeoh, Oon Chye

    While assessment is the bread and butter of the teaching profession, its practitioners usually do not extend analysis of test responses beyond simple measures such as facility or discrimination indices in classical test theory. Item response theory (IRT) has much to offer but its nonintuitive content and difficulty make it a formidable obstacle in…

  7. A Test of Objectification Theory in Adolescent Girls.

    ERIC Educational Resources Information Center

    Slater, Amy; Tiggemann, Marika

    2002-01-01

    Tested the components of a model proposed by Objectification Theory in a sample of adolescent girls who did and did not study classical ballet. Participant surveys examined self-objectification, body shame, appearance anxiety, and disordered eating. There was no difference between groups on self-objectification or any of its proposed consequences.…

  8. A Theoretical and Empirical Integration of the Rational-Emotive and Classical Conditioning Theories

    ERIC Educational Resources Information Center

    Russell, Phillip L.; Brandsma, Jeffrey M.

    1974-01-01

    Galvanic skin conductance response, respiration rate and respiration depth values of an experimental and control group were used to test the hypotheses of a Albert Ellis' ABC Theory of psychopathology. (EK)

  9. Modern Psychometric Methodology: Applications of Item Response Theory

    ERIC Educational Resources Information Center

    Reid, Christine A.; Kolakowsky-Hayner, Stephanie A.; Lewis, Allen N.; Armstrong, Amy J.

    2007-01-01

    Item response theory (IRT) methodology is introduced as a tool for improving assessment instruments used with people who have disabilities. Need for this approach in rehabilitation is emphasized; differences between IRT and classical test theory are clarified. Concepts essential to understanding IRT are defined, necessary data assumptions are…

  10. Testing the Moral Algebra of Two Kohlbergian Informers

    ERIC Educational Resources Information Center

    Hommers, Wilfried; Lewand, Martin; Ehrmann, Dominic

    2012-01-01

    This paper seeks to unify two major theories of moral judgment: Kohlberg's stage theory and Anderson's moral information integration theory. Subjects were told about thoughts of actors in Kohlberg's classic altruistic Heinz dilemma and in a new egoistical dilemma. These actors's thoughts represented Kohlberg's stages I (Personal Risk) and IV…

  11. Using Classical Test Theory and Item Response Theory to Evaluate the LSCI

    NASA Astrophysics Data System (ADS)

    Schlingman, Wayne M.; Prather, E. E.; Collaboration of Astronomy Teaching Scholars CATS

    2011-01-01

    Analyzing the data from the recent national study using the Light and Spectroscopy Concept Inventory (LSCI), this project uses both Classical Test Theory (CTT) and Item Response Theory (IRT) to investigate the LSCI itself in order to better understand what it is actually measuring. We use Classical Test Theory to form a framework of results that can be used to evaluate the effectiveness of individual questions at measuring differences in student understanding and provide further insight into the prior results presented from this data set. In the second phase of this research, we use Item Response Theory to form a theoretical model that generates parameters accounting for a student's ability, a question's difficulty, and estimate the level of guessing. The combined results from our investigations using both CTT and IRT are used to better understand the learning that is taking place in classrooms across the country. The analysis will also allow us to evaluate the effectiveness of individual questions and determine whether the item difficulties are appropriately matched to the abilities of the students in our data set. These results may require that some questions be revised, motivating the need for further development of the LSCI. This material is based upon work supported by the National Science Foundation under Grant No. 0715517, a CCLI Phase III Grant for the Collaboration of Astronomy Teaching Scholars (CATS). Any opinions, findings, and conclusions or recommendations expressed in this material are those of the authors and do not necessarily reflect the views of the National Science Foundation.

  12. A quantum theory account of order effects and conjunction fallacies in political judgments.

    PubMed

    Yearsley, James M; Trueblood, Jennifer S

    2017-09-06

    Are our everyday judgments about the world around us normative? Decades of research in the judgment and decision-making literature suggest the answer is no. If people's judgments do not follow normative rules, then what rules if any do they follow? Quantum probability theory is a promising new approach to modeling human behavior that is at odds with normative, classical rules. One key advantage of using quantum theory is that it explains multiple types of judgment errors using the same basic machinery, unifying what have previously been thought of as disparate phenomena. In this article, we test predictions from quantum theory related to the co-occurrence of two classic judgment phenomena, order effects and conjunction fallacies, using judgments about real-world events (related to the U.S. presidential primaries). We also show that our data obeys two a priori and parameter free constraints derived from quantum theory. Further, we examine two factors that moderate the effects, cognitive thinking style (as measured by the Cognitive Reflection Test) and political ideology.

  13. Theory and experiment in gravitational physics

    NASA Technical Reports Server (NTRS)

    Will, C. M.

    1981-01-01

    New technological advances have made it feasible to conduct measurements with precision levels which are suitable for experimental tests of the theory of general relativity. This book has been designed to fill a new need for a complete treatment of techniques for analyzing gravitation theory and experience. The Einstein equivalence principle and the foundations of gravitation theory are considered, taking into account the Dicke framework, basic criteria for the viability of a gravitation theory, experimental tests of the Einstein equivalence principle, Schiff's conjecture, and a model theory devised by Lightman and Lee (1973). Gravitation as a geometric phenomenon is considered along with the parametrized post-Newtonian formalism, the classical tests, tests of the strong equivalence principle, gravitational radiation as a tool for testing relativistic gravity, the binary pulsar, and cosmological tests.

  14. Theory and experiment in gravitational physics

    NASA Astrophysics Data System (ADS)

    Will, C. M.

    New technological advances have made it feasible to conduct measurements with precision levels which are suitable for experimental tests of the theory of general relativity. This book has been designed to fill a new need for a complete treatment of techniques for analyzing gravitation theory and experience. The Einstein equivalence principle and the foundations of gravitation theory are considered, taking into account the Dicke framework, basic criteria for the viability of a gravitation theory, experimental tests of the Einstein equivalence principle, Schiff's conjecture, and a model theory devised by Lightman and Lee (1973). Gravitation as a geometric phenomenon is considered along with the parametrized post-Newtonian formalism, the classical tests, tests of the strong equivalence principle, gravitational radiation as a tool for testing relativistic gravity, the binary pulsar, and cosmological tests.

  15. Assessment of Differential Item Functioning under Cognitive Diagnosis Models: The DINA Model Example

    ERIC Educational Resources Information Center

    Li, Xiaomin; Wang, Wen-Chung

    2015-01-01

    The assessment of differential item functioning (DIF) is routinely conducted to ensure test fairness and validity. Although many DIF assessment methods have been developed in the context of classical test theory and item response theory, they are not applicable for cognitive diagnosis models (CDMs), as the underlying latent attributes of CDMs are…

  16. An Introduction to Item Response Theory for Health Behavior Researchers

    ERIC Educational Resources Information Center

    Warne, Russell T.; McKyer, E. J. Lisako; Smith, Matthew L.

    2012-01-01

    Objective: To introduce item response theory (IRT) to health behavior researchers by contrasting it with classical test theory and providing an example of IRT in health behavior. Method: Demonstrate IRT by fitting the 2PL model to substance-use survey data from the Adolescent Health Risk Behavior questionnaire (n = 1343 adolescents). Results: An…

  17. Practice and Problems in Language Testing 5. Non-Classical Test Theory; Final Examinations in Secondary Schools. Papers Presented at the International Language Testing Symposium (5th, Arnhem, Netherlands, March 25-26, 1982).

    ERIC Educational Resources Information Center

    van Weeren, J., Ed.

    Presented in this symposium reader are nine papers, four of which deal with the theory and impact of the Rasch model on language testing and five of which discuss final examinations in secondary schools in both general and specific terms. The papers are: "Introduction to Rasch Measurement: Some Implications for Language Testing" (J. J.…

  18. Mixed Quantum/Classical Theory for Molecule-Molecule Inelastic Scattering: Derivations of Equations and Application to N2 + H2 System.

    PubMed

    Semenov, Alexander; Babikov, Dmitri

    2015-12-17

    The mixed quantum classical theory, MQCT, for inelastic scattering of two molecules is developed, in which the internal (rotational, vibrational) motion of both collision partners is treated with quantum mechanics, and the molecule-molecule scattering (translational motion) is described by classical trajectories. The resultant MQCT formalism includes a system of coupled differential equations for quantum probability amplitudes, and the classical equations of motion in the mean-field potential. Numerical tests of this theory are carried out for several most important rotational state-to-state transitions in the N2 + H2 system, in a broad range of collision energies. Besides scattering resonances (at low collision energies) excellent agreement with full-quantum results is obtained, including the excitation thresholds, the maxima of cross sections, and even some smaller features, such as slight oscillations of energy dependencies. Most importantly, at higher energies the results of MQCT are nearly identical to the full quantum results, which makes this approach a good alternative to the full-quantum calculations that become computationally expensive at higher collision energies and for heavier collision partners. Extensions of this theory to include vibrational transitions or general asymmetric-top rotor (polyatomic) molecules are relatively straightforward.

  19. Antigravity Acts on Photons

    NASA Astrophysics Data System (ADS)

    Brynjolfsson, Ari

    2002-04-01

    Einstein's general theory of relativity assumes that photons don't change frequency as they move from Sun to Earth. This assumption is correct in classical physics. All experiments proving the general relativity are in the domain of classical physics. This include the tests by Pound et al. of the gravitational redshift of 14.4 keV photons; the rocket experiments by Vessot et al.; the Galileo solar redshift experiments by Krisher et al.; the gravitational deflection of light experiments by Riveros and Vucetich; and delay of echoes of radar signals passing close to Sun as observed by Shapiro et al. Bohr's correspondence principle assures that quantum mechanical theory of general relativity agrees with Einstein's classical theory when frequency and gravitational field gradient approach zero, or when photons cannot interact with the gravitational field. When we treat photons as quantum mechanical particles; we find that gravitational force on photons is reversed (antigravity). This modified theory contradicts the equivalence principle, but is consistent with all experiments. Solar lines and distant stars are redshifted in accordance with author's plasma redshift theory. These changes result in a beautiful consistent cosmology.

  20. The Importance of Relying on the Manual: Scoring Error Variance in the WISC-IV Vocabulary Subtest

    ERIC Educational Resources Information Center

    Erdodi, Laszlo A.; Richard, David C. S.; Hopwood, Christopher

    2009-01-01

    Classical test theory assumes that ability level has no effect on measurement error. Newer test theories, however, argue that the precision of a measurement instrument changes as a function of the examinee's true score. Research has shown that administration errors are common in the Wechsler scales and that subtests requiring subjective scoring…

  1. A Practitioner's Introduction to Equating with Primers on Classical Test Theory and Item Response Theory

    ERIC Educational Resources Information Center

    Ryan, Joseph; Brockmann, Frank

    2009-01-01

    Equating is an essential tool in educational assessment due the critical role it plays in several key areas: establishing validity across forms and years; fairness; test security; and, increasingly, continuity in programs that release items or require ongoing development. Although the practice of equating is rooted in long standing practices that…

  2. Interest Inventory Items as Attitude Eliciting Stimuli in Classical Conditioning: A Test of the A-R-D Theory. Language, Personality, and Cross-Cultural Study and Measurement of the Human A-R-D (Motivational) System.

    ERIC Educational Resources Information Center

    Gross, Michael C.; Staats, Arthur W.

    An experiment was conducted to test the hypothesis that interest inventory items elicit classically conditionable attitudinal responses. A higher-order conditioning procedure was used in which items from the Strong Vocational Interest Blank were employed as unconditioned stimuli and nonsense syllables as conditioned stimuli. Items for which the…

  3. An Individualized Approach to Introductory Physics

    ERIC Educational Resources Information Center

    Rigden, John S.

    1970-01-01

    Explains individualization of a physics course in terms of organization, testing, and philosophy. Organization of laboratory and lecture is focused on two topics, classical mechanics and relativity theory. The testing consists of quantitative and qualitative questions. (DS)

  4. Quantum and classical behavior in interacting bosonic systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hertzberg, Mark P.

    It is understood that in free bosonic theories, the classical field theory accurately describes the full quantum theory when the occupancy numbers of systems are very large. However, the situation is less understood in interacting theories, especially on time scales longer than the dynamical relaxation time. Recently there have been claims that the quantum theory deviates spectacularly from the classical theory on this time scale, even if the occupancy numbers are extremely large. Furthermore, it is claimed that the quantum theory quickly thermalizes while the classical theory does not. The evidence for these claims comes from noticing a spectacular differencemore » in the time evolution of expectation values of quantum operators compared to the classical micro-state evolution. If true, this would have dramatic consequences for many important phenomena, including laboratory studies of interacting BECs, dark matter axions, preheating after inflation, etc. In this work we critically examine these claims. We show that in fact the classical theory can describe the quantum behavior in the high occupancy regime, even when interactions are large. The connection is that the expectation values of quantum operators in a single quantum micro-state are approximated by a corresponding classical ensemble average over many classical micro-states. Furthermore, by the ergodic theorem, a classical ensemble average of local fields with statistical translation invariance is the spatial average of a single micro-state. So the correlation functions of the quantum and classical field theories of a single micro-state approximately agree at high occupancy, even in interacting systems. Furthermore, both quantum and classical field theories can thermalize, when appropriate coarse graining is introduced, with the classical case requiring a cutoff on low occupancy UV modes. We discuss applications of our results.« less

  5. Combination of classical test theory (CTT) and item response theory (IRT) analysis to study the psychometric properties of the French version of the Quality of Life Enjoyment and Satisfaction Questionnaire-Short Form (Q-LES-Q-SF).

    PubMed

    Bourion-Bédès, Stéphanie; Schwan, Raymund; Epstein, Jonathan; Laprevote, Vincent; Bédès, Alex; Bonnet, Jean-Louis; Baumann, Cédric

    2015-02-01

    The study aimed to examine the construct validity and reliability of the Quality of Life Enjoyment and Satisfaction Questionnaire-Short Form (Q-LES-Q-SF) according to both classical test and item response theories. The psychometric properties of the French version of this instrument were investigated in a cross-sectional, multicenter study. A total of 124 outpatients with a substance dependence diagnosis participated in the study. Psychometric evaluation included descriptive analysis, internal consistency, test-retest reliability, and validity. The dimensionality of the instrument was explored using a combination of the classical test, confirmatory factor analysis (CFA), and an item response theory analysis, the Person Separation Index (PSI), in a complementary manner. The results of the Q-LES-Q-SF revealed that the questionnaire was easy to administer and the acceptability was good. The internal consistency and the test-retest reliability were 0.9 and 0.88, respectively. All items were significantly correlated with the total score and the SF-12 used in the study. The CFA with one factor model was good, and for the unidimensional construct, the PSI was found to be 0.902. The French version of the Q-LES-Q-SF yielded valid and reliable clinical assessments of the quality of life for future research and clinical practice involving French substance abusers. In response to recent questioning regarding the unidimensionality or bidimensionality of the instrument and according to the underlying theoretical unidimensional construct used for its development, this study suggests the Q-LES-Q-SF as a one-dimension questionnaire in French QoL studies.

  6. An Application of the Rasch Measurement Theory to an Assessment of Geometric Thinking Levels

    ERIC Educational Resources Information Center

    Stols, Gerrit; Long, Caroline; Dunne, Tim

    2015-01-01

    The purpose of this study is to apply the Rasch model to investigate both the Van Hiele theory for geometric development and an associated test. In terms of the test, the objective is to investigate the functioning of a classic 25-item instrument designed to identify levels of geometric proficiency. The dataset of responses by 244 students (106…

  7. Diagrammar in classical scalar field theory

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cattaruzza, E., E-mail: Enrico.Cattaruzza@gmail.com; Gozzi, E., E-mail: gozzi@ts.infn.it; INFN, Sezione di Trieste

    2011-09-15

    In this paper we analyze perturbatively a g{phi}{sup 4}classical field theory with and without temperature. In order to do that, we make use of a path-integral approach developed some time ago for classical theories. It turns out that the diagrams appearing at the classical level are many more than at the quantum level due to the presence of extra auxiliary fields in the classical formalism. We shall show that a universal supersymmetry present in the classical path-integral mentioned above is responsible for the cancelation of various diagrams. The same supersymmetry allows the introduction of super-fields and super-diagrams which considerably simplifymore » the calculations and make the classical perturbative calculations almost 'identical' formally to the quantum ones. Using the super-diagrams technique, we develop the classical perturbation theory up to third order. We conclude the paper with a perturbative check of the fluctuation-dissipation theorem. - Highlights: > We provide the Feynman diagrams of perturbation theory for a classical field theory. > We give a super-formalism which links the quantum diagrams to the classical ones. > We check perturbatively the fluctuation-dissipation theorem.« less

  8. Modeling the Severity of Drinking Consequences in First-Year College Women: An Item Response Theory Analysis of the Rutgers Alcohol Problem Index*

    PubMed Central

    Cohn, Amy M.; Hagman, Brett T.; Graff, Fiona S.; Noel, Nora E.

    2011-01-01

    Objective: The present study examined the latent continuum of alcohol-related negative consequences among first-year college women using methods from item response theory and classical test theory. Method: Participants (N = 315) were college women in their freshman year who reported consuming any alcohol in the past 90 days and who completed assessments of alcohol consumption and alcohol-related negative consequences using the Rutgers Alcohol Problem Index. Results: Item response theory analyses showed poor model fit for five items identified in the Rutgers Alcohol Problem Index. Two-parameter item response theory logistic models were applied to the remaining 18 items to examine estimates of item difficulty (i.e., severity) and discrimination parameters. The item difficulty parameters ranged from 0.591 to 2.031, and the discrimination parameters ranged from 0.321 to 2.371. Classical test theory analyses indicated that the omission of the five misfit items did not significantly alter the psychometric properties of the construct. Conclusions: Findings suggest that those consequences that had greater severity and discrimination parameters may be used as screening items to identify female problem drinkers at risk for an alcohol use disorder. PMID:22051212

  9. Using Rasch Analysis to Inform Rating Scale Development

    ERIC Educational Resources Information Center

    Van Zile-Tamsen, Carol

    2017-01-01

    The use of surveys, questionnaires, and rating scales to measure important outcomes in higher education is pervasive, but reliability and validity information is often based on problematic Classical Test Theory approaches. Rasch Analysis, based on Item Response Theory, provides a better alternative for examining the psychometric quality of rating…

  10. Reliability of Test Scores in Nonparametric Item Response Theory.

    ERIC Educational Resources Information Center

    Sijtsma, Klaas; Molenaar, Ivo W.

    1987-01-01

    Three methods for estimating reliability are studied within the context of nonparametric item response theory. Two were proposed originally by Mokken and a third is developed in this paper. Using a Monte Carlo strategy, these three estimation methods are compared with four "classical" lower bounds to reliability. (Author/JAZ)

  11. The Reliability of Criterion-Referenced Measures.

    ERIC Educational Resources Information Center

    Livingston, Samuel A.

    The assumptions of the classical test-theory model are used to develop a theory of reliability for criterion-referenced measures which parallels that for norm-referenced measures. It is shown that the Spearman-Brown formula holds for criterion-referenced measures and that the criterion-referenced reliability coefficient can be used to correct…

  12. Socio-Demographic Determinants of Economic Growth: Age-Structure, Preindustrial Heritage and Sociolinguistic Integration

    ERIC Educational Resources Information Center

    Crenshaw, Edward; Robison, Kristopher

    2010-01-01

    This study establishes a socio-demographic theory of international development derived from selected classical and contemporary sociological theories. Four hypotheses are tested: (1. population growth's effect on development depends on age-structure; (2. historic population density (used here as an indicator of preindustrial social complexity)…

  13. Cognitive Diagnostic Attribute-Level Discrimination Indices

    ERIC Educational Resources Information Center

    Henson, Robert; Roussos, Louis; Douglas, Jeff; He, Xuming

    2008-01-01

    Cognitive diagnostic models (CDMs) model the probability of correctly answering an item as a function of an examinee's attribute mastery pattern. Because estimation of the mastery pattern involves more than a continuous measure of ability, reliability concepts introduced by classical test theory and item response theory do not apply. The cognitive…

  14. The Development of a Psychometrically-Sound Instrument to Measure Teachers' Multidimensional Attitudes toward Inclusive Education

    ERIC Educational Resources Information Center

    Mahat, Marian

    2008-01-01

    The "Multidimensional Attitudes toward Inclusive Education Scale" (MATIES) was developed to effectively measure affective, cognitive and behavioural aspects of attitudes, within the realm of inclusive education that includes physical, social and curricular inclusion. Models within Item Response Theory and Classical Test Theory were used…

  15. A Theoretical and Empirical Comparison of Three Approaches to Achievement Testing.

    ERIC Educational Resources Information Center

    Haladyna, Tom; Roid, Gale

    Three approaches to the construction of achievement tests are compared: construct, operational, and empirical. The construct approach is based upon classical test theory and measures an abstract representation of the instructional objectives. The operational approach specifies instructional intent through instructional objectives, facet design,…

  16. Quantum theory for 1D X-ray free electron laser

    DOE PAGES

    Anisimov, Petr Mikhaylovich

    2017-09-19

    Classical 1D X-ray Free Electron Laser (X-ray FEL) theory has stood the test of time by guiding FEL design and development prior to any full-scale analysis. Future X-ray FELs and inverse-Compton sources, where photon recoil approaches an electron energy spread value, push the classical theory to its limits of applicability. After substantial efforts by the community to find what those limits are, there is no universally agreed upon quantum approach to design and development of future X-ray sources. We offer a new approach to formulate the quantum theory for 1D X-ray FELs that has an obvious connection to the classicalmore » theory, which allows for immediate transfer of knowledge between the two regimes. In conclusion, we exploit this connection in order to draw quantum mechanical conclusions about the quantum nature of electrons and generated radiation in terms of FEL variables.« less

  17. Non-Noetherian symmetries for oscillators in classical mechanics and in field theory

    NASA Technical Reports Server (NTRS)

    Hojman, Sergio A.; Delajara, Jamie; Pena, Leda

    1995-01-01

    Infinitely many new conservation laws both for free fields as well as for test fields evolving on a given gravitational background are presented. The conserved currents are constructed using the field theoretical counterpart of a recently discovered non-Noetherian symmetry which gives rise to a new way of solving the classical small oscillations problem. Several examples are discussed.

  18. Experimental non-classicality of an indivisible quantum system.

    PubMed

    Lapkiewicz, Radek; Li, Peizhe; Schaeff, Christoph; Langford, Nathan K; Ramelow, Sven; Wieśniak, Marcin; Zeilinger, Anton

    2011-06-22

    In contrast to classical physics, quantum theory demands that not all properties can be simultaneously well defined; the Heisenberg uncertainty principle is a manifestation of this fact. Alternatives have been explored--notably theories relying on joint probability distributions or non-contextual hidden-variable models, in which the properties of a system are defined independently of their own measurement and any other measurements that are made. Various deep theoretical results imply that such theories are in conflict with quantum mechanics. Simpler cases demonstrating this conflict have been found and tested experimentally with pairs of quantum bits (qubits). Recently, an inequality satisfied by non-contextual hidden-variable models and violated by quantum mechanics for all states of two qubits was introduced and tested experimentally. A single three-state system (a qutrit) is the simplest system in which such a contradiction is possible; moreover, the contradiction cannot result from entanglement between subsystems, because such a three-state system is indivisible. Here we report an experiment with single photonic qutrits which provides evidence that no joint probability distribution describing the outcomes of all possible measurements--and, therefore, no non-contextual theory--can exist. Specifically, we observe a violation of the Bell-type inequality found by Klyachko, Can, Binicioğlu and Shumovsky. Our results illustrate a deep incompatibility between quantum mechanics and classical physics that cannot in any way result from entanglement.

  19. A Strategy for Replacing Sum Scoring

    ERIC Educational Resources Information Center

    Ramsay, James O.; Wiberg, Marie

    2017-01-01

    This article promotes the use of modern test theory in testing situations where sum scores for binary responses are now used. It directly compares the efficiencies and biases of classical and modern test analyses and finds an improvement in the root mean squared error of ability estimates of about 5% for two designed multiple-choice tests and…

  20. Determination of angle of light deflection in higher-derivative gravity theories

    NASA Astrophysics Data System (ADS)

    Xu, Chenmei; Yang, Yisong

    2018-03-01

    Gravitational light deflection is known as one of three classical tests of general relativity and the angle of deflection may be computed explicitly using approximate or exact solutions describing the gravitational force generated from a point mass. In various generalized gravity theories, however, such explicit determination is often impossible due to the difficulty in obtaining an exact expression for the deflection angle. In this work, we present some highly effective globally convergent iterative methods to determine the angle of semiclassical gravitational deflection in higher- and infinite-derivative formalisms of quantum gravity theories. We also establish the universal properties that the deflection angle always stays below the classical Einstein angle and is a strictly decreasing function of the incident photon energy, in these formalisms.

  1. Two-dimensional collective electron magnetotransport, oscillations, and chaos in a semiconductor superlattice

    NASA Astrophysics Data System (ADS)

    Bonilla, L. L.; Carretero, M.; Segura, A.

    2017-12-01

    When quantized, traces of classically chaotic single-particle systems include eigenvalue statistics and scars in eigenfuntions. Since 2001, many theoretical and experimental works have argued that classically chaotic single-electron dynamics influences and controls collective electron transport. For transport in semiconductor superlattices under tilted magnetic and electric fields, these theories rely on a reduction to a one-dimensional self-consistent drift model. A two-dimensional theory based on self-consistent Boltzmann transport does not support that single-electron chaos influences collective transport. This theory agrees with existing experimental evidence of current self-oscillations, predicts spontaneous collective chaos via a period doubling scenario, and could be tested unambiguously by measuring the electric potential inside the superlattice under a tilted magnetic field.

  2. Two-dimensional collective electron magnetotransport, oscillations, and chaos in a semiconductor superlattice.

    PubMed

    Bonilla, L L; Carretero, M; Segura, A

    2017-12-01

    When quantized, traces of classically chaotic single-particle systems include eigenvalue statistics and scars in eigenfuntions. Since 2001, many theoretical and experimental works have argued that classically chaotic single-electron dynamics influences and controls collective electron transport. For transport in semiconductor superlattices under tilted magnetic and electric fields, these theories rely on a reduction to a one-dimensional self-consistent drift model. A two-dimensional theory based on self-consistent Boltzmann transport does not support that single-electron chaos influences collective transport. This theory agrees with existing experimental evidence of current self-oscillations, predicts spontaneous collective chaos via a period doubling scenario, and could be tested unambiguously by measuring the electric potential inside the superlattice under a tilted magnetic field.

  3. Testing strong-segregation theory against self-consistent-field theory for block copolymer melts

    NASA Astrophysics Data System (ADS)

    Matsen, M. W.

    2001-06-01

    We introduce a highly efficient self-consistent-field theory (SCFT) method for examining the cylindrical and spherical block copolymer morphologies using a standard unit cell approximation (UCA). The method is used to calculate the classical diblock copolymer phase boundaries deep into the strong-segregation regime, where they can be compared with recent improvements to strong-segregation theory (SST). The comparison suggests a significant discrepancy between the two theories indicating that our understanding of strongly stretched polymer brushes is still incomplete.

  4. Density-functional theory simulation of large quantum dots

    NASA Astrophysics Data System (ADS)

    Jiang, Hong; Baranger, Harold U.; Yang, Weitao

    2003-10-01

    Kohn-Sham spin-density functional theory provides an efficient and accurate model to study electron-electron interaction effects in quantum dots, but its application to large systems is a challenge. Here an efficient method for the simulation of quantum dots using density-function theory is developed; it includes the particle-in-the-box representation of the Kohn-Sham orbitals, an efficient conjugate-gradient method to directly minimize the total energy, a Fourier convolution approach for the calculation of the Hartree potential, and a simplified multigrid technique to accelerate the convergence. We test the methodology in a two-dimensional model system and show that numerical studies of large quantum dots with several hundred electrons become computationally affordable. In the noninteracting limit, the classical dynamics of the system we study can be continuously varied from integrable to fully chaotic. The qualitative difference in the noninteracting classical dynamics has an effect on the quantum properties of the interacting system: integrable classical dynamics leads to higher-spin states and a broader distribution of spacing between Coulomb blockade peaks.

  5. Computerized Adaptive Performance Evaluation.

    DTIC Science & Technology

    1980-02-01

    based on classical psychological test theory, with the result that the obtained measurements ani statements of achievement or performance have... psychological aspects of the achievement testing environment. Results , C Applications of Item Characteristic Curve Models and Adaptive Testing Strategies ICC...of immediate knowledge of results and adaptive testing on ability test performance (Research Report 76-4). Minneapolis: Department of Psychology

  6. Theory of mind in early psychosis.

    PubMed

    Langdon, Robyn; Still, Megan; Connors, Michael H; Ward, Philip B; Catts, Stanley V

    2014-08-01

    A deficit in theory of mind--the ability to infer and reason about the mental states of others - might underpin the poor social functioning of patients with psychosis. Unfortunately, however, there is considerable variation in how such a deficit is assessed. The current study compared three classic tests of theory of mind in terms of their ability to detect impairment in patients in the early stages of psychosis. Twenty-three patients within 2 years of their first psychotic episode and 19 healthy controls received picture-sequencing, joke-appreciation and story-comprehension tests of theory of mind. Whereas the picture-sequencing and joke-appreciation tests successfully detected a selective theory-of-mind deficit in patients, the story-comprehension test did not. The findings suggest that tests that place minimal demands on language processing and involve indirect, rather than explicit, instructions to assess theory of mind might be best suited to detecting theory-of-mind impairment in early stages of psychosis. © 2013 Wiley Publishing Asia Pty Ltd.

  7. Modeling Conditional Probabilities in Complex Educational Assessments. CSE Technical Report.

    ERIC Educational Resources Information Center

    Mislevy, Robert J.; Almond, Russell; Dibello, Lou; Jenkins, Frank; Steinberg, Linda; Yan, Duanli; Senturk, Deniz

    An active area in psychometric research is coordinated task design and statistical analysis built around cognitive models. Compared with classical test theory and item response theory, there is often less information from observed data about the measurement-model parameters. On the other hand, there is more information from the grounding…

  8. The value of item response theory in clinical assessment: a review.

    PubMed

    Thomas, Michael L

    2011-09-01

    Item response theory (IRT) and related latent variable models represent modern psychometric theory, the successor to classical test theory in psychological assessment. Although IRT has become prevalent in the measurement of ability and achievement, its contributions to clinical domains have been less extensive. Applications of IRT to clinical assessment are reviewed to appraise its current and potential value. Benefits of IRT include comprehensive analyses and reduction of measurement error, creation of computer adaptive tests, meaningful scaling of latent variables, objective calibration and equating, evaluation of test and item bias, greater accuracy in the assessment of change due to therapeutic intervention, and evaluation of model and person fit. The theory may soon reinvent the manner in which tests are selected, developed, and scored. Although challenges remain to the widespread implementation of IRT, its application to clinical assessment holds great promise. Recommendations for research, test development, and clinical practice are provided.

  9. A simple test of expected utility theory using professional traders.

    PubMed

    List, John A; Haigh, Michael S

    2005-01-18

    We compare behavior across students and professional traders from the Chicago Board of Trade in a classic Allais paradox experiment. Our experiment tests whether independence, a necessary condition in expected utility theory, is systematically violated. We find that both students and professionals exhibit some behavior consistent with the Allais paradox, but the data pattern does suggest that the trader population falls prey to the Allais paradox less frequently than the student population.

  10. Force-field functor theory: classical force-fields which reproduce equilibrium quantum distributions

    PubMed Central

    Babbush, Ryan; Parkhill, John; Aspuru-Guzik, Alán

    2013-01-01

    Feynman and Hibbs were the first to variationally determine an effective potential whose associated classical canonical ensemble approximates the exact quantum partition function. We examine the existence of a map between the local potential and an effective classical potential which matches the exact quantum equilibrium density and partition function. The usefulness of such a mapping rests in its ability to readily improve Born-Oppenheimer potentials for use with classical sampling. We show that such a map is unique and must exist. To explore the feasibility of using this result to improve classical molecular mechanics, we numerically produce a map from a library of randomly generated one-dimensional potential/effective potential pairs then evaluate its performance on independent test problems. We also apply the map to simulate liquid para-hydrogen, finding that the resulting radial pair distribution functions agree well with path integral Monte Carlo simulations. The surprising accessibility and transferability of the technique suggest a quantitative route to adapting Born-Oppenheimer potentials, with a motivation similar in spirit to the powerful ideas and approximations of density functional theory. PMID:24790954

  11. Pressure broadening of the electric dipole and Raman lines of CO2 by argon: Stringent test of the classical impact theory at different temperatures on a benchmark system

    NASA Astrophysics Data System (ADS)

    Ivanov, Sergey V.; Buzykin, Oleg G.

    2016-12-01

    A classical approach is applied to calculate pressure broadening coefficients of CO2 vibration-rotational spectral lines perturbed by Ar. Three types of spectra are examined: electric dipole (infrared) absorption; isotropic and anisotropic Raman Q branches. Simple and explicit formulae of the classical impact theory are used along with exact 3D Hamilton equations for CO2-Ar molecular motion. The calculations utilize vibrationally independent most accurate ab initio potential energy surface (PES) of Hutson et al. expanded in Legendre polynomial series up to lmax = 24. New improved algorithm of classical rotational frequency selection is applied. The dependences of CO2 half-widths on rotational quantum number J up to J=100 are computed for the temperatures between 77 and 765 K and compared with available experimental data as well as with the results of fully quantum dynamical calculations performed on the same PES. To make the picture complete, the predictions of two independent variants of the semi-classical Robert-Bonamy formalism for dipole absorption lines are included. This method. however, has demonstrated poor accuracy almost for all temperatures. On the contrary, classical broadening coefficients are in excellent agreement both with measurements and with quantum results at all temperatures. The classical impact theory in its present variant is capable to produce quickly and accurately the pressure broadening coefficients of spectral lines of linear molecules for any J value (including high Js) using full-dimensional ab initio - based PES in the cases where other computational methods are either extremely time consuming (like the quantum close coupling method) or give erroneous results (like semi-classical methods).

  12. SPSS and SAS programs for generalizability theory analyses.

    PubMed

    Mushquash, Christopher; O'Connor, Brian P

    2006-08-01

    The identification and reduction of measurement errors is a major challenge in psychological testing. Most investigators rely solely on classical test theory for assessing reliability, whereas most experts have long recommended using generalizability theory instead. One reason for the common neglect of generalizability theory is the absence of analytic facilities for this purpose in popular statistical software packages. This article provides a brief introduction to generalizability theory, describes easy to use SPSS, SAS, and MATLAB programs for conducting the recommended analyses, and provides an illustrative example, using data (N = 329) for the Rosenberg Self-Esteem Scale. Program output includes variance components, relative and absolute errors and generalizability coefficients, coefficients for D studies, and graphs of D study results.

  13. Fundamental theories of waves and particles formulated without classical mass

    NASA Astrophysics Data System (ADS)

    Fry, J. L.; Musielak, Z. E.

    2010-12-01

    Quantum and classical mechanics are two conceptually and mathematically different theories of physics, and yet they do use the same concept of classical mass that was originally introduced by Newton in his formulation of the laws of dynamics. In this paper, physical consequences of using the classical mass by both theories are explored, and a novel approach that allows formulating fundamental (Galilean invariant) theories of waves and particles without formally introducing the classical mass is presented. In this new formulation, the theories depend only on one common parameter called 'wave mass', which is deduced from experiments for selected elementary particles and for the classical mass of one kilogram. It is shown that quantum theory with the wave mass is independent of the Planck constant and that higher accuracy of performing calculations can be attained by such theory. Natural units in connection with the presented approach are also discussed and justification beyond dimensional analysis is given for the particular choice of such units.

  14. The contrasting roles of Planck's constant in classical and quantum theories

    NASA Astrophysics Data System (ADS)

    Boyer, Timothy H.

    2018-04-01

    We trace the historical appearance of Planck's constant in physics, and we note that initially the constant did not appear in connection with quanta. Furthermore, we emphasize that Planck's constant can appear in both classical and quantum theories. In both theories, Planck's constant sets the scale of atomic phenomena. However, the roles played in the foundations of the theories are sharply different. In quantum theory, Planck's constant is crucial to the structure of the theory. On the other hand, in classical electrodynamics, Planck's constant is optional, since it appears only as the scale factor for the (homogeneous) source-free contribution to the general solution of Maxwell's equations. Since classical electrodynamics can be solved while taking the homogenous source-free contribution in the solution as zero or non-zero, there are naturally two different theories of classical electrodynamics, one in which Planck's constant is taken as zero and one where it is taken as non-zero. The textbooks of classical electromagnetism present only the version in which Planck's constant is taken to vanish.

  15. Statistical correlation analysis for comparing vibration data from test and analysis

    NASA Technical Reports Server (NTRS)

    Butler, T. G.; Strang, R. F.; Purves, L. R.; Hershfeld, D. J.

    1986-01-01

    A theory was developed to compare vibration modes obtained by NASTRAN analysis with those obtained experimentally. Because many more analytical modes can be obtained than experimental modes, the analytical set was treated as expansion functions for putting both sources in comparative form. The dimensional symmetry was developed for three general cases: nonsymmetric whole model compared with a nonsymmetric whole structural test, symmetric analytical portion compared with a symmetric experimental portion, and analytical symmetric portion with a whole experimental test. The theory was coded and a statistical correlation program was installed as a utility. The theory is established with small classical structures.

  16. Taking-On: A Grounded Theory of Addressing Barriers in Task Completion

    ERIC Educational Resources Information Center

    Austinson, Julie Ann

    2011-01-01

    This study of taking-on was conducted using classical grounded theory methodology (Glaser, 1978, 1992, 1998, 2001, 2005; Glaser & Strauss, 1967). Classical grounded theory is inductive, empirical, and naturalistic; it does not utilize manipulation or constrained time frames. Classical grounded theory is a systemic research method used to generate…

  17. On the classic and modern theories of matching.

    PubMed

    McDowell, J J

    2005-07-01

    Classic matching theory, which is based on Herrnstein's (1961) original matching equation and includes the well-known quantitative law of effect, is almost certainly false. The theory is logically inconsistent with known experimental findings, and experiments have shown that its central constant-k assumption is not tenable. Modern matching theory, which is based on the power function version of the original matching equation, remains tenable, although it has not been discussed or studied extensively. The modern theory is logically consistent with known experimental findings, it predicts the fact and details of the violation of the classic theory's constant-k assumption, and it accurately describes at least some data that are inconsistent with the classic theory.

  18. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Anisimov, Petr Mikhaylovich

    Classical 1D X-ray Free Electron Laser (X-ray FEL) theory has stood the test of time by guiding FEL design and development prior to any full-scale analysis. Future X-ray FELs and inverse-Compton sources, where photon recoil approaches an electron energy spread value, push the classical theory to its limits of applicability. After substantial efforts by the community to find what those limits are, there is no universally agreed upon quantum approach to design and development of future X-ray sources. We offer a new approach to formulate the quantum theory for 1D X-ray FELs that has an obvious connection to the classicalmore » theory, which allows for immediate transfer of knowledge between the two regimes. In conclusion, we exploit this connection in order to draw quantum mechanical conclusions about the quantum nature of electrons and generated radiation in terms of FEL variables.« less

  19. The Importance of the Assumption of Uncorrelated Errors in Psychometric Theory

    ERIC Educational Resources Information Center

    Raykov, Tenko; Marcoulides, George A.; Patelis, Thanos

    2015-01-01

    A critical discussion of the assumption of uncorrelated errors in classical psychometric theory and its applications is provided. It is pointed out that this assumption is essential for a number of fundamental results and underlies the concept of parallel tests, the Spearman-Brown's prophecy and the correction for attenuation formulas as well as…

  20. Classical Field Theory and the Stress-Energy Tensor

    NASA Astrophysics Data System (ADS)

    Swanson, Mark S.

    2015-09-01

    This book is a concise introduction to the key concepts of classical field theory for beginning graduate students and advanced undergraduate students who wish to study the unifying structures and physical insights provided by classical field theory without dealing with the additional complication of quantization. In that regard, there are many important aspects of field theory that can be understood without quantizing the fields. These include the action formulation, Galilean and relativistic invariance, traveling and standing waves, spin angular momentum, gauge invariance, subsidiary conditions, fluctuations, spinor and vector fields, conservation laws and symmetries, and the Higgs mechanism, all of which are often treated briefly in a course on quantum field theory. The variational form of classical mechanics and continuum field theory are both developed in the time-honored graduate level text by Goldstein et al (2001). An introduction to classical field theory from a somewhat different perspective is available in Soper (2008). Basic classical field theory is often treated in books on quantum field theory. Two excellent texts where this is done are Greiner and Reinhardt (1996) and Peskin and Schroeder (1995). Green's function techniques are presented in Arfken et al (2013).

  1. A simple test of expected utility theory using professional traders

    PubMed Central

    List, John A.; Haigh, Michael S.

    2005-01-01

    We compare behavior across students and professional traders from the Chicago Board of Trade in a classic Allais paradox experiment. Our experiment tests whether independence, a necessary condition in expected utility theory, is systematically violated. We find that both students and professionals exhibit some behavior consistent with the Allais paradox, but the data pattern does suggest that the trader population falls prey to the Allais paradox less frequently than the student population. PMID:15634739

  2. A quantum-classical theory with nonlinear and stochastic dynamics

    NASA Astrophysics Data System (ADS)

    Burić, N.; Popović, D. B.; Radonjić, M.; Prvanović, S.

    2014-12-01

    The method of constrained dynamical systems on the quantum-classical phase space is utilized to develop a theory of quantum-classical hybrid systems. Effects of the classical degrees of freedom on the quantum part are modeled using an appropriate constraint, and the interaction also includes the effects of neglected degrees of freedom. Dynamical law of the theory is given in terms of nonlinear stochastic differential equations with Hamiltonian and gradient terms. The theory provides a successful dynamical description of the collapse during quantum measurement.

  3. Quantum theory of electromagnetic fields in a cosmological quantum spacetime

    NASA Astrophysics Data System (ADS)

    Lewandowski, Jerzy; Nouri-Zonoz, Mohammad; Parvizi, Ali; Tavakoli, Yaser

    2017-11-01

    The theory of quantum fields propagating on an isotropic cosmological quantum spacetime is reexamined by generalizing the scalar test field to an electromagnetic (EM) vector field. For any given polarization of the EM field on the classical background, the Hamiltonian can be written in the form of the Hamiltonian of a set of decoupled harmonic oscillators, each corresponding to a single mode of the field. In transition from the classical to quantum spacetime background, following the technical procedure given by Ashtekar et al. [Phys. Rev. D 79, 064030 (2009), 10.1103/PhysRevD.79.064030], a quantum theory of the test EM field on an effective (dressed) spacetime emerges. The nature of this emerging dressed geometry is independent of the chosen polarization, but it may depend on the energy of the corresponding field mode. Specifically, when the backreaction of the field on the quantum geometry is negligible (i.e., a test field approximation is assumed), all field modes probe the same effective background independent of the mode's energy. However, when the backreaction of the field modes on the quantum geometry is significant, by employing a Born-Oppenheimer approximation, it is shown that a rainbow (i.e., a mode-dependent) metric emerges. The emergence of this mode-dependent background in the Planck regime may have a significant effect on the creation of quantum particles. The production amount on the dressed background is computed and is compared with the familiar results on the classical geometry.

  4. Delamination Analysis Of Composite Curved Bars

    NASA Technical Reports Server (NTRS)

    Ko, William L.; Jackson, Raymond H.

    1990-01-01

    Classical anisotropic elasticity theory used to construct "multilayer" composite semicircular curved bar subjected to end forces and end moments. Radial location and intensity of open-mode delamination stress calculated and compared with results obtained from anisotropic continuum theory and from finite element method. Multilayer theory gave more accurate predictions of location and intensity of open-mode delamination stress. Currently being applied to predict open-mode delamination stress concentrations in horse-shoe-shaped composite test coupons.

  5. Evaluating the validity of the Work Role Functioning Questionnaire (Canadian French version) using classical test theory and item response theory.

    PubMed

    Hong, Quan Nha; Coutu, Marie-France; Berbiche, Djamal

    2017-01-01

    The Work Role Functioning Questionnaire (WRFQ) was developed to assess workers' perceived ability to perform job demands and is used to monitor presenteeism. Still few studies on its validity can be found in the literature. The purpose of this study was to assess the items and factorial composition of the Canadian French version of the WRFQ (WRFQ-CF). Two measurement approaches were used to test the WRFQ-CF: Classical Test Theory (CTT) and non-parametric Item Response Theory (IRT). A total of 352 completed questionnaires were analyzed. A four-factor and three-factor model models were tested and shown respectively good fit with 14 items (Root Mean Square Error of Approximation (RMSEA) = 0.06, Standardized Root Mean Square Residual (SRMR) = 0.04, Bentler Comparative Fit Index (CFI) = 0.98) and with 17 items (RMSEA = 0.059, SRMR = 0.048, CFI = 0.98). Using IRT, 13 problematic items were identified, of which 9 were common with CTT. This study tested different models with fewer problematic items found in a three-factor model. Using a non-parametric IRT and CTT for item purification gave complementary results. IRT is still scarcely used and can be an interesting alternative method to enhance the quality of a measurement instrument. More studies are needed on the WRFQ-CF to refine its items and factorial composition.

  6. Analyzing force concept inventory with item response theory

    NASA Astrophysics Data System (ADS)

    Wang, Jing; Bao, Lei

    2010-10-01

    Item response theory is a popular assessment method used in education. It rests on the assumption of a probability framework that relates students' innate ability and their performance on test questions. Item response theory transforms students' raw test scores into a scaled proficiency score, which can be used to compare results obtained with different test questions. The scaled score also addresses the issues of ceiling effects and guessing, which commonly exist in quantitative assessment. We used item response theory to analyze the force concept inventory (FCI). Our results show that item response theory can be useful for analyzing physics concept surveys such as the FCI and produces results about the individual questions and student performance that are beyond the capability of classical statistics. The theory yields detailed measurement parameters regarding the difficulty, discrimination features, and probability of correct guess for each of the FCI questions.

  7. Reporting Subscores Using R: A Software Review

    ERIC Educational Resources Information Center

    Dai, Shenghai; Svetina, Dubravka; Wang, Xiaolin

    2017-01-01

    There is an increasing interest in reporting test subscores for diagnostic purposes. In this article, we review nine popular R packages (subscore, mirt, TAM, sirt, CDM, NPCD, lavaan, sem, and OpenMX) that are capable of implementing subscore-reporting methods within one or more frameworks including classical test theory, multidimensional item…

  8. Analysis of Added Value of Subscores with Respect to Classification

    ERIC Educational Resources Information Center

    Sinharay, Sandip

    2014-01-01

    Brennan noted that users of test scores often want (indeed, demand) that subscores be reported, along with total test scores, for diagnostic purposes. Haberman suggested a method based on classical test theory (CTT) to determine if subscores have added value over the total score. One way to interpret the method is that a subscore has added value…

  9. Using Rasch Measurement to Score, Evaluate, and Improve Examinations in an Anatomy Course

    ERIC Educational Resources Information Center

    Royal, Kenneth D.; Gilliland, Kurt O.; Kernick, Edward T.

    2014-01-01

    Any examination that involves moderate to high stakes implications for examinees should be psychometrically sound and legally defensible. Currently, there are two broad and competing families of test theories that are used to score examination data. The majority of instructors outside the high-stakes testing arena rely on classical test theory…

  10. Application of an IRT Polytomous Model for Measuring Health Related Quality of Life

    ERIC Educational Resources Information Center

    Tejada, Antonio J. Rojas; Rojas, Oscar M. Lozano

    2005-01-01

    Background: The Item Response Theory (IRT) has advantages for measuring Health Related Quality of Life (HRQOL) as opposed to the Classical Tests Theory (CTT). Objectives: To present the results of the application of a polytomous model based on IRT, specifically, the Rating Scale Model (RSM), to measure HRQOL with the EORTC QLQ-C30. Methods: 103…

  11. An Investigation of the Accuracy of Alternative Methods of True Score Estimation in High-Stakes Mixed-Format Examinations.

    ERIC Educational Resources Information Center

    Klinger, Don A.; Rogers, W. Todd

    2003-01-01

    The estimation accuracy of procedures based on classical test score theory and item response theory (generalized partial credit model) were compared for examinations consisting of multiple-choice and extended-response items. Analysis of British Columbia Scholarship Examination results found an error rate of about 10 percent for both methods, with…

  12. A Cognitive Diagnostic Modeling of Attribute Mastery in Massachusetts, Minnesota, and the U.S. National Sample Using the TIMSS 2007

    ERIC Educational Resources Information Center

    Lee, Young-Sun; Park, Yoon Soo; Taylan, Didem

    2011-01-01

    Studies of international mathematics achievement such as the Trends in Mathematics and Science Study (TIMSS) have employed classical test theory and item response theory to rank individuals within a latent ability continuum. Although these approaches have provided insights into comparisons between countries, they have yet to examine how specific…

  13. Mixed quantum/classical theory of rotationally and vibrationally inelastic scattering in space-fixed and body-fixed reference frames

    NASA Astrophysics Data System (ADS)

    Semenov, Alexander; Babikov, Dmitri

    2013-11-01

    We formulated the mixed quantum/classical theory for rotationally and vibrationally inelastic scattering process in the diatomic molecule + atom system. Two versions of theory are presented, first in the space-fixed and second in the body-fixed reference frame. First version is easy to derive and the resultant equations of motion are transparent, but the state-to-state transition matrix is complex-valued and dense. Such calculations may be computationally demanding for heavier molecules and/or higher temperatures, when the number of accessible channels becomes large. In contrast, the second version of theory requires some tedious derivations and the final equations of motion are rather complicated (not particularly intuitive). However, the state-to-state transitions are driven by real-valued sparse matrixes of much smaller size. Thus, this formulation is the method of choice from the computational point of view, while the space-fixed formulation can serve as a test of the body-fixed equations of motion, and the code. Rigorous numerical tests were carried out for a model system to ensure that all equations, matrixes, and computer codes in both formulations are correct.

  14. Generalized classical and quantum signal theories

    NASA Astrophysics Data System (ADS)

    Rundblad, E.; Labunets, V.; Novak, P.

    2005-05-01

    In this paper we develop two topics and show their inter- and cross-relation. The first centers on general notions of the generalized classical signal theory on finite Abelian hypergroups. The second concerns the generalized quantum hyperharmonic analysis of quantum signals (Hermitean operators associated with classical signals). We study classical and quantum generalized convolution hypergroup algebras of classical and quantum signals.

  15. Using 20-million-year-old amber to test the super-Arrhenius behaviour of glass-forming systems.

    PubMed

    Zhao, Jing; Simon, Sindee L; McKenna, Gregory B

    2013-01-01

    Fossil amber offers the opportunity to investigate the dynamics of glass-forming materials far below the nominal glass transition temperature. This is important in the context of classical theory, as well as some new theories that challenge the idea of an 'ideal' glass transition. Here we report results from calorimetric and stress relaxation experiments using a 20-million-year-old Dominican amber. By performing the stress relaxation experiments in a step-wise fashion, we measured the relaxation time at each temperature and, above the fictive temperature of this 20-million-year-old glass, this is an upper bound to the equilibrium relaxation time. The results deviate dramatically from the expectation of classical theory and are consistent with some modern ideas, in which the diverging timescale signature of complex fluids disappears below the glass transition temperature.

  16. First Test of Long-Range Collisional Drag via Plasma Wave Damping

    NASA Astrophysics Data System (ADS)

    Affolter, Matthew

    2017-10-01

    In magnetized plasmas, the rate of particle collisions is enhanced over classical predictions when the cyclotron radius rc is less than the Debye length λD. Classical theories describe local velocity scattering collisions with impact parameters ρ

  17. Metacognitive development of deaf children: lessons from the appearance-reality and false belief tasks.

    PubMed

    Courtin, Cyril; Melot, Anne-Marie

    2005-01-01

    'Theory of mind' development is now an important research field in deaf studies. Past research with the classic false belief task has consistently reported a delay in theory of mind development in deaf children born of hearing parents, while performance of second-generation deaf children is more problematic with some contradictory results. The present paper is aimed at testing the metacognitive abilities of deaf children on two tasks: the appearance-reality paradigm designed by Flavell, Flavell and Green (1983) and the classic false belief inference task (Wimmer & Perner, 1983; Hogrefe, Wimmer & Perner, 1986). Twenty-eight second-generation deaf children, 60 deaf children of hearing parents and 36 hearing children, aged 5 to 7, were tested and compared on three appearance-reality and three false belief items. Results show that early exposure to language, be it signed or oral, facilitates performance on the two theory of mind tasks. In addition, native signers equal hearing children in the appearance-reality task while surpassing them on the false belief one. The differences of performance patterns in the two tasks are discussed in terms of linguistic and metarepresentational development.

  18. Development of Nonword and Irregular Word Lists for Australian Grade 3 Students Using Rasch Analysis

    ERIC Educational Resources Information Center

    Callinan, Sarah; Cunningham, Everarda; Theiler, Stephen

    2014-01-01

    Many tests used in educational settings to identify learning difficulties endeavour to pick up only the lowest performers. Yet these tests are generally developed within a Classical Test Theory (CTT) paradigm that assumes that data do not have significant skew. Rasch analysis is more tolerant of skew and was used to validate two newly developed…

  19. Interlaminar shear stress effects on the postbuckling response of graphite-epoxy panels

    NASA Technical Reports Server (NTRS)

    Engelstad, S. P.; Knight, N. F., Jr.; Reddy, J. N.

    1990-01-01

    The influence of shear flexibility on overall postbuckling response was assessed, and transverse shear stress distributions in relation to panel failure were examined. Nonlinear postbuckling results are obtained for finite element models based on classical laminated plate theory and first-order shear deformation theory. Good correlation between test and analysis is obtained. The results presented analytically substantiate the experimentally observed failure mode.

  20. Introduction to Classical Density Functional Theory by a Computational Experiment

    ERIC Educational Resources Information Center

    Jeanmairet, Guillaume; Levy, Nicolas; Levesque, Maximilien; Borgis, Daniel

    2014-01-01

    We propose an in silico experiment to introduce the classical density functional theory (cDFT). Density functional theories, whether quantum or classical, rely on abstract concepts that are nonintuitive; however, they are at the heart of powerful tools and active fields of research in both physics and chemistry. They led to the 1998 Nobel Prize in…

  1. Design Equations and Criteria of Orthotropic Composite Panels

    DTIC Science & Technology

    2013-05-01

    33  Appendix A Classical Laminate Theory ( CLT ): ....................................................................... A–1  Appendix...Science London , 1990. NSWCCD-65-TR–2004/16A A–1 Appendix A Classical Laminate Theory ( CLT ): In Section 6 of this report, preliminary design...determined using:  Classical Laminate Theory, CLT , to Predict Equivalent Stiffness Characteristics, First- Ply Strength Note: CLT is valid for

  2. Perturbative quantum gravity as a double copy of gauge theory.

    PubMed

    Bern, Zvi; Carrasco, John Joseph M; Johansson, Henrik

    2010-08-06

    In a previous paper we observed that (classical) tree-level gauge-theory amplitudes can be rearranged to display a duality between color and kinematics. Once this is imposed, gravity amplitudes are obtained using two copies of gauge-theory diagram numerators. Here we conjecture that this duality persists to all quantum loop orders and can thus be used to obtain multiloop gravity amplitudes easily from gauge-theory ones. As a nontrivial test, we show that the three-loop four-point amplitude of N=4 super-Yang-Mills theory can be arranged into a form satisfying the duality, and by taking double copies of the diagram numerators we obtain the corresponding amplitude of N=8 supergravity. We also remark on a nonsupersymmetric two-loop test based on pure Yang-Mills theory resulting in gravity coupled to an antisymmetric tensor and dilaton.

  3. Semiclassical evaluation of quantum fidelity

    NASA Astrophysics Data System (ADS)

    Vanicek, Jiri

    2004-03-01

    We present a numerically feasible semiclassical method to evaluate quantum fidelity (Loschmidt echo) in a classically chaotic system. It was thought that such evaluation would be intractable, but instead we show that a uniform semiclassical expression not only is tractable but it gives remarkably accurate numerical results for the standard map in both the Fermi-golden-rule and Lyapunov regimes. Because it allows a Monte-Carlo evaluation, this uniform expression is accurate at times where there are 10^70 semiclassical contributions. Remarkably, the method also explicitly contains the ``building blocks'' of analytical theories of recent literature, and thus permits a direct test of approximations made by other authors in these regimes, rather than an a posteriori comparison with numerical results. We explain in more detail the extended validity of the classical perturbation approximation and thus provide a ``defense" of the linear response theory from the famous Van Kampen objection. We point out the potential use of our uniform expression in other areas because it gives a most direct link between the quantum Feynman propagator based on the path integral and the semiclassical Van Vleck propagator based on the sum over classical trajectories. Finally, we test the applicability of our method in integrable and mixed systems.

  4. k-Cosymplectic Classical Field Theories: Tulczyjew and Skinner-Rusk Formulations

    NASA Astrophysics Data System (ADS)

    Rey, Angel M.; Román-Roy, Narciso; Salgado, Modesto; Vilariño, Silvia

    2012-06-01

    The k-cosymplectic Lagrangian and Hamiltonian formalisms of first-order classical field theories are reviewed and completed. In particular, they are stated for singular and almost-regular systems. Subsequently, several alternative formulations for k-cosymplectic first-order field theories are developed: First, generalizing the construction of Tulczyjew for mechanics, we give a new interpretation of the classical field equations. Second, the Lagrangian and Hamiltonian formalisms are unified by giving an extension of the Skinner-Rusk formulation on classical mechanics.

  5. Unifying inflation with ΛCDM epoch in modified f(R) gravity consistent with Solar System tests

    NASA Astrophysics Data System (ADS)

    Nojiri, Shin'ichi; Odintsov, Sergei D.

    2007-12-01

    We suggest two realistic f(R) and one F(G) modified gravities which are consistent with local tests and cosmological bounds. The typical property of such theories is the presence of the effective cosmological constant epochs in such a way that early-time inflation and late-time cosmic acceleration are naturally unified within single model. It is shown that classical instability does not appear here and Newton law is respected. Some discussion of possible anti-gravity regime appearance and related modification of the theory is done.

  6. Field testing, refinement, and psychometric evaluation of a new measure of quality of care for assisted living.

    PubMed

    Rantz, Marilyn J; Aud, Myra A; Zwygart-Stauffacher, Mary; Mehr, David R; Petroski, Gregory F; Owen, Steven V; Madsen, Richard W; Flesner, Marcia; Conn, Vicki; Maas, Meridean

    2008-01-01

    Field test results are reported for the Observable Indicators of Nursing Home Care Quality Instrument-Assisted Living Version, an instrument designed to measure the quality of care in assisted living facilities after a brief 30-minute walk-through. The OIQ-AL was tested in 207 assisted-living facilities in two states using classical test theory, generalizability theory, and exploratory factor analysis. The 34-item scale has a coherent six-factor structure that conceptually describes the multidimensional concept of care quality in assisted living. The six factors can be logically clustered into process (Homelike and Caring, 21 items) and structure (Access and Choice; Lighting; Plants and Pets; Outdoor Spaces) subscales and for a total quality score. Classical test theory results indicate most subscales and the total quality score from the OIQ-AL have acceptable interrater, test-retest, and strong internal consistency reliabilities. Generalizability theory analyses reveal that dependability of scores from the instrument are strong, particularly by including a second observer who conducts a site visit and independently completes an instrument, or by a single observer conducting two site visits and completing instruments during each visit. Scoring guidelines based on the total sample of observations (N = 358) help guide those who want to use the measure to interpret both subscale and total scores. Content validity was supported by two expert panels of people experienced in the assisted-living field, and a content validity index calculated for the first version of the scale is high (3.43 on a four-point scale). The OIQ-AL gives reliable and valid scores for researchers, and may be useful for consumers, providers, and others interested in measuring quality of care in assisted-living facilities.

  7. Time evolution of linearized gauge field fluctuations on a real-time lattice

    NASA Astrophysics Data System (ADS)

    Kurkela, A.; Lappi, T.; Peuron, J.

    2016-12-01

    Classical real-time lattice simulations play an important role in understanding non-equilibrium phenomena in gauge theories and are used in particular to model the prethermal evolution of heavy-ion collisions. Due to instabilities, small quantum fluctuations on top of the classical background may significantly affect the dynamics of the system. In this paper we argue for the need for a numerical calculation of a system of classical gauge fields and small linearized fluctuations in a way that keeps the separation between the two manifest. We derive and test an explicit algorithm to solve these equations on the lattice, maintaining gauge invariance and Gauss' law.

  8. Fitting the Rasch Model to Account for Variation in Item Discrimination

    ERIC Educational Resources Information Center

    Weitzman, R. A.

    2009-01-01

    Building on the Kelley and Gulliksen versions of classical test theory, this article shows that a logistic model having only a single item parameter can account for varying item discrimination, as well as difficulty, by using item-test correlations to adjust incorrect-correct (0-1) item responses prior to an initial model fit. The fit occurs…

  9. Development and validation of the irritable bowel syndrome scale under the system of quality of life instruments for chronic diseases QLICD-IBS: combinations of classical test theory and generalizability theory.

    PubMed

    Lei, Pingguang; Lei, Guanghe; Tian, Jianjun; Zhou, Zengfen; Zhao, Miao; Wan, Chonghua

    2014-10-01

    This paper is aimed to develop the irritable bowel syndrome (IBS) scale of the system of Quality of Life Instruments for Chronic Diseases (QLICD-IBS) by the modular approach and validate it by both classical test theory and generalizability theory. The QLICD-IBS was developed based on programmed decision procedures with multiple nominal and focus group discussions, in-depth interview, and quantitative statistical procedures. One hundred twelve inpatients with IBS were used to provide the data measuring QOL three times before and after treatments. The psychometric properties of the scale were evaluated with respect to validity, reliability, and responsiveness employing correlation analysis, factor analyses, multi-trait scaling analysis, t tests and also G studies and D studies of generalizability theory analysis. Multi-trait scaling analysis, correlation, and factor analyses confirmed good construct validity and criterion-related validity when using SF-36 as a criterion. Test-retest reliability coefficients (Pearson r and intra-class correlation (ICC)) for the overall score and all domains were higher than 0.80; the internal consistency α for all domains at two measurements were higher than 0.70 except for the social domain (0.55 and 0.67, respectively). The overall score and scores for all domains/facets had statistically significant changes after treatments with moderate or higher effect size standardized response mean (SRM) ranging from 0.72 to 1.02 at domain levels. G coefficients and index of dependability (Ф coefficients) confirmed the reliability of the scale further with more exact variance components. The QLICD-IBS has good validity, reliability, responsiveness, and some highlights and can be used as the quality of life instrument for patients with IBS.

  10. Generalized probability theories: what determines the structure of quantum theory?

    NASA Astrophysics Data System (ADS)

    Janotta, Peter; Hinrichsen, Haye

    2014-08-01

    The framework of generalized probabilistic theories is a powerful tool for studying the foundations of quantum physics. It provides the basis for a variety of recent findings that significantly improve our understanding of the rich physical structure of quantum theory. This review paper tries to present the framework and recent results to a broader readership in an accessible manner. To achieve this, we follow a constructive approach. Starting from a few basic physically motivated assumptions we show how a given set of observations can be manifested in an operational theory. Furthermore, we characterize consistency conditions limiting the range of possible extensions. In this framework classical and quantum theory appear as special cases, and the aim is to understand what distinguishes quantum mechanics as the fundamental theory realized in nature. It turns out that non-classical features of single systems can equivalently result from higher-dimensional classical theories that have been restricted. Entanglement and non-locality, however, are shown to be genuine non-classical features.

  11. The complex and quaternionic quantum bit from relativity of simultaneity on an interferometer

    NASA Astrophysics Data System (ADS)

    Garner, Andrew J. P.; Müller, Markus P.; Dahlsten, Oscar C. O.

    2017-12-01

    The patterns of fringes produced by an interferometer have long been important testbeds for our best contemporary theories of physics. Historically, interference has been used to contrast quantum mechanics with classical physics, but recently experiments have been performed that test quantum theory against even more exotic alternatives. A physically motivated family of theories are those where the state space of a two-level system is given by a sphere of arbitrary dimension. This includes classical bits, and real, complex and quaternionic quantum theory. In this paper, we consider relativity of simultaneity (i.e. that observers may disagree about the order of events at different locations) as applied to a two-armed interferometer, and show that this forbids most interference phenomena more complicated than those of complex quantum theory. If interference must depend on some relational property of the setting (such as path difference), then relativity of simultaneity will limit state spaces to standard complex quantum theory, or a subspace thereof. If this relational assumption is relaxed, we find one additional theory compatible with relativity of simultaneity: quaternionic quantum theory. Our results have consequences for current laboratory interference experiments: they have to be designed carefully to avoid rendering beyond-quantum effects invisible by relativity of simultaneity.

  12. The complex and quaternionic quantum bit from relativity of simultaneity on an interferometer.

    PubMed

    Garner, Andrew J P; Müller, Markus P; Dahlsten, Oscar C O

    2017-12-01

    The patterns of fringes produced by an interferometer have long been important testbeds for our best contemporary theories of physics. Historically, interference has been used to contrast quantum mechanics with classical physics, but recently experiments have been performed that test quantum theory against even more exotic alternatives. A physically motivated family of theories are those where the state space of a two-level system is given by a sphere of arbitrary dimension. This includes classical bits, and real, complex and quaternionic quantum theory. In this paper, we consider relativity of simultaneity (i.e. that observers may disagree about the order of events at different locations) as applied to a two-armed interferometer, and show that this forbids most interference phenomena more complicated than those of complex quantum theory. If interference must depend on some relational property of the setting (such as path difference), then relativity of simultaneity will limit state spaces to standard complex quantum theory, or a subspace thereof. If this relational assumption is relaxed, we find one additional theory compatible with relativity of simultaneity: quaternionic quantum theory. Our results have consequences for current laboratory interference experiments: they have to be designed carefully to avoid rendering beyond-quantum effects invisible by relativity of simultaneity.

  13. Analysis of the psychometric properties of the Multiple Sclerosis Impact Scale-29 (MSIS-29) in relapsing–remitting multiple sclerosis using classical and modern test theory

    PubMed Central

    Wyrwich, KW; Phillips, GA; Vollmer, T; Guo, S

    2016-01-01

    Background Investigations using classical test theory support the psychometric properties of the original version of the Multiple Sclerosis Impact Scale (MSIS-29v1), a disease-specific measure of multiple sclerosis (MS) impact (physical and psychological subscales). Later, assessments of the MSIS-29v1 in an MS community-based sample using Rasch analysis led to revisions of the instrument’s response options (MSIS-29v2). Objective The objective of this paper is to evaluate the psychometric properties of the MSIS-29v1 in a clinical trial cohort of relapsing–remitting MS patients (RRMS). Methods Data from 600 patients with RRMS enrolled in the SELECT clinical trial were used. Assessments were performed at baseline and at Weeks 12, 24, and 52. In addition to traditional psychometric analyses, Item Response Theory (IRT) and Rasch analysis were used to evaluate the measurement properties of the MSIS-29v1. Results Both MSIS-29v1 subscales demonstrated strong reliability, construct validity, and responsiveness. The IRT and Rasch analysis showed overall support for response category threshold ordering, person-item fit, and item fit for both subscales. Conclusions Both MSIS-29v1 subscales demonstrated robust measurement properties using classical, IRT, and Rasch techniques. Unlike previous research using a community-based sample, the MSIS-29v1 was found to be psychometrically sound to assess physical and psychological impairments in a clinical trial sample of patients with RRMS. PMID:28607741

  14. Analysis of the psychometric properties of the Multiple Sclerosis Impact Scale-29 (MSIS-29) in relapsing-remitting multiple sclerosis using classical and modern test theory.

    PubMed

    Bacci, E D; Wyrwich, K W; Phillips, G A; Vollmer, T; Guo, S

    2016-01-01

    Investigations using classical test theory support the psychometric properties of the original version of the Multiple Sclerosis Impact Scale (MSIS-29v1), a disease-specific measure of multiple sclerosis (MS) impact (physical and psychological subscales). Later, assessments of the MSIS-29v1 in an MS community-based sample using Rasch analysis led to revisions of the instrument's response options (MSIS-29v2). The objective of this paper is to evaluate the psychometric properties of the MSIS-29v1 in a clinical trial cohort of relapsing-remitting MS patients (RRMS). Data from 600 patients with RRMS enrolled in the SELECT clinical trial were used. Assessments were performed at baseline and at Weeks 12, 24, and 52. In addition to traditional psychometric analyses, Item Response Theory (IRT) and Rasch analysis were used to evaluate the measurement properties of the MSIS-29v1. Both MSIS-29v1 subscales demonstrated strong reliability, construct validity, and responsiveness. The IRT and Rasch analysis showed overall support for response category threshold ordering, person-item fit, and item fit for both subscales. Both MSIS-29v1 subscales demonstrated robust measurement properties using classical, IRT, and Rasch techniques. Unlike previous research using a community-based sample, the MSIS-29v1 was found to be psychometrically sound to assess physical and psychological impairments in a clinical trial sample of patients with RRMS.

  15. Null but not void: considerations for hypothesis testing.

    PubMed

    Shaw, Pamela A; Proschan, Michael A

    2013-01-30

    Standard statistical theory teaches us that once the null and alternative hypotheses have been defined for a parameter, the choice of the statistical test is clear. Standard theory does not teach us how to choose the null or alternative hypothesis appropriate to the scientific question of interest. Neither does it tell us that in some cases, depending on which alternatives are realistic, we may want to define our null hypothesis differently. Problems in statistical practice are frequently not as pristinely summarized as the classic theory in our textbooks. In this article, we present examples in statistical hypothesis testing in which seemingly simple choices are in fact rich with nuance that, when given full consideration, make the choice of the right hypothesis test much less straightforward. Published 2012. This article is a US Government work and is in the public domain in the USA.

  16. Classical and quantum analysis of repulsive singularities in four-dimensional extended supergravity

    NASA Astrophysics Data System (ADS)

    Gaida, I.; Hollmann, H. R.; Stewart, J. M.

    1999-07-01

    Non-minimal repulsive singularities (`repulsons') in extended supergravity theories are investigated. The short-distance antigravity properties of the repulsons are tested at the classical and the quantum level by a scalar test-particle. Using a partial wave expansion it is shown that the particle is totally reflected at the origin. A high-frequency incoming particle undergoes a phase shift of icons/Journals/Common/pi" ALT="pi" ALIGN="TOP"/>/2. However, the phase shift for a low-frequency particle depends upon the physical data of the repulson. The curvature singularity at a finite distance rh turns out to be transparent for the scalar test-particle and the coordinate singularity at the origin serves as the repulsive barrier to bounce back the particles.

  17. Methodological issues regarding power of classical test theory (CTT) and item response theory (IRT)-based approaches for the comparison of patient-reported outcomes in two groups of patients - a simulation study

    PubMed Central

    2010-01-01

    Background Patients-Reported Outcomes (PRO) are increasingly used in clinical and epidemiological research. Two main types of analytical strategies can be found for these data: classical test theory (CTT) based on the observed scores and models coming from Item Response Theory (IRT). However, whether IRT or CTT would be the most appropriate method to analyse PRO data remains unknown. The statistical properties of CTT and IRT, regarding power and corresponding effect sizes, were compared. Methods Two-group cross-sectional studies were simulated for the comparison of PRO data using IRT or CTT-based analysis. For IRT, different scenarios were investigated according to whether items or person parameters were assumed to be known, to a certain extent for item parameters, from good to poor precision, or unknown and therefore had to be estimated. The powers obtained with IRT or CTT were compared and parameters having the strongest impact on them were identified. Results When person parameters were assumed to be unknown and items parameters to be either known or not, the power achieved using IRT or CTT were similar and always lower than the expected power using the well-known sample size formula for normally distributed endpoints. The number of items had a substantial impact on power for both methods. Conclusion Without any missing data, IRT and CTT seem to provide comparable power. The classical sample size formula for CTT seems to be adequate under some conditions but is not appropriate for IRT. In IRT, it seems important to take account of the number of items to obtain an accurate formula. PMID:20338031

  18. Telling and Not-Telling: A Classic Grounded Theory of Sharing Life-Stories

    ERIC Educational Resources Information Center

    Powers, Trudy Lee

    2013-01-01

    This study of "Telling and Not-Telling" was conducted using the classic grounded theory methodology (Glaser 1978, 1992, 1998; Glaser & Strauss, 1967). This unique methodology systematically and inductively generates conceptual theories from data. The goal is to discover theory that explains, predicts, and provides practical…

  19. Compression failure of angle-ply laminates

    NASA Technical Reports Server (NTRS)

    Peel, L. D.; Hyer, M. W.; Shuart, M. J.

    1992-01-01

    Test results from the compression loading of (+ or - Theta/ - or + Theta)(sub 6s) angle-ply IM7-8551-7a specimens, 0 less than or = Theta less than or = 90 degs, are presented. The observed failure strengths and modes are discussed, and typical stress-strain relations shown. Using classical lamination theory and the maximum stress criterion, an attempt is made to predict failure stress as a function of Theta. This attempt results in poor correlation with test results and thus a more advanced model is used. The model, which is based on a geometrically nonlinear theory, and which was taken from previous work, includes the influence of observed layer waviness. The waviness is described by the wave length and the wave amplitude. The theory is briefly described and results from the theory are correlated with test results. It is shown that by using levels of waviness observed in the specimens, the correlation between predictions and observations is good.

  20. Dealing with Omitted and Not-Reached Items in Competence Tests: Evaluating Approaches Accounting for Missing Responses in Item Response Theory Models

    ERIC Educational Resources Information Center

    Pohl, Steffi; Gräfe, Linda; Rose, Norman

    2014-01-01

    Data from competence tests usually show a number of missing responses on test items due to both omitted and not-reached items. Different approaches for dealing with missing responses exist, and there are no clear guidelines on which of those to use. While classical approaches rely on an ignorable missing data mechanism, the most recently developed…

  1. Nonadiabatic Molecular Dynamics and Orthogonality Constrained Density Functional Theory

    NASA Astrophysics Data System (ADS)

    Shushkov, Philip Georgiev

    The exact quantum dynamics of realistic, multidimensional systems remains a formidable computational challenge. In many chemical processes, however, quantum effects such as tunneling, zero-point energy quantization, and nonadiabatic transitions play an important role. Therefore, approximate approaches that improve on the classical mechanical framework are of special practical interest. We propose a novel ring polymer surface hopping method for the calculation of chemical rate constants. The method blends two approaches, namely ring polymer molecular dynamics that accounts for tunneling and zero-point energy quantization, and surface hopping that incorporates nonadiabatic transitions. We test the method against exact quantum mechanical calculations for a one-dimensional, two-state model system. The method reproduces quite accurately the tunneling contribution to the rate and the distribution of reactants between the electronic states for this model system. Semiclassical instanton theory, an approach related to ring polymer molecular dynamics, accounts for tunneling by the use of periodic classical trajectories on the inverted potential energy surface. We study a model of electron transfer in solution, a chemical process where nonadiabatic events are prominent. By representing the tunneling electron with a ring polymer, we derive Marcus theory of electron transfer from semiclassical instanton theory after a careful analysis of the tunneling mode. We demonstrate that semiclassical instanton theory can recover the limit of Fermi's Golden Rule rate in a low-temperature, deep-tunneling regime. Mixed quantum-classical dynamics treats a few important degrees of freedom quantum mechanically, while classical mechanics describes affordably the rest of the system. But the interface of quantum and classical description is a challenging theoretical problem, especially for low-energy chemical processes. We therefore focus on the semiclassical limit of the coupled nuclear-electronic dynamics. We show that the time-dependent Schrodinger equation for the electrons employed in the widely used fewest switches surface hopping method is applicable only in the limit of nearly identical classical trajectories on the different potential energy surfaces. We propose a short-time decoupling algorithm that restricts the use of the Schrodinger equation only to the interaction regions. We test the short-time approximation on three model systems against exact quantum-mechanical calculations. The approximation improves the performance of the surface hopping approach. Nonadiabatic molecular dynamics simulations require the efficient and accurate computation of ground and excited state potential energy surfaces. Unlike the ground state calculations where standard methods exist, the computation of excited state properties is a challenging task. We employ time-independent density functional theory, in which the excited state energy is represented as a functional of the total density. We suggest an adiabatic-like approximation that simplifies the excited state exchange-correlation functional. We also derive a set of minimal conditions to impose exactly the orthogonality of the excited state Kohn-Sham determinant to the ground state determinant. This leads to an efficient, variational algorithm for the self-consistent optimization of the excited state energy. Finally, we assess the quality of the excitation energies obtained by the new method on a set of 28 organic molecules. The new approach provides results of similar accuracy to time-dependent density functional theory.

  2. New Equating Methods and Their Relationships with Levine Observed Score Linear Equating under the Kernel Equating Framework

    ERIC Educational Resources Information Center

    Chen, Haiwen; Holland, Paul

    2010-01-01

    In this paper, we develop a new curvilinear equating for the nonequivalent groups with anchor test (NEAT) design under the assumption of the classical test theory model, that we name curvilinear Levine observed score equating. In fact, by applying both the kernel equating framework and the mean preserving linear transformation of…

  3. Sustainability Attitudes and Behavioral Motivations of College Students: Testing the Extended Parallel Process Model

    ERIC Educational Resources Information Center

    Perrault, Evan K.; Clark, Scott K.

    2018-01-01

    Purpose: A planet that can no longer sustain life is a frightening thought--and one that is often present in mass media messages. Therefore, this study aims to test the components of a classic fear appeal theory, the extended parallel process model (EPPM) and to determine how well its constructs predict sustainability behavioral intentions. This…

  4. Target Classification of Canonical Scatterers Using Classical Estimation and Dictionary Based Techniques

    DTIC Science & Technology

    2012-03-22

    shapes tested , when the objective parameter set was confined to a dictionary’s de - fined parameter space. These physical characteristics included...8 2.3 Hypothesis Testing and Detection Theory . . . . . . . . . . . . . . . 8 2.4 3-D SAR Scattering Models...basis pursuit de -noising (BPDN) algorithm is chosen to perform extraction due to inherent efficiency and error tolerance. Multiple shape dictionaries

  5. Characterizing Measurement Error in Test Scores across Studies: A Tutorial on Conducting "Reliability Generalization" Analyses.

    ERIC Educational Resources Information Center

    Henson, Robin K.; Thompson, Bruce

    Given the potential value of reliability generalization (RG) studies in the development of cumulative psychometric knowledge, the purpose of this paper is to provide a tutorial on how to conduct such studies and to serve as a guide for researchers wishing to use this methodology. After some brief comments on classical test theory, the paper…

  6. Finite-block-length analysis in classical and quantum information theory.

    PubMed

    Hayashi, Masahito

    2017-01-01

    Coding technology is used in several information processing tasks. In particular, when noise during transmission disturbs communications, coding technology is employed to protect the information. However, there are two types of coding technology: coding in classical information theory and coding in quantum information theory. Although the physical media used to transmit information ultimately obey quantum mechanics, we need to choose the type of coding depending on the kind of information device, classical or quantum, that is being used. In both branches of information theory, there are many elegant theoretical results under the ideal assumption that an infinitely large system is available. In a realistic situation, we need to account for finite size effects. The present paper reviews finite size effects in classical and quantum information theory with respect to various topics, including applied aspects.

  7. Finite-block-length analysis in classical and quantum information theory

    PubMed Central

    HAYASHI, Masahito

    2017-01-01

    Coding technology is used in several information processing tasks. In particular, when noise during transmission disturbs communications, coding technology is employed to protect the information. However, there are two types of coding technology: coding in classical information theory and coding in quantum information theory. Although the physical media used to transmit information ultimately obey quantum mechanics, we need to choose the type of coding depending on the kind of information device, classical or quantum, that is being used. In both branches of information theory, there are many elegant theoretical results under the ideal assumption that an infinitely large system is available. In a realistic situation, we need to account for finite size effects. The present paper reviews finite size effects in classical and quantum information theory with respect to various topics, including applied aspects. PMID:28302962

  8. Kappa and Rater Accuracy: Paradigms and Parameters

    ERIC Educational Resources Information Center

    Conger, Anthony J.

    2017-01-01

    Drawing parallels to classical test theory, this article clarifies the difference between rater accuracy and reliability and demonstrates how category marginal frequencies affect rater agreement and Cohen's kappa. Category assignment paradigms are developed: comparing raters to a standard (index) versus comparing two raters to one another…

  9. Interlaminar shear stress effects on the postbuckling response of graphite-epoxy panels

    NASA Technical Reports Server (NTRS)

    Engelstad, S. P.; Reddy, J. N.; Knight, N. F., Jr.

    1990-01-01

    The objectives of the study are to assess the influence of shear flexibility on overall postbuckling response, and to examine transverse shear stress distributions in relation to panel failure. Nonlinear postbuckling results are obtained for finite element models based on classical laminated plate theory and first-order shear deformation theory. Good correlation between test and analysis is obtained. The results presented in this paper analytically substantiate the experimentally observed failure mode.

  10. Koopman-von Neumann formulation of classical Yang-Mills theories: I

    NASA Astrophysics Data System (ADS)

    Carta, P.; Gozzi, E.; Mauro, D.

    2006-03-01

    In this paper we present the Koopman-von Neumann (KvN) formulation of classical non-Abelian gauge field theories. In particular we shall explore the functional (or classical path integral) counterpart of the KvN method. In the quantum path integral quantization of Yang-Mills theories concepts like gauge-fixing and Faddeev-Popov determinant appear in a quite natural way. We will prove that these same objects are needed also in this classical path integral formulation for Yang-Mills theories. We shall also explore the classical path integral counterpart of the BFV formalism and build all the associated universal and gauge charges. These last are quite different from the analog quantum ones and we shall show the relation between the two. This paper lays the foundation of this formalism which, due to the many auxiliary fields present, is rather heavy. Applications to specific topics outlined in the paper will appear in later publications.

  11. a Classical Isodual Theory of Antimatter and its Prediction of Antigravity

    NASA Astrophysics Data System (ADS)

    Santilli, Ruggero Maria

    An inspection of the contemporary physics literature reveals that, while matter is treated at all levels of study, from Newtonian mechanics to quantum field theory, antimatter is solely treated at the level of second quantization. For the purpose of initiating the restoration of full equivalence in the treatment of matter and antimatter in due time, and as the classical foundations of an axiomatically consistent inclusion of gravitation in unified gauge theories recently appeared elsewhere, in this paper we present a classical representation of antimatter which begins at the primitive Newtonian level with corresponding formulations at all subsequent levels. By recalling that charge conjugation of particles into antiparticles is antiautomorphic, the proposed theory of antimatter is based on a new map, called isoduality, which is also antiautomorphic (and more generally, antiisomorphic), yet it is applicable beginning at the classical level and then persists at the quantum level where it becomes equivalent to charge conjugation. We therefore present, apparently for the first time, the classical isodual theory of antimatter, we identify the physical foundations of the theory as being the novel isodual Galilean, special and general relativities, and we show the compatibility of the theory with all available classical experimental data on antimatter. We identify the classical foundations of the prediction of antigravity for antimatter in the field of matter (or vice-versa) without any claim on its validity, and defer its resolution to specifically identified experiments. We identify the novel, classical, isodual electromagnetic waves which are predicted to be emitted by antimatter, the so-called space-time machine based on a novel non-Newtonian geometric propulsion, and other implications of the theory. We also introduce, apparently for the first time, the isodual space and time inversions and show that they are nontrivially different than the conventional ones, thus offering a possibility for the future resolution whether far away galaxies and quasars are made up of matter or of antimatter. The paper ends with the indication that the studies are at their first infancy, and indicates some of the open problems. To avoid a prohibitive length, the paper is restricted to the classical treatment, while studies on operator profiles are treated elsewhere.

  12. Fundamental finite key limits for one-way information reconciliation in quantum key distribution

    NASA Astrophysics Data System (ADS)

    Tomamichel, Marco; Martinez-Mateo, Jesus; Pacher, Christoph; Elkouss, David

    2017-11-01

    The security of quantum key distribution protocols is guaranteed by the laws of quantum mechanics. However, a precise analysis of the security properties requires tools from both classical cryptography and information theory. Here, we employ recent results in non-asymptotic classical information theory to show that one-way information reconciliation imposes fundamental limitations on the amount of secret key that can be extracted in the finite key regime. In particular, we find that an often used approximation for the information leakage during information reconciliation is not generally valid. We propose an improved approximation that takes into account finite key effects and numerically test it against codes for two probability distributions, that we call binary-binary and binary-Gaussian, that typically appear in quantum key distribution protocols.

  13. Towards the Fundamental Quantum Limit of Linear Measurements of Classical Signals

    NASA Astrophysics Data System (ADS)

    Miao, Haixing; Adhikari, Rana X.; Ma, Yiqiu; Pang, Belinda; Chen, Yanbei

    2017-08-01

    The quantum Cramér-Rao bound (QCRB) sets a fundamental limit for the measurement of classical signals with detectors operating in the quantum regime. Using linear-response theory and the Heisenberg uncertainty relation, we derive a general condition for achieving such a fundamental limit. When applied to classical displacement measurements with a test mass, this condition leads to an explicit connection between the QCRB and the standard quantum limit that arises from a tradeoff between the measurement imprecision and quantum backaction; the QCRB can be viewed as an outcome of a quantum nondemolition measurement with the backaction evaded. Additionally, we show that the test mass is more a resource for improving measurement sensitivity than a victim of the quantum backaction, which suggests a new approach to enhancing the sensitivity of a broad class of sensors. We illustrate these points with laser interferometric gravitational-wave detectors.

  14. From the limits of the classical model of sensitometric curves to a realistic model based on the percolation theory for GafChromic EBT films.

    PubMed

    del Moral, F; Vázquez, J A; Ferrero, J J; Willisch, P; Ramírez, R D; Teijeiro, A; López Medina, A; Andrade, B; Vázquez, J; Salvador, F; Medal, D; Salgado, M; Muñoz, V

    2009-09-01

    Modern radiotherapy uses complex treatments that necessitate more complex quality assurance procedures. As a continuous medium, GafChromic EBT films offer suitable features for such verification. However, its sensitometric curve is not fully understood in terms of classical theoretical models. In fact, measured optical densities and those predicted by the classical models differ significantly. This difference increases systematically with wider dose ranges. Thus, achieving the accuracy required for intensity-modulated radiotherapy (IMRT) by classical methods is not possible, plecluding their use. As a result, experimental parametrizations, such as polynomial fits, are replacing phenomenological expressions in modern investigations. This article focuses on identifying new theoretical ways to describe sensitometric curves and on evaluating the quality of fit for experimental data based on four proposed models. A whole mathematical formalism starting with a geometrical version of the classical theory is used to develop new expressions for the sensitometric curves. General results from the percolation theory are also used. A flat-bed-scanner-based method was chosen for the film analysis. Different tests were performed, such as consistency of the numeric results for the proposed model and double examination using data from independent researchers. Results show that the percolation-theory-based model provides the best theoretical explanation for the sensitometric behavior of GafChromic films. The different sizes of active centers or monomer crystals of the film are the basis of this model, allowing acquisition of information about the internal structure of the films. Values for the mean size of the active centers were obtained in accordance with technical specifications. In this model, the dynamics of the interaction between the active centers of GafChromic film and radiation is also characterized by means of its interaction cross-section value. The percolation model fulfills the accuracy requirements for quality-control procedures when large ranges of doses are used and offers a physical explanation for the film response.

  15. Experimental test of state-independent quantum contextuality of an indivisible quantum system

    NASA Astrophysics Data System (ADS)

    Li, Meng; Huang, Yun-Feng; Cao, Dong-Yang; Zhang, Chao; Zhang, Yong-Sheng; Liu, Bi-Heng; Li, Chuan-Feng; Guo, Guang-Can

    2014-05-01

    Since the quantum mechanics was born, quantum mechanics was argued among scientists because the differences between quantum mechanics and the classical physics. Because of this, some people give hidden variable theory. One of the hidden variable theory is non-contextual hidden variable theory, and KS inequalities are famous in non-contextual hidden variable theory. But the original KS inequalities have 117 directions to measure, so it is almost impossible to test the KS inequalities in experiment. However bout two years ago, Sixia Yu and C.H. Oh point out that for a single qutrit, we only need to measure 13 directions, then we can test the KS inequalities. This makes it possible to test the KS inequalities in experiment. We use the polarization and the path of single photon to construct a qutrit, and we use the half-wave plates, the beam displacers and polar beam splitters to prepare the quantum state and finish the measurement. And the result prove that quantum mechanics is right and non-contextual hidden variable theory is wrong.

  16. Development and validation of the coronary heart disease scale under the system of quality of life instruments for chronic diseases QLICD-CHD: combinations of classical test theory and Generalizability Theory.

    PubMed

    Wan, Chonghua; Li, Hezhan; Fan, Xuejin; Yang, Ruixue; Pan, Jiahua; Chen, Wenru; Zhao, Rong

    2014-06-04

    Quality of life (QOL) for patients with coronary heart disease (CHD) is now concerned worldwide with the specific instruments being seldom and no one developed by the modular approach. This paper is aimed to develop the CHD scale of the system of Quality of Life Instruments for Chronic Diseases (QLICD-CHD) by the modular approach and validate it by both classical test theory and Generalizability Theory. The QLICD-CHD was developed based on programmed decision procedures with multiple nominal and focus group discussions, in-depth interview, pre-testing and quantitative statistical procedures. 146 inpatients with CHD were used to provide the data measuring QOL three times before and after treatments. The psychometric properties of the scale were evaluated with respect to validity, reliability and responsiveness employing correlation analysis, factor analyses, multi-trait scaling analysis, t-tests and also G studies and D studies of Genralizability Theory analysis. Multi-trait scaling analysis, correlation and factor analyses confirmed good construct validity and criterion-related validity when using SF-36 as a criterion. The internal consistency α and test-retest reliability coefficients (Pearson r and Intra-class correlations ICC) for the overall instrument and all domains were higher than 0.70 and 0.80 respectively; The overall and all domains except for social domain had statistically significant changes after treatments with moderate effect size SRM (standardized response mea) ranging from 0.32 to 0.67. G-coefficients and index of dependability (Ф coefficients) confirmed the reliability of the scale further with more exact variance components. The QLICD-CHD has good validity, reliability, and moderate responsiveness and some highlights, and can be used as the quality of life instrument for patients with CHD. However, in order to obtain better reliability, the numbers of items for social domain should be increased or the items' quality, not quantity, should be improved.

  17. TESTING GRAVITY WITH QUASI-PERIODIC OSCILLATIONS FROM ACCRETING BLACK HOLES: THE CASE OF THE EINSTEIN–DILATON–GAUSS–BONNET THEORY

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Maselli, Andrea; Gualtieri, Leonardo; Ferrari, Valeria

    Quasi-periodic oscillations (QPOs) observed in the X-ray flux emitted by accreting black holes are associated with phenomena occurring near the horizon. Future very large area X-ray instruments will be able to measure QPO frequencies with very high precision, thus probing this strong-field region. Using the relativistic precession model, we show the way in which QPO frequencies could be used to test general relativity (GR) against those alternative theories of gravity which predict deviations from the classical theory in the strong-field and high-curvature regimes. We consider one of the best-motivated high-curvature corrections to GR, namely, the Einstein–Dilaton–Gauss–Bonnet theory, and show thatmore » a detection of QPOs with the expected sensitivity of the proposed ESA M-class mission LOFT would set the most stringent constraints on the parameter space of this theory.« less

  18. Is wave-particle objectivity compatible with determinism and locality?

    PubMed

    Ionicioiu, Radu; Jennewein, Thomas; Mann, Robert B; Terno, Daniel R

    2014-09-26

    Wave-particle duality, superposition and entanglement are among the most counterintuitive features of quantum theory. Their clash with our classical expectations motivated hidden-variable (HV) theories. With the emergence of quantum technologies, we can test experimentally the predictions of quantum theory versus HV theories and put strong restrictions on their key assumptions. Here, we study an entanglement-assisted version of the quantum delayed-choice experiment and show that the extension of HV to the controlling devices only exacerbates the contradiction. We compare HV theories that satisfy the conditions of objectivity (a property of photons being either particles or waves, but not both), determinism and local independence of hidden variables with quantum mechanics. Any two of the above conditions are compatible with it. The conflict becomes manifest when all three conditions are imposed and persists for any non-zero value of entanglement. We propose an experiment to test our conclusions.

  19. Is wave–particle objectivity compatible with determinism and locality?

    PubMed Central

    Ionicioiu, Radu; Jennewein, Thomas; Mann, Robert B.; Terno, Daniel R.

    2014-01-01

    Wave–particle duality, superposition and entanglement are among the most counterintuitive features of quantum theory. Their clash with our classical expectations motivated hidden-variable (HV) theories. With the emergence of quantum technologies, we can test experimentally the predictions of quantum theory versus HV theories and put strong restrictions on their key assumptions. Here, we study an entanglement-assisted version of the quantum delayed-choice experiment and show that the extension of HV to the controlling devices only exacerbates the contradiction. We compare HV theories that satisfy the conditions of objectivity (a property of photons being either particles or waves, but not both), determinism and local independence of hidden variables with quantum mechanics. Any two of the above conditions are compatible with it. The conflict becomes manifest when all three conditions are imposed and persists for any non-zero value of entanglement. We propose an experiment to test our conclusions. PMID:25256419

  20. The Effect of the Media on Suicide: The Great Depression.

    ERIC Educational Resources Information Center

    Stack, Steven

    1992-01-01

    Tests thesis that degree of media influence is contingent on audience receptivity. Audience receptivity to suicide stories assumed high during Great Depression. Developed taxonomy of stories using classic imitation, social learning, and differential identification theories. Analysis of monthly data on suicides and publicized stories revealed…

  1. Cutting Cakes Carefully

    ERIC Educational Resources Information Center

    Hill, Theodore P.; Morrison, Kent E.

    2010-01-01

    This paper surveys the fascinating mathematics of fair division, and provides a suite of examples using basic ideas from algebra, calculus, and probability which can be used to examine and test new and sometimes complex mathematical theories and claims involving fair division. Conversely, the classical cut-and-choose and moving-knife algorithms…

  2. Any Ontological Model of the Single Qubit Stabilizer Formalism must be Contextual

    NASA Astrophysics Data System (ADS)

    Lillystone, Piers; Wallman, Joel J.

    Quantum computers allow us to easily solve some problems classical computers find hard. Non-classical improvements in computational power should be due to some non-classical property of quantum theory. Contextuality, a more general notion of non-locality, is a necessary, but not sufficient, resource for quantum speed-up. Proofs of contextuality can be constructed for the classically simulable stabilizer formalism. Previous proofs of stabilizer contextuality are known for 2 or more qubits, for example the Mermin-Peres magic square. In the work presented we extend these results and prove that any ontological model of the single qubit stabilizer theory must be contextual, as defined by R. Spekkens, and give a relation between our result and the Mermin-Peres square. By demonstrating that contextuality is present in the qubit stabilizer formalism we provide further insight into the contextuality present in quantum theory. Understanding the contextuality of classical sub-theories will allow us to better identify the physical properties of quantum theory required for computational speed up. This research was supported by CIFAR, the Government of Ontario, and the Government of Canada through NSERC and Industry Canada.

  3. Evaluation of the mathematical and economic basis for conversion processes in the LEAP energy-economy model

    NASA Astrophysics Data System (ADS)

    Oblow, E. M.

    1982-10-01

    An evaluation was made of the mathematical and economic basis for conversion processes in the Long-term Energy Analysis Program (LEAP) energy economy model. Conversion processes are the main modeling subunit in LEAP used to represent energy conversion industries and are supposedly based on the classical economic theory of the firm. Questions about uniqueness and existence of LEAP solutions and their relation to classical equilibrium economic theory prompted the study. An analysis of classical theory and LEAP model equations was made to determine their exact relationship. The conclusions drawn from this analysis were that LEAP theory is not consistent with the classical theory of the firm. Specifically, the capacity factor formalism used by LEAP does not support a classical interpretation in terms of a technological production function for energy conversion processes. The economic implications of this inconsistency are suboptimal process operation and short term negative profits in years where plant operation should be terminated. A new capacity factor formalism, which retains the behavioral features of the original model, is proposed to resolve these discrepancies.

  4. The equivalence principle in a quantum world

    NASA Astrophysics Data System (ADS)

    Bjerrum-Bohr, N. E. J.; Donoghue, John F.; El-Menoufi, Basem Kamal; Holstein, Barry R.; Planté, Ludovic; Vanhove, Pierre

    2015-09-01

    We show how modern methods can be applied to quantum gravity at low energy. We test how quantum corrections challenge the classical framework behind the equivalence principle (EP), for instance through introduction of nonlocality from quantum physics, embodied in the uncertainty principle. When the energy is small, we now have the tools to address this conflict explicitly. Despite the violation of some classical concepts, the EP continues to provide the core of the quantum gravity framework through the symmetry — general coordinate invariance — that is used to organize the effective field theory (EFT).

  5. (Re)igniting a Sociological Imagination in Adult Education: The Continuing Relevance of Classical Theory

    ERIC Educational Resources Information Center

    Lange, Elizabeth

    2015-01-01

    This article argues that sociology has been a foundational discipline for the field of adult education, but it has been largely implicit, until recently. This article contextualizes classical theories of sociology within contemporary critiques, reviews the historical roots of sociology and then briefly introduces the classical theories…

  6. Item Response Theory and Health Outcomes Measurement in the 21st Century

    PubMed Central

    Hays, Ron D.; Morales, Leo S.; Reise, Steve P.

    2006-01-01

    Item response theory (IRT) has a number of potential advantages over classical test theory in assessing self-reported health outcomes. IRT models yield invariant item and latent trait estimates (within a linear transformation), standard errors conditional on trait level, and trait estimates anchored to item content. IRT also facilitates evaluation of differential item functioning, inclusion of items with different response formats in the same scale, and assessment of person fit and is ideally suited for implementing computer adaptive testing. Finally, IRT methods can be helpful in developing better health outcome measures and in assessing change over time. These issues are reviewed, along with a discussion of some of the methodological and practical challenges in applying IRT methods. PMID:10982088

  7. An Introduction to Item Response Theory for Patient-Reported Outcome Measurement

    PubMed Central

    Nguyen, Tam H.; Han, Hae-Ra; Kim, Miyong T.

    2015-01-01

    The growing emphasis on patient-centered care has accelerated the demand for high-quality data from patient-reported outcome (PRO) measures. Traditionally, the development and validation of these measures has been guided by classical test theory. However, item response theory (IRT), an alternate measurement framework, offers promise for addressing practical measurement problems found in health-related research that have been difficult to solve through classical methods. This paper introduces foundational concepts in IRT, as well as commonly used models and their assumptions. Existing data on a combined sample (n = 636) of Korean American and Vietnamese American adults who responded to the High Blood Pressure Health Literacy Scale and the Patient Health Questionnaire-9 are used to exemplify typical applications of IRT. These examples illustrate how IRT can be used to improve the development, refinement, and evaluation of PRO measures. Greater use of methods based on this framework can increase the accuracy and efficiency with which PROs are measured. PMID:24403095

  8. An introduction to item response theory for patient-reported outcome measurement.

    PubMed

    Nguyen, Tam H; Han, Hae-Ra; Kim, Miyong T; Chan, Kitty S

    2014-01-01

    The growing emphasis on patient-centered care has accelerated the demand for high-quality data from patient-reported outcome (PRO) measures. Traditionally, the development and validation of these measures has been guided by classical test theory. However, item response theory (IRT), an alternate measurement framework, offers promise for addressing practical measurement problems found in health-related research that have been difficult to solve through classical methods. This paper introduces foundational concepts in IRT, as well as commonly used models and their assumptions. Existing data on a combined sample (n = 636) of Korean American and Vietnamese American adults who responded to the High Blood Pressure Health Literacy Scale and the Patient Health Questionnaire-9 are used to exemplify typical applications of IRT. These examples illustrate how IRT can be used to improve the development, refinement, and evaluation of PRO measures. Greater use of methods based on this framework can increase the accuracy and efficiency with which PROs are measured.

  9. On Complexity in Bilingual Research: The Causes, Effects, and Breadth of Content and Language Integrated Learning--A Reply to Bruton (2011)

    ERIC Educational Resources Information Center

    Lorenzo, Francisco; Moore, Pat; Casal, Sonia

    2011-01-01

    This article proposes that a complex issue such as bilingualism gives rise to a need for complex research. Complexity theories, both in the psycholinguistic and educational fields, may inspire new empirical studies on bilingualism that will likely provide data otherwise unattainable through classic pre-test/post-test methods. The article also…

  10. Application of the Rasch Rating Scale Model to the Assessment of Quality of Life of Persons with Intellectual Disability

    ERIC Educational Resources Information Center

    Gomez, Laura E.; Arias, Benito; Verdugo, Miguel Angel; Navas, Patricia

    2012-01-01

    Background: Most instruments that assess quality of life have been validated by means of the classical test theory (CTT). However, CTT limitations have resulted in the development of alternative models, such as the Rasch rating scale model (RSM). The main goal of this paper is testing and improving the psychometric properties of the INTEGRAL…

  11. Contemporary understanding of riots: Classical crowd psychology, ideology and the social identity approach.

    PubMed

    Stott, Clifford; Drury, John

    2016-04-01

    This article explores the origins and ideology of classical crowd psychology, a body of theory reflected in contemporary popularised understandings such as of the 2011 English 'riots'. This article argues that during the nineteenth century, the crowd came to symbolise a fear of 'mass society' and that 'classical' crowd psychology was a product of these fears. Classical crowd psychology pathologised, reified and decontextualised the crowd, offering the ruling elites a perceived opportunity to control it. We contend that classical theory misrepresents crowd psychology and survives in contemporary understanding because it is ideological. We conclude by discussing how classical theory has been supplanted in academic contexts by an identity-based crowd psychology that restores the meaning to crowd action, replaces it in its social context and in so doing transforms theoretical understanding of 'riots' and the nature of the self. © The Author(s) 2016.

  12. Influence of an asymmetric ring on the modeling of an orthogonally stiffened cylindrical shell

    NASA Technical Reports Server (NTRS)

    Rastogi, Naveen; Johnson, Eric R.

    1994-01-01

    Structural models are examined for the influence of a ring with an asymmetrical cross section on the linear elastic response of an orthogonally stiffened cylindrical shell subjected to internal pressure. The first structural model employs classical theory for the shell and stiffeners. The second model employs transverse shear deformation theories for the shell and stringer and classical theory for the ring. Closed-end pressure vessel effects are included. Interacting line load intensities are computed in the stiffener-to-skin joints for an example problem having the dimensions of the fuselage of a large transport aircraft. Classical structural theory is found to exaggerate the asymmetric response compared to the transverse shear deformation theory.

  13. Attentional Sensitization of Unconscious Cognition: Task Sets Modulate Subsequent Masked Semantic Priming

    ERIC Educational Resources Information Center

    Kiefer, Markus; Martens, Ulla

    2010-01-01

    According to classical theories, automatic processes are autonomous and independent of higher level cognitive influence. In contrast, the authors propose that automatic processing depends on attentional sensitization of task-congruent processing pathways. In 3 experiments, the authors tested this hypothesis with a modified masked semantic priming…

  14. IRT Equating of the MCAT. MCAT Monograph.

    ERIC Educational Resources Information Center

    Hendrickson, Amy B.; Kolen, Michael J.

    This study compared various equating models and procedures for a sample of data from the Medical College Admission Test(MCAT), considering how item response theory (IRT) equating results compare with classical equipercentile results and how the results based on use of various IRT models, observed score versus true score, direct versus linked…

  15. Inferential Procedures for Correlation Coefficients Corrected for Attenuation.

    ERIC Educational Resources Information Center

    Hakstian, A. Ralph; And Others

    1988-01-01

    A model and computation procedure based on classical test score theory are presented for determination of a correlation coefficient corrected for attenuation due to unreliability. Delta and Monte Carlo method applications are discussed. A power analysis revealed no serious loss in efficiency resulting from correction for attentuation. (TJH)

  16. The Positivity Scale

    ERIC Educational Resources Information Center

    Caprara, Gian Vittorio; Alessandri, Guido; Eisenberg, Nancy; Kupfer, A.; Steca, Patrizia; Caprara, Maria Giovanna; Yamaguchi, Susumu; Fukuzawa, Ai; Abela, John

    2012-01-01

    Five studies document the validity of a new 8-item scale designed to measure "positivity," defined as the tendency to view life and experiences with a positive outlook. In the first study (N = 372), the psychometric properties of Positivity Scale (P Scale) were examined in accordance with classical test theory using a large number of…

  17. Integrating Incremental Learning and Episodic Memory Models of the Hippocampal Region

    ERIC Educational Resources Information Center

    Meeter, M.; Myers, C. E.; Gluck, M. A.

    2005-01-01

    By integrating previous computational models of corticohippocampal function, the authors develop and test a unified theory of the neural substrates of familiarity, recollection, and classical conditioning. This approach integrates models from 2 traditions of hippocampal modeling, those of episodic memory and incremental learning, by drawing on an…

  18. Constrained variational calculus for higher order classical field theories

    NASA Astrophysics Data System (ADS)

    Campos, Cédric M.; de León, Manuel; Martín de Diego, David

    2010-11-01

    We develop an intrinsic geometrical setting for higher order constrained field theories. As a main tool we use an appropriate generalization of the classical Skinner-Rusk formalism. Some examples of applications are studied, in particular to the geometrical description of optimal control theory for partial differential equations.

  19. Chance, determinism and the classical theory of probability.

    PubMed

    Vasudevan, Anubav

    2018-02-01

    This paper situates the metaphysical antinomy between chance and determinism in the historical context of some of the earliest developments in the mathematical theory of probability. Since Hacking's seminal work on the subject, it has been a widely held view that the classical theorists of probability were guilty of an unwitting equivocation between a subjective, or epistemic, interpretation of probability, on the one hand, and an objective, or statistical, interpretation, on the other. While there is some truth to this account, I argue that the tension at the heart of the classical theory of probability is not best understood in terms of the duality between subjective and objective interpretations of probability. Rather, the apparent paradox of chance and determinism, when viewed through the lens of the classical theory of probability, manifests itself in a much deeper ambivalence on the part of the classical probabilists as to the rational commensurability of causal and probabilistic reasoning. Copyright © 2017 Elsevier Ltd. All rights reserved.

  20. Dressing the post-Newtonian two-body problem and classical effective field theory

    NASA Astrophysics Data System (ADS)

    Kol, Barak; Smolkin, Michael

    2009-12-01

    We apply a dressed perturbation theory to better organize and economize the computation of high orders of the 2-body effective action of an inspiralling post-Newtonian (PN) gravitating binary. We use the effective field theory approach with the nonrelativistic field decomposition (NRG fields). For that purpose we develop quite generally the dressing theory of a nonlinear classical field theory coupled to pointlike sources. We introduce dressed charges and propagators, but unlike the quantum theory there are no dressed bulk vertices. The dressed quantities are found to obey recursive integral equations which succinctly encode parts of the diagrammatic expansion, and are the classical version of the Schwinger-Dyson equations. Actually, the classical equations are somewhat stronger since they involve only finitely many quantities, unlike the quantum theory. Classical diagrams are shown to factorize exactly when they contain nonlinear worldline vertices, and we classify all the possible topologies of irreducible diagrams for low loop numbers. We apply the dressing program to our post-Newtonian case of interest. The dressed charges consist of the dressed energy-momentum tensor after a nonrelativistic decomposition, and we compute all dressed charges (in the harmonic gauge) appearing up to 2PN in the 2-body effective action (and more). We determine the irreducible skeleton diagrams up to 3PN and we employ the dressed charges to compute several terms beyond 2PN.

  1. Bosonic Loop Diagrams as Perturbative Solutions of the Classical Field Equations in ϕ4-Theory

    NASA Astrophysics Data System (ADS)

    Finster, Felix; Tolksdorf, Jürgen

    2012-05-01

    Solutions of the classical ϕ4-theory in Minkowski space-time are analyzed in a perturbation expansion in the nonlinearity. Using the language of Feynman diagrams, the solution of the Cauchy problem is expressed in terms of tree diagrams which involve the retarded Green's function and have one outgoing leg. In order to obtain general tree diagrams, we set up a "classical measurement process" in which a virtual observer of a scattering experiment modifies the field and detects suitable energy differences. By adding a classical stochastic background field, we even obtain all loop diagrams. The expansions are compared with the standard Feynman diagrams of the corresponding quantum field theory.

  2. Modeling and analysis of the TF30-P-3 compressor system with inlet pressure distortion

    NASA Technical Reports Server (NTRS)

    Mazzawy, R. S.; Banks, G. A.

    1976-01-01

    Circumferential inlet distortion testing of a TF30-P-3 afterburning turbofan engine was conducted at NASA-Lewis Research Center. Pratt and Whitney Aircraft analyzed the data using its multiple segment parallel compressor model and classical compressor theory. Distortion attenuation analysis resulted in a detailed flow field calculation with good agreement between multiple segment model predictions and the test data. Sensitivity of the engine stall line to circumferential inlet distortion was calculated on the basis of parallel compressor theory to be more severe than indicated by the data. However, the calculated stall site location was in agreement with high response instrumentation measurements.

  3. Measuring Quality and Outcomes in Sports Medicine.

    PubMed

    Ruzbarsky, Joseph J; Marom, Niv; Marx, Robert G

    2018-07-01

    Patient-reported outcome measures (PROMs) are objective metrics critical to evaluating outcomes throughout orthopedic surgery. New instruments continue to emerge, increasing the breadth of information required for those intending to use these measures for research or clinical care. Although earlier metrics were developed using the principles of classic test theory, newer instruments constructed using item response theory are amenable to computer-adaptive testing and may change the way these instruments are administered. This article aims to define the psychometric properties that are important to understand when using all PROMs and to review the most widely used instruments in sports medicine. Copyright © 2018 Elsevier Inc. All rights reserved.

  4. How Settings Change People: Applying Behavior Setting Theory to Consumer-Run Organizations

    ERIC Educational Resources Information Center

    Brown, Louis D.; Shepherd, Matthew D.; Wituk, Scott A.; Meissen, Greg

    2007-01-01

    Self-help initiatives stand as a classic context for organizational studies in community psychology. Behavior setting theory stands as a classic conception of organizations and the environment. This study explores both, applying behavior setting theory to consumer-run organizations (CROs). Analysis of multiple data sets from all CROs in Kansas…

  5. Assessment of Work Climates: The Appropriateness of Classical-Management Theory and Human-Relations Theory under Various Contingencies. Final Report.

    ERIC Educational Resources Information Center

    Langdale, John A.

    The construct of "organizational climate" was explicated and various ways of operationalizing it were reviewed. A survey was made of the literature pertinent to the classical-human relations dimension of environmental quality. As a result, it was hypothesized that the appropriateness of the classical and human-relations master plans is moderated…

  6. The evolving Planck mass in classically scale-invariant theories

    NASA Astrophysics Data System (ADS)

    Kannike, K.; Raidal, M.; Spethmann, C.; Veermäe, H.

    2017-04-01

    We consider classically scale-invariant theories with non-minimally coupled scalar fields, where the Planck mass and the hierarchy of physical scales are dynamically generated. The classical theories possess a fixed point, where scale invariance is spontaneously broken. In these theories, however, the Planck mass becomes unstable in the presence of explicit sources of scale invariance breaking, such as non-relativistic matter and cosmological constant terms. We quantify the constraints on such classical models from Big Bang Nucleosynthesis that lead to an upper bound on the non-minimal coupling and require trans-Planckian field values. We show that quantum corrections to the scalar potential can stabilise the fixed point close to the minimum of the Coleman-Weinberg potential. The time-averaged motion of the evolving fixed point is strongly suppressed, thus the limits on the evolving gravitational constant from Big Bang Nucleosynthesis and other measurements do not presently constrain this class of theories. Field oscillations around the fixed point, if not damped, contribute to the dark matter density of the Universe.

  7. A quantum probability framework for human probabilistic inference.

    PubMed

    Trueblood, Jennifer S; Yearsley, James M; Pothos, Emmanuel M

    2017-09-01

    There is considerable variety in human inference (e.g., a doctor inferring the presence of a disease, a juror inferring the guilt of a defendant, or someone inferring future weight loss based on diet and exercise). As such, people display a wide range of behaviors when making inference judgments. Sometimes, people's judgments appear Bayesian (i.e., normative), but in other cases, judgments deviate from the normative prescription of classical probability theory. How can we combine both Bayesian and non-Bayesian influences in a principled way? We propose a unified explanation of human inference using quantum probability theory. In our approach, we postulate a hierarchy of mental representations, from 'fully' quantum to 'fully' classical, which could be adopted in different situations. In our hierarchy of models, moving from the lowest level to the highest involves changing assumptions about compatibility (i.e., how joint events are represented). Using results from 3 experiments, we show that our modeling approach explains 5 key phenomena in human inference including order effects, reciprocity (i.e., the inverse fallacy), memorylessness, violations of the Markov condition, and antidiscounting. As far as we are aware, no existing theory or model can explain all 5 phenomena. We also explore transitions in our hierarchy, examining how representations change from more quantum to more classical. We show that classical representations provide a better account of data as individuals gain familiarity with a task. We also show that representations vary between individuals, in a way that relates to a simple measure of cognitive style, the Cognitive Reflection Test. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  8. Theory of Neutron Chain Reactions: Extracts from Volume I, Diffusion and Slowing Down of Neutrons: Chapter I. Elementary Theory of Neutron Diffusion. Chapter II. Second Order Diffusion Theory. Chapter III. Slowing Down of Neutrons

    DOE R&D Accomplishments Database

    Weinberg, Alvin M.; Noderer, L. C.

    1951-05-15

    The large scale release of nuclear energy in a uranium fission chain reaction involves two essentially distinct physical phenomena. On the one hand there are the individual nuclear processes such as fission, neutron capture, and neutron scattering. These are essentially quantum mechanical in character, and their theory is non-classical. On the other hand, there is the process of diffusion -- in particular, diffusion of neutrons, which is of fundamental importance in a nuclear chain reaction. This process is classical; insofar as the theory of the nuclear chain reaction depends on the theory of neutron diffusion, the mathematical study of chain reactions is an application of classical, not quantum mechanical, techniques.

  9. Eclipsing binary stars as tests of gravity theories - The apsidal motion of AS Camelopardalis

    NASA Technical Reports Server (NTRS)

    Maloney, Frank P.; Guinan, Edward F.; Boyd, Patricia T.

    1989-01-01

    AS Camelopardalis is an 8th-magnitude eclipsing binary that consists of two main-sequence (B8 V and a B9.5 V) components in an eccentric orbit (e = 0.17) with an orbital period of 3.43 days. Like the eccentric eclipsing system DI Herculis, and a few other systems, AS Cam is an important test case for studying relativistic apsidal motion. In these systems, the theoretical general relativistic apsidal motion is comparable to that expected from classical effects arising from tidal and rotational deformation of the stellar components. Accurate determinations of the orbital and stellar properties of AS Cam have been made by Hilditch (1972) and Khalliulin and Kozyreva (1983) that permit the theoretical relativistic and classical contributions to the apsidal motion to be determined reasonably well. All the published timings of primary and secondary minima have been gathered and supplemented with eclipse timings from 1899 to 1920 obtained from the Harvard plate collection. Least-squares solutions of the eclipse timings extending over an 80 yr interval yield a smaller than expected apsidal motion, in agreement with that found by Khalliulin and Kozyreva from a smaller set of data. The observed apsidal motion for AS Cam is about one-third that expected from the combined relativistic and classical effects. Thus, AS Cam joins DI Her in having an observed apsidal motion significantly less than that predicted from theory.

  10. Using classical test theory, item response theory, and Rasch measurement theory to evaluate patient-reported outcome measures: a comparison of worked examples.

    PubMed

    Petrillo, Jennifer; Cano, Stefan J; McLeod, Lori D; Coon, Cheryl D

    2015-01-01

    To provide comparisons and a worked example of item- and scale-level evaluations based on three psychometric methods used in patient-reported outcome development-classical test theory (CTT), item response theory (IRT), and Rasch measurement theory (RMT)-in an analysis of the National Eye Institute Visual Functioning Questionnaire (VFQ-25). Baseline VFQ-25 data from 240 participants with diabetic macular edema from a randomized, double-masked, multicenter clinical trial were used to evaluate the VFQ at the total score level. CTT, RMT, and IRT evaluations were conducted, and results were assessed in a head-to-head comparison. Results were similar across the three methods, with IRT and RMT providing more detailed diagnostic information on how to improve the scale. CTT led to the identification of two problematic items that threaten the validity of the overall scale score, sets of redundant items, and skewed response categories. IRT and RMT additionally identified poor fit for one item, many locally dependent items, poor targeting, and disordering of over half the response categories. Selection of a psychometric approach depends on many factors. Researchers should justify their evaluation method and consider the intended audience. If the instrument is being developed for descriptive purposes and on a restricted budget, a cursory examination of the CTT-based psychometric properties may be all that is possible. In a high-stakes situation, such as the development of a patient-reported outcome instrument for consideration in pharmaceutical labeling, however, a thorough psychometric evaluation including IRT or RMT should be considered, with final item-level decisions made on the basis of both quantitative and qualitative results. Copyright © 2015. Published by Elsevier Inc.

  11. Discrete-State and Continuous Models of Recognition Memory: Testing Core Properties under Minimal Assumptions

    ERIC Educational Resources Information Center

    Kellen, David; Klauer, Karl Christoph

    2014-01-01

    A classic discussion in the recognition-memory literature concerns the question of whether recognition judgments are better described by continuous or discrete processes. These two hypotheses are instantiated by the signal detection theory model (SDT) and the 2-high-threshold model, respectively. Their comparison has almost invariably relied on…

  12. Psychometric Properties of the HOME Inventory Using Rasch Analysis

    ERIC Educational Resources Information Center

    Glad, Johan; Kottorp, Anders; Jergeby, Ulla; Gustafsson, Carina; Sonnander, Karin

    2014-01-01

    Objectives: The aim of this pilot study was to explore psychometric properties of two versions of the Home Observation for Measurement of the Environment Inventory in a Swedish social service sample. Method: Social workers employed at 22 Swedish child protections agencies participated in the data collection. Both classic test theory approaches and…

  13. Measuring Integration of Information and Communication Technology in Education: An Item Response Modeling Approach

    ERIC Educational Resources Information Center

    Peeraer, Jef; Van Petegem, Peter

    2012-01-01

    This research describes the development and validation of an instrument to measure integration of Information and Communication Technology (ICT) in education. After literature research on definitions of integration of ICT in education, a comparison is made between the classical test theory and the item response modeling approach for the…

  14. Renewing Aristotelian Theory: The Cold Fusion Controversy as a Test Case.

    ERIC Educational Resources Information Center

    Gross, Alan G.

    1995-01-01

    Exhibits the strength and flexibility of science as a rhetorical enterprise via a rhetorical analysis of cold fusion which reveals science under considerable stress. Assumes the continuing viability of classical rhetoric as an explanation for the persuasiveness of texts, while acknowledging the need to reexamine its central concepts. (SR)

  15. Classical conformality in the Standard Model from Coleman’s theory

    NASA Astrophysics Data System (ADS)

    Kawana, Kiyoharu

    2016-09-01

    The classical conformality (CC) is one of the possible candidates for explaining the gauge hierarchy of the Standard Model (SM). We show that it is naturally obtained from the Coleman’s theory on baby universe.

  16. Maximal incompatibility of locally classical behavior and global causal order in multiparty scenarios

    NASA Astrophysics Data System (ADS)

    Baumeler, ńmin; Feix, Adrien; Wolf, Stefan

    2014-10-01

    Quantum theory in a global spacetime gives rise to nonlocal correlations, which cannot be explained causally in a satisfactory way; this motivates the study of theories with reduced global assumptions. Oreshkov, Costa, and Brukner [Nat. Commun. 3, 1092 (2012), 10.1038/ncomms2076] proposed a framework in which quantum theory is valid locally but where, at the same time, no global spacetime, i.e., predefined causal order, is assumed beyond the absence of logical paradoxes. It was shown for the two-party case, however, that a global causal order always emerges in the classical limit. Quite naturally, it has been conjectured that the same also holds in the multiparty setting. We show that, counter to this belief, classical correlations locally compatible with classical probability theory exist that allow for deterministic signaling between three or more parties incompatible with any predefined causal order.

  17. A post-classical theory of enamel biomineralization… and why we need one.

    PubMed

    Simmer, James P; Richardson, Amelia S; Hu, Yuan-Yuan; Smith, Charles E; Ching-Chun Hu, Jan

    2012-09-01

    Enamel crystals are unique in shape, orientation and organization. They are hundreds of thousands times longer than they are wide, run parallel to each other, are oriented with respect to the ameloblast membrane at the mineralization front and are organized into rod or interrod enamel. The classical theory of amelogenesis postulates that extracellular matrix proteins shape crystallites by specifically inhibiting ion deposition on the crystal sides, orient them by binding multiple crystallites and establish higher levels of crystal organization. Elements of the classical theory are supported in principle by in vitro studies; however, the classical theory does not explain how enamel forms in vivo. In this review, we describe how amelogenesis is highly integrated with ameloblast cell activities and how the shape, orientation and organization of enamel mineral ribbons are established by a mineralization front apparatus along the secretory surface of the ameloblast cell membrane.

  18. Hydrostatic Stress Effect on the Yield Behavior of Inconel 100

    NASA Technical Reports Server (NTRS)

    Allen, Phillip A.; Wilson, Christopher D.

    2003-01-01

    Classical metal plasticity theory assumes that hydrostatic stress has negligible effect on the yield and postyield behavior of metals. Recent reexaminations of classical theory have revealed a significant effect of hydrostatic stress on the yield behavior of various geometries. Fatigue tests and nonlinear finite element analyses (FEA) of Inconel 100 (IN100) equal-arm bend specimens and new monotonic tests and nonlinear finite element analyses of IN100 smooth tension, smooth compression, and double-edge notch tension (DENT) test specimens have revealed the effect of internal hydrostatic tensile stresses on yielding. Nonlinear FEA using the von Mises (yielding is independent of hydrostatic stress) and the Drucker-Prager (yielding is linearly dependent on hydrostatic stress) yield functions were performed. A new FEA constitutive model was developed that incorporates a pressure-dependent yield function with combined multilinear kinematic and multilinear isotropic hardening using the ABAQUS user subroutine (UMAT) utility. In all monotonic tensile test cases, the von Mises constitutive model, overestimated the load for a given displacement or strain. Considering the failure displacements or strains for the DENT specimen, the Drucker-Prager FEM s predicted loads that were approximately 3% lower than the von Mises values. For the failure loads, the Drucker Prager FEM s predicted strains that were up to 35% greater than the von Mises values. Both the Drucker-Prager model and the von Mises model performed equally-well in simulating the equal-arm bend fatigue test.

  19. Statistical mechanics in the context of special relativity. II.

    PubMed

    Kaniadakis, G

    2005-09-01

    The special relativity laws emerge as one-parameter (light speed) generalizations of the corresponding laws of classical physics. These generalizations, imposed by the Lorentz transformations, affect both the definition of the various physical observables (e.g., momentum, energy, etc.), as well as the mathematical apparatus of the theory. Here, following the general lines of [Phys. Rev. E 66, 056125 (2002)], we show that the Lorentz transformations impose also a proper one-parameter generalization of the classical Boltzmann-Gibbs-Shannon entropy. The obtained relativistic entropy permits us to construct a coherent and self-consistent relativistic statistical theory, preserving the main features of the ordinary statistical theory, which is recovered in the classical limit. The predicted distribution function is a one-parameter continuous deformation of the classical Maxwell-Boltzmann distribution and has a simple analytic form, showing power law tails in accordance with the experimental evidence. Furthermore, this statistical mechanics can be obtained as the stationary case of a generalized kinetic theory governed by an evolution equation obeying the H theorem and reproducing the Boltzmann equation of the ordinary kinetics in the classical limit.

  20. Creation and validation of the barriers to alcohol reduction (BAR) scale using classical test theory and item response theory.

    PubMed

    Kunicki, Zachary J; Schick, Melissa R; Spillane, Nichea S; Harlow, Lisa L

    2018-06-01

    Those who binge drink are at increased risk for alcohol-related consequences when compared to non-binge drinkers. Research shows individuals may face barriers to reducing their drinking behavior, but few measures exist to assess these barriers. This study created and validated the Barriers to Alcohol Reduction (BAR) scale. Participants were college students ( n  = 230) who endorsed at least one instance of past-month binge drinking (4+ drinks for women or 5+ drinks for men). Using classical test theory, exploratory structural equation modeling found a two-factor structure of personal/psychosocial barriers and perceived program barriers. The sub-factors, and full scale had reasonable internal consistency (i.e., coefficient omega = 0.78 (personal/psychosocial), 0.82 (program barriers), and 0.83 (full measure)). The BAR also showed evidence for convergent validity with the Brief Young Adult Alcohol Consequences Questionnaire ( r  = 0.39, p  < .001) and discriminant validity with Barriers to Physical Activity ( r  = -0.02, p  = .81). Item Response Theory (IRT) analysis showed the two factors separately met the unidimensionality assumption, and provided further evidence for severity of the items on the two factors. Results suggest that the BAR measure appears reliable and valid for use in an undergraduate student population of binge drinkers. Future studies may want to re-examine this measure in a more diverse sample.

  1. Testing Relativity with Electrodynamics

    NASA Astrophysics Data System (ADS)

    Bailey, Quentin; Kostelecky, Alan

    2004-04-01

    Lorentz and CPT violation is a promising candidate signal for Planck-scale physics. Low-energy effects of Lorentz and CPT violation are described by the general theoretical framework called the Standard-Model Extension (SME). This talk focuses on Lorentz-violating effects arising in the classical electrodynamics limit of the SME. Analysis of the theory shows that suitable experiments could improve by several orders of magnitude certain sensitivities achieved in modern Michelson-Morley and Kennedy-Thorndike tests.

  2. Tests of Lorentz Symmetry with Electrodynamics

    NASA Astrophysics Data System (ADS)

    Bailey, Quentin; Kostelecky, Alan

    2004-05-01

    Lorentz and CPT violation is a promising candidate signal for Planck-scale physics. Low-energy effects of Lorentz and CPT violation are described by the general theoretical framework called the Standard-Model Extension (SME). This talk focuses on Lorentz-violating effects arising in the limit of classical electrodynamics. Analysis of the theory shows that suitable experiments could improve by several orders of magnitude on the sensitivities achieved in modern Michelson-Morley and Kennedy-Thorndike tests.

  3. Construction of Chained True Score Equipercentile Equatings under the Kernel Equating (KE) Framework and Their Relationship to Levine True Score Equating. Research Report. ETS RR-09-24

    ERIC Educational Resources Information Center

    Chen, Haiwen; Holland, Paul

    2009-01-01

    In this paper, we develop a new chained equipercentile equating procedure for the nonequivalent groups with anchor test (NEAT) design under the assumptions of the classical test theory model. This new equating is named chained true score equipercentile equating. We also apply the kernel equating framework to this equating design, resulting in a…

  4. Leading-order classical Lagrangians for the nonminimal standard-model extension

    NASA Astrophysics Data System (ADS)

    Reis, J. A. A. S.; Schreck, M.

    2018-03-01

    In this paper, we derive the general leading-order classical Lagrangian covering all fermion operators of the nonminimal standard-model extension (SME). Such a Lagrangian is considered to be the point-particle analog of the effective field theory description of Lorentz violation that is provided by the SME. At leading order in Lorentz violation, the Lagrangian obtained satisfies the set of five nonlinear equations that govern the map from the field theory to the classical description. This result can be of use for phenomenological studies of classical bodies in gravitational fields.

  5. JOURNAL SCOPE GUIDELINES: Paper classification scheme

    NASA Astrophysics Data System (ADS)

    2005-06-01

    This scheme is used to clarify the journal's scope and enable authors and readers to more easily locate the appropriate section for their work. For each of the sections listed in the scope statement we suggest some more detailed subject areas which help define that subject area. These lists are by no means exhaustive and are intended only as a guide to the type of papers we envisage appearing in each section. We acknowledge that no classification scheme can be perfect and that there are some papers which might be placed in more than one section. We are happy to provide further advice on paper classification to authors upon request (please email jphysa@iop.org). 1. Statistical physics numerical and computational methods statistical mechanics, phase transitions and critical phenomena quantum condensed matter theory Bose-Einstein condensation strongly correlated electron systems exactly solvable models in statistical mechanics lattice models, random walks and combinatorics field-theoretical models in statistical mechanics disordered systems, spin glasses and neural networks nonequilibrium systems network theory 2. Chaotic and complex systems nonlinear dynamics and classical chaos fractals and multifractals quantum chaos classical and quantum transport cellular automata granular systems and self-organization pattern formation biophysical models 3. Mathematical physics combinatorics algebraic structures and number theory matrix theory classical and quantum groups, symmetry and representation theory Lie algebras, special functions and orthogonal polynomials ordinary and partial differential equations difference and functional equations integrable systems soliton theory functional analysis and operator theory inverse problems geometry, differential geometry and topology numerical approximation and analysis geometric integration computational methods 4. Quantum mechanics and quantum information theory coherent states eigenvalue problems supersymmetric quantum mechanics scattering theory relativistic quantum mechanics semiclassical approximations foundations of quantum mechanics and measurement theory entanglement and quantum nonlocality geometric phases and quantum tomography quantum tunnelling decoherence and open systems quantum cryptography, communication and computation theoretical quantum optics 5. Classical and quantum field theory quantum field theory gauge and conformal field theory quantum electrodynamics and quantum chromodynamics Casimir effect integrable field theory random matrix theory applications in field theory string theory and its developments classical field theory and electromagnetism metamaterials 6. Fluid and plasma theory turbulence fundamental plasma physics kinetic theory magnetohydrodynamics and multifluid descriptions strongly coupled plasmas one-component plasmas non-neutral plasmas astrophysical and dusty plasmas

  6. Analysis of delamination related fracture processes in composites

    NASA Technical Reports Server (NTRS)

    Armanios, Erian A.

    1992-01-01

    An anisotropic thin walled closed section beam theory was developed based on an asymptotical analysis of the shell energy functional. The displacement field is not assumed a priori and emerges as a result of the analysis. In addition to the classical out-of-plane torsional warping, two new contributions are identified namely, axial strain and bending warping. A comparison of the derived governing equations confirms the theory developed by Reissner and Tsai. Also, explicit closed form expressions for the beam stiffness coefficients, the stress and displacement fields are provided. The predictions of the present theory were validated by comparison with finite element simulation, other closed form analyses and test data.

  7. Quasi-Static Analysis of Round LaRC THUNDER Actuators

    NASA Technical Reports Server (NTRS)

    Campbell, Joel F.

    2007-01-01

    An analytic approach is developed to predict the shape and displacement with voltage in the quasi-static limit of round LaRC Thunder Actuators. The problem is treated with classical lamination theory and Von Karman non-linear analysis. In the case of classical lamination theory exact analytic solutions are found. It is shown that classical lamination theory is insufficient to describe the physical situation for large actuators but is sufficient for very small actuators. Numerical results are presented for the non-linear analysis and compared with experimental measurements. Snap-through behavior, bifurcation, and stability are presented and discussed.

  8. Quasi-Static Analysis of LaRC THUNDER Actuators

    NASA Technical Reports Server (NTRS)

    Campbell, Joel F.

    2007-01-01

    An analytic approach is developed to predict the shape and displacement with voltage in the quasi-static limit of LaRC Thunder Actuators. The problem is treated with classical lamination theory and Von Karman non-linear analysis. In the case of classical lamination theory exact analytic solutions are found. It is shown that classical lamination theory is insufficient to describe the physical situation for large actuators but is sufficient for very small actuators. Numerical results are presented for the non-linear analysis and compared with experimental measurements. Snap-through behavior, bifurcation, and stability are presented and discussed.

  9. Navigating the grounded theory terrain. Part 2.

    PubMed

    Hunter, Andrew; Murphy, Kathy; Grealish, Annmarie; Casey, Dympna; Keady, John

    2011-01-01

    In this paper, the choice of classic grounded theory will be discussed and justified in the context of the first author's PhD research. The methodological discussion takes place within the context of PhD research entitled: Development of a stakeholder-led framework for a structured education programme that will prepare nurses and healthcare assistants to deliver a psychosocial intervention for people with dementia. There is a lack of research and limited understanding of the effect of psychosocial interventions on people with dementia. The first author thought classic grounded theory a suitable research methodology to investigate as it is held to be ideal for areas of research where there is little understanding of the social processes at work. The literature relating to the practical application of classic grounded theory is illustrated using examples relating to four key grounded theory components: Theory development: using constant comparison and memoing, Methodological rigour, Emergence of a core category, Inclusion of self and engagement with participants. Following discussion of the choice and application of classic grounded theory, this paper explores the need for researchers to visit and understand the various grounded theory options. This paper argues that researchers new to grounded theory must be familiar with and understand the various options. The researchers will then be able to apply the methodologies they choose consistently and critically. Doing so will allow them to develop theory rigorously and they will ultimately be able to better defend their final methodological destinations.

  10. Geometric Algebra for Physicists

    NASA Astrophysics Data System (ADS)

    Doran, Chris; Lasenby, Anthony

    2007-11-01

    Preface; Notation; 1. Introduction; 2. Geometric algebra in two and three dimensions; 3. Classical mechanics; 4. Foundations of geometric algebra; 5. Relativity and spacetime; 6. Geometric calculus; 7. Classical electrodynamics; 8. Quantum theory and spinors; 9. Multiparticle states and quantum entanglement; 10. Geometry; 11. Further topics in calculus and group theory; 12. Lagrangian and Hamiltonian techniques; 13. Symmetry and gauge theory; 14. Gravitation; Bibliography; Index.

  11. Rydberg Atoms in Strong Fields: a Testing Ground for Quantum Chaos.

    NASA Astrophysics Data System (ADS)

    Courtney, Michael

    1995-01-01

    Rydberg atoms in strong static electric and magnetic fields provide experimentally accessible systems for studying the connections between classical chaos and quantum mechanics in the semiclassical limit. This experimental accessibility has motivated the development of reliable quantum mechanical solutions. This thesis uses both experimental and computed quantum spectra to test the central approaches to quantum chaos. These central approaches consist mainly of developing methods to compute the spectra of quantum systems in non -perturbative regimes, correlating statistical descriptions of eigenvalues with the classical behavior of the same Hamiltonian, and the development of semiclassical methods such as periodic-orbit theory. Particular emphasis is given to identifying the spectral signature of recurrences --quantum wave packets which follow classical orbits. The new findings include: the breakdown of the connection between energy-level statistics and classical chaos in odd-parity diamagnetic lithium, the discovery of the signature of very long period orbits in atomic spectra, quantitative evidence for the scattering of recurrences by the alkali -metal core, quantitative description of the behavior of recurrences near bifurcations, and a semiclassical interpretation of the evolution of continuum Stark spectra. (Copies available exclusively from MIT Libraries, Rm. 14-0551, Cambridge, MA 02139-4307. Ph. 617-253-5668; Fax 617-253-1690.).

  12. Optimal stomatal behavior with competition for water and risk of hydraulic impairment.

    PubMed

    Wolf, Adam; Anderegg, William R L; Pacala, Stephen W

    2016-11-15

    For over 40 y the dominant theory of stomatal behavior has been that plants should open stomates until the carbon gained by an infinitesimal additional opening balances the additional water lost times a water price that is constant at least over short periods. This theory has persisted because of its remarkable success in explaining strongly supported simple empirical models of stomatal conductance, even though we have also known for over 40 y that the theory is not consistent with competition among plants for water. We develop an alternative theory in which plants maximize carbon gain without pricing water loss and also add two features to both this and the classical theory, which are strongly supported by empirical evidence: (i) water flow through xylem that is progressively impaired as xylem water potential drops and (ii) fitness or carbon costs associated with low water potentials caused by a variety of mechanisms, including xylem damage repair. We show that our alternative carbon-maximization optimization is consistent with plant competition because it yields an evolutionary stable strategy (ESS)-species with the ESS stomatal behavior that will outcompete all others. We further show that, like the classical theory, the alternative theory also explains the functional forms of empirical stomatal models. We derive ways to test between the alternative optimization criteria by introducing a metric-the marginal xylem tension efficiency, which quantifies the amount of photosynthesis a plant will forego from opening stomatal an infinitesimal amount more to avoid a drop in water potential.

  13. The Nature of Quantum Truth: Logic, Set Theory, & Mathematics in the Context of Quantum Theory

    NASA Astrophysics Data System (ADS)

    Frey, Kimberly

    The purpose of this dissertation is to construct a radically new type of mathematics whose underlying logic differs from the ordinary classical logic used in standard mathematics, and which we feel may be more natural for applications in quantum mechanics. Specifically, we begin by constructing a first order quantum logic, the development of which closely parallels that of ordinary (classical) first order logic --- the essential differences are in the nature of the logical axioms, which, in our construction, are motivated by quantum theory. After showing that the axiomatic first order logic we develop is sound and complete (with respect to a particular class of models), this logic is then used as a foundation on which to build (axiomatic) mathematical systems --- and we refer to the resulting new mathematics as "quantum mathematics." As noted above, the hope is that this form of mathematics is more natural than classical mathematics for the description of quantum systems, and will enable us to address some foundational aspects of quantum theory which are still troublesome --- e.g. the measurement problem --- as well as possibly even inform our thinking about quantum gravity. After constructing the underlying logic, we investigate properties of several mathematical systems --- e.g. axiom systems for abstract algebras, group theory, linear algebra, etc. --- in the presence of this quantum logic. In the process, we demonstrate that the resulting quantum mathematical systems have some strange, but very interesting features, which indicates a richness in the structure of mathematics that is classically inaccessible. Moreover, some of these features do indeed suggest possible applications to foundational questions in quantum theory. We continue our investigation of quantum mathematics by constructing an axiomatic quantum set theory, which we show satisfies certain desirable criteria. Ultimately, we hope that such a set theory will lead to a foundation for quantum mathematics in a sense which parallels the foundational role of classical set theory in classical mathematics. One immediate application of the quantum set theory we develop is to provide a foundation on which to construct quantum natural numbers, which are the quantum analog of the classical counting numbers. It turns out that in a special class of models, there exists a 1-1 correspondence between the quantum natural numbers and bounded observables in quantum theory whose eigenvalues are (ordinary) natural numbers. This 1-1 correspondence is remarkably satisfying, and not only gives us great confidence in our quantum set theory, but indicates the naturalness of such models for quantum theory itself. We go on to develop a Peano-like arithmetic for these new "numbers," as well as consider some of its consequences. Finally, we conclude by summarizing our results, and discussing directions for future work.

  14. Nucleation theory - Is replacement free energy needed?. [error analysis of capillary approximation

    NASA Technical Reports Server (NTRS)

    Doremus, R. H.

    1982-01-01

    It has been suggested that the classical theory of nucleation of liquid from its vapor as developed by Volmer and Weber (1926) needs modification with a factor referred to as the replacement free energy and that the capillary approximation underlying the classical theory is in error. Here, the classical nucleation equation is derived from fluctuation theory, Gibb's result for the reversible work to form a critical nucleus, and the rate of collision of gas molecules with a surface. The capillary approximation is not used in the derivation. The chemical potential of small drops is then considered, and it is shown that the capillary approximation can be derived from thermodynamic equations. The results show that no corrections to Volmer's equation are needed.

  15. Effective model hierarchies for dynamic and static classical density functional theories

    NASA Astrophysics Data System (ADS)

    Majaniemi, S.; Provatas, N.; Nonomura, M.

    2010-09-01

    The origin and methodology of deriving effective model hierarchies are presented with applications to solidification of crystalline solids. In particular, it is discussed how the form of the equations of motion and the effective parameters on larger scales can be obtained from the more microscopic models. It will be shown that tying together the dynamic structure of the projection operator formalism with static classical density functional theories can lead to incomplete (mass) transport properties even though the linearized hydrodynamics on large scales is correctly reproduced. To facilitate a more natural way of binding together the dynamics of the macrovariables and classical density functional theory, a dynamic generalization of density functional theory based on the nonequilibrium generating functional is suggested.

  16. Foundations of Quantum Mechanics: recent developments at INRIM

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Genovese, Marco; Piacentini, Fabrizio

    2011-09-23

    This paper's purpose is to show some experiments performed in the 'Carlo Novero' labs of the Optics Division of the National Institute of Metrological Research (INRIM, Torino, Italy) in the last years, aiming to discriminate between Standard Quantum Mechanics and some specific, restricted class of Hidden Variable Theories (HVTs).The first experiment, realized in two different configurations, will perform the Alicki - Van Ryn non-classicality test on single particles, in our specific case heralded single photons. The second experiment instead will be on the testing of two restricted Local Realistic Theories (LRTs), properly built to describe polarization entangled photons experiments, whosemore » inequalities are not affected by the detection loophole.« less

  17. How Often Do Subscores Have Added Value? Results from Operational and Simulated Data

    ERIC Educational Resources Information Center

    Sinharay, Sandip

    2010-01-01

    Recently, there has been an increasing level of interest in subscores for their potential diagnostic value. Haberman suggested a method based on classical test theory to determine whether subscores have added value over total scores. In this article I first provide a rich collection of results regarding when subscores were found to have added…

  18. Item Response Modeling: An Evaluation of the Children's Fruit and Vegetable Self-Efficacy Questionnaire

    ERIC Educational Resources Information Center

    Watson, Kathy; Baranowski, Tom; Thompson, Debbe

    2006-01-01

    Perceived self-efficacy (SE) for eating fruit and vegetables (FV) is a key variable mediating FV change in interventions. This study applies item response modeling (IRM) to a fruit, juice and vegetable self-efficacy questionnaire (FVSEQ) previously validated with classical test theory (CTT) procedures. The 24-item (five-point Likert scale) FVSEQ…

  19. Item Construction Using Reflective, Formative, or Rasch Measurement Models: Implications for Group Work

    ERIC Educational Resources Information Center

    Peterson, Christina Hamme; Gischlar, Karen L.; Peterson, N. Andrew

    2017-01-01

    Measures that accurately capture the phenomenon are critical to research and practice in group work. The vast majority of group-related measures were developed using the reflective measurement model rooted in classical test theory (CTT). Depending on the construct definition and the measure's purpose, the reflective model may not always be the…

  20. The Assessment Revolution that Has Passed England By: Rasch Measurement

    ERIC Educational Resources Information Center

    Panayides, Panayiotis; Robinson, Colin; Tymms, Peter

    2010-01-01

    Assessment has been dominated by Classical Test Theory for the last half century although the radically different approach known as Rasch measurement briefly blossomed in England during the 1960s and 1970s. Its open development was stopped dead in the 1980s, whilst some work has continued almost surreptitiously. Elsewhere Rasch has assumed…

  1. Electromagnetic Waves in a Uniform Gravitational Field and Planck's Postulate

    ERIC Educational Resources Information Center

    Acedo, Luis; Tung, Michael M.

    2012-01-01

    The gravitational redshift forms the central part of the majority of the classical tests for the general theory of relativity. It could be successfully checked even in laboratory experiments on the earth's surface. The standard derivation of this effect is based on the distortion of the local structure of spacetime induced by large masses. The…

  2. The Proper Sequence for Correcting Correlation Coefficients for Range Restriction and Unreliability.

    ERIC Educational Resources Information Center

    Stauffer, Joseph M.; Mendoza, Jorge L.

    2001-01-01

    Uses classical test theory to show that it is the nature of the range restriction, rather than the nature of the available reliability coefficient, that determines the sequence for applying corrections for range restriction and unreliability. Shows how the common rule of thumb for choosing the sequence is tenable only when the correction does not…

  3. Modifying Spearman's Attenuation Equation to Yield Partial Corrections for Measurement Error--With Application to Sample Size Calculations

    ERIC Educational Resources Information Center

    Nicewander, W. Alan

    2018-01-01

    Spearman's correction for attenuation (measurement error) corrects a correlation coefficient for measurement errors in either-or-both of two variables, and follows from the assumptions of classical test theory. Spearman's equation removes all measurement error from a correlation coefficient which translates into "increasing the reliability of…

  4. Psychometric Properties of the Fatigue Severity Scale in Polio Survivors

    ERIC Educational Resources Information Center

    Burger, Helena; Franchignoni, Franco; Puzic, Natasa; Giordano, Andrea

    2010-01-01

    The objective of this study was to evaluate by means of classical test theory and Rasch analysis the scaling characteristics and psychometric properties of the Fatigue Severity Scale (FSS) in polio survivors. A questionnaire, consisting of five general questions (sex, age, age at time of acute polio, sequelae of polio, and new symptoms), the FSS,…

  5. Fundamental Structure of Loop Quantum Gravity

    NASA Astrophysics Data System (ADS)

    Han, Muxin; Ma, Yongge; Huang, Weiming

    In the recent twenty years, loop quantum gravity, a background independent approach to unify general relativity and quantum mechanics, has been widely investigated. The aim of loop quantum gravity is to construct a mathematically rigorous, background independent, non-perturbative quantum theory for a Lorentzian gravitational field on a four-dimensional manifold. In the approach, the principles of quantum mechanics are combined with those of general relativity naturally. Such a combination provides us a picture of, so-called, quantum Riemannian geometry, which is discrete on the fundamental scale. Imposing the quantum constraints in analogy from the classical ones, the quantum dynamics of gravity is being studied as one of the most important issues in loop quantum gravity. On the other hand, the semi-classical analysis is being carried out to test the classical limit of the quantum theory. In this review, the fundamental structure of loop quantum gravity is presented pedagogically. Our main aim is to help non-experts to understand the motivations, basic structures, as well as general results. It may also be beneficial to practitioners to gain insights from different perspectives on the theory. We will focus on the theoretical framework itself, rather than its applications, and do our best to write it in modern and precise langauge while keeping the presentation accessible for beginners. After reviewing the classical connection dynamical formalism of general relativity, as a foundation, the construction of the kinematical Ashtekar-Isham-Lewandowski representation is introduced in the content of quantum kinematics. The algebraic structure of quantum kinematics is also discussed. In the content of quantum dynamics, we mainly introduce the construction of a Hamiltonian constraint operator and the master constraint project. At last, some applications and recent advances are outlined. It should be noted that this strategy of quantizing gravity can also be extended to obtain other background-independent quantum gauge theories. There is no divergence within this background-independent and diffeomorphism-invariant quantization program of matter coupled to gravity.

  6. Using extant literature in a grounded theory study: a personal account.

    PubMed

    Yarwood-Ross, Lee; Jack, Kirsten

    2015-03-01

    To provide a personal account of the factors in a doctoral study that led to the adoption of classic grounded theory principles relating to the use of literature. Novice researchers considering grounded theory methodology will become aware of the contentious issue of how and when extant literature should be incorporated into a study. The three main grounded theory approaches are classic, Straussian and constructivist, and the seminal texts provide conflicting beliefs surrounding the use of literature. A classic approach avoids a pre-study literature review to minimise preconceptions and emphasises the constant comparison method, while the Straussian and constructivist approaches focus more on the beneficial aspects of an initial literature review and researcher reflexivity. The debate also extends into the wider academic community, where no consensus exists. This is a methodological paper detailing the authors' engagement in the debate surrounding the role of the literature in a grounded theory study. In the authors' experience, researchers can best understand the use of literature in grounded theory through immersion in the seminal texts, engaging with wider academic literature, and examining their preconceptions of the substantive area. The authors concluded that classic grounded theory principles were appropriate in the context of their doctoral study. Novice researchers will have their own sets of circumstances when preparing their studies and should become aware of the different perspectives to make decisions that they can ultimately justify. This paper can be used by other novice researchers as an example of the decision-making process that led to delaying a pre-study literature review and identifies the resources used to write a research proposal when using a classic grounded theory approach.

  7. Ward identity and basis tensor gauge theory at one loop

    NASA Astrophysics Data System (ADS)

    Chung, Daniel J. H.

    2018-06-01

    Basis tensor gauge theory (BTGT) is a reformulation of ordinary gauge theory that is an analog of the vierbein formulation of gravity and is related to the Wilson line formulation. To match ordinary gauge theories coupled to matter, the BTGT formalism requires a continuous symmetry that we call the BTGT symmetry in addition to the ordinary gauge symmetry. After classically interpreting the BTGT symmetry, we construct using the BTGT formalism the Ward identities associated with the BTGT symmetry and the ordinary gauge symmetry. For a way of testing the quantum stability and the consistency of the Ward identities with a known regularization method, we explicitly renormalize the scalar QED at one loop using dimensional regularization using the BTGT formalism.

  8. Evaluation of Wall Interference Effects in a Two-Dimensional Transonic Wind Tunnel by Subsonic Linear Theory,

    DTIC Science & Technology

    1979-02-01

    tests were conducted on two geometrica lly similar models of each of two aerofoil sections -—t he NA CA 00/ 2 and the BGK- 1 sections -and covered a...and slotted-wall tes t sections are corrected for wind tunnel wall interference efJ~cts by the application of classical linearized theory. For the...solid wall results , these corrections appear to produce data which are very close to being free of the effects of interference. In the case of

  9. Classical BV Theories on Manifolds with Boundary

    NASA Astrophysics Data System (ADS)

    Cattaneo, Alberto S.; Mnev, Pavel; Reshetikhin, Nicolai

    2014-12-01

    In this paper we extend the classical BV framework to gauge theories on spacetime manifolds with boundary. In particular, we connect the BV construction in the bulk with the BFV construction on the boundary and we develop its extension to strata of higher codimension in the case of manifolds with corners. We present several examples including electrodynamics, Yang-Mills theory and topological field theories coming from the AKSZ construction, in particular, the Chern-Simons theory, the BF theory, and the Poisson sigma model. This paper is the first step towards developing the perturbative quantization of such theories on manifolds with boundary in a way consistent with gluing.

  10. From quantum to classical interactions between a free electron and a surface

    NASA Astrophysics Data System (ADS)

    Beierle, Peter James

    Quantum theory is often cited as being one of the most empirically validated theories in terms of its predictive power and precision. These attributes have led to numerous scientific discoveries and technological advancements. However, the precise relationship between quantum and classical physics remains obscure. The prevailing description is known as decoherence theory, where classical physics emerges from a more general quantum theory through environmental interaction. Sometimes referred to as the decoherence program, it does not solve the quantum measurement problem. We believe experiments performed between the microscopic and macroscopic world may help finish the program. The following considers a free electron that interacts with a surface (the environment), providing a controlled decoherence mechanism. There are non-decohering interactions to be examined and quantified before the weaker decohering effects are filtered out. In the first experiment, an electron beam passes over a surface that's illuminated by low-power laser light. This induces a surface charge redistribution causing the electron deflection. This phenomenon's parameters are investigated. This system can be well understood in terms of classical electrodynamics, and the technological applications of this electron beam switch are considered. Such phenomena may mask decoherence effects. A second experiment tests decoherence theory by introducing a nanofabricated diffraction grating before the surface. The electron undergoes diffraction through the grating, but as the electron passes over the surface it's predicted by various physical models that the electron will lose its wave interference property. Image charge based models, which predict a larger loss of contrast than what is observed, are falsified (despite experiencing an image charge force). A theoretical study demonstrates how a loss of contrast may not be due to the irreversible process decoherence, but dephasing (a reversible process due to randomization of the wavefunction's phase). To resolve this ambiguity, a correlation function on an ensemble of diffraction patterns is analyzed after an electron undergoes either process in a path integral calculation. The diffraction pattern is successfully recovered for dephasing, but not for decoherence, thus verifying it as a potential tool in experimental studies to determine the nature of the observed process.

  11. Bukhvostov-Lipatov model and quantum-classical duality

    NASA Astrophysics Data System (ADS)

    Bazhanov, Vladimir V.; Lukyanov, Sergei L.; Runov, Boris A.

    2018-02-01

    The Bukhvostov-Lipatov model is an exactly soluble model of two interacting Dirac fermions in 1 + 1 dimensions. The model describes weakly interacting instantons and anti-instantons in the O (3) non-linear sigma model. In our previous work [arxiv:arXiv:1607.04839] we have proposed an exact formula for the vacuum energy of the Bukhvostov-Lipatov model in terms of special solutions of the classical sinh-Gordon equation, which can be viewed as an example of a remarkable duality between integrable quantum field theories and integrable classical field theories in two dimensions. Here we present a complete derivation of this duality based on the classical inverse scattering transform method, traditional Bethe ansatz techniques and analytic theory of ordinary differential equations. In particular, we show that the Bethe ansatz equations defining the vacuum state of the quantum theory also define connection coefficients of an auxiliary linear problem for the classical sinh-Gordon equation. Moreover, we also present details of the derivation of the non-linear integral equations determining the vacuum energy and other spectral characteristics of the model in the case when the vacuum state is filled by 2-string solutions of the Bethe ansatz equations.

  12. Lenard-Balescu calculations and classical molecular dynamics simulations of electrical and thermal conductivities of hydrogen plasmas

    DOE PAGES

    Whitley, Heather D.; Scullard, Christian R.; Benedict, Lorin X.; ...

    2014-12-04

    Here, we present a discussion of kinetic theory treatments of linear electrical and thermal transport in hydrogen plasmas, for a regime of interest to inertial confinement fusion applications. In order to assess the accuracy of one of the more involved of these approaches, classical Lenard-Balescu theory, we perform classical molecular dynamics simulations of hydrogen plasmas using 2-body quantum statistical potentials and compute both electrical and thermal conductivity from out particle trajectories using the Kubo approach. Our classical Lenard-Balescu results employing the identical statistical potentials agree well with the simulations.

  13. Impact of population structure, effective bottleneck time, and allele frequency on linkage disequilibrium maps

    PubMed Central

    Zhang, Weihua; Collins, Andrew; Gibson, Jane; Tapper, William J.; Hunt, Sarah; Deloukas, Panos; Bentley, David R.; Morton, Newton E.

    2004-01-01

    Genetic maps in linkage disequilibrium (LD) units play the same role for association mapping as maps in centimorgans provide at much lower resolution for linkage mapping. Association mapping of genes determining disease susceptibility and other phenotypes is based on the theory of LD, here applied to relations with three phenomena. To test the theory, markers at high density along a 10-Mb continuous segment of chromosome 20q were studied in African-American, Asian, and Caucasian samples. Population structure, whether created by pooling samples from divergent populations or by the mating pattern in a mixed population, is accurately bioassayed from genotype frequencies. The effective bottleneck time for Eurasians is substantially less than for migration out of Africa, reflecting later bottlenecks. The classical dependence of allele frequency on mutation age does not hold for the generally shorter time span of inbreeding and LD. Limitation of the classical theory to mutation age justifies the assumption of constant time in a LD map, except for alleles that were rare at the effective bottleneck time or have arisen since. This assumption is derived from the Malecot model and verified in all samples. Tested measures of relative efficiency, support intervals, and localization error determine the operating characteristics of LD maps that are applicable to every sexually reproducing species, with implications for association mapping, high-resolution linkage maps, evolutionary inference, and identification of recombinogenic sequences. PMID:15604137

  14. Impact of population structure, effective bottleneck time, and allele frequency on linkage disequilibrium maps.

    PubMed

    Zhang, Weihua; Collins, Andrew; Gibson, Jane; Tapper, William J; Hunt, Sarah; Deloukas, Panos; Bentley, David R; Morton, Newton E

    2004-12-28

    Genetic maps in linkage disequilibrium (LD) units play the same role for association mapping as maps in centimorgans provide at much lower resolution for linkage mapping. Association mapping of genes determining disease susceptibility and other phenotypes is based on the theory of LD, here applied to relations with three phenomena. To test the theory, markers at high density along a 10-Mb continuous segment of chromosome 20q were studied in African-American, Asian, and Caucasian samples. Population structure, whether created by pooling samples from divergent populations or by the mating pattern in a mixed population, is accurately bioassayed from genotype frequencies. The effective bottleneck time for Eurasians is substantially less than for migration out of Africa, reflecting later bottlenecks. The classical dependence of allele frequency on mutation age does not hold for the generally shorter time span of inbreeding and LD. Limitation of the classical theory to mutation age justifies the assumption of constant time in a LD map, except for alleles that were rare at the effective bottleneck time or have arisen since. This assumption is derived from the Malecot model and verified in all samples. Tested measures of relative efficiency, support intervals, and localization error determine the operating characteristics of LD maps that are applicable to every sexually reproducing species, with implications for association mapping, high-resolution linkage maps, evolutionary inference, and identification of recombinogenic sequences.

  15. Further Development of an Optimal Design Approach Applied to Axial Magnetic Bearings

    NASA Technical Reports Server (NTRS)

    Bloodgood, V. Dale, Jr.; Groom, Nelson J.; Britcher, Colin P.

    2000-01-01

    Classical design methods involved in magnetic bearings and magnetic suspension systems have always had their limitations. Because of this, the overall effectiveness of a design has always relied heavily on the skill and experience of the individual designer. This paper combines two approaches that have been developed to aid the accuracy and efficiency of magnetostatic design. The first approach integrates classical magnetic circuit theory with modern optimization theory to increase design efficiency. The second approach uses loss factors to increase the accuracy of classical magnetic circuit theory. As an example, an axial magnetic thrust bearing is designed for minimum power.

  16. Ethical and Stylistic Implications in Delivering Conference Papers.

    ERIC Educational Resources Information Center

    Enos, Theresa

    1986-01-01

    Analyzes shortcomings of conference papers intended for the eye rather than the ear. Referring to classical oratory, speech act theory, and cognitive theory, recommends revising papers for oral presentation by using classical disposition; deductive rather than inductive argument; formulaic repetition of words and phrases; non-inverted clause…

  17. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nomura, Yasunori; Salzetta, Nico; Sanches, Fabio

    We study the Hilbert space structure of classical spacetimes under the assumption that entanglement in holographic theories determines semiclassical geometry. We show that this simple assumption has profound implications; for example, a superposition of classical spacetimes may lead to another classical spacetime. Despite its unconventional nature, this picture admits the standard interpretation of superpositions of well-defined semiclassical spacetimes in the limit that the number of holographic degrees of freedom becomes large. We illustrate these ideas using a model for the holographic theory of cosmological spacetimes.

  18. Classical theory of radiating strings

    NASA Technical Reports Server (NTRS)

    Copeland, Edmund J.; Haws, D.; Hindmarsh, M.

    1990-01-01

    The divergent part of the self force of a radiating string coupled to gravity, an antisymmetric tensor and a dilaton in four dimensions are calculated to first order in classical perturbation theory. While this divergence can be absorbed into a renormalization of the string tension, demanding that both it and the divergence in the energy momentum tensor vanish forces the string to have the couplings of compactified N = 1 D = 10 supergravity. In effect, supersymmetry cures the classical infinities.

  19. Emergence of a classical Universe from quantum gravity and cosmology.

    PubMed

    Kiefer, Claus

    2012-09-28

    I describe how we can understand the classical appearance of our world from a universal quantum theory. The essential ingredient is the process of decoherence. I start with a general discussion in ordinary quantum theory and then turn to quantum gravity and quantum cosmology. There is a whole hierarchy of classicality from the global gravitational field to the fluctuations in the cosmic microwave background, which serve as the seeds for the structure in the Universe.

  20. On the Nature of Intelligence

    NASA Astrophysics Data System (ADS)

    Churchland, Paul M.

    Alan Turing is the consensus patron saint of the classical research program in Artificial Intelligence (AI), and his behavioral test for the possession of conscious intelligence has become his principal legacy in the mind of the academic public. Both takes are mistakes. That test is a dialectical throwaway line even for Turing himself, a tertiary gesture aimed at softening the intellectual resistance to a research program which, in his hands, possessed real substance, both mathematical and theoretical. The wrangling over his celebrated test has deflected attention away from those more substantial achievements, and away from the enduring obligation to construct a substantive theory of what conscious intelligence really is, as opposed to an epistemological account of how to tell when you are confronting an instance of it. This essay explores Turing's substantive research program on the nature of intelligence, and argues that the classical AI program is not its best expression, nor even the expression intended by Turing. It then attempts to put the famous Test into its proper, and much reduced, perspective.

  1. Emergent dark energy via decoherence in quantum interactions

    NASA Astrophysics Data System (ADS)

    Altamirano, Natacha; Corona-Ugalde, Paulina; Khosla, Kiran E.; Milburn, Gerard J.; Mann, Robert B.

    2017-06-01

    In this work we consider a recent proposal that gravitational interactions are mediated via classical information and apply it to a relativistic context. We study a toy model of a quantized Friedman-Robertson-Walker (FRW) universe with the assumption that any test particles must feel a classical metric. We show that such a model results in decoherence in the FRW state that manifests itself as a dark energy fluid that fills the spacetime. Analysis of the resulting fluid, shows the equation of state asymptotically oscillates around the value w  =  -1/3, regardless of the spatial curvature, which provides the bound between accelerating and decelerating expanding FRW cosmologies. Motivated with quantum-classical interactions this model is yet another example of theories with violation of energy-momentum conservation whose signature could have significant consequences for the observable universe.

  2. Classical gluon and graviton radiation from the bi-adjoint scalar double copy

    NASA Astrophysics Data System (ADS)

    Goldberger, Walter D.; Prabhu, Siddharth G.; Thompson, Jedidiah O.

    2017-09-01

    We find double-copy relations between classical radiating solutions in Yang-Mills theory coupled to dynamical color charges and their counterparts in a cubic bi-adjoint scalar field theory which interacts linearly with particles carrying bi-adjoint charge. The particular color-to-kinematics replacements we employ are motivated by the Bern-Carrasco-Johansson double-copy correspondence for on-shell amplitudes in gauge and gravity theories. They are identical to those recently used to establish relations between classical radiating solutions in gauge theory and in dilaton gravity. Our explicit bi-adjoint solutions are constructed to second order in a perturbative expansion, and map under the double copy onto gauge theory solutions which involve at most cubic gluon self-interactions. If the correspondence is found to persist to higher orders in perturbation theory, our results suggest the possibility of calculating gravitational radiation from colliding compact objects, directly from a scalar field with vastly simpler (purely cubic) Feynman vertices.

  3. Combinatorial Market Processing for Multilateral Coordination

    DTIC Science & Technology

    2005-09-01

    8 In the classical auction theory literature, most of the attention is focused on one-sided, single-item auctions [86]. There is now a growing body of...Programming in Infinite-dimensional Spaces: Theory and Applications, Wiley, 1987. [3] K. J. Arrow, “An extension of the basic theorems of classical ...Commodities, Princeton University Press, 1969. [43] D. Friedman and J. Rust, The Double Auction Market: Institutions, Theories, and Evidence, Addison

  4. The Classical Vacuum.

    ERIC Educational Resources Information Center

    Boyer, Timothy H.

    1985-01-01

    The classical vacuum of physics is not empty, but contains a distinctive pattern of electromagnetic fields. Discovery of the vacuum, thermal spectrum, classical electron theory, zero-point spectrum, and effects of acceleration are discussed. Connection between thermal radiation and the classical vacuum reveals unexpected unity in the laws of…

  5. Random walk in generalized quantum theory

    NASA Astrophysics Data System (ADS)

    Martin, Xavier; O'Connor, Denjoe; Sorkin, Rafael D.

    2005-01-01

    One can view quantum mechanics as a generalization of classical probability theory that provides for pairwise interference among alternatives. Adopting this perspective, we “quantize” the classical random walk by finding, subject to a certain condition of “strong positivity”, the most general Markovian, translationally invariant “decoherence functional” with nearest neighbor transitions.

  6. Developing a Measure of Therapist Adherence to Contingency Management: An Application of the Many-Facet Rasch Model

    ERIC Educational Resources Information Center

    Chapman, Jason E.; Sheidow, Ashli J.; Henggeler, Scott W.; Halliday-Boykins, Colleen A.; Cunningham, Phillippe B.

    2008-01-01

    A unique application of the Many-Facet Rasch Model (MFRM) is introduced as the preferred method for evaluating the psychometric properties of a measure of therapist adherence to Contingency Management (CM) treatment of adolescent substance use. The utility of psychometric methods based in Classical Test Theory was limited by complexities of the…

  7. Multilayer theory for delamination analysis of a composite curved bar subjected to end forces and end moments

    NASA Technical Reports Server (NTRS)

    Ko, William L.; Jackson, Raymond H.

    1989-01-01

    A composite test specimen in the shape of a semicircular curved bar subjected to bending offers an excellent stress field for studying the open-mode delamination behavior of laminated composite materials. This is because the open-mode delamination nucleates at the midspan of the curved bar. The classical anisotropic elasticity theory was used to construct a 'multilayer' theory for the calculations of the stress and deformation fields induced in the multilayered composite semicircular curved bar subjected to end forces and end moments. The radial location and intensity of the open-mode delamination stress were calculated and were compared with the results obtained from the anisotropic continuum theory and from the finite element method. The multilayer theory gave more accurate predictions of the location and the intensity of the open-mode delamination stress than those calculated from the anisotropic continuum theory.

  8. Multilayer theory for delamination analysis of a composite curved bar subjected to end forces and end moments

    NASA Technical Reports Server (NTRS)

    Ko, William L.; Jackson, Raymond H.

    1989-01-01

    A composite test specimen in the shape of a semicircular curved bar subjected to bending offers an excellent stress field for studying the open-mode delamination behavior of laminated composite materials. This is because the open-mode delamination nucleates at the midspan of the curved bar. The classical anisotropic elasticity theory was used to construct a multilayer theory for the calculations of the stress and deformation fields induced in the multilayered composite semicircular curved bar subjected to end forces and end moments. The radial location and intensity of the open-mode delamination stress were calculated and were compared with the results obtained from the anisotropic continuum theory and from the finite element method. The multilayer theory gave more accurate predictions of the location and the intensity of the open-mode delamination stress than those calculated from the anisotropic continuum theory.

  9. The Legacy of Seligman's "Phobias and Preparedness" (1971).

    PubMed

    McNally, Richard J

    2016-09-01

    Seligman's (1971) classic article, "Phobias and Preparedness," marked a break from traditional conditioning theories of the etiology of phobias, inspiring a line of research integrating evolutionary theory with learning theory. In this article, I briefly sketch the context motivating the preparedness theory of phobias before summarizing the initial wave of laboratory conditioning experiments pioneered by Öhman and conducted by his team and by others to test predictions derived from Seligman's theory. Finally, I review the legacy of Seligman's article, including theoretical developments embodied in Öhman and Mineka's fear module approach as well as alternatives for explaining "preparedness" phenomena, including the selective sensitization, expectancy, and nonassociative theories. Although Seligman himself soon moved on to other topics, his seminal article in Behavior Therapy continues to inspire research more than four decades later that has deepened our understanding of the etiology of phobias. Copyright © 2015. Published by Elsevier Ltd.

  10. Hydrostatic Stress Effect On the Yield Behavior of Inconel 100

    NASA Technical Reports Server (NTRS)

    Allen, Phillip A.; Wilson, Christopher D.

    2002-01-01

    Classical metal plasticity theory assumes that hydrostatic stress has no effect on the yield and postyield behavior of metals. Recent reexaminations of classical theory have revealed a significant effect of hydrostatic stress on the yield behavior of notched geometries. New experiments and nonlinear finite element analyses (FEA) of Inconel 100 (IN 100) equal-arm bend and double-edge notch tension (DENT) test specimens have revealed the effect of internal hydrostatic tensile stresses on yielding. Nonlinear FEA using the von Mises (yielding is independent of hydrostatic stress) and the Drucker-Prager (yielding is linearly dependent on hydrostatic stress) yield functions was performed. In all test cases, the von Mises constitutive model, which is independent of hydrostatic pressure, overestimated the load for a given displacement or strain. Considering the failure displacements or strains, the Drucker-Prager FEMs predicted loads that were 3% to 5% lower than the von Mises values. For the failure loads, the Drucker Prager FEMs predicted strains that were 20% to 35% greater than the von Mises values. The Drucker-Prager yield function seems to more accurately predict the overall specimen response of geometries with significant internal hydrostatic stress influence.

  11. Calculations of the surface tensions of liquid metals

    NASA Technical Reports Server (NTRS)

    Stroud, D. G.

    1981-01-01

    The understanding of the surface tension of liquid metals and alloys from as close to first principles as possible is discussed. The two ingredients which are combined in these calculations are: the electron theory of metals, and the classical theory of liquids, as worked out within the framework of statistical mechanics. The results are a new theory of surface tensions and surface density profiles from knowledge purely of the bulk properties of the coexisting liquid and vapor phases. It is found that the method works well for the pure liquid metals on which it was tested; work is extended to mixtures of liquid metals, interfaces between immiscible liquid metals, and to the temperature derivative of the surface tension.

  12. Neo-classical theory of competition or Adam Smith's hand as mathematized ideology

    NASA Astrophysics Data System (ADS)

    McCauley, Joseph L.

    2001-10-01

    Orthodox economic theory (utility maximization, rational agents, efficient markets in equilibrium) is based on arbitrarily postulated, nonempiric notions. The disagreement between economic reality and a key feature of neo-classical economic theory was criticized empirically by Osborne. I show that the orthodox theory is internally self-inconsistent for the very reason suggested by Osborne: lack of invertibility of demand and supply as functions of price to obtain price as functions of supply and demand. The reason for the noninvertibililty arises from nonintegrable excess demand dynamics, a feature of their theory completely ignored by economists.

  13. Measurement incompatibility and Schrödinger-Einstein-Podolsky-Rosen steering in a class of probabilistic theories

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Banik, Manik, E-mail: manik11ju@gmail.com

    Steering is one of the most counter intuitive non-classical features of bipartite quantum system, first noticed by Schrödinger at the early days of quantum theory. On the other hand, measurement incompatibility is another non-classical feature of quantum theory, initially pointed out by Bohr. Recently, Quintino et al. [Phys. Rev. Lett. 113, 160402 (2014)] and Uola et al. [Phys. Rev. Lett. 113, 160403 (2014)] have investigated the relation between these two distinct non-classical features. They have shown that a set of measurements is not jointly measurable (i.e., incompatible) if and only if they can be used for demonstrating Schrödinger-Einstein-Podolsky-Rosen steering. Themore » concept of steering has been generalized for more general abstract tensor product theories rather than just Hilbert space quantum mechanics. In this article, we discuss that the notion of measurement incompatibility can be extended for general probability theories. Further, we show that the connection between steering and measurement incompatibility holds in a border class of tensor product theories rather than just quantum theory.« less

  14. What is Quantum Mechanics? A Minimal Formulation

    NASA Astrophysics Data System (ADS)

    Friedberg, R.; Hohenberg, P. C.

    2018-03-01

    This paper presents a minimal formulation of nonrelativistic quantum mechanics, by which is meant a formulation which describes the theory in a succinct, self-contained, clear, unambiguous and of course correct manner. The bulk of the presentation is the so-called "microscopic theory", applicable to any closed system S of arbitrary size N, using concepts referring to S alone, without resort to external apparatus or external agents. An example of a similar minimal microscopic theory is the standard formulation of classical mechanics, which serves as the template for a minimal quantum theory. The only substantive assumption required is the replacement of the classical Euclidean phase space by Hilbert space in the quantum case, with the attendant all-important phenomenon of quantum incompatibility. Two fundamental theorems of Hilbert space, the Kochen-Specker-Bell theorem and Gleason's theorem, then lead inevitably to the well-known Born probability rule. For both classical and quantum mechanics, questions of physical implementation and experimental verification of the predictions of the theories are the domain of the macroscopic theory, which is argued to be a special case or application of the more general microscopic theory.

  15. Theory of mind deficit in adult patients with congenital heart disease.

    PubMed

    Chiavarino, Claudia; Bianchino, Claudia; Brach-Prever, Silvia; Riggi, Chiara; Palumbo, Luigi; Bara, Bruno G; Bosco, Francesca M

    2015-10-01

    This article provides the first assessment of theory of mind, that is, the ability to reason about mental states, in adult patients with congenital heart disease. Patients with congenital heart disease and matched healthy controls were administered classical theory of mind tasks and a semi-structured interview which provides a multidimensional evaluation of theory of mind (Theory of Mind Assessment Scale). The patients with congenital heart disease performed worse than the controls on the Theory of Mind Assessment Scale, whereas they did as well as the control group on the classical theory-of-mind tasks. These findings provide the first evidence that adults with congenital heart disease may display specific impairments in theory of mind. © The Author(s) 2013.

  16. Scalability of a Low-Cost Multi-Teraflop Linux Cluster for High-End Classical Atomistic and Quantum Mechanical Simulations

    NASA Technical Reports Server (NTRS)

    Kikuchi, Hideaki; Kalia, Rajiv K.; Nakano, Aiichiro; Vashishta, Priya; Shimojo, Fuyuki; Saini, Subhash

    2003-01-01

    Scalability of a low-cost, Intel Xeon-based, multi-Teraflop Linux cluster is tested for two high-end scientific applications: Classical atomistic simulation based on the molecular dynamics method and quantum mechanical calculation based on the density functional theory. These scalable parallel applications use space-time multiresolution algorithms and feature computational-space decomposition, wavelet-based adaptive load balancing, and spacefilling-curve-based data compression for scalable I/O. Comparative performance tests are performed on a 1,024-processor Linux cluster and a conventional higher-end parallel supercomputer, 1,184-processor IBM SP4. The results show that the performance of the Linux cluster is comparable to that of the SP4. We also study various effects, such as the sharing of memory and L2 cache among processors, on the performance.

  17. The effects of diseases, drugs, and chemicals on the creativity and productivity of famous sculptors, classic painters, classic music composers, and authors.

    PubMed

    Wolf, Paul L

    2005-11-01

    Many myths, theories, and speculations exist as to the exact etiology of the diseases, drugs, and chemicals that affected the creativity and productivity of famous sculptors, classic painters, classic music composers, and authors. To emphasize the importance of a modern clinical chemistry laboratory and hematology coagulation laboratory in interpreting the basis for the creativity and productivity of various artists. This investigation analyzed the lives of famous artists, including classical sculptor Benvenuto Cellini; classical sculptor and painter Michelangelo Buonarroti; classic painters Ivar Arosenius, Edvard Munch, and Vincent Van Gogh; classic music composer Louis Hector Berlioz; and English essayist Thomas De Quincey. The analysis includes their illnesses, their famous artistic works, and the modern clinical chemistry, toxicology, and hematology coagulation tests that would have been important in the diagnosis and treatment of their diseases. The associations between illness and art may be close and many because of both the actual physical limitations of the artists and their mental adaptation to disease. Although they were ill, many continued to be productive. If modern clinical chemistry, toxicology, and hematology coagulation laboratories had existed during the lifetimes of these various well-known individuals, clinical laboratories might have unraveled the mysteries of their afflictions. The illnesses these people endured probably could have been ascertained and perhaps treated. Diseases, drugs, and chemicals may have influenced their creativity and productivity.

  18. Classical Physics and the Bounds of Quantum Correlations.

    PubMed

    Frustaglia, Diego; Baltanás, José P; Velázquez-Ahumada, María C; Fernández-Prieto, Armando; Lujambio, Aintzane; Losada, Vicente; Freire, Manuel J; Cabello, Adán

    2016-06-24

    A unifying principle explaining the numerical bounds of quantum correlations remains elusive, despite the efforts devoted to identifying it. Here, we show that these bounds are indeed not exclusive to quantum theory: for any abstract correlation scenario with compatible measurements, models based on classical waves produce probability distributions indistinguishable from those of quantum theory and, therefore, share the same bounds. We demonstrate this finding by implementing classical microwaves that propagate along meter-size transmission-line circuits and reproduce the probabilities of three emblematic quantum experiments. Our results show that the "quantum" bounds would also occur in a classical universe without quanta. The implications of this observation are discussed.

  19. Testing for detailed balance in a financial market

    NASA Astrophysics Data System (ADS)

    Fiebig, H. R.; Musgrove, D. P.

    2015-06-01

    We test a historical price-time series in a financial market (the NASDAQ 100 index) for a statistical property known as detailed balance. The presence of detailed balance would imply that the market can be modeled by a stochastic process based on a Markov chain, thus leading to equilibrium. In economic terms, a positive outcome of the test would support the efficient market hypothesis, a cornerstone of neo-classical economic theory. In contrast to the usage in prevalent economic theory the term equilibrium here is tied to the returns, rather than the price-time series. The test is based on an action functional S constructed from the elements of the detailed balance condition and the historical data set, and then analyzing S by means of simulated annealing. Checks are performed to verify the validity of the analysis method. We discuss the outcome of this analysis.

  20. Open or closed? Dirac, Heisenberg, and the relation between classical and quantum mechanics

    NASA Astrophysics Data System (ADS)

    Bokulich, Alisa

    2004-09-01

    This paper describes a long-standing, though little known, debate between Dirac and Heisenberg over the nature of scientific methodology, theory change, and intertheoretic relations. Following Heisenberg's terminology, their disagreements can be summarized as a debate over whether the classical and quantum theories are "open" or "closed." A close examination of this debate sheds new light on the philosophical views of two of the great founders of quantum theory.

  1. An Applied Ecological Framework for Evaluating Infrastructure to Promote Walking and Cycling: The iConnect Study

    PubMed Central

    Bull, Fiona; Powell, Jane; Cooper, Ashley R.; Brand, Christian; Mutrie, Nanette; Preston, John; Rutter, Harry

    2011-01-01

    Improving infrastructure for walking and cycling is increasingly recommended as a means to promote physical activity, prevent obesity, and reduce traffic congestion and carbon emissions. However, limited evidence from intervention studies exists to support this approach. Drawing on classic epidemiological methods, psychological and ecological models of behavior change, and the principles of realistic evaluation, we have developed an applied ecological framework by which current theories about the behavioral effects of environmental change may be tested in heterogeneous and complex intervention settings. Our framework guides study design and analysis by specifying the most important data to be collected and relations to be tested to confirm or refute specific hypotheses and thereby refine the underlying theories. PMID:21233429

  2. Development of novel optical fiber sensors for measuring tilts and displacements of geotechnical structures

    NASA Astrophysics Data System (ADS)

    Pei, Hua-Fu; Yin, Jian-Hua; Jin, Wei

    2013-09-01

    Two kinds of innovative sensors based on optical fiber sensing technologies have been proposed and developed for measuring tilts and displacements in geotechnical structures. The newly developed tilt sensors are based on classical beam theory and were successfully used to measure the inclinations in a physical model test. The conventional inclinometers including in-place and portable types, as a key instrument, are very commonly used in geotechnical engineering. In this paper, fiber Bragg grating sensing technology is used to measure strains along a standard inclinometer casing and these strains are used to calculate the lateral and/or horizontal deflections of the casing using the beam theory and a finite difference method. Finally, the monitoring results are verified by laboratory tests.

  3. Quantitative, steady-state properties of Catania's computational model of the operant reserve.

    PubMed

    Berg, John P; McDowell, J J

    2011-05-01

    Catania (2005) found that a computational model of the operant reserve (Skinner, 1938) produced realistic behavior in initial, exploratory analyses. Although Catania's operant reserve computational model demonstrated potential to simulate varied behavioral phenomena, the model was not systematically tested. The current project replicated and extended the Catania model, clarified its capabilities through systematic testing, and determined the extent to which it produces behavior corresponding to matching theory. Significant departures from both classic and modern matching theory were found in behavior generated by the model across all conditions. The results suggest that a simple, dynamic operant model of the reflex reserve does not simulate realistic steady state behavior. Copyright © 2011 Elsevier B.V. All rights reserved.

  4. The role of a posteriori mathematics in physics

    NASA Astrophysics Data System (ADS)

    MacKinnon, Edward

    2018-05-01

    The calculus that co-evolved with classical mechanics relied on definitions of functions and differentials that accommodated physical intuitions. In the early nineteenth century mathematicians began the rigorous reformulation of calculus and eventually succeeded in putting almost all of mathematics on a set-theoretic foundation. Physicists traditionally ignore this rigorous mathematics. Physicists often rely on a posteriori math, a practice of using physical considerations to determine mathematical formulations. This is illustrated by examples from classical and quantum physics. A justification of such practice stems from a consideration of the role of phenomenological theories in classical physics and effective theories in contemporary physics. This relates to the larger question of how physical theories should be interpreted.

  5. The Lack of Chemical Equilibrium does not Preclude the Use of the Classical Nucleation Theory in Circumstellar Outflows

    NASA Technical Reports Server (NTRS)

    Paquette, John A.; Nuth, Joseph A., III

    2011-01-01

    Classical nucleation theory has been used in models of dust nucleation in circumstellar outflows around oxygen-rich asymptotic giant branch stars. One objection to the application of classical nucleation theory (CNT) to astrophysical systems of this sort is that an equilibrium distribution of clusters (assumed by CNT) is unlikely to exist in such conditions due to a low collision rate of condensable species. A model of silicate grain nucleation and growth was modified to evaluate the effect of a nucleation flux orders of magnitUde below the equilibrium value. The results show that a lack of chemical equilibrium has only a small effect on the ultimate grain distribution.

  6. In search of a general theory of species' range evolution.

    PubMed

    Connallon, Tim; Sgrò, Carla M

    2018-06-13

    Despite the pervasiveness of the world's biodiversity, no single species has a truly global distribution. In fact, most species have very restricted distributions. What limits species from expanding beyond their current geographic ranges? This has been classically treated by ecologists as an ecological problem and by evolutionary biologist as an evolutionary problem. Such a dichotomy is false-the problem of species' ranges sits firmly within the realm of evolutionary ecology. In support of this view, Polechová presents new theory that explains species' range limits with reference to two key factors central to both ecological and evolutionary theory-migration and population size. This new model sets the scene for empirical tests of range limit theory and builds the case for assisted gene flow as a key management tool for threatened species.

  7. S-Duality, Deconstruction and Confinement for a Marginal Deformation of N=4 SUSY Yang-Mills

    NASA Astrophysics Data System (ADS)

    Dorey, Nick

    2004-08-01

    We study an exactly marginal deformation of Script N = 4 SUSY Yang-Mills with gauge group U(N) using field theory and string theory methods. The classical theory has a Higgs branch for rational values of the deformation parameter. We argue that the quantum theory also has an S-dual confining branch which cannot be seen classically. The low-energy effective theory on these branches is a six-dimensional non-commutative gauge theory with sixteen supercharges. Confinement of magnetic and electric charges, on the Higgs and confining branches respectively, occurs due to the formation of BPS-saturated strings in the low energy theory. The results also suggest a new way of deconstructing Little String Theory as a large-N limit of a confining gauge theory in four dimensions.

  8. High-pressure phase transitions - Examples of classical predictability

    NASA Astrophysics Data System (ADS)

    Celebonovic, Vladan

    1992-09-01

    The applicability of the Savic and Kasanin (1962-1967) classical theory of dense matter to laboratory experiments requiring estimates of high-pressure phase transitions was examined by determining phase transition pressures for a set of 19 chemical substances (including elements, hydrocarbons, metal oxides, and salts) for which experimental data were available. A comparison between experimental and transition points and those predicted by the Savic-Kasanin theory showed that the theory can be used for estimating values of transition pressures. The results also support conclusions obtained in previous astronomical applications of the Savic-Kasanin theory.

  9. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Khrennikov, Andrei

    We present fundamentals of a prequantum model with hidden variables of the classical field type. In some sense this is the comeback of classical wave mechanics. Our approach also can be considered as incorporation of quantum mechanics into classical signal theory. All quantum averages (including correlations of entangled systems) can be represented as classical signal averages and correlations.

  10. Uniting the Spheres: Modern Feminist Theory and Classic Texts in AP English

    ERIC Educational Resources Information Center

    Drew, Simao J. A.; Bosnic, Brenda G.

    2008-01-01

    High school teachers Simao J. A. Drew and Brenda G. Bosnic help familiarize students with gender role analysis and feminist theory. Students examine classic literature and contemporary texts, considering characters' historical, literary, and social contexts while expanding their understanding of how patterns of identity and gender norms exist and…

  11. Aesthetic Creativity: Insights from Classical Literary Theory on Creative Learning

    ERIC Educational Resources Information Center

    Hellstrom, Tomas Georg

    2011-01-01

    This paper addresses the subject of textual creativity by drawing on work done in classical literary theory and criticism, specifically new criticism, structuralism and early poststructuralism. The question of how readers and writers engage creatively with the text is closely related to educational concerns, though they are often thought of as…

  12. Generalization of the Activated Complex Theory of Reaction Rates. II. Classical Mechanical Treatment

    DOE R&D Accomplishments Database

    Marcus, R. A.

    1964-01-01

    In its usual classical form activated complex theory assumes a particular expression for the kinetic energy of the reacting system -- one associated with a rectilinear motion along the reaction coordinate. The derivation of the rate expression given in the present paper is based on the general kinetic energy expression.

  13. Analogy between electromagnetic potentials and wave-like dynamic variables with connections to quantum theory

    NASA Astrophysics Data System (ADS)

    Yang, Chen

    2018-05-01

    The transitions from classical theories to quantum theories have attracted many interests. This paper demonstrates the analogy between the electromagnetic potentials and wave-like dynamic variables with their connections to quantum theory for audiences at advanced undergraduate level and above. In the first part, the counterpart relations in the classical electrodynamics (e.g. gauge transform and Lorenz condition) and classical mechanics (e.g. Legendre transform and free particle condition) are presented. These relations lead to similar governing equations of the field variables and dynamic variables. The Lorenz gauge, scalar potential and vector potential manifest a one-to-one similarity to the action, Hamiltonian and momentum, respectively. In the second part, the connections between the classical pictures of electromagnetic field and particle to quantum picture are presented. By characterising the states of electromagnetic field and particle via their (corresponding) variables, their evolution pictures manifest the same algebraic structure (isomorphic). Subsequently, pictures of the electromagnetic field and particle are compared to the quantum picture and their interconnections are given. A brief summary of the obtained results are presented at the end of the paper.

  14. Quantum-correlation breaking channels, quantum conditional probability and Perron-Frobenius theory

    NASA Astrophysics Data System (ADS)

    Chruściński, Dariusz

    2013-03-01

    Using the quantum analog of conditional probability and classical Bayes theorem we discuss some aspects of particular entanglement breaking channels: quantum-classical and classical-classical channels. Applying the quantum analog of Perron-Frobenius theorem we generalize the recent result of Korbicz et al. (2012) [8] on full and spectrum broadcasting from quantum-classical channels to arbitrary quantum channels.

  15. Operator Formulation of Classical Mechanics.

    ERIC Educational Resources Information Center

    Cohn, Jack

    1980-01-01

    Discusses the construction of an operator formulation of classical mechanics which is directly concerned with wave packets in configuration space and is more similar to that of convential quantum theory than other extant operator formulations of classical mechanics. (Author/HM)

  16. Epistemic View of Quantum States and Communication Complexity of Quantum Channels

    NASA Astrophysics Data System (ADS)

    Montina, Alberto

    2012-09-01

    The communication complexity of a quantum channel is the minimal amount of classical communication required for classically simulating a process of state preparation, transmission through the channel and subsequent measurement. It establishes a limit on the power of quantum communication in terms of classical resources. We show that classical simulations employing a finite amount of communication can be derived from a special class of hidden variable theories where quantum states represent statistical knowledge about the classical state and not an element of reality. This special class has attracted strong interest very recently. The communication cost of each derived simulation is given by the mutual information between the quantum state and the classical state of the parent hidden variable theory. Finally, we find that the communication complexity for single qubits is smaller than 1.28 bits. The previous known upper bound was 1.85 bits.

  17. Nonrelativistic para-Lorentzian mechanics

    NASA Astrophysics Data System (ADS)

    Vargas, J. G.

    1981-04-01

    After reviewing the foundations of special relativity and the room left for rival theories, a set of nonrelativistic para-Lorentzian transformations is derived uniquely, based on (a) a weaker first principle, (b) the requirement that the transformations sought do not give rise to the clock “paradox” (in a refined version), and (c) the compliance of the transformations with the classical experiments of Michelson-Morley, Kennedy-Thorndike, and Ives-Stilwell. The corresponding dynamics is developed. Most of the experimental support of special relativity is reconsidered in the light of the new theory. It is concluded that the relativity of simultaneity has so far not been tested.

  18. Projective limits of state spaces III. Toy-models

    NASA Astrophysics Data System (ADS)

    Lanéry, Suzanne; Thiemann, Thomas

    2018-01-01

    In this series of papers, we investigate the projective framework initiated by Kijowski (1977) and Okołów (2009, 2014, 2013) [1,2], which describes the states of a quantum theory as projective families of density matrices. A short reading guide to the series can be found in Lanéry (2016). A strategy to implement the dynamics in this formalism was presented in our first paper Lanéry and Thiemann (2017) (see also Lanéry, 2016, section 4), which we now test in two simple toy-models. The first one is a very basic linear model, meant as an illustration of the general procedure, and we will only discuss it at the classical level. In the second one, we reformulate the Schrödinger equation, treated as a classical field theory, within this projective framework, and proceed to its (non-relativistic) second quantization. We are then able to reproduce the physical content of the usual Fock quantization.

  19. Extended theory of harmonic maps connects general relativity to chaos and quantum mechanism

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ren, Gang; Duan, Yi-Shi

    General relativity and quantum mechanism are two separate rules of modern physics explaining how nature works. Both theories are accurate, but the direct connection between two theories was not yet clarified. Recently, researchers blur the line between classical and quantum physics by connecting chaos and entanglement equation. Here in this paper, we showed the Duan's extended HM theory, which has the solution of the general relativity, can also have the solutions of the classic chaos equations and even the solution of Schrödinger equation in quantum physics, suggesting the extended theory of harmonic maps may act as a universal theory ofmore » physics.« less

  20. Extended theory of harmonic maps connects general relativity to chaos and quantum mechanism

    DOE PAGES

    Ren, Gang; Duan, Yi-Shi

    2017-07-20

    General relativity and quantum mechanism are two separate rules of modern physics explaining how nature works. Both theories are accurate, but the direct connection between two theories was not yet clarified. Recently, researchers blur the line between classical and quantum physics by connecting chaos and entanglement equation. Here in this paper, we showed the Duan's extended HM theory, which has the solution of the general relativity, can also have the solutions of the classic chaos equations and even the solution of Schrödinger equation in quantum physics, suggesting the extended theory of harmonic maps may act as a universal theory ofmore » physics.« less

  1. Computation in generalised probabilisitic theories

    NASA Astrophysics Data System (ADS)

    Lee, Ciarán M.; Barrett, Jonathan

    2015-08-01

    From the general difficulty of simulating quantum systems using classical systems, and in particular the existence of an efficient quantum algorithm for factoring, it is likely that quantum computation is intrinsically more powerful than classical computation. At present, the best upper bound known for the power of quantum computation is that {{BQP}}\\subseteq {{AWPP}}, where {{AWPP}} is a classical complexity class (known to be included in {{PP}}, hence {{PSPACE}}). This work investigates limits on computational power that are imposed by simple physical, or information theoretic, principles. To this end, we define a circuit-based model of computation in a class of operationally-defined theories more general than quantum theory, and ask: what is the minimal set of physical assumptions under which the above inclusions still hold? We show that given only an assumption of tomographic locality (roughly, that multipartite states and transformations can be characterized by local measurements), efficient computations are contained in {{AWPP}}. This inclusion still holds even without assuming a basic notion of causality (where the notion is, roughly, that probabilities for outcomes cannot depend on future measurement choices). Following Aaronson, we extend the computational model by allowing post-selection on measurement outcomes. Aaronson showed that the corresponding quantum complexity class, {{PostBQP}}, is equal to {{PP}}. Given only the assumption of tomographic locality, the inclusion in {{PP}} still holds for post-selected computation in general theories. Hence in a world with post-selection, quantum theory is optimal for computation in the space of all operational theories. We then consider whether one can obtain relativized complexity results for general theories. It is not obvious how to define a sensible notion of a computational oracle in the general framework that reduces to the standard notion in the quantum case. Nevertheless, it is possible to define computation relative to a ‘classical oracle’. Then, we show there exists a classical oracle relative to which efficient computation in any theory satisfying the causality assumption does not include {{NP}}.

  2. Independent evolution of the sexes promotes amphibian diversification

    PubMed Central

    De Lisle, Stephen P.; Rowe, Locke

    2015-01-01

    Classic ecological theory predicts that the evolution of sexual dimorphism constrains diversification by limiting morphospace available for speciation. Alternatively, sexual selection may lead to the evolution of reproductive isolation and increased diversification. We test contrasting predictions of these hypotheses by examining the relationship between sexual dimorphism and diversification in amphibians. Our analysis shows that the evolution of sexual size dimorphism (SSD) is associated with increased diversification and speciation, contrary to the ecological theory. Further, this result is unlikely to be explained by traditional sexual selection models because variation in amphibian SSD is unlikely to be driven entirely by sexual selection. We suggest that relaxing a central assumption of classic ecological models—that the sexes share a common adaptive landscape—leads to the alternative hypothesis that independent evolution of the sexes may promote diversification. Once the constraints of sexual conflict are relaxed, the sexes can explore morphospace that would otherwise be inaccessible. Consistent with this novel hypothesis, the evolution of SSD in amphibians is associated with reduced current extinction threat status, and an historical reduction in extinction rate. Our work reconciles conflicting predictions from ecological and evolutionary theory and illustrates that the ability of the sexes to evolve independently is associated with a spectacular vertebrate radiation. PMID:25694616

  3. Quantum correlations and dynamics from classical random fields valued in complex Hilbert spaces

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Khrennikov, Andrei

    2010-08-15

    One of the crucial differences between mathematical models of classical and quantum mechanics (QM) is the use of the tensor product of the state spaces of subsystems as the state space of the corresponding composite system. (To describe an ensemble of classical composite systems, one uses random variables taking values in the Cartesian product of the state spaces of subsystems.) We show that, nevertheless, it is possible to establish a natural correspondence between the classical and the quantum probabilistic descriptions of composite systems. Quantum averages for composite systems (including entangled) can be represented as averages with respect to classical randommore » fields. It is essentially what Albert Einstein dreamed of. QM is represented as classical statistical mechanics with infinite-dimensional phase space. While the mathematical construction is completely rigorous, its physical interpretation is a complicated problem. We present the basic physical interpretation of prequantum classical statistical field theory in Sec. II. However, this is only the first step toward real physical theory.« less

  4. Development and validation of the nasopharyngeal cancer scale among the system of quality of life instruments for cancer patients (QLICP-NA V2.0): combined classical test theory and generalizability theory.

    PubMed

    Wu, Jiayuan; Hu, Liren; Zhang, Gaohua; Liang, Qilian; Meng, Qiong; Wan, Chonghua

    2016-08-01

    This research was designed to develop a nasopharyngeal cancer (NPC) scale based on quality of life (QOL) instruments for cancer patients (QLICP-NA). This scale was developed by using a modular approach and was evaluated by classical test and generalizability theories. Programmed decision procedures and theories on instrument development were applied to create QLICP-NA V2.0. A total of 121 NPC inpatients were assessed using QLICP-NA V2.0 to measure their QOL data from hospital admission until discharge. Scale validity, reliability, and responsiveness were evaluated by correlation, factor, parallel, multi-trait scaling, and t test analyses, as well as by generalizability (G) and decision (D) studies of the generalizability theory. Results of multi-trait scaling, correlation, factor, and parallel analyses indicated that QLICP-NA V2.0 exhibited good construct validity. The significant difference of QOL between the treated and untreated NPC patients indicated a good clinical validity of the questionnaire. The internal consistency (α) and test-retest reliability coefficients (intra-class correlations) of each domain, as well as the overall scale, were all >0.70. Ceiling effects were not found in all domains and most facets, except for common side effects (24.8 %) in the domain of common symptoms and side effects, tumor early symptoms (27.3 %) and therapeutic side effects (23.2 %) in specific domain, whereas floor effects did not exist in each domain/facet. The overall changes in the physical and social domains were significantly different between pre- and post-treatments with a moderate effective size (standard response mean) ranging from 0.21 to 0.27 (p < 0.05), but these changes were not obvious in the other domains, as well as in the overall scale. Scale reliability was further confirmed by G coefficients and index of dependability, with more exact variance components based on generalizability theory. QLICP-NA V2.0 exhibited reasonable degrees of validity, reliability, and responsiveness. However, this scale must be further improved before it can be used as a practical instrument to evaluate the QOL of NPC patients in China.

  5. Opening Switch Research on a Plasma Focus VI.

    DTIC Science & Technology

    1988-02-26

    Sausage Instability in the Plasma Focus In this section the classical Kruskal- Schwarzschild 3 theory for the sausage mode is applied to the pinch phase...on 1) the shape of the pinch, 2) axial flow of plasma, and 3) self-generated magnetic fields are also presented. The Kruskal- Schwarzschild Theory The...classical mhd theory for the m=O mode in a plasma supported by a magnetic field against gravity; this is the well-known Kruskal- Schwarzschild

  6. The Effect of Substituting p for alpha on the Unconditional and Conditional Powers of a Null Hypothesis Test.

    ERIC Educational Resources Information Center

    Martuza, Victor R.; Engel, John D.

    Results from classical power analysis (Brewer, 1972) suggest that a researcher should not set a=p (when p is less than a) in a posteriori fashion when a study yields statistically significant results because of a resulting decrease in power. The purpose of the present report is to use Bayesian theory in examining the validity of this…

  7. A New Interpretation of Augmented Subscores and Their Added Value in Terms of Parallel Forms

    ERIC Educational Resources Information Center

    Sinharay, Sandip

    2018-01-01

    The value-added method of Haberman is arguably one of the most popular methods to evaluate the quality of subscores. The method is based on the classical test theory and deems a subscore to be of added value if the subscore predicts the corresponding true subscore better than does the total score. Sinharay provided an interpretation of the added…

  8. The Meaning of Goodness-of-Fit Tests: Commentary on "Goodness-of-Fit Assessment of Item Response Theory Models"

    ERIC Educational Resources Information Center

    Thissen, David

    2013-01-01

    In this commentary, David Thissen states that "Goodness-of-fit assessment for IRT models is maturing; it has come a long way from zero." Thissen then references prior works on "goodness of fit" in the index of Lord and Novick's (1968) classic text; Yen (1984); Drasgow, Levine, Tsien, Williams, and Mead (1995); Chen and…

  9. Multivariate Distributions in Reliability Theory and Life Testing.

    DTIC Science & Technology

    1981-04-01

    Downton Distribution This distribution is a special case of a classical bivariate gamma distribution due to Wicksell and to Kibble. See Krishnaiah and...Krishnamoorthy and Parthasarathy (1951) (see also Krishnaiah and Rao (1961) and Krishnaiah (1977))and also within the frame- 13 work of the Arnold classes. A...for these distributions and their properties is Johnson and Kotz (1972). Krishnaiah (1977) has specifically discussed multi- variate gamma

  10. Ethics and Economics in International Business Education: A Comparison of Kuwaiti and U.S. Students Using an Ultimatum Game Scenario

    ERIC Educational Resources Information Center

    Wittmer, Dennis; Al-Kazemi, Ali

    2013-01-01

    In the past 20 years a body of research in behavioral and experimental economics has challenged classical economic theory. Yet, this body of research seems relatively unknown in business education. One behavioral test with implications for international business education has been the use of ultimatum games, which has more recently expanded to…

  11. When Can Subscores Be Expected to Have Added Value? Results from Operational and Simulated Data. Research Report. ETS RR-10-16

    ERIC Educational Resources Information Center

    Sinharay, Sandip

    2010-01-01

    Recently, there has been an increasing level of interest in subscores for their potential diagnostic value. Haberman (2008) suggested a method based on classical test theory to determine whether subscores have added value over total scores. This paper provides a literature review and reports when subscores were found to have added value for…

  12. Aeroacoustics Computation for Nearly Fully Expanded Supersonic Jets Using the CE/SE Method

    NASA Technical Reports Server (NTRS)

    Loh, Ching Y.; Hultgren, Lennart S.; Wang, Xiao Y.; Chang, Sin-Chung; Jorgenson, Philip C. E.

    2000-01-01

    In this paper, the space-time conservation element solution element (CE/SE) method is tested in the classical axisymmetric jet instability problem, rendering good agreement with the linear theory. The CE/SE method is then applied to numerical simulations of several nearly fully expanded axisymmetric jet flows and their noise fields and qualitative agreement with available experimental and theoretical results is demonstrated.

  13. A Study of General Education Astronomy Students' Understandings of Cosmology. Part II. Evaluating Four Conceptual Cosmology Surveys: A Classical Test Theory Approach

    ERIC Educational Resources Information Center

    Wallace, Colin S.; Prather, Edward E.; Duncan, Douglas K.

    2011-01-01

    This is the second of five papers detailing our national study of general education astronomy students' conceptual and reasoning difficulties with cosmology. This article begins our quantitative investigation of the data. We describe how we scored students' responses to four conceptual cosmology surveys, and we present evidence for the inter-rater…

  14. Phenomenologically viable Lorentz-violating quantum gravity.

    PubMed

    Sotiriou, Thomas P; Visser, Matt; Weinfurtner, Silke

    2009-06-26

    Horava's "Lifschitz point gravity" has many desirable features, but in its original incarnation one is forced to accept a nonzero cosmological constant of the wrong sign to be compatible with observation. We develop an extension of Horava's model that abandons "detailed balance" and regains parity invariance, and in 3+1 dimensions exhibit all five marginal (renormalizable) and four relevant (super-renormalizable) operators, as determined by power counting. We also consider the classical limit of this theory, evaluate the Hamiltonian and supermomentum constraints, and extract the classical equations of motion in a form similar to the Arnowitt-Deser-Misner formulation of general relativity. This puts the model in a framework amenable to developing detailed precision tests.

  15. Hamilton-Jacobi theory in multisymplectic classical field theories

    NASA Astrophysics Data System (ADS)

    de León, Manuel; Prieto-Martínez, Pedro Daniel; Román-Roy, Narciso; Vilariño, Silvia

    2017-09-01

    The geometric framework for the Hamilton-Jacobi theory developed in the studies of Cariñena et al. [Int. J. Geom. Methods Mod. Phys. 3(7), 1417-1458 (2006)], Cariñena et al. [Int. J. Geom. Methods Mod. Phys. 13(2), 1650017 (2015)], and de León et al. [Variations, Geometry and Physics (Nova Science Publishers, New York, 2009)] is extended for multisymplectic first-order classical field theories. The Hamilton-Jacobi problem is stated for the Lagrangian and the Hamiltonian formalisms of these theories as a particular case of a more general problem, and the classical Hamilton-Jacobi equation for field theories is recovered from this geometrical setting. Particular and complete solutions to these problems are defined and characterized in several equivalent ways in both formalisms, and the equivalence between them is proved. The use of distributions in jet bundles that represent the solutions to the field equations is the fundamental tool in this formulation. Some examples are analyzed and, in particular, the Hamilton-Jacobi equation for non-autonomous mechanical systems is obtained as a special case of our results.

  16. Rotational quenching of H2O by He: mixed quantum/classical theory and comparison with quantum results.

    PubMed

    Ivanov, Mikhail; Dubernet, Marie-Lise; Babikov, Dmitri

    2014-04-07

    The mixed quantum/classical theory (MQCT) formulated in the space-fixed reference frame is used to compute quenching cross sections of several rotationally excited states of water molecule by impact of He atom in a broad range of collision energies, and is tested against the full-quantum calculations on the same potential energy surface. In current implementation of MQCT method, there are two major sources of errors: one affects results at energies below 10 cm(-1), while the other shows up at energies above 500 cm(-1). Namely, when the collision energy E is below the state-to-state transition energy ΔE the MQCT method becomes less accurate due to its intrinsic classical approximation, although employment of the average-velocity principle (scaling of collision energy in order to satisfy microscopic reversibility) helps dramatically. At higher energies, MQCT is expected to be accurate but in current implementation, in order to make calculations computationally affordable, we had to cut off the basis set size. This can be avoided by using a more efficient body-fixed formulation of MQCT. Overall, the errors of MQCT method are within 20% of the full-quantum results almost everywhere through four-orders-of-magnitude range of collision energies, except near resonances, where the errors are somewhat larger.

  17. Properties of the Boltzmann equation in the classical approximation

    DOE PAGES

    Epelbaum, Thomas; Gelis, François; Tanji, Naoto; ...

    2014-12-30

    We examine the Boltzmann equation with elastic point-like scalar interactions in two different versions of the the classical approximation. Although solving numerically the Boltzmann equation with the unapproximated collision term poses no problem, this allows one to study the effect of the ultraviolet cutoff in these approximations. This cutoff dependence in the classical approximations of the Boltzmann equation is closely related to the non-renormalizability of the classical statistical approximation of the underlying quantum field theory. The kinetic theory setup that we consider here allows one to study in a much simpler way the dependence on the ultraviolet cutoff, since onemore » has also access to the non-approximated result for comparison.« less

  18. Finite element stress, vibration, and buckling analysis of laminated beams with the use of refined elements

    NASA Astrophysics Data System (ADS)

    Borovkov, Alexei I.; Avdeev, Ilya V.; Artemyev, A.

    1999-05-01

    In present work, the stress, vibration and buckling finite element analysis of laminated beams is performed. Review of the equivalent single-layer (ESL) laminate theories is done. Finite element algorithms and procedures integrated into the original FEA program system and based on the classical laminated plate theory (CLPT), first-order shear deformation theory (FSDT), third-order theory of Reddy (TSDT-R) and third- order theory of Kant (TSDT-K) with the use of the Lanczos method for solving of the eigenproblem are developed. Several numerical tests and examples of bending, free vibration and buckling of multilayered and sandwich beams with various material, geometry properties and boundary conditions are solved. New effective higher-order hierarchical element for the accurate calculation of transverse shear stress is proposed. The comparative analysis of results obtained by the considered models and solutions of 2D problems of the heterogeneous anisotropic elasticity is fulfilled.

  19. Experimental Observation of Two Features Unexpected from the Classical Theories of Rubber Elasticity

    NASA Astrophysics Data System (ADS)

    Nishi, Kengo; Fujii, Kenta; Chung, Ung-il; Shibayama, Mitsuhiro; Sakai, Takamasa

    2017-12-01

    Although the elastic modulus of a Gaussian chain network is thought to be successfully described by classical theories of rubber elasticity, such as the affine and phantom models, verification experiments are largely lacking owing to difficulties in precisely controlling of the network structure. We prepared well-defined model polymer networks experimentally, and measured the elastic modulus G for a broad range of polymer concentrations and connectivity probabilities, p . In our experiment, we observed two features that were distinct from those predicted by classical theories. First, we observed the critical behavior G ˜|p -pc|1.95 near the sol-gel transition. This scaling law is different from the prediction of classical theories, but can be explained by analogy between the electric conductivity of resistor networks and the elasticity of polymer networks. Here, pc is the sol-gel transition point. Furthermore, we found that the experimental G -p relations in the region above C* did not follow the affine or phantom theories. Instead, all the G /G0-p curves fell onto a single master curve when G was normalized by the elastic modulus at p =1 , G0. We show that the effective medium approximation for Gaussian chain networks explains this master curve.

  20. Early development of rostrum saw-teeth in a fossil ray tests classical theories of the evolution of vertebrate dentitions.

    PubMed

    Smith, Moya Meredith; Riley, Alex; Fraser, Gareth J; Underwood, Charlie; Welten, Monique; Kriwet, Jürgen; Pfaff, Cathrin; Johanson, Zerina

    2015-10-07

    In classical theory, teeth of vertebrate dentitions evolved from co-option of external skin denticles into the oral cavity. This hypothesis predicts that ordered tooth arrangement and regulated replacement in the oral dentition were also derived from skin denticles. The fossil batoid ray Schizorhiza stromeri (Chondrichthyes; Cretaceous) provides a test of this theory. Schizorhiza preserves an extended cartilaginous rostrum with closely spaced, alternating saw-teeth, different from sawfish and sawsharks today. Multiple replacement teeth reveal unique new data from micro-CT scanning, showing how the 'cone-in-cone' series of ordered saw-teeth sets arrange themselves developmentally, to become enclosed by the roots of pre-existing saw-teeth. At the rostrum tip, newly developing saw-teeth are present, as mineralized crown tips within a vascular, cartilaginous furrow; these reorient via two 90° rotations then relocate laterally between previously formed roots. Saw-tooth replacement slows mid-rostrum where fewer saw-teeth are regenerated. These exceptional developmental data reveal regulated order for serial self-renewal, maintaining the saw edge with ever-increasing saw-tooth size. This mimics tooth replacement in chondrichthyans, but differs in the crown reorientation and their enclosure directly between roots of predecessor saw-teeth. Schizorhiza saw-tooth development is decoupled from the jaw teeth and their replacement, dependent on a dental lamina. This highly specialized rostral saw, derived from diversification of skin denticles, is distinct from the dentition and demonstrates the potential developmental plasticity of skin denticles. © 2015 The Authors.

  1. Marshaling Resources: A Classic Grounded Theory Study of Online Learners

    ERIC Educational Resources Information Center

    Yalof, Barbara

    2012-01-01

    Students who enroll in online courses comprise one quarter of an increasingly diverse student body in higher education today. Yet, it is not uncommon for an online program to lose over 50% of its enrolled students prior to graduation. This study used a classic grounded theory qualitative methodology to investigate the persistent problem of…

  2. The Integration of Gender into the Teaching of Classical Social Theory: Help from "The Handmaid's Tale."

    ERIC Educational Resources Information Center

    Gotsch-Thomson, Susan

    1990-01-01

    Describes how gender is integrated into a classical social theory course by including a female theorist in the reading assignments and using "The Handmaid's Tale" by Margaret Atwood as the basis for class discussion. Reviews the course objectives and readings; describes the process of the class discussions; and provides student…

  3. The Development of Bayesian Theory and Its Applications in Business and Bioinformatics

    NASA Astrophysics Data System (ADS)

    Zhang, Yifei

    2018-03-01

    Bayesian Theory originated from an Essay of a British mathematician named Thomas Bayes in 1763, and after its development in 20th century, Bayesian Statistics has been taking a significant part in statistical study of all fields. Due to the recent breakthrough of high-dimensional integral, Bayesian Statistics has been improved and perfected, and now it can be used to solve problems that Classical Statistics failed to solve. This paper summarizes Bayesian Statistics’ history, concepts and applications, which are illustrated in five parts: the history of Bayesian Statistics, the weakness of Classical Statistics, Bayesian Theory and its development and applications. The first two parts make a comparison between Bayesian Statistics and Classical Statistics in a macroscopic aspect. And the last three parts focus on Bayesian Theory in specific -- from introducing some particular Bayesian Statistics’ concepts to listing their development and finally their applications.

  4. Constraining torsion with Gravity Probe B

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mao Yi; Guth, Alan H.; Cabi, Serkan

    2007-11-15

    It is well-entrenched folklore that all torsion gravity theories predict observationally negligible torsion in the solar system, since torsion (if it exists) couples only to the intrinsic spin of elementary particles, not to rotational angular momentum. We argue that this assumption has a logical loophole which can and should be tested experimentally, and consider nonstandard torsion theories in which torsion can be generated by macroscopic rotating objects. In the spirit of action=reaction, if a rotating mass like a planet can generate torsion, then a gyroscope would be expected to feel torsion. An experiment with a gyroscope (without nuclear spin) suchmore » as Gravity Probe B (GPB) can test theories where this is the case. Using symmetry arguments, we show that to lowest order, any torsion field around a uniformly rotating spherical mass is determined by seven dimensionless parameters. These parameters effectively generalize the parametrized post-Newtonian formalism and provide a concrete framework for further testing Einstein's general theory of relativity (GR). We construct a parametrized Lagrangian that includes both standard torsion-free GR and Hayashi-Shirafuji maximal torsion gravity as special cases. We demonstrate that classic solar system tests rule out the latter and constrain two observable parameters. We show that Gravity Probe B is an ideal experiment for further constraining nonstandard torsion theories, and work out the most general torsion-induced precession of its gyroscope in terms of our torsion parameters.« less

  5. Thermal Stress Analysis of a Continuous and Pulsed End-Pumped Nd:YAG Rod Crystal Using Non-Classic Conduction Heat Transfer Theory

    NASA Astrophysics Data System (ADS)

    Mojahedi, Mahdi; Shekoohinejad, Hamidreza

    2018-02-01

    In this paper, temperature distribution in the continuous and pulsed end-pumped Nd:YAG rod crystal is determined using nonclassical and classical heat conduction theories. In order to find the temperature distribution in crystal, heat transfer differential equations of crystal with consideration of boundary conditions are derived based on non-Fourier's model and temperature distribution of the crystal is achieved by an analytical method. Then, by transferring non-Fourier differential equations to matrix equations, using finite element method, temperature and stress of every point of crystal are calculated in the time domain. According to the results, a comparison between classical and nonclassical theories is represented to investigate rupture power values. In continuous end pumping with equal input powers, non-Fourier theory predicts greater temperature and stress compared to Fourier theory. It also shows that with an increase in relaxation time, crystal rupture power decreases. Despite of these results, in single rectangular pulsed end-pumping condition, with an equal input power, Fourier theory indicates higher temperature and stress rather than non-Fourier theory. It is also observed that, when the relaxation time increases, maximum amounts of temperature and stress decrease.

  6. Phylogenetic escalation and decline of plant defense strategies

    PubMed Central

    Agrawal, Anurag A.; Fishbein, Mark

    2008-01-01

    As the basal resource in most food webs, plants have evolved myriad strategies to battle consumption by herbivores. Over the past 50 years, plant defense theories have been formulated to explain the remarkable variation in abundance, distribution, and diversity of secondary chemistry and other defensive traits. For example, classic theories of enemy-driven evolutionary dynamics have hypothesized that defensive traits escalate through the diversification process. Despite the fact that macroevolutionary patterns are an explicit part of defense theories, phylogenetic analyses have not been previously attempted to disentangle specific predictions concerning (i) investment in resistance traits, (ii) recovery after damage, and (iii) plant growth rate. We constructed a molecular phylogeny of 38 species of milkweed and tested four major predictions of defense theory using maximum-likelihood methods. We did not find support for the growth-rate hypothesis. Our key finding was a pattern of phyletic decline in the three most potent resistance traits (cardenolides, latex, and trichomes) and an escalation of regrowth ability. Our neontological approach complements more common paleontological approaches to discover directional trends in the evolution of life and points to the importance of natural enemies in the macroevolution of species. The finding of macroevolutionary escalating regowth ability and declining resistance provides a window into the ongoing coevolutionary dynamics between plants and herbivores and suggests a revision of classic plant defense theory. Where plants are primarily consumed by specialist herbivores, regrowth (or tolerance) may be favored over resistance traits during the diversification process. PMID:18645183

  7. Absolute pitch in children prior to the beginning of musical training.

    PubMed

    Ross, David A; Marks, Lawrence E

    2009-07-01

    Absolute pitch (AP) is a rare skill, historically defined as the ability to name notes. Until now, methodologic limitations made it impossible to directly test the extent to which the development of AP depends on musical training. Using a new paradigm, we tested children with minimal musical experience. Although most children performed poorly, two performed comparably to adult possessors of AP. Follow-up testing showed that the performance of both children progressed to that of "classic" AP. These data support the theory that AP can result from differences in the encoding of stimulus frequency that are independent of musical experience.

  8. The nutrition for sport knowledge questionnaire (NSKQ): development and validation using classical test theory and Rasch analysis.

    PubMed

    Trakman, Gina Louise; Forsyth, Adrienne; Hoye, Russell; Belski, Regina

    2017-01-01

    Appropriate dietary intake can have a significant influence on athletic performance. There is a growing consensus on sports nutrition and professionals working with athletes often provide dietary education. However, due to the limitations of existing sports nutrition knowledge questionnaires, previous reports of athletes' nutrition knowledge may be inaccurate. An updated questionnaire has been developed based on a recent review of sports nutrition guidelines. The tool has been validated using a robust methodology that incorporates relevant techniques from classical test theory (CTT) and Item response theory (IRT), namely, Rasch analysis. The final questionnaire has 89 questions and six sub-sections (weight management, macronutrients, micronutrients, sports nutrition, supplements, and alcohol). The content and face validity of the tool have been confirmed based on feedback from expert sports dietitians and university sports students, respectively. The internal reliability of the questionnaire as a whole is high (KR = 0.88), and most sub-sections achieved an acceptable internal reliability. Construct validity has been confirmed, with an independent T-test revealing a significant ( p  < 0.001) difference in knowledge scores of nutrition (64 ± 16%) and non-nutrition students (51 ± 19%). Test-retest reliability has been assured, with a strong correlation ( r  = 0.92, p  < 0.001) between individuals' scores on two attempts of the test, 10 days to 2 weeks apart. Three of the sub-sections fit the Rasch Unidimensional Model. The final version of the questionnaire represents a significant improvement over previous tools. Each nutrition sub-section is unidimensional, and therefore researchers and practitioners can use these individually, as required. Use of the questionnaire will allow researchers to draw conclusions about the effectiveness of nutrition education programs, and differences in knowledge across athletes of varying ages, genders, and athletic calibres.

  9. Making classical and quantum canonical general relativity computable through a power series expansion in the inverse cosmological constant.

    PubMed

    Gambini, R; Pullin, J

    2000-12-18

    We consider general relativity with a cosmological constant as a perturbative expansion around a completely solvable diffeomorphism invariant field theory. This theory is the lambda --> infinity limit of general relativity. This allows an explicit perturbative computational setup in which the quantum states of the theory and the classical observables can be explicitly computed. An unexpected relationship arises at a quantum level between the discrete spectrum of the volume operator and the allowed values of the cosmological constant.

  10. On the Anticipatory Aspects of the Four Interactions: what the Known Classical and Semi-Classical Solutions Teach us

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lusanna, Luca

    2004-08-19

    The four (electro-magnetic, weak, strong and gravitational) interactions are described by singular Lagrangians and by Dirac-Bergmann theory of Hamiltonian constraints. As a consequence a subset of the original configuration variables are gauge variables, not determined by the equations of motion. Only at the Hamiltonian level it is possible to separate the gauge variables from the deterministic physical degrees of freedom, the Dirac observables, and to formulate a well posed Cauchy problem for them both in special and general relativity. Then the requirement of causality dictates the choice of retarded solutions at the classical level. However both the problems of themore » classical theory of the electron, leading to the choice of (1/2) (retarded + advanced) solutions, and the regularization of quantum field theory, leading to the Feynman propagator, introduce anticipatory aspects. The determination of the relativistic Darwin potential as a semi-classical approximation to the Lienard-Wiechert solution for particles with Grassmann-valued electric charges, regularizing the Coulomb self-energies, shows that these anticipatory effects live beyond the semi-classical approximation (tree level) under the form of radiative corrections, at least for the electro-magnetic interaction.Talk and 'best contribution' at The Sixth International Conference on Computing Anticipatory Systems CASYS'03, Liege August 11-16, 2003.« less

  11. The dynamical mass of a classical Cepheid variable star in an eclipsing binary system.

    PubMed

    Pietrzyński, G; Thompson, I B; Gieren, W; Graczyk, D; Bono, G; Udalski, A; Soszyński, I; Minniti, D; Pilecki, B

    2010-11-25

    Stellar pulsation theory provides a means of determining the masses of pulsating classical Cepheid supergiants-it is the pulsation that causes their luminosity to vary. Such pulsational masses are found to be smaller than the masses derived from stellar evolution theory: this is the Cepheid mass discrepancy problem, for which a solution is missing. An independent, accurate dynamical mass determination for a classical Cepheid variable star (as opposed to type-II Cepheids, low-mass stars with a very different evolutionary history) in a binary system is needed in order to determine which is correct. The accuracy of previous efforts to establish a dynamical Cepheid mass from Galactic single-lined non-eclipsing binaries was typically about 15-30% (refs 6, 7), which is not good enough to resolve the mass discrepancy problem. In spite of many observational efforts, no firm detection of a classical Cepheid in an eclipsing double-lined binary has hitherto been reported. Here we report the discovery of a classical Cepheid in a well detached, double-lined eclipsing binary in the Large Magellanic Cloud. We determine the mass to a precision of 1% and show that it agrees with its pulsation mass, providing strong evidence that pulsation theory correctly and precisely predicts the masses of classical Cepheids.

  12. Phase-Sensitive Coherence and the Classical-Quantum Boundary in Ghost Imaging

    NASA Technical Reports Server (NTRS)

    Erkmen, Baris I.; Hardy, Nicholas D.; Venkatraman, Dheera; Wong, Franco N. C.; Shapiro, Jeffrey H.

    2011-01-01

    The theory of partial coherence has a long and storied history in classical statistical optics. the vast majority of this work addresses fields that are statistically stationary in time, hence their complex envelopes only have phase-insensitive correlations. The quantum optics of squeezed-state generation, however, depends on nonlinear interactions producing baseband field operators with phase-insensitive and phase-sensitive correlations. Utilizing quantum light to enhance imaging has been a topic of considerable current interest, much of it involving biphotons, i.e., streams of entangled-photon pairs. Biphotons have been employed for quantum versions of optical coherence tomography, ghost imaging, holography, and lithography. However, their seemingly quantum features have been mimicked with classical-sate light, questioning wherein lies the classical-quantum boundary. We have shown, for the case of Gaussian-state light, that this boundary is intimately connected to the theory of phase-sensitive partial coherence. Here we present that theory, contrasting it with the familiar case of phase-insensitive partial coherence, and use it to elucidate the classical-quantum boundary of ghost imaging. We show, both theoretically and experimentally, that classical phase-sensitive light produces ghost imaging most closely mimicking those obtained in biphotons, and we derived the spatial resolution, image contrast, and signal-to-noise ratio of a standoff-sensing ghost imager, taking into account target-induced speckle.

  13. Cantilever testing of sintered-silver interconnects

    DOE PAGES

    Wereszczak, Andrew A.; Chen, Branndon R.; Jadaan, Osama M.; ...

    2017-10-19

    Cantilever testing is an underutilized test method from which results and interpretations promote greater understanding of the tensile and shear failure responses of interconnects, metallizations, or bonded joints. The use and analysis of this method were pursued through the mechanical testing of sintered-silver interconnects that joined Ni/Au-plated copper pillars or Ti/Ni/Ag-plated silicon pillars to Ag-plated direct bonded copper substrates. Sintered-silver was chosen as the interconnect test medium because of its high electrical and thermal conductivities and high-temperature capability—attractive characteristics for a candidate interconnect in power electronic components and other devices. Deep beam theory was used to improve upon the estimationsmore » of the tensile and shear stresses calculated from classical beam theory. The failure stresses of the sintered-silver interconnects were observed to be dependent on test-condition and test-material-system. In conclusion, the experimental simplicity of cantilever testing, and the ability to analytically calculate tensile and shear stresses at failure, result in it being an attractive mechanical test method to evaluate the failure response of interconnects.« less

  14. Analysis test of understanding of vectors with the three-parameter logistic model of item response theory and item response curves technique

    NASA Astrophysics Data System (ADS)

    Rakkapao, Suttida; Prasitpong, Singha; Arayathanitkul, Kwan

    2016-12-01

    This study investigated the multiple-choice test of understanding of vectors (TUV), by applying item response theory (IRT). The difficulty, discriminatory, and guessing parameters of the TUV items were fit with the three-parameter logistic model of IRT, using the parscale program. The TUV ability is an ability parameter, here estimated assuming unidimensionality and local independence. Moreover, all distractors of the TUV were analyzed from item response curves (IRC) that represent simplified IRT. Data were gathered on 2392 science and engineering freshmen, from three universities in Thailand. The results revealed IRT analysis to be useful in assessing the test since its item parameters are independent of the ability parameters. The IRT framework reveals item-level information, and indicates appropriate ability ranges for the test. Moreover, the IRC analysis can be used to assess the effectiveness of the test's distractors. Both IRT and IRC approaches reveal test characteristics beyond those revealed by the classical analysis methods of tests. Test developers can apply these methods to diagnose and evaluate the features of items at various ability levels of test takers.

  15. Cantilever testing of sintered-silver interconnects

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wereszczak, Andrew A.; Chen, Branndon R.; Jadaan, Osama M.

    Cantilever testing is an underutilized test method from which results and interpretations promote greater understanding of the tensile and shear failure responses of interconnects, metallizations, or bonded joints. The use and analysis of this method were pursued through the mechanical testing of sintered-silver interconnects that joined Ni/Au-plated copper pillars or Ti/Ni/Ag-plated silicon pillars to Ag-plated direct bonded copper substrates. Sintered-silver was chosen as the interconnect test medium because of its high electrical and thermal conductivities and high-temperature capability—attractive characteristics for a candidate interconnect in power electronic components and other devices. Deep beam theory was used to improve upon the estimationsmore » of the tensile and shear stresses calculated from classical beam theory. The failure stresses of the sintered-silver interconnects were observed to be dependent on test-condition and test-material-system. In conclusion, the experimental simplicity of cantilever testing, and the ability to analytically calculate tensile and shear stresses at failure, result in it being an attractive mechanical test method to evaluate the failure response of interconnects.« less

  16. Bertrand's theorem and virial theorem in fractional classical mechanics

    NASA Astrophysics Data System (ADS)

    Yu, Rui-Yan; Wang, Towe

    2017-09-01

    Fractional classical mechanics is the classical counterpart of fractional quantum mechanics. The central force problem in this theory is investigated. Bertrand's theorem is generalized, and virial theorem is revisited, both in three spatial dimensions. In order to produce stable, closed, non-circular orbits, the inverse-square law and the Hooke's law should be modified in fractional classical mechanics.

  17. The Caregiver Contribution to Heart Failure Self-Care (CACHS): Further Psychometric Testing of a Novel Instrument.

    PubMed

    Buck, Harleah G; Harkness, Karen; Ali, Muhammad Usman; Carroll, Sandra L; Kryworuchko, Jennifer; McGillion, Michael

    2017-04-01

    Caregivers (CGs) contribute important assistance with heart failure (HF) self-care, including daily maintenance, symptom monitoring, and management. Until CGs' contributions to self-care can be quantified, it is impossible to characterize it, account for its impact on patient outcomes, or perform meaningful cost analyses. The purpose of this study was to conduct psychometric testing and item reduction on the recently developed 34-item Caregiver Contribution to Heart Failure Self-care (CACHS) instrument using classical and item response theory methods. Fifty CGs (mean age 63 years ±12.84; 70% female) recruited from a HF clinic completed the CACHS in 2014 and results evaluated using classical test theory and item response theory. Items would be deleted for low (<.05) or high (>.95) endorsement, low (<.3) or high (>.7) corrected item-total correlations, significant pairwise correlation coefficients, floor or ceiling effects, relatively low latent trait and item information function levels (<1.5 and p > .5), and differential item functioning. After analysis, 14 items were excluded, resulting in a 20-item instrument (self-care maintenance eight items; monitoring seven items; and management five items). Most items demonstrated moderate to high discrimination (median 2.13, minimum .77, maximum 5.05), and appropriate item difficulty (-2.7 to 1.4). Internal consistency reliability was excellent (Cronbach α = .94, average inter-item correlation = .41) with no ceiling effects. The newly developed 20-item version of the CACHS is supported by rigorous instrument development and represents a novel instrument to measure CGs' contribution to HF self-care. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.

  18. The Tensile Strength of Liquid Nitrogen

    NASA Astrophysics Data System (ADS)

    Huang, Jian

    1992-01-01

    The tensile strength of liquids has been a puzzling subject. On the one hand, the classical nucleation theory has met great success in predicting the nucleation rates of superheated liquids. On the other hand, most of reported experimental values of the tensile strength for different liquids are far below the prediction from the classical nucleation theory. In this study, homogeneous nucleation in liquid nitrogen and its tensile strength have been investigated. Different approaches for determining the pressure amplitude were studied carefully. It is shown that Raman-Nath theory, as modified by the introduction of an effective interaction length, can be used to determine the pressure amplitude in the focal plane of a focusing ultrasonic transducer. The results obtained from different diffraction orders are consistent and in good agreement with other approaches including Debye's theory and solving the KZK equation. The measurement of the tensile strength was carried out in a high pressure stainless steel dewar. A High intensity ultrasonic wave was focused into a small volume of liquid nitrogen in a short time period. A probe laser beam passes through the focal region of a concave spherical transducer with small aperture angle and the transmitted light is detected with a photodiode. The pressure amplitude at the focus is calculated based on the acoustic power radiated into the liquid. In the experiment, the electrical signal on the transducer is gated at its resonance frequency with gate widths of 20 mus to 0.2 ms and temperature range from 77 K to near 100 K. The calculated pressure amplitude is in agreement with the prediction of classical nucleation theory for the nucleation rates from 10^6 to 10^ {11} (bubbles/cm^3 sec). This work provides the experimental evidence that the validity of the classical nucleation theory can be extended to the region of the negative pressure up to -90 atm. This is only the second cryogenic liquid to reach the tensile strength predicted from the classical nucleation theory.

  19. Symmetrical windowing for quantum states in quasi-classical trajectory simulations: Application to electron transfer

    NASA Astrophysics Data System (ADS)

    Cotton, Stephen J.; Igumenshchev, Kirill; Miller, William H.

    2014-08-01

    It has recently been shown [S. J. Cotton and W. H. Miller, J. Chem. Phys. 139, 234112 (2013)] that a symmetrical windowing quasi-classical (SQC) approach [S. J. Cotton and W. H. Miller, J. Phys. Chem. A 117, 7190 (2013)] applied to the Meyer-Miller model [H.-D. Meyer and W. H. Miller, J. Chem. Phys. 70, 3214 (1979)] for the electronic degrees of freedom in electronically non-adiabatic dynamics is capable of quantitatively reproducing quantum mechanical results for a variety of test applications, including cases where "quantum" coherence effects are significant. Here we apply this same SQC methodology, within a flux-side correlation function framework, to calculate thermal rate constants corresponding to several proposed models of electron transfer processes [P. Huo, T. F. Miller III, and D. F. Coker, J. Chem. Phys. 139, 151103 (2013); A. R. Menzeleev, N. Ananth, and T. F. Miller III, J. Chem. Phys. 135, 074106 (2011)]. Good quantitative agreement with Marcus Theory is obtained over several orders of magnitude variation in non-adiabatic coupling. Moreover, the "inverted regime" in thermal rate constants (with increasing bias) known from Marcus Theory is also reproduced with good accuracy by this very simple classical approach. The SQC treatment is also applied to a recent model of photoinduced proton coupled electron transfer [C. Venkataraman, A. V. Soudackov, and S. Hammes-Schiffer, J. Chem. Phys. 131, 154502 (2009)] and population decay of the photoexcited donor state is found to be in reasonable agreement with results calculated via reduced density matrix theory.

  20. Symmetrical windowing for quantum states in quasi-classical trajectory simulations: Application to electron transfer

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cotton, Stephen J.; Igumenshchev, Kirill; Miller, William H., E-mail: millerwh@berkeley.edu

    It has recently been shown [S. J. Cotton and W. H. Miller, J. Chem. Phys. 139, 234112 (2013)] that a symmetrical windowing quasi-classical (SQC) approach [S. J. Cotton and W. H. Miller, J. Phys. Chem. A 117, 7190 (2013)] applied to the Meyer-Miller model [H.-D. Meyer and W. H. Miller, J. Chem. Phys. 70, 3214 (1979)] for the electronic degrees of freedom in electronically non-adiabatic dynamics is capable of quantitatively reproducing quantum mechanical results for a variety of test applications, including cases where “quantum” coherence effects are significant. Here we apply this same SQC methodology, within a flux-side correlation functionmore » framework, to calculate thermal rate constants corresponding to several proposed models of electron transfer processes [P. Huo, T. F. Miller III, and D. F. Coker, J. Chem. Phys. 139, 151103 (2013); A. R. Menzeleev, N. Ananth, and T. F. Miller III, J. Chem. Phys. 135, 074106 (2011)]. Good quantitative agreement with Marcus Theory is obtained over several orders of magnitude variation in non-adiabatic coupling. Moreover, the “inverted regime” in thermal rate constants (with increasing bias) known from Marcus Theory is also reproduced with good accuracy by this very simple classical approach. The SQC treatment is also applied to a recent model of photoinduced proton coupled electron transfer [C. Venkataraman, A. V. Soudackov, and S. Hammes-Schiffer, J. Chem. Phys. 131, 154502 (2009)] and population decay of the photoexcited donor state is found to be in reasonable agreement with results calculated via reduced density matrix theory.« less

  1. Study of homogeneous bubble nucleation in liquid carbon dioxide by a hybrid approach combining molecular dynamics simulation and density gradient theory

    NASA Astrophysics Data System (ADS)

    Langenbach, K.; Heilig, M.; Horsch, M.; Hasse, H.

    2018-03-01

    A new method for predicting homogeneous bubble nucleation rates of pure compounds from vapor-liquid equilibrium (VLE) data is presented. It combines molecular dynamics simulation on the one side with density gradient theory using an equation of state (EOS) on the other. The new method is applied here to predict bubble nucleation rates in metastable liquid carbon dioxide (CO2). The molecular model of CO2 is taken from previous work of our group. PC-SAFT is used as an EOS. The consistency between the molecular model and the EOS is achieved by adjusting the PC-SAFT parameters to VLE data obtained from the molecular model. The influence parameter of density gradient theory is fitted to the surface tension of the molecular model. Massively parallel molecular dynamics simulations are performed close to the spinodal to compute bubble nucleation rates. From these simulations, the kinetic prefactor of the hybrid nucleation theory is estimated, whereas the nucleation barrier is calculated from density gradient theory. This enables the extrapolation of molecular simulation data to the whole metastable range including technically relevant densities. The results are tested against available experimental data and found to be in good agreement. The new method does not suffer from typical deficiencies of classical nucleation theory concerning the thermodynamic barrier at the spinodal and the bubble size dependence of surface tension, which is typically neglected in classical nucleation theory. In addition, the density in the center of critical bubbles and their surface tension is determined as a function of their radius. The usual linear Tolman correction to the capillarity approximation is found to be invalid.

  2. Study of homogeneous bubble nucleation in liquid carbon dioxide by a hybrid approach combining molecular dynamics simulation and density gradient theory.

    PubMed

    Langenbach, K; Heilig, M; Horsch, M; Hasse, H

    2018-03-28

    A new method for predicting homogeneous bubble nucleation rates of pure compounds from vapor-liquid equilibrium (VLE) data is presented. It combines molecular dynamics simulation on the one side with density gradient theory using an equation of state (EOS) on the other. The new method is applied here to predict bubble nucleation rates in metastable liquid carbon dioxide (CO 2 ). The molecular model of CO 2 is taken from previous work of our group. PC-SAFT is used as an EOS. The consistency between the molecular model and the EOS is achieved by adjusting the PC-SAFT parameters to VLE data obtained from the molecular model. The influence parameter of density gradient theory is fitted to the surface tension of the molecular model. Massively parallel molecular dynamics simulations are performed close to the spinodal to compute bubble nucleation rates. From these simulations, the kinetic prefactor of the hybrid nucleation theory is estimated, whereas the nucleation barrier is calculated from density gradient theory. This enables the extrapolation of molecular simulation data to the whole metastable range including technically relevant densities. The results are tested against available experimental data and found to be in good agreement. The new method does not suffer from typical deficiencies of classical nucleation theory concerning the thermodynamic barrier at the spinodal and the bubble size dependence of surface tension, which is typically neglected in classical nucleation theory. In addition, the density in the center of critical bubbles and their surface tension is determined as a function of their radius. The usual linear Tolman correction to the capillarity approximation is found to be invalid.

  3. The effects of physical aging at elevated temperatures on the viscoelastic creep on IM7/K3B

    NASA Technical Reports Server (NTRS)

    Gates, Thomas S.; Feldman, Mark

    1994-01-01

    Physical aging at elevated temperature of the advanced composite IM7/K3B was investigated through the use of creep compliance tests. Testing consisted of short term isothermal, creep/recovery with the creep segments performed at constant load. The matrix dominated transverse tensile and in-plane shear behavior were measured at temperatures ranging from 200 to 230 C. Through the use of time based shifting procedures, the aging shift factors, shift rates and momentary master curve parameters were found at each temperature. These material parameters were used as input to a predictive methodology, which was based upon effective time theory and linear viscoelasticity combined with classical lamination theory. Long term creep compliance test data was compared to predictions to verify the method. The model was then used to predict the long term creep behavior for several general laminates.

  4. Nonequilibrium dynamics of the O( N ) model on dS3 and AdS crunches

    NASA Astrophysics Data System (ADS)

    Kumar, S. Prem; Vaganov, Vladislav

    2018-03-01

    We study the nonperturbative quantum evolution of the interacting O( N ) vector model at large- N , formulated on a spatial two-sphere, with time dependent couplings which diverge at finite time. This model - the so-called "E-frame" theory, is related via a conformal transformation to the interacting O( N ) model in three dimensional global de Sitter spacetime with time independent couplings. We show that with a purely quartic, relevant deformation the quantum evolution of the E-frame model is regular even when the classical theory is rendered singular at the end of time by the diverging coupling. Time evolution drives the E-frame theory to the large- N Wilson-Fisher fixed point when the classical coupling diverges. We study the quantum evolution numerically for a variety of initial conditions and demonstrate the finiteness of the energy at the classical "end of time". With an additional (time dependent) mass deformation, quantum backreaction lowers the mass, with a putative smooth time evolution only possible in the limit of infinite quartic coupling. We discuss the relevance of these results for the resolution of crunch singularities in AdS geometries dual to E-frame theories with a classical gravity dual.

  5. Classical theory of atom-surface scattering: The rainbow effect

    NASA Astrophysics Data System (ADS)

    Miret-Artés, Salvador; Pollak, Eli

    2012-07-01

    The scattering of heavy atoms and molecules from surfaces is oftentimes dominated by classical mechanics. A large body of experiments have gathered data on the angular distributions of the scattered species, their energy loss distribution, sticking probability, dependence on surface temperature and more. For many years these phenomena have been considered theoretically in the framework of the “washboard model” in which the interaction of the incident particle with the surface is described in terms of hard wall potentials. Although this class of models has helped in elucidating some of the features it left open many questions such as: true potentials are clearly not hard wall potentials, it does not provide a realistic framework for phonon scattering, and it cannot explain the incident angle and incident energy dependence of rainbow scattering, nor can it provide a consistent theory for sticking. In recent years we have been developing a classical perturbation theory approach which has provided new insight into the dynamics of atom-surface scattering. The theory includes both surface corrugation as well as interaction with surface phonons in terms of harmonic baths which are linearly coupled to the system coordinates. This model has been successful in elucidating many new features of rainbow scattering in terms of frictions and bath fluctuations or noise. It has also given new insight into the origins of asymmetry in atomic scattering from surfaces. New phenomena deduced from the theory include friction induced rainbows, energy loss rainbows, a theory of super-rainbows, and more. In this review we present the classical theory of atom-surface scattering as well as extensions and implications for semiclassical scattering and the further development of a quantum theory of surface scattering. Special emphasis is given to the inversion of scattering data into information on the particle-surface interactions.

  6. Classical theory of atom-surface scattering: The rainbow effect

    NASA Astrophysics Data System (ADS)

    Miret-Artés, Salvador; Pollak, Eli

    The scattering of heavy atoms and molecules from surfaces is oftentimes dominated by classical mechanics. A large body of experiments have gathered data on the angular distributions of the scattered species, their energy loss distribution, sticking probability, dependence on surface temperature and more. For many years these phenomena have been considered theoretically in the framework of the "washboard model" in which the interaction of the incident particle with the surface is described in terms of hard wall potentials. Although this class of models has helped in elucidating some of the features it left open many questions such as: true potentials are clearly not hard wall potentials, it does not provide a realistic framework for phonon scattering, and it cannot explain the incident angle and incident energy dependence of rainbow scattering, nor can it provide a consistent theory for sticking. In recent years we have been developing a classical perturbation theory approach which has provided new insight into the dynamics of atom-surface scattering. The theory includes both surface corrugation as well as interaction with surface phonons in terms of harmonic baths which are linearly coupled to the system coordinates. This model has been successful in elucidating many new features of rainbow scattering in terms of frictions and bath fluctuations or noise. It has also given new insight into the origins of asymmetry in atomic scattering from surfaces. New phenomena deduced from the theory include friction induced rainbows, energy loss rainbows, a theory of super-rainbows, and more. In this review we present the classical theory of atom-surface scattering as well as extensions and implications for semiclassical scattering and the further development of a quantum theory of surface scattering. Special emphasis is given to the inversion of scattering data into information on the particle-surface interactions.

  7. Loop Quantum Cosmology.

    PubMed

    Bojowald, Martin

    2008-01-01

    Quantum gravity is expected to be necessary in order to understand situations in which classical general relativity breaks down. In particular in cosmology one has to deal with initial singularities, i.e., the fact that the backward evolution of a classical spacetime inevitably comes to an end after a finite amount of proper time. This presents a breakdown of the classical picture and requires an extended theory for a meaningful description. Since small length scales and high curvatures are involved, quantum effects must play a role. Not only the singularity itself but also the surrounding spacetime is then modified. One particular theory is loop quantum cosmology, an application of loop quantum gravity to homogeneous systems, which removes classical singularities. Its implications can be studied at different levels. The main effects are introduced into effective classical equations, which allow one to avoid the interpretational problems of quantum theory. They give rise to new kinds of early-universe phenomenology with applications to inflation and cyclic models. To resolve classical singularities and to understand the structure of geometry around them, the quantum description is necessary. Classical evolution is then replaced by a difference equation for a wave function, which allows an extension of quantum spacetime beyond classical singularities. One main question is how these homogeneous scenarios are related to full loop quantum gravity, which can be dealt with at the level of distributional symmetric states. Finally, the new structure of spacetime arising in loop quantum gravity and its application to cosmology sheds light on more general issues, such as the nature of time. Supplementary material is available for this article at 10.12942/lrr-2008-4.

  8. Insights into Ventilatory Inhomogeneity from Respiratory Measurements on Spacelab Mission D-2

    NASA Technical Reports Server (NTRS)

    Paiva, Manuel; Verbanck, Sylvia; Linnarsson, Dag; Prisk, Kim; West, John B.

    1996-01-01

    The relative contributions of inter-regional and intra-regional ventilation inhomogeneities of Spacelab astronauts are studied. The classical theory of ventilation distribution in the lung is that the top-to-bottom (inter-regional) ventilation inhomogeneities are primarily gravity dependent, whereas the peripheral (intra-regional) ventilation distribution is gravity independent. Argon rebreathing tests showed that gravity independent specific ventilation (ventilation per unit volume) inhomogeneities are at least as large as gravity dependent ones. Single breath tests with helium and sulfur hexafluoride showed the different sensitivity of these gases to microgravity.

  9. Baseline mathematics and geodetics for tracking operations

    NASA Technical Reports Server (NTRS)

    James, R.

    1981-01-01

    Various geodetic and mapping algorithms are analyzed as they apply to radar tracking systems and tested in extended BASIC computer language for real time computer applications. Closed-form approaches to the solution of converting Earth centered coordinates to latitude, longitude, and altitude are compared with classical approximations. A simplified approach to atmospheric refractivity called gradient refraction is compared with conventional ray tracing processes. An extremely detailed set of documentation which provides the theory, derivations, and application of algorithms used in the programs is included. Validation methods are also presented for testing the accuracy of the algorithms.

  10. On the co-creation of classical and modern physics.

    PubMed

    Staley, Richard

    2005-12-01

    While the concept of "classical physics" has long framed our understanding of the environment from which modern physics emerged, it has consistently been read back into a period in which the physicists concerned initially considered their work in quite other terms. This essay explores the shifting currency of the rich cultural image of the classical/ modern divide by tracing empirically different uses of "classical" within the physics community from the 1890s to 1911. A study of fin-de-siècle addresses shows that the earliest general uses of the concept proved controversial. Our present understanding of the term was in large part shaped by its incorporation (in different ways) within the emerging theories of relativity and quantum theory--where the content of "classical" physics was defined by proponents of the new. Studying the diverse ways in which Boltzmann, Larmor, Poincaré, Einstein, Minkowski, and Planck invoked the term "classical" will help clarify the critical relations between physicists' research programs and their use of worldview arguments in fashioning modern physics.

  11. Contact stresses in gear teeth: A new method of analysis

    NASA Technical Reports Server (NTRS)

    Somprakit, Paisan; Huston, Ronald L.; Oswald, Fred B.

    1991-01-01

    A new, innovative procedure called point load superposition for determining the contact stresses in mating gear teeth. It is believed that this procedure will greatly extend both the range of applicability and the accuracy of gear contact stress analysis. Point load superposition is based upon fundamental solutions from the theory of elasticity. It is an iterative numerical procedure which has distinct advantages over the classical Hertz method, the finite element method, and over existing applications with the boundary element method. Specifically, friction and sliding effects, which are either excluded from or difficult to study with the classical methods, are routinely handled with the new procedure. Presented here are the basic theory and the algorithms. Several examples are given. Results are consistent with those of the classical theories. Applications to spur gears are discussed.

  12. Testing the Quantum-Classical Boundary and Dimensionality of Quantum Systems

    NASA Astrophysics Data System (ADS)

    Shun, Poh Hou

    Quantum theory introduces a cut between the observer and the observed system [1], but does not provide a definition of what is an observer [2]. Based on an informational def- inition of the observer, Grinbaum has recently [3] predicted an upper bound on bipartite correlations in the Clauser-Horne-Shimony-Holt (CHSH) Bell scenario equal to 2.82537, which is slightly smaller than the Tsirelson bound [4] of standard quantum theory, but is consistent with all the available experimental results [5--17]. Not being able to exceed Grin- baum's limit would support that quantum theory is only an effective description of a more fundamental theory and would have a deep impact in physics and quantum information processing. In this thesis, we present a test of the CHSH inequality on photon pairs in maximally entangled states of polarization in which a value 2.8276 +/- 0.00082 is observed, violating Grinbaum's bound by 2.72 standard deviations and providing the smallest distance with respect to Tsirelson's bound ever reported, namely, 0.0008 +/- 0.00082. (Abstract shortened by UMI.).

  13. Monopole operators and Hilbert series of Coulomb branches of 3 d = 4 gauge theories

    NASA Astrophysics Data System (ADS)

    Cremonesi, Stefano; Hanany, Amihay; Zaffaroni, Alberto

    2014-01-01

    This paper addresses a long standing problem - to identify the chiral ring and moduli space (i.e. as an algebraic variety) on the Coulomb branch of an = 4 superconformal field theory in 2+1 dimensions. Previous techniques involved a computation of the metric on the moduli space and/or mirror symmetry. These methods are limited to sufficiently small moduli spaces, with enough symmetry, or to Higgs branches of sufficiently small gauge theories. We introduce a simple formula for the Hilbert series of the Coulomb branch, which applies to any good or ugly three-dimensional = 4 gauge theory. The formula counts monopole operators which are dressed by classical operators, the Casimir invariants of the residual gauge group that is left unbroken by the magnetic flux. We apply our formula to several classes of gauge theories. Along the way we make various tests of mirror symmetry, successfully comparing the Hilbert series of the Coulomb branch with the Hilbert series of the Higgs branch of the mirror theory.

  14. Finite element modelling versus classic beam theory: comparing methods for stress estimation in a morphologically diverse sample of vertebrate long bones

    PubMed Central

    Brassey, Charlotte A.; Margetts, Lee; Kitchener, Andrew C.; Withers, Philip J.; Manning, Phillip L.; Sellers, William I.

    2013-01-01

    Classic beam theory is frequently used in biomechanics to model the stress behaviour of vertebrate long bones, particularly when creating intraspecific scaling models. Although methodologically straightforward, classic beam theory requires complex irregular bones to be approximated as slender beams, and the errors associated with simplifying complex organic structures to such an extent are unknown. Alternative approaches, such as finite element analysis (FEA), while much more time-consuming to perform, require no such assumptions. This study compares the results obtained using classic beam theory with those from FEA to quantify the beam theory errors and to provide recommendations about when a full FEA is essential for reasonable biomechanical predictions. High-resolution computed tomographic scans of eight vertebrate long bones were used to calculate diaphyseal stress owing to various loading regimes. Under compression, FEA values of minimum principal stress (σmin) were on average 142 per cent (±28% s.e.) larger than those predicted by beam theory, with deviation between the two models correlated to shaft curvature (two-tailed p = 0.03, r2 = 0.56). Under bending, FEA values of maximum principal stress (σmax) and beam theory values differed on average by 12 per cent (±4% s.e.), with deviation between the models significantly correlated to cross-sectional asymmetry at midshaft (two-tailed p = 0.02, r2 = 0.62). In torsion, assuming maximum stress values occurred at the location of minimum cortical thickness brought beam theory and FEA values closest in line, and in this case FEA values of τtorsion were on average 14 per cent (±5% s.e.) higher than beam theory. Therefore, FEA is the preferred modelling solution when estimates of absolute diaphyseal stress are required, although values calculated by beam theory for bending may be acceptable in some situations. PMID:23173199

  15. Generalized Quantum Theory of Bianchi IX Cosmologies

    NASA Astrophysics Data System (ADS)

    Craig, David; Hartle, James

    2003-04-01

    We apply sum-over-histories generalized quantum theory to the closed homogeneous minisuperspace Bianchi IX cosmological model. We sketch how the probabilities in decoherent sets of alternative, coarse-grained histories of this model universe are calculated. We consider in particular, the probabilities for classical evolution in a suitable coarse-graining. For a restricted class of initial conditions and coarse grainings we exhibit the approximate decoherence of alternative histories in which the universe behaves classically and those in which it does not, illustrating the prediction that these universes will evolve in an approximately classical manner with a probability near unity.

  16. Generalized mutual information and Tsirelson's bound

    NASA Astrophysics Data System (ADS)

    Wakakuwa, Eyuri; Murao, Mio

    2014-12-01

    We introduce a generalization of the quantum mutual information between a classical system and a quantum system into the mutual information between a classical system and a system described by general probabilistic theories. We apply this generalized mutual information (GMI) to a derivation of Tsirelson's bound from information causality, and prove that Tsirelson's bound can be derived from the chain rule of the GMI. By using the GMI, we formulate the "no-supersignalling condition" (NSS), that the assistance of correlations does not enhance the capability of classical communication. We prove that NSS is never violated in any no-signalling theory.

  17. Generalized mutual information and Tsirelson's bound

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wakakuwa, Eyuri; Murao, Mio

    2014-12-04

    We introduce a generalization of the quantum mutual information between a classical system and a quantum system into the mutual information between a classical system and a system described by general probabilistic theories. We apply this generalized mutual information (GMI) to a derivation of Tsirelson's bound from information causality, and prove that Tsirelson's bound can be derived from the chain rule of the GMI. By using the GMI, we formulate the 'no-supersignalling condition' (NSS), that the assistance of correlations does not enhance the capability of classical communication. We prove that NSS is never violated in any no-signalling theory.

  18. Quantum mechanics as classical statistical mechanics with an ontic extension and an epistemic restriction.

    PubMed

    Budiyono, Agung; Rohrlich, Daniel

    2017-11-03

    Where does quantum mechanics part ways with classical mechanics? How does quantum randomness differ fundamentally from classical randomness? We cannot fully explain how the theories differ until we can derive them within a single axiomatic framework, allowing an unambiguous account of how one theory is the limit of the other. Here we derive non-relativistic quantum mechanics and classical statistical mechanics within a common framework. The common axioms include conservation of average energy and conservation of probability current. But two axioms distinguish quantum mechanics from classical statistical mechanics: an "ontic extension" defines a nonseparable (global) random variable that generates physical correlations, and an "epistemic restriction" constrains allowed phase space distributions. The ontic extension and epistemic restriction, with strength on the order of Planck's constant, imply quantum entanglement and uncertainty relations. This framework suggests that the wave function is epistemic, yet it does not provide an ontic dynamics for individual systems.

  19. "Fathers" and "sons" of theories in cell physiology: the membrane theory.

    PubMed

    Matveev, V V; Wheatley, D N

    2005-12-16

    The last 50 years in the history of life sciences are remarkable for a new important feature that looks as a great threat for their future. A profound specialization dominating in quickly developing fields of science causes a crisis of the scientific method. The essence of the method is a unity of two elements, the experimental data and the theory that explains them. To us, "fathers" of science, classically, were the creators of new ideas and theories. They were the true experts of their own theories. It is only they who have the right to say: "I am the theory". In other words, they were carriers of theories, of the theoretical knowledge. The fathers provided the necessary logical integrity to their theories, since theories in biology have still to be based on strict mathematical proofs. It is not true for sons. As a result of massive specialization, modern experts operate in very confined close spaces. They formulate particular rules far from the level of theory. The main theories of science are known to them only at the textbook level. Nowadays, nobody can say: "I am the theory". With whom, then is it possible to discuss today on a broader theoretical level? How can a classical theory--for example, the membrane one--be changed or even disproved under these conditions? How can the "sons" with their narrow education catch sight of membrane theory defects? As a result, "global" theories have few critics and control. Due to specialization, we have lost the ability to work at the experimental level of biology within the correct or appropriate theoretical context. The scientific method in its classic form is now being rapidly eroded. A good case can be made for "Membrane Theory", to which we will largely refer throughout this article.

  20. Color measurement and discrimination

    NASA Technical Reports Server (NTRS)

    Wandell, B. A.

    1985-01-01

    Theories of color measurement attempt to provide a quantative means for predicting whether two lights will be discriminable to an average observer. All color measurement theories can be characterized as follows: suppose lights a and b evoke responses from three color channels characterized as vectors, v(a) and v(b); the vector difference v(a) - v(b) corresponds to a set of channel responses that would be generated by some real light, call it *. According to theory a and b will be discriminable when * is detectable. A detailed development and test of the classic color measurement approach are reported. In the absence of a luminance component in the test stimuli, a and b, the theory holds well. In the presence of a luminance component, the theory is clearly false. When a luminance component is present discrimination judgements depend largely on whether the lights being discriminated fall in separate, categorical regions of color space. The results suggest that sensory estimation of surface color uses different methods, and the choice of method depends upon properties of the image. When there is significant luminance variation a categorical method is used, while in the absence of significant luminance variation judgments are continuous and consistant with the measurement approach.

  1. An Investigation of the Impact of Guessing on Coefficient α and Reliability

    PubMed Central

    2014-01-01

    Guessing is known to influence the test reliability of multiple-choice tests. Although there are many studies that have examined the impact of guessing, they used rather restrictive assumptions (e.g., parallel test assumptions, homogeneous inter-item correlations, homogeneous item difficulty, and homogeneous guessing levels across items) to evaluate the relation between guessing and test reliability. Based on the item response theory (IRT) framework, this study investigated the extent of the impact of guessing on reliability under more realistic conditions where item difficulty, item discrimination, and guessing levels actually vary across items with three different test lengths (TL). By accommodating multiple item characteristics simultaneously, this study also focused on examining interaction effects between guessing and other variables entered in the simulation to be more realistic. The simulation of the more realistic conditions and calculations of reliability and classical test theory (CTT) item statistics were facilitated by expressing CTT item statistics, coefficient α, and reliability in terms of IRT model parameters. In addition to the general negative impact of guessing on reliability, results showed interaction effects between TL and guessing and between guessing and test difficulty.

  2. A test of the AdS/CFT duality on the Coulomb branch

    NASA Astrophysics Data System (ADS)

    Costa, M. S.

    2000-06-01

    We consider the /N=4 /SU(N) Super Yang Mills theory on the Coulomb branch with gauge symmetry broken to S(U(N1)×U(N2)). By integrating the W particles, the effective action near the IR SU(Ni) conformal fixed points is seen to be a deformation of the Super Yang Mills theory by a non-renormalized, irrelevant, dimension 8 operator. The correction to the two-point function of the dilaton field dual operator near the IR is related to a three-point function of chiral primary operators at the conformal fixed points and agrees with the classical gravity prediction, including the numerical factor.

  3. Quantum cybernetics and its test in “late choice” experiments

    NASA Astrophysics Data System (ADS)

    Grössing, Gerhard

    1986-11-01

    A relativistically invariant wave equation for the propagation of wave fronts S = const ( S being the action function) is derived on the basis of a cybernetic model of quantum systems involving “hidden variables”. This equation can be considered both as an expression of Huygens' principle and as a general continuity equation providing a close link between classical and quantum mechanics. Although the theory reproduces ordinary quantum mechanics, there are particular situations providing experimental predictions differing from those existing theories. Such predictions are made for so-called “late choice” experiments, which are modified versions of the familiar “delayed choice” experiments.

  4. A discussion of differences in preparation, performance and postreflections in participant observations within two grounded theory approaches.

    PubMed

    Berthelsen, Connie Bøttcher; Lindhardt, Tove; Frederiksen, Kirsten

    2017-06-01

    This paper presents a discussion of the differences in using participant observation as a data collection method by comparing the classic grounded theory methodology of Barney Glaser with the constructivist grounded theory methodology by Kathy Charmaz. Participant observations allow nursing researchers to experience activities and interactions directly in situ. However, using participant observations as a data collection method can be done in many ways, depending on the chosen grounded theory methodology, and may produce different results. This discussion shows that how the differences between using participant observations in classic and constructivist grounded theory can be considerable and that grounded theory researchers should adhere to the method descriptions of performing participant observations according to the selected grounded theory methodology to enhance the quality of research. © 2016 Nordic College of Caring Science.

  5. The Institution of Sociological Theory in Canada.

    PubMed

    Guzman, Cinthya; Silver, Daniel

    2018-02-01

    Using theory syllabi and departmental data collected for three academic years, this paper investigates the institutional practice of theory in sociology departments across Canada. In particular, it examines the position of theory within the sociological curriculum, and how this varies among universities. Taken together, our analyses indicate that theory remains deeply institutionalized at the core of sociological education and Canadian sociologists' self-understanding; that theorists as a whole show some coherence in how they define themselves, but differ in various ways, especially along lines of region, intellectual background, and gender; that despite these differences, the classical versus contemporary heuristic largely cuts across these divides, as does the strongly ingrained position of a small group of European authors as classics of the discipline as a whole. Nevertheless, who is a classic remains an unsettled question, alternatives to the "classical versus contemporary" heuristic do exist, and theorists' syllabi reveal diverse "others" as potential candidates. Our findings show that the field of sociology is neither marked by universal agreement nor by absolute division when it comes to its theoretical underpinnings. To the extent that they reveal a unified field, the findings suggest that unity lies more in a distinctive form than in a distinctive content, which defines the space and structure of the field of sociology. © 2018 Canadian Sociological Association/La Société canadienne de sociologie.

  6. On the effective field theory of intersecting D3-branes

    NASA Astrophysics Data System (ADS)

    Abbaspur, Reza

    2018-05-01

    We study the effective field theory of two intersecting D3-branes with one common dimension along the lines recently proposed in ref. [1]. We introduce a systematic way of deriving the classical effective action to arbitrary orders in perturbation theory. Using a proper renormalization prescription to handle logarithmic divergencies arising at all orders in the perturbation series, we recover the first order renormalization group equation of ref. [1] plus an infinite set of higher order equations. We show the consistency of the higher order equations with the first order one and hence interpret the first order result as an exact RG flow equation in the classical theory.

  7. Ginzburg-Landau theory for the solid-liquid interface of bcc elements. II - Application to the classical one-component plasma, the Wigner crystal, and He-4

    NASA Technical Reports Server (NTRS)

    Zeng, X. C.; Stroud, D.

    1989-01-01

    The previously developed Ginzburg-Landau theory for calculating the crystal-melt interfacial tension of bcc elements to treat the classical one-component plasma (OCP), the charged fermion system, and the Bose crystal. For the OCP, a direct application of the theory of Shih et al. (1987) yields for the surface tension 0.0012(Z-squared e-squared/a-cubed), where Ze is the ionic charge and a is the radius of the ionic sphere. Bose crystal-melt interface is treated by a quantum extension of the classical density-functional theory, using the Feynman formalism to estimate the relevant correlation functions. The theory is applied to the metastable He-4 solid-superfluid interface at T = 0, with a resulting surface tension of 0.085 erg/sq cm, in reasonable agreement with the value extrapolated from the measured surface tension of the bcc solid in the range 1.46-1.76 K. These results suggest that the density-functional approach is a satisfactory mean-field theory for estimating the equilibrium properties of liquid-solid interfaces, given knowledge of the uniform phases.

  8. A Classic Test of the Hubbert-Rubey Weakening Mechanism: M7.6 Thrust-Belt Earthquake Taiwan

    NASA Astrophysics Data System (ADS)

    Yue, L.; Suppe, J.

    2005-12-01

    The Hubbert-Rubey (1959) fluid-pressure hypothesis has long been accepted as a classic solution to the problem of the apparent weakness of long thin thrust sheets. This hypothesis, in its classic form argues that ambient high pore-fluid pressures, which are common in sedimentary basins, reduce the normalized shear traction on the fault τb/ρ g H = μb(1-λb) where λb=Pf/ρ g H is the normalized pore-fluid pressure and μb is the coefficient of friction. Remarkably, there have been few large-scale tests of this classic hypothesis. Here we document ambient pore-fluid pressures surrounding the active frontal thrusts of western Taiwan, including the Chulungpu thrust that slipped in the 1999 Mw7.6 Chi-Chi earthquake. We show from 3-D mapping of these thrusts that they flatten to a shallow detachment at about 5 km depth in the Pliocene Chinshui Shale. Using critical-taper wedge theory and the dip of the detachment and surface slope we constrain the basal shear traction τb/ρ g H ≍ 0.1 which is substantially weaker than common lab friction values of of Byerlee's law (μb= 0.85-0.6). We have determined the pore-fluid pressures as a function of depth in 76 wells, based on in-situ formation tests, sonic logs and mud densities. Fluid pressures are regionally controlled stratigraphically by sedimentary facies. The top of overpressures is everywhere below the base of the Chinshui Shale, therefore the entire Chinshui thrust system is at ambient hydrostatic pore-fluid pressures (λb ≍ 0.4). According to the classic Hubbert-Rubey hypothesis the required basal coefficient of friction is therefore μb ≍ 0.1-0.2. Therefore the classic Hubbert & Rubey mechanism involving static ambient excess fluid pressures is not the cause of extreme fault weakening in this western Taiwan example. We must look to other mechanisms of large-scale fault weakening, many of which are difficult to test.

  9. Nonclassicality and Entanglement in multimode radiation fields under the action of classicality preserving devices

    NASA Astrophysics Data System (ADS)

    Chaturvedi, S.

    2011-09-01

    In this work we examine the possibilities of converting quantum optical nonclassicality into entanglement in multimode under the action of classicality preserving devices such as beamsplitters. While the single mode case is amenable to a complete analysis, non availability of certain crucial results in the classical theory of moments in the multimode situations forces us to treat these cases with lesser degree of generality by taking recourse to the familiar Mandel matrix and its extensions. We generalize the Mandel matrix from one-mode states to the two-mode situation, leading to a natural classification of states with varying levels of nonclassicality. For two-mode states we present a single test which, if successful, simultaneouly witnesses nonclassicality as well as NPT entanglement. We develop a test for NPT entanglement after beamsplitter action on a nonclassical state, designed in such a way that it remains `close' to that for nonclassicality. In the same spirit we analyse the result of three-mode `beamsplitter' action after coupling to an ancilla in the Fock ground state. The concept of genuine tripartite entanglement and scalar measures of nonclassicality at the Mandel level for two mode systems are discussed and illustrated with the help of several examples.

  10. ERP Reliability Analysis (ERA) Toolbox: An open-source toolbox for analyzing the reliability of event-related brain potentials.

    PubMed

    Clayson, Peter E; Miller, Gregory A

    2017-01-01

    Generalizability theory (G theory) provides a flexible, multifaceted approach to estimating score reliability. G theory's approach to estimating score reliability has important advantages over classical test theory that are relevant for research using event-related brain potentials (ERPs). For example, G theory does not require parallel forms (i.e., equal means, variances, and covariances), can handle unbalanced designs, and provides a single reliability estimate for designs with multiple sources of error. This monograph provides a detailed description of the conceptual framework of G theory using examples relevant to ERP researchers, presents the algorithms needed to estimate ERP score reliability, and provides a detailed walkthrough of newly-developed software, the ERP Reliability Analysis (ERA) Toolbox, that calculates score reliability using G theory. The ERA Toolbox is open-source, Matlab software that uses G theory to estimate the contribution of the number of trials retained for averaging, group, and/or event types on ERP score reliability. The toolbox facilitates the rigorous evaluation of psychometric properties of ERP scores recommended elsewhere in this special issue. Copyright © 2016 Elsevier B.V. All rights reserved.

  11. An enstrophy-based linear and nonlinear receptivity theory

    NASA Astrophysics Data System (ADS)

    Sengupta, Aditi; Suman, V. K.; Sengupta, Tapan K.; Bhaumik, Swagata

    2018-05-01

    In the present research, a new theory of instability based on enstrophy is presented for incompressible flows. Explaining instability through enstrophy is counter-intuitive, as it has been usually associated with dissipation for the Navier-Stokes equation (NSE). This developed theory is valid for both linear and nonlinear stages of disturbance growth. A previously developed nonlinear theory of incompressible flow instability based on total mechanical energy described in the work of Sengupta et al. ["Vortex-induced instability of an incompressible wall-bounded shear layer," J. Fluid Mech. 493, 277-286 (2003)] is used to compare with the present enstrophy based theory. The developed equations for disturbance enstrophy and disturbance mechanical energy are derived from NSE without any simplifying assumptions, as compared to other classical linear/nonlinear theories. The theory is tested for bypass transition caused by free stream convecting vortex over a zero pressure gradient boundary layer. We explain the creation of smaller scales in the flow by a cascade of enstrophy, which creates rotationality, in general inhomogeneous flows. Linear and nonlinear versions of the theory help explain the vortex-induced instability problem under consideration.

  12. Do event horizons exist?

    NASA Astrophysics Data System (ADS)

    Baccetti, Valentina; Mann, Robert B.; Terno, Daniel R.

    Event horizons are the defining feature of classical black holes. They are the key ingredient of the information loss paradox which, as paradoxes in quantum foundations, is built on a combination of predictions of quantum theory and counterfactual classical features: neither horizon formation nor its crossing by a test body can be detected by a distant observer. Furthermore, horizons are unnecessary for the production of Hawking-like radiation. We demonstrate that when this radiation is taken into account, it can prevent horizon crossing/formation in a large class of models. We conjecture that horizon avoidance is a general feature of collapse. The nonexistence of event horizons dispels the paradox, but opens up important questions about thermodynamic properties of the resulting objects and correlations between different degrees of freedom.

  13. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Horn, Paul R., E-mail: prhorn@berkeley.edu; Mao, Yuezhi; Head-Gordon, Martin, E-mail: mhg@cchem.berkeley.edu

    In energy decomposition analysis of Kohn-Sham density functional theory calculations, the so-called frozen (or pre-polarization) interaction energy contains contributions from permanent electrostatics, dispersion, and Pauli repulsion. The standard classical approach to separate them suffers from several well-known limitations. We introduce an alternative scheme that employs valid antisymmetric electronic wavefunctions throughout and is based on the identification of individual fragment contributions to the initial supersystem wavefunction as determined by an energetic optimality criterion. The density deformations identified with individual fragments upon formation of the initial supersystem wavefunction are analyzed along with the distance dependence of the new and classical terms formore » test cases that include the neon dimer, ammonia borane, water-Na{sup +}, water-Cl{sup −}, and the naphthalene dimer.« less

  14. Classical closure theory and Lam's interpretation of epsilon-RNG

    NASA Technical Reports Server (NTRS)

    Zhou, YE

    1995-01-01

    Lam's phenomenological epsilon-renormalization group (RNG) model is quite different from the other members of that group. It does not make use of the correspondence principle and the epsilon-expansion procedure. We demonstrate that Lam's epsilon-RNG model is essentially the physical space version of the classical closure theory in spectral space and consider the corresponding treatment of the eddy viscosity and energy backscatter.

  15. New variables for classical and quantum gravity

    NASA Technical Reports Server (NTRS)

    Ashtekar, Abhay

    1986-01-01

    A Hamiltonian formulation of general relativity based on certain spinorial variables is introduced. These variables simplify the constraints of general relativity considerably and enable one to imbed the constraint surface in the phase space of Einstein's theory into that of Yang-Mills theory. The imbedding suggests new ways of attacking a number of problems in both classical and quantum gravity. Some illustrative applications are discussed.

  16. Conveying the Complex: Updating U.S. Joint Systems Analysis Doctrine with Complexity Theory

    DTIC Science & Technology

    2013-12-10

    screech during a public address, or sustain and amplify it during a guitar solo. Since the systems are nonlinear, understanding cause and effect... Classics , 2007), 12. 34 those frames.58 A technique to cope with the potentially confusing...Reynolds, Paul Davidson. A Primer in Theory Construction. Boston: Allyn and Bacon Classics , 2007. Riolo, Rick L. “The Effects and Evolution of Tag

  17. Quantum kinetic theory of the filamentation instability

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bret, A.; Haas, F.

    2011-07-15

    The quantum electromagnetic dielectric tensor for a multi-species plasma is re-derived from the gauge-invariant Wigner-Maxwell system and presented under a form very similar to the classical one. The resulting expression is then applied to a quantum kinetic theory of the electromagnetic filamentation instability. Comparison is made with the quantum fluid theory including a Bohm pressure term and with the cold classical plasma result. A number of analytical expressions are derived for the cutoff wave vector, the largest growth rate, and the most unstable wave vector.

  18. A classical density-functional theory for describing water interfaces.

    PubMed

    Hughes, Jessica; Krebs, Eric J; Roundy, David

    2013-01-14

    We develop a classical density functional for water which combines the White Bear fundamental-measure theory (FMT) functional for the hard sphere fluid with attractive interactions based on the statistical associating fluid theory variable range (SAFT-VR). This functional reproduces the properties of water at both long and short length scales over a wide range of temperatures and is computationally efficient, comparable to the cost of FMT itself. We demonstrate our functional by applying it to systems composed of two hard rods, four hard rods arranged in a square, and hard spheres in water.

  19. Geometry, topology, and string theory

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Varadarajan, Uday

    A variety of scenarios are considered which shed light upon the uses and limitations of classical geometric and topological notions in string theory. The primary focus is on situations in which D-brane or string probes of a given classical space-time see the geometry quite differently than one might naively expect. In particular, situations in which extra dimensions, non-commutative geometries as well as other non-local structures emerge are explored in detail. Further, a preliminary exploration of such issues in Lorentzian space-times with non-trivial causal structures within string theory is initiated.

  20. Psychometric properties of the Chinese version of resilience scale specific to cancer: an item response theory analysis.

    PubMed

    Ye, Zeng Jie; Liang, Mu Zi; Zhang, Hao Wei; Li, Peng Fei; Ouyang, Xue Ren; Yu, Yuan Liang; Liu, Mei Ling; Qiu, Hong Zhong

    2018-06-01

    Classic theory test has been used to develop and validate the 25-item Resilience Scale Specific to Cancer (RS-SC) in Chinese patients with cancer. This study was designed to provide additional information about the discriminative value of the individual items tested with an item response theory analysis. A two-parameter graded response model was performed to examine whether any of the items of the RS-SC exhibited problems with the ordering and steps of thresholds, as well as the ability of items to discriminate patients with different resilience levels using item characteristic curves. A sample of 214 Chinese patients with cancer diagnosis was analyzed. The established three-dimension structure of the RS-SC was confirmed. Several items showed problematic thresholds or discrimination ability and require further revision. Some problematic items should be refined and a short-form of RS-SC maybe feasible in clinical settings in order to reduce burden on patients. However, the generalizability of these findings warrants further investigations.

  1. Improving the Reliability of Student Scores from Speeded Assessments: An Illustration of Conditional Item Response Theory Using a Computer-Administered Measure of Vocabulary.

    PubMed

    Petscher, Yaacov; Mitchell, Alison M; Foorman, Barbara R

    2015-01-01

    A growing body of literature suggests that response latency, the amount of time it takes an individual to respond to an item, may be an important factor to consider when using assessment data to estimate the ability of an individual. Considering that tests of passage and list fluency are being adapted to a computer administration format, it is possible that accounting for individual differences in response times may be an increasingly feasible option to strengthen the precision of individual scores. The present research evaluated the differential reliability of scores when using classical test theory and item response theory as compared to a conditional item response model which includes response time as an item parameter. Results indicated that the precision of student ability scores increased by an average of 5 % when using the conditional item response model, with greater improvements for those who were average or high ability. Implications for measurement models of speeded assessments are discussed.

  2. Improving the Reliability of Student Scores from Speeded Assessments: An Illustration of Conditional Item Response Theory Using a Computer-Administered Measure of Vocabulary

    PubMed Central

    Petscher, Yaacov; Mitchell, Alison M.; Foorman, Barbara R.

    2016-01-01

    A growing body of literature suggests that response latency, the amount of time it takes an individual to respond to an item, may be an important factor to consider when using assessment data to estimate the ability of an individual. Considering that tests of passage and list fluency are being adapted to a computer administration format, it is possible that accounting for individual differences in response times may be an increasingly feasible option to strengthen the precision of individual scores. The present research evaluated the differential reliability of scores when using classical test theory and item response theory as compared to a conditional item response model which includes response time as an item parameter. Results indicated that the precision of student ability scores increased by an average of 5 % when using the conditional item response model, with greater improvements for those who were average or high ability. Implications for measurement models of speeded assessments are discussed. PMID:27721568

  3. Testing the adaptive radiation hypothesis for the lemurs of Madagascar.

    PubMed

    Herrera, James P

    2017-01-01

    Lemurs, the diverse, endemic primates of Madagascar, are thought to represent a classic example of adaptive radiation. Based on the most complete phylogeny of living and extinct lemurs yet assembled, I tested predictions of adaptive radiation theory by estimating rates of speciation, extinction and adaptive phenotypic evolution. As predicted, lemur speciation rate exceeded that of their sister clade by nearly twofold, indicating the diversification dynamics of lemurs and mainland relatives may have been decoupled. Lemur diversification rates did not decline over time, however, as predicted by adaptive radiation theory. Optimal body masses diverged among dietary and activity pattern niches as lineages diversified into unique multidimensional ecospace. Based on these results, lemurs only partially fulfil the predictions of adaptive radiation theory, with phenotypic evolution corresponding to an 'early burst' of adaptive differentiation. The results must be interpreted with caution, however, because over the long evolutionary history of lemurs (approx. 50 million years), the 'early burst' signal of adaptive radiation may have been eroded by extinction.

  4. Testing the adaptive radiation hypothesis for the lemurs of Madagascar

    PubMed Central

    2017-01-01

    Lemurs, the diverse, endemic primates of Madagascar, are thought to represent a classic example of adaptive radiation. Based on the most complete phylogeny of living and extinct lemurs yet assembled, I tested predictions of adaptive radiation theory by estimating rates of speciation, extinction and adaptive phenotypic evolution. As predicted, lemur speciation rate exceeded that of their sister clade by nearly twofold, indicating the diversification dynamics of lemurs and mainland relatives may have been decoupled. Lemur diversification rates did not decline over time, however, as predicted by adaptive radiation theory. Optimal body masses diverged among dietary and activity pattern niches as lineages diversified into unique multidimensional ecospace. Based on these results, lemurs only partially fulfil the predictions of adaptive radiation theory, with phenotypic evolution corresponding to an ‘early burst’ of adaptive differentiation. The results must be interpreted with caution, however, because over the long evolutionary history of lemurs (approx. 50 million years), the ‘early burst’ signal of adaptive radiation may have been eroded by extinction. PMID:28280597

  5. Semiclassical theory of electronically nonadiabatic transitions in molecular collision processes

    NASA Technical Reports Server (NTRS)

    Lam, K. S.; George, T. F.

    1979-01-01

    An introductory account of the semiclassical theory of the S-matrix for molecular collision processes is presented, with special emphasis on electronically nonadiabatic transitions. This theory is based on the incorporation of classical mechanics with quantum superposition, and in practice makes use of the analytic continuation of classical mechanics into the complex space of time domain. The relevant concepts of molecular scattering theory and related dynamical models are described and the formalism is developed and illustrated with simple examples - collinear collision of the A+BC type. The theory is then extended to include the effects of laser-induced nonadiabatic transitions. Two bound continuum processes collisional ionization and collision-induced emission also amenable to the same general semiclassical treatment are discussed.

  6. Classical theory of atomic collisions - The first hundred years

    NASA Astrophysics Data System (ADS)

    Grujić, Petar V.

    2012-05-01

    Classical calculations of the atomic processes started in 1911 with famous Rutherford's evaluation of the differential cross section for α particles scattered on foil atoms [1]. The success of these calculations was soon overshadowed by the rise of Quantum Mechanics in 1925 and its triumphal success in describing processes at the atomic and subatomic levels. It was generally recognized that the classical approach should be inadequate and it was neglected until 1953, when the famous paper by Gregory Wannier appeared, in which the threshold law for the single ionization cross section behaviour by electron impact was derived. All later calculations and experimental studies confirmed the law derived by purely classical theory. The next step was taken by Ian Percival and collaborators in 60s, who developed a general classical three-body computer code, which was used by many researchers in evaluating various atomic processes like ionization, excitation, detachment, dissociation, etc. Another approach was pursued by Michal Gryzinski from Warsaw, who started a far reaching programme for treating atomic particles and processes as purely classical objects [2]. Though often criticized for overestimating the domain of the classical theory, results of his group were able to match many experimental data. Belgrade group was pursuing the classical approach using both analytical and numerical calculations, studying a number of atomic collisions, in particular near-threshold processes. Riga group, lead by Modris Gailitis [3], contributed considerably to the field, as it was done by Valentin Ostrovsky and coworkers from Sanct Petersbourg, who developed powerful analytical methods within purely classical mechanics [4]. We shall make an overview of these approaches and show some of the remarkable results, which were subsequently confirmed by semiclassical and quantum mechanical calculations, as well as by the experimental evidence. Finally we discuss the theoretical and epistemological background of the classical calculations and explain why these turned out so successful, despite the essentially quantum nature of the atomic and subatomic systems.

  7. Independent evolution of the sexes promotes amphibian diversification.

    PubMed

    De Lisle, Stephen P; Rowe, Locke

    2015-03-22

    Classic ecological theory predicts that the evolution of sexual dimorphism constrains diversification by limiting morphospace available for speciation. Alternatively, sexual selection may lead to the evolution of reproductive isolation and increased diversification. We test contrasting predictions of these hypotheses by examining the relationship between sexual dimorphism and diversification in amphibians. Our analysis shows that the evolution of sexual size dimorphism (SSD) is associated with increased diversification and speciation, contrary to the ecological theory. Further, this result is unlikely to be explained by traditional sexual selection models because variation in amphibian SSD is unlikely to be driven entirely by sexual selection. We suggest that relaxing a central assumption of classic ecological models-that the sexes share a common adaptive landscape-leads to the alternative hypothesis that independent evolution of the sexes may promote diversification. Once the constraints of sexual conflict are relaxed, the sexes can explore morphospace that would otherwise be inaccessible. Consistent with this novel hypothesis, the evolution of SSD in amphibians is associated with reduced current extinction threat status, and an historical reduction in extinction rate. Our work reconciles conflicting predictions from ecological and evolutionary theory and illustrates that the ability of the sexes to evolve independently is associated with a spectacular vertebrate radiation. © 2015 The Author(s) Published by the Royal Society. All rights reserved.

  8. Perception of emotional climate in a revolution: Test of a multistage theory of revolution in the Tunisian context.

    PubMed

    Rimé, Bernard; Yzerbyt, Vincent; Mahjoub, Abdelwahab

    2017-12-01

    Participation in social movements and collective action depends upon people's capacity to perceive their societal context. We examined this question in the context of Arab Spring revolutions. In a classic theory of revolution highlighting the role of collective emotions, Brinton (1938) claimed that revolutions, far from chaos, proceed in an orderly sequence involving four stages: euphoria, degradation, terror, and restoration. The emotional climate (EC) as perceived by ordinary Tunisian citizens (2,699 women and 3,816 men) was measured during the 4 years of the Tunisian revolution. A quadratic pattern of perceived EC measures over time provided strong support to Brinton's model. In addition, three different analyses suggested the presence of four distinct stages in the evolution of perceived EC. Third, the socio-political developments in Tunisia during the four stages proved entirely consistent with both Brinton's theoretical model and the perceived EC indicators. Finally, social identification proved closely related to the temporal evolution of positive EC scores. In sum, data from this study not only lend support to the views put forth in an heretofore untested classic theory of revolution but also demonstrate that psychosocial measurements can validly monitor a major process of socio-political transformation. © 2017 The British Psychological Society.

  9. Classical space-times from the S-matrix

    NASA Astrophysics Data System (ADS)

    Neill, Duff; Rothstein, Ira Z.

    2013-12-01

    We show that classical space-times can be derived directly from the S-matrix for a theory of massive particles coupled to a massless spin two particle. As an explicit example we derive the Schwarzchild space-time as a series in GN. At no point of the derivation is any use made of the Einstein-Hilbert action or the Einstein equations. The intermediate steps involve only on-shell S-matrix elements which are generated via BCFW recursion relations and unitarity sewing techniques. The notion of a space-time metric is only introduced at the end of the calculation where it is extracted by matching the potential determined by the S-matrix to the geodesic motion of a test particle. Other static space-times such as Kerr follow in a similar manner. Furthermore, given that the procedure is action independent and depends only upon the choice of the representation of the little group, solutions to Yang-Mills (YM) theory can be generated in the same fashion. Moreover, the squaring relation between the YM and gravity three point functions shows that the seeds that generate solutions in the two theories are algebraically related. From a technical standpoint our methodology can also be utilized to calculate quantities relevant for the binary inspiral problem more efficiently then the more traditional Feynman diagram approach.

  10. Gravitational tides in the outer planets. I - Implications of classical tidal theory. II - Interior calculations and estimation of the tidal dissipation factor

    NASA Technical Reports Server (NTRS)

    Ioannou, Petros J.; Lindzen, Richard S.

    1993-01-01

    Classical tidal theory is applied to the atmospheres of the outer planets. The tidal geopotential due to satellites of the outer planets is discussed, and the solution of Laplace's tidal equation for Hough modes appropriate to tides on the outer planets is examined. The vertical structure of tidal modes is described, noting that only relatively high-order meridional mode numbers can propagate vertically with growing amplitude. Expected magnitudes for tides in the visible atmosphere of Jupiter are discussed. The classical theory is extended to planetary interiors taking the effects of spherically and self-gravity into account. The thermodynamic structure of Jupiter is described and the WKB theory of the vertical structure equation is presented. The regions for which inertial, gravity, and acoustic oscillations are possible are delineated. The case of a planet with a neutral interior is treated, discussing the various atmospheric boundary conditions and showing that the tidal response is small.

  11. Physics of automated driving in framework of three-phase traffic theory.

    PubMed

    Kerner, Boris S

    2018-04-01

    We have revealed physical features of automated driving in the framework of the three-phase traffic theory for which there is no fixed time headway to the preceding vehicle. A comparison with the classical model approach to automated driving for which an automated driving vehicle tries to reach a fixed (desired or "optimal") time headway to the preceding vehicle has been made. It turns out that automated driving in the framework of the three-phase traffic theory can exhibit the following advantages in comparison with the classical model of automated driving: (i) The absence of string instability. (ii) Considerably smaller speed disturbances at road bottlenecks. (iii) Automated driving vehicles based on the three-phase theory can decrease the probability of traffic breakdown at the bottleneck in mixed traffic flow consisting of human driving and automated driving vehicles; on the contrary, even a single automated driving vehicle based on the classical approach can provoke traffic breakdown at the bottleneck in mixed traffic flow.

  12. Physics of automated driving in framework of three-phase traffic theory

    NASA Astrophysics Data System (ADS)

    Kerner, Boris S.

    2018-04-01

    We have revealed physical features of automated driving in the framework of the three-phase traffic theory for which there is no fixed time headway to the preceding vehicle. A comparison with the classical model approach to automated driving for which an automated driving vehicle tries to reach a fixed (desired or "optimal") time headway to the preceding vehicle has been made. It turns out that automated driving in the framework of the three-phase traffic theory can exhibit the following advantages in comparison with the classical model of automated driving: (i) The absence of string instability. (ii) Considerably smaller speed disturbances at road bottlenecks. (iii) Automated driving vehicles based on the three-phase theory can decrease the probability of traffic breakdown at the bottleneck in mixed traffic flow consisting of human driving and automated driving vehicles; on the contrary, even a single automated driving vehicle based on the classical approach can provoke traffic breakdown at the bottleneck in mixed traffic flow.

  13. Urns and Chameleons: two metaphors for two different types of measurements

    NASA Astrophysics Data System (ADS)

    Accardi, Luigi

    2013-09-01

    The awareness of the physical possibility of models of space, alternative with respect to the Euclidean one, begun to emerge towards the end of the 19-th century. At the end of the 20-th century a similar awareness emerged concerning the physical possibility of models of the laws of chance alternative with respect to the classical probabilistic models (Kolmogorov model). In geometry the mathematical construction of several non-Euclidean models of space preceded of about one century their applications in physics, which came with the theory of relativity. In physics the opposite situation took place. In fact, while the first example of non Kolmogorov probabilistic models emerged in quantum physics approximately one century ago, at the beginning of 1900, the awareness of the fact that this new mathematical formalism reflected a new mathematical model of the laws of chance had to wait until the early 1980's. In this long time interval the classical and the new probabilistic models were both used in the description and the interpretation of quantum phenomena and negatively interfered with each other because of the absence (for many decades) of a mathematical theory that clearly delimited the respective domains of application. The result of this interference was the emergence of the so-called the "paradoxes of quantum theory". For several decades there have been many different attempts to solve these paradoxes giving rise to what K. Popper baptized "the great quantum muddle": a debate which has been at the core of the philosophy of science for more than 50 years. However these attempts have led to contradictions between the two fundamental theories of the contemporary physical: the quantum theory and the theory of the relativity. Quantum probability identifies the reason of the emergence of non Kolmogorov models, and therefore of the so-called the paradoxes of quantum theory, in the difference between the notion of passive measurements like "reading pre-existent properties" (urn metaphor) and measurements consisting in reading "a response to an interaction" (chameleon metaphor). The non-trivial point is that one can prove that, while the urn scheme cannot lead to empirical data outside of classic probability, response based measurements can give rise to non classical statistics. The talk will include entirely classical examples of non classical statistics and potential applications to economic, sociological or biomedical phenomena.

  14. The sociobiology of genes: the gene's eye view as a unifying behavioural-ecological framework for biological evolution.

    PubMed

    De Tiège, Alexis; Van de Peer, Yves; Braeckman, Johan; Tanghe, Koen B

    2017-11-22

    Although classical evolutionary theory, i.e., population genetics and the Modern Synthesis, was already implicitly 'gene-centred', the organism was, in practice, still generally regarded as the individual unit of which a population is composed. The gene-centred approach to evolution only reached a logical conclusion with the advent of the gene-selectionist or gene's eye view in the 1960s and 1970s. Whereas classical evolutionary theory can only work with (genotypically represented) fitness differences between individual organisms, gene-selectionism is capable of working with fitness differences among genes within the same organism and genome. Here, we explore the explanatory potential of 'intra-organismic' and 'intra-genomic' gene-selectionism, i.e., of a behavioural-ecological 'gene's eye view' on genetic, genomic and organismal evolution. First, we give a general outline of the framework and how it complements the-to some extent-still 'organism-centred' approach of classical evolutionary theory. Secondly, we give a more in-depth assessment of its explanatory potential for biological evolution, i.e., for Darwin's 'common descent with modification' or, more specifically, for 'historical continuity or homology with modular evolutionary change' as it has been studied by evolutionary developmental biology (evo-devo) during the last few decades. In contrast with classical evolutionary theory, evo-devo focuses on 'within-organism' developmental processes. Given the capacity of gene-selectionism to adopt an intra-organismal gene's eye view, we outline the relevance of the latter model for evo-devo. Overall, we aim for the conceptual integration between the gene's eye view on the one hand, and more organism-centred evolutionary models (both classical evolutionary theory and evo-devo) on the other.

  15. Nonlinear effects in evolution - an ab initio study: A model in which the classical theory of evolution occurs as a special case.

    PubMed

    Clerc, Daryl G

    2016-07-21

    An ab initio approach was used to study the molecular-level interactions that connect gene-mutation to changes in an organism׳s phenotype. The study provides new insights into the evolutionary process and presents a simplification whereby changes in phenotypic properties may be studied in terms of the binding affinities of the chemical interactions affected by mutation, rather than by correlation to the genes. The study also reports the role that nonlinear effects play in the progression of organs, and how those effects relate to the classical theory of evolution. Results indicate that the classical theory of evolution occurs as a special case within the ab initio model - a case having two attributes. The first attribute: proteins and promoter regions are not shared among organs. The second attribute: continuous limiting behavior exists in the physical properties of organs as well as in the binding affinity of the associated chemical interactions, with respect to displacements in the chemical properties of proteins and promoter regions induced by mutation. Outside of the special case, second-order coupling contributions are significant and nonlinear effects play an important role, a result corroborated by analyses of published activity levels in binding and transactivation assays. Further, gradations in the state of perfection of an organ may be small or large depending on the type of mutation, and not necessarily closely-separated as maintained by the classical theory. Results also indicate that organs progress with varying degrees of interdependence, the likelihood of successful mutation decreases with increasing complexity of the affected chemical system, and differences between the ab initio model and the classical theory increase with increasing complexity of the organism. Copyright © 2016 The Author. Published by Elsevier Ltd.. All rights reserved.

  16. The Basics: What's Essential about Theory for Community Development Practice?

    ERIC Educational Resources Information Center

    Hustedde, Ronald J.; Ganowicz, Jacek

    2002-01-01

    Relates three classical theories (structural functionalism, conflict theory, symbolic interactionism) to fundamental concerns of community development (structure, power, and shared meaning). Links these theories to Giddens' structuration theory, which connects macro and micro structures and community influence on change through cultural norms.…

  17. A viscoplastic constitutive theory for metal matrix composites at high temperature

    NASA Technical Reports Server (NTRS)

    Robinson, David N.; Duffy, Stephen F.; Ellis, John R.

    1988-01-01

    A viscoplastic constitutive theory is presented for representing the high temperature deformation behavior of metal matrix composites. The point of view taken is a continuum one where the composite is considered a material in its own right, with its own properties that can be determined for the composite as a whole. It is assumed that a single preferential (fiber) direction is identifiable at each material point (continuum element) admitting the idealization of local transverse isotropy. A key ingredient is the specification of an experimental program for the complete determination of the material functions and parameters for characterizing a particular metal matrix composite. The parameters relating to the strength of anisotropy can be determined through tension/torsion tests on longitudinally and circumferentially reinforced thin walled tubes. Fundamental aspects of the theory are explored through a geometric interpretation of some basic features analogous to those of the classical theory of plasticity.

  18. A viscoplastic constitutive theory for metal matrix composites at high temperature

    NASA Technical Reports Server (NTRS)

    Robinson, D. N.; Duffy, S. F.; Ellis, J. R.

    1986-01-01

    A viscoplastic constitutive theory is presented for representing the high-temperature deformation behavior of metal matrix composites. The point of view taken is a continuum one where the composite is considered a material in its own right, with its own properties that can be determined for the composite as a whole. It is assumed that a single preferential (fiber) direction is identifiable at each material point (continuum element) admitting the idealization of local transverse isotropy. A key ingredient in this work is the specification of an experimental program for the complete determination of the material functions and parameters for characterizing a particular metal matrix composite. The parameters relating to the strength of anisotropy can be determined through tension/torsion tests on longitudinally and circumferentially reinforced thin-walled tubes. Fundamental aspects of the theory are explored through a geometric interpretation of some basic features analogous to those of the classical theory of plasticity.

  19. A viscoplastic constitutive theory for metal matrix composites at high temperature

    NASA Technical Reports Server (NTRS)

    Robinson, D. N.; Ellis, J. R.; Duffy, S. F.

    1987-01-01

    A viscoplastic theory is presented for representing the high-temperature deformation behavior of metal matrix composites. The point of view taken is a continuum one where the composite is considered a material in its own right, with its own properties that can be determined for the composite as a whole. It is presumed that a single preferential (fiber) direction is identifiable at each material point (continuum element) admitting the idealization of local transverse isotropy. A key ingredient in this work is the specification of an experimental program for the complete determination of the material functions and parameters for characterizing a particular metal matrix composite. The parameters relating to the strength of anisotropy can be determined through tension/torsion tests on longitudinally and circumferentially reinforced thin-walled tubes. Fundamental aspects of the theory are explored through a geometric interpretation of some basic features analogous to those of the classical theory of plasticity.

  20. Using generalizability theory to develop clinical assessment protocols.

    PubMed

    Preuss, Richard A

    2013-04-01

    Clinical assessment protocols must produce data that are reliable, with a clinically attainable minimal detectable change (MDC). In a reliability study, generalizability theory has 2 advantages over classical test theory. These advantages provide information that allows assessment protocols to be adjusted to match individual patient profiles. First, generalizability theory allows the user to simultaneously consider multiple sources of measurement error variance (facets). Second, it allows the user to generalize the findings of the main study across the different study facets and to recalculate the reliability and MDC based on different combinations of facet conditions. In doing so, clinical assessment protocols can be chosen based on minimizing the number of measures that must be taken to achieve a realistic MDC, using repeated measures to minimize the MDC, or simply based on the combination that best allows the clinician to monitor an individual patient's progress over a specified period of time.

  1. Entanglement entropy of ABJM theory and entropy of topological black hole

    NASA Astrophysics Data System (ADS)

    Nian, Jun; Zhang, Xinyu

    2017-07-01

    In this paper we discuss the supersymmetric localization of the 4D N = 2 offshell gauged supergravity on the background of the AdS4 neutral topological black hole, which is the gravity dual of the ABJM theory defined on the boundary {S}^1× H^2 . We compute the large- N expansion of the supergravity partition function. The result gives the black hole entropy with the logarithmic correction, which matches the previous result of the entanglement entropy of the ABJM theory up to some stringy effects. Our result is consistent with the previous on-shell one-loop computation of the logarithmic correction to black hole entropy. It provides an explicit example of the identification of the entanglement entropy of the boundary conformal field theory with the bulk black hole entropy beyond the leading order given by the classical Bekenstein-Hawking formula, which consequently tests the AdS/CFT correspondence at the subleading order.

  2. On classical and quantum dynamics of tachyon-like fields and their cosmological implications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dimitrijević, Dragoljub D., E-mail: ddrag@pmf.ni.ac.rs; Djordjević, Goran S., E-mail: ddrag@pmf.ni.ac.rs; Milošević, Milan, E-mail: ddrag@pmf.ni.ac.rs

    2014-11-24

    We consider a class of tachyon-like potentials, motivated by string theory, D-brane dynamics and inflation theory in the context of classical and quantum mechanics. A formalism for describing dynamics of tachyon fields in spatially homogenous and one-dimensional - classical and quantum mechanical limit is proposed. A few models with concrete potentials are considered. Additionally, possibilities for p-adic and adelic generalization of these models are discussed. Classical actions and corresponding quantum propagators, in the Feynman path integral approach, are calculated in a form invariant on a change of the background number fields, i.e. on both archimedean and nonarchimedean spaces. Looking formore » a quantum origin of inflation, relevance of p-adic and adelic generalizations are briefly discussed.« less

  3. The Classical Theory of Light Colors: a Paradigm for Description of Particle Interactions

    NASA Astrophysics Data System (ADS)

    Mazilu, Nicolae; Agop, Maricel; Gatu, Irina; Iacob, Dan Dezideriu; Butuc, Irina; Ghizdovat, Vlad

    2016-06-01

    The color is an interaction property: of the interaction of light with matter. Classically speaking it is therefore akin to the forces. But while forces engendered the mechanical view of the world, the colors generated the optical view. One of the modern concepts of interaction between the fundamental particles of matter - the quantum chromodynamics - aims to fill the gap between mechanics and optics, in a specific description of strong interactions. We show here that this modern description of the particle interactions has ties with both the classical and quantum theories of light, regardless of the connection between forces and colors. In a word, the light is a universal model in the description of matter. The description involves classical Yang-Mills fields related to color.

  4. The polymer physics of single DNA confined in nanochannels.

    PubMed

    Dai, Liang; Renner, C Benjamin; Doyle, Patrick S

    2016-06-01

    In recent years, applications and experimental studies of DNA in nanochannels have stimulated the investigation of the polymer physics of DNA in confinement. Recent advances in the physics of confined polymers, using DNA as a model polymer, have moved beyond the classic Odijk theory for the strong confinement, and the classic blob theory for the weak confinement. In this review, we present the current understanding of the behaviors of confined polymers while briefly reviewing classic theories. Three aspects of confined DNA are presented: static, dynamic, and topological properties. The relevant simulation methods are also summarized. In addition, comparisons of confined DNA with DNA under tension and DNA in semidilute solution are made to emphasize universal behaviors. Finally, an outlook of the possible future research for confined DNA is given. Copyright © 2015 Elsevier B.V. All rights reserved.

  5. Classical and non-classical effective medium theories: New perspectives

    NASA Astrophysics Data System (ADS)

    Tsukerman, Igor

    2017-05-01

    Future research in electrodynamics of periodic electromagnetic composites (metamaterials) can be expected to produce sophisticated homogenization theories valid for any composition and size of the lattice cell. The paper outlines a promising path in that direction, leading to non-asymptotic and nonlocal homogenization models, and highlights aspects of homogenization that are often overlooked: the finite size of the sample and the role of interface boundaries. Classical theories (e.g. Clausius-Mossotti, Maxwell Garnett), while originally derived from a very different set of ideas, fit well into the proposed framework. Nonlocal effects can be included in the model, making an order-of-magnitude accuracy improvements possible. One future challenge is to determine what effective parameters can or cannot be obtained for a given set of constituents of a metamaterial lattice cell, thereby delineating the possible from the impossible in metamaterial design.

  6. Measurement error: Implications for diagnosis and discrepancy models of developmental dyslexia.

    PubMed

    Cotton, Sue M; Crewther, David P; Crewther, Sheila G

    2005-08-01

    The diagnosis of developmental dyslexia (DD) is reliant on a discrepancy between intellectual functioning and reading achievement. Discrepancy-based formulae have frequently been employed to establish the significance of the difference between 'intelligence' and 'actual' reading achievement. These formulae, however, often fail to take into consideration test reliability and the error associated with a single test score. This paper provides an illustration of the potential effects that test reliability and measurement error can have on the diagnosis of dyslexia, with particular reference to discrepancy models. The roles of reliability and standard error of measurement (SEM) in classic test theory are also briefly reviewed. This is followed by illustrations of how SEM and test reliability can aid with the interpretation of a simple discrepancy-based formula of DD. It is proposed that a lack of consideration of test theory in the use of discrepancy-based models of DD can lead to misdiagnosis (both false positives and false negatives). Further, misdiagnosis in research samples affects reproducibility and generalizability of findings. This in turn, may explain current inconsistencies in research on the perceptual, sensory, and motor correlates of dyslexia.

  7. Simulation of transient flow in a shock tunnel and a high Mach number nozzle

    NASA Technical Reports Server (NTRS)

    Jacobs, P. A.

    1991-01-01

    A finite volume Navier-Stokes code was used to simulate the shock reflection and nozzle starting processes in an axisymmetric shock tube and a high Mach number nozzle. The simulated nozzle starting processes were found to match the classical quasi-1-D theory and some features of the experimental measurements. The shock reflection simulation illustrated a new mechanism for the driver gas contamination of the stagnated test gas.

  8. Wavefront aberrations of x-ray dynamical diffraction beams.

    PubMed

    Liao, Keliang; Hong, Youli; Sheng, Weifan

    2014-10-01

    The effects of dynamical diffraction in x-ray diffractive optics with large numerical aperture render the wavefront aberrations difficult to describe using the aberration polynomials, yet knowledge of them plays an important role in a vast variety of scientific problems ranging from optical testing to adaptive optics. Although the diffraction theory of optical aberrations was established decades ago, its application in the area of x-ray dynamical diffraction theory (DDT) is still lacking. Here, we conduct a theoretical study on the aberration properties of x-ray dynamical diffraction beams. By treating the modulus of the complex envelope as the amplitude weight function in the orthogonalization procedure, we generalize the nonrecursive matrix method for the determination of orthonormal aberration polynomials, wherein Zernike DDT and Legendre DDT polynomials are proposed. As an example, we investigate the aberration evolution inside a tilted multilayer Laue lens. The corresponding Legendre DDT polynomials are obtained numerically, which represent balanced aberrations yielding minimum variance of the classical aberrations of an anamorphic optical system. The balancing of classical aberrations and their standard deviations are discussed. We also present the Strehl ratio of the primary and secondary balanced aberrations.

  9. Quantum formalism as an optimisation procedure of information flows for physical and biological systems.

    PubMed

    Baladrón, Carlos; Khrennikov, Andrei

    2016-12-01

    The similarities between biological and physical systems as respectively defined in quantum information biology (QIB) and in a Darwinian approach to quantum mechanics (DAQM) have been analysed. In both theories the processing of information is a central feature characterising the systems. The analysis highlights a mutual support on the thesis contended by each theory. On the one hand, DAQM provides a physical basis that might explain the key role played by quantum information at the macroscopic level for bio-systems in QIB. On the other hand, QIB offers the possibility, acting as a macroscopic testing ground, to analyse the emergence of quantumness from classicality in the terms held by DAQM. As an added result of the comparison, a tentative definition of quantum information in terms of classical information flows has been proposed. The quantum formalism would appear from this comparative analysis between QIB and DAQM as an optimal information scheme that would maximise the stability of biological and physical systems at any scale. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  10. Linear and angular coherence momenta in the classical second-order coherence theory of vector electromagnetic fields.

    PubMed

    Wang, Wei; Takeda, Mitsuo

    2006-09-01

    A new concept of vector and tensor densities is introduced into the general coherence theory of vector electromagnetic fields that is based on energy and energy-flow coherence tensors. Related coherence conservation laws are presented in the form of continuity equations that provide new insights into the propagation of second-order correlation tensors associated with stationary random classical electromagnetic fields.

  11. A method for testing whether model predictions fall within a prescribed factor of true values, with an application to pesticide leaching

    USGS Publications Warehouse

    Parrish, Rudolph S.; Smith, Charles N.

    1990-01-01

    A quantitative method is described for testing whether model predictions fall within a specified factor of true values. The technique is based on classical theory for confidence regions on unknown population parameters and can be related to hypothesis testing in both univariate and multivariate situations. A capability index is defined that can be used as a measure of predictive capability of a model, and its properties are discussed. The testing approach and the capability index should facilitate model validation efforts and permit comparisons among competing models. An example is given for a pesticide leaching model that predicts chemical concentrations in the soil profile.

  12. Application of ply level analysis to flexural wave propagation

    NASA Astrophysics Data System (ADS)

    Valisetty, R. R.; Rehfield, L. W.

    1988-10-01

    A brief survey is presented of the shear deformation theories of laminated plates. It indicates that there are certain non-classical influences that affect bending-related behavior in the same way as do the transverse shear stresses. They include bending- and stretching-related section warping and the concomitant non-classical surface parallel stress contributions and the transverse normal stress. A bending theory gives significantly improved performance if these non-classical affects are incorporated. The heterogeneous shear deformations that are characteristic of laminates with highly dissimilar materials, however, require that attention be paid to the modeling of local rotations. In this paper, it is shown that a ply level analysis can be used to model such disparate shear deformations. Here, equilibrium of each layer is analyzed separately. Earlier applications of this analysis include free-edge laminate stresses. It is now extended to the study of flexural wave propagation in laminates. A recently developed homogeneous plate theory is used as a ply level model. Due consideration is given to the non-classical influences and no shear correction factors are introduced extraneously in this theory. The results for the lowest flexural mode of travelling planar harmonic waves indicate that this approach is competitive and yields better results for certain laminates.

  13. Beyond Moore's law: towards competitive quantum devices

    NASA Astrophysics Data System (ADS)

    Troyer, Matthias

    2015-05-01

    A century after the invention of quantum theory and fifty years after Bell's inequality we see the first quantum devices emerge as products that aim to be competitive with the best classical computing devices. While a universal quantum computer of non-trivial size is still out of reach there exist a number commercial and experimental devices: quantum random number generators, quantum simulators and quantum annealers. In this colloquium I will present some of these devices and validation tests we performed on them. Quantum random number generators use the inherent randomness in quantum measurements to produce true random numbers, unlike classical pseudorandom number generators which are inherently deterministic. Optical lattice emulators use ultracold atomic gases in optical lattices to mimic typical models of condensed matter physics. In my talk I will focus especially on the devices built by Canadian company D-Wave systems, which are special purpose quantum simulators for solving hard classical optimization problems. I will review the controversy around the quantum nature of these devices and will compare them to state of the art classical algorithms. I will end with an outlook towards universal quantum computing and end with the question: which important problems that are intractable even for post-exa-scale classical computers could we expect to solve once we have a universal quantum computer?

  14. Geometric Theory of Reduction of Nonlinear Control Systems

    NASA Astrophysics Data System (ADS)

    Elkin, V. I.

    2018-02-01

    The foundations of a differential geometric theory of nonlinear control systems are described on the basis of categorical concepts (isomorphism, factorization, restrictions) by analogy with classical mathematical theories (of linear spaces, groups, etc.).

  15. Comment on Gallistel: behavior theory and information theory: some parallels.

    PubMed

    Nevin, John A

    2012-05-01

    In this article, Gallistel proposes information theory as an approach to some enduring problems in the study of operant and classical conditioning. Copyright © 2012 Elsevier B.V. All rights reserved.

  16. EPRL/FK asymptotics and the flatness problem

    NASA Astrophysics Data System (ADS)

    Oliveira, José Ricardo

    2018-05-01

    Spin foam models are an approach to quantum gravity based on the concept of sum over states, which aims to describe quantum spacetime dynamics in a way that its parent framework, loop quantum gravity, has not as of yet succeeded. Since these models’ relation to classical Einstein gravity is not explicit, an important test of their viabilitiy is the study of asymptotics—the classical theory should be obtained in a limit where quantum effects are negligible, taken to be the limit of large triangle areas in a triangulated manifold with boundary. In this paper we will briefly introduce the EPRL/FK spin foam model and known results about its asymptotics, proceeding then to describe a practical computation of spin foam and semiclassical geometric data for a simple triangulation with only one interior triangle. The results are used to comment on the ‘flatness problem’—a hypothesis raised by Bonzom (2009 Phys. Rev. D 80 064028) suggesting that EPRL/FK’s classical limit only describes flat geometries in vacuum.

  17. Representational Realism, Closed Theories and the Quantum to Classical Limit

    NASA Astrophysics Data System (ADS)

    de Ronde, Christian

    In this chapter, we discuss the representational realist stance as a pluralistontic approach to inter-theoretic relationships. Our stance stresses the fact that physical theories require the necessary consideration of a conceptual level of discourse which determines and configures the specific field of phenomena discussed by each particular theory. We will criticize the orthodox line of research which has grounded the analysis about QM in two (Bohrian) metaphysical presuppositions - accepted in the present as dogmas that all interpretations must follow. We will also examine how the orthodox project of "bridging the gap" between the quantum and the classical domains has constrained the possibilities of research, producing only a limited set of interpretational problems which only focus in the justification of "classical reality" and exclude the possibility of analyzing the possibilities of non-classical conceptual representations of QM. The representational realist stance introduces two new problems, namely, the superposition problem and the contextuality problem, which consider explicitly the conceptual representation of orthodox QM beyond the mere reference to mathematical structures and measurement outcomes. In the final part of the chapter, we revisit, from representational realist perspective, the quantum to classical limit and the orthodox claim that this inter-theoretic relation can be explained through the principle of decoherence.

  18. [Discussion on six errors of formulas corresponding to syndromes in using the classic formulas].

    PubMed

    Bao, Yan-ju; Hua, Bao-jin

    2012-12-01

    The theory of formulas corresponding to syndromes is one of the characteristics of Treatise on Cold Damage and Miscellaneous Diseases (Shanghan Zabing Lun) and one of the main principles in applying classic prescriptions. It is important to take effect by following the principle of formulas corresponding to syndromes. However, some medical practitioners always feel that the actual clinical effect is far less than expected. Six errors in the use of classic prescriptions as well as the theory of formulas corresponding to syndromes are the most important causes to be considered, i.e. paying attention only to the local syndromes while neglecting the whole, paying attention only to formulas corresponding to syndromes while neglecting the pathogenesis, paying attention only to syndromes while neglecting the pulse diagnosis, paying attention only to unilateral prescription but neglecting the combined prescriptions, paying attention only to classic prescriptions while neglecting the modern formulas, and paying attention only to the formulas but neglecting the drug dosage. Therefore, not only the patients' clinical syndromes, but also the combination of main syndrome and pathogenesis simultaneously is necessary in the clinical applications of classic prescriptions and the theory of prescription corresponding to syndrome. In addition, comprehensive syndrome differentiation, modern formulas, current prescriptions, combined prescriptions, and drug dosage all contribute to avoid clinical errors and improve clinical effects.

  19. A Comparison of Kinetic Energy and Momentum in Special Relativity and Classical Mechanics

    ERIC Educational Resources Information Center

    Riggs, Peter J.

    2016-01-01

    Kinetic energy and momentum are indispensable dynamical quantities in both the special theory of relativity and in classical mechanics. Although momentum and kinetic energy are central to understanding dynamics, the differences between their relativistic and classical notions have not always received adequate treatment in undergraduate teaching.…

  20. A Comparative Analysis of Three Unique Theories of Organizational Learning

    ERIC Educational Resources Information Center

    Leavitt, Carol C.

    2011-01-01

    The purpose of this paper is to present three classical theories on organizational learning and conduct a comparative analysis that highlights their strengths, similarities, and differences. Two of the theories -- experiential learning theory and adaptive -- generative learning theory -- represent the thinking of the cognitive perspective, while…

  1. Classical evolution and quantum generation in generalized gravity theories including string corrections and tachyons: Unified analyses

    NASA Astrophysics Data System (ADS)

    Hwang, Jai-Chan; Noh, Hyerim

    2005-03-01

    We present cosmological perturbation theory based on generalized gravity theories including string theory correction terms and a tachyonic complication. The classical evolution as well as the quantum generation processes in these varieties of gravity theories are presented in unified forms. These apply both to the scalar- and tensor-type perturbations. Analyses are made based on the curvature variable in two different gauge conditions often used in the literature in Einstein’s gravity; these are the curvature variables in the comoving (or uniform-field) gauge and the zero-shear gauge. Applications to generalized slow-roll inflation and its consequent power spectra are derived in unified forms which include a wide range of inflationary scenarios based on Einstein’s gravity and others.

  2. Infinite derivative gravity: non-singular cosmology & blackhole solutions

    NASA Astrophysics Data System (ADS)

    Mazumdar, A.

    Both Einstein’s theory of General Relativity and Newton’s theory of gravity possess a short distance and small time scale catastrophe. The blackhole singularity and cosmological Big Bang singularity problems highlight that current theories of gravity are incomplete description at early times and small distances. I will discuss how one can potentially resolve these fundamental problems at a classical level and quantum level. In particular, I will discuss infinite derivative theories of gravity, where gravitational interactions become weaker in the ultraviolet, and therefore resolving some of the classical singularities, such as Big Bang and Schwarzschild singularity for compact non-singular objects with mass up to 1025 grams. In this lecture, I will discuss quantum aspects of infinite derivative gravity and discuss few aspects which can make the theory asymptotically free in the UV.

  3. Psychodrama: group psychotherapy through role playing.

    PubMed

    Kipper, D A

    1992-10-01

    The theory and the therapeutic procedure of classical psychodrama are described along with brief illustrations. Classical psychodrama and sociodrama stemmed from role theory, enactments, "tele," the reciprocity of choices, and the theory of spontaneity-robopathy and creativity. The discussion focuses on key concepts such as the therapeutic team, the structure of the session, transference and reality, countertransference, the here-and-now and the encounter, the group-as-a-whole, resistance and difficult clients, and affect and cognition. Also described are the neoclassical approaches of psychodrama, action methods, and clinical role playing, and the significance of the concept of behavioral simulation in group psychotherapy.

  4. The confrontation between general relativity and experiment

    NASA Technical Reports Server (NTRS)

    Will, C. M.

    1980-01-01

    Experiments that test the foundations of gravitation theory in terms of the Einstein equivalence principle are discussed along with solar system tests of general relativity at the post-Newtonian level. These include classical (light-deflection, time delay and perihelion shift) tests as well as tests of the strong equivalence principle. The binary pulsar is discussed as an extra-solar-system gravitational testing ground, and attention is given to the multipolarity of the waves and the amount of radiation damping. The mass function, periastron shift, redshift-Doppler parameter and rate of change of the orbit period (Pb) of the binary pulsar are also considered, and it is suggested that the measurement of Pb represents the first observation of the effects of gravitational radiation.

  5. Spinning particles, axion radiation, and the classical double copy

    NASA Astrophysics Data System (ADS)

    Goldberger, Walter D.; Li, Jingping; Prabhu, Siddharth G.

    2018-05-01

    We extend the perturbative double copy between radiating classical sources in gauge theory and gravity to the case of spinning particles. We construct, to linear order in spins, perturbative radiating solutions to the classical Yang-Mills equations sourced by a set of interacting color charges with chromomagnetic dipole spin couplings. Using a color-to-kinematics replacement rule proposed earlier by one of the authors, these solutions map onto radiation in a theory of interacting particles coupled to massless fields that include the graviton, a scalar (dilaton) ϕ and the Kalb-Ramond axion field Bμ ν. Consistency of the double copy imposes constraints on the parameters of the theory on both the gauge and gravity sides of the correspondence. In particular, the color charges carry a chromomagnetic interaction which, in d =4 , corresponds to a gyromagnetic ratio equal to Dirac's value g =2 . The color-to-kinematics map implies that on the gravity side, the bulk theory of the fields (ϕ ,gμ ν,Bμ ν) has interactions which match those of d -dimensional "string gravity," as is the case both in the BCJ double copy of pure gauge theory scattering amplitudes and the KLT relations between the tree-level S -matrix elements of open and closed string theory.

  6. Lamb wave extraction of dispersion curves in micro/nano-plates using couple stress theories

    NASA Astrophysics Data System (ADS)

    Ghodrati, Behnam; Yaghootian, Amin; Ghanbar Zadeh, Afshin; Mohammad-Sedighi, Hamid

    2018-01-01

    In this paper, Lamb wave propagation in a homogeneous and isotropic non-classical micro/nano-plates is investigated. To consider the effect of material microstructure on the wave propagation, three size-dependent models namely indeterminate-, modified- and consistent couple stress theories are used to extract the dispersion equations. In the mentioned theories, a parameter called 'characteristic length' is used to consider the size of material microstructure in the governing equations. To generalize the parametric studies and examine the effect of thickness, propagation wavelength, and characteristic length on the behavior of miniature plate structures, the governing equations are nondimensionalized by defining appropriate dimensionless parameters. Then the dispersion curves for phase and group velocities are plotted in terms of a wide frequency-thickness range to study the lamb waves propagation considering microstructure effects in very high frequencies. According to the illustrated results, it was observed that the couple stress theories in the Cosserat type material predict more rigidity than the classical theory; so that in a plate with constant thickness, by increasing the thickness to characteristic length ratio, the results approach to the classical theory, and by reducing this ratio, wave propagation speed in the plate is significantly increased. In addition, it is demonstrated that for high-frequency Lamb waves, it converges to dispersive Rayleigh wave velocity.

  7. Causality re-established.

    PubMed

    D'Ariano, Giacomo Mauro

    2018-07-13

    Causality has never gained the status of a 'law' or 'principle' in physics. Some recent literature has even popularized the false idea that causality is a notion that should be banned from theory. Such misconception relies on an alleged universality of the reversibility of the laws of physics, based either on the determinism of classical theory, or on the multiverse interpretation of quantum theory, in both cases motivated by mere interpretational requirements for realism of the theory. Here, I will show that a properly defined unambiguous notion of causality is a theorem of quantum theory, which is also a falsifiable proposition of the theory. Such a notion of causality appeared in the literature within the framework of operational probabilistic theories. It is a genuinely theoretical notion, corresponding to establishing a definite partial order among events, in the same way as we do by using the future causal cone on Minkowski space. The notion of causality is logically completely independent of the misidentified concept of 'determinism', and, being a consequence of quantum theory, is ubiquitous in physics. In addition, as classical theory can be regarded as a restriction of quantum theory, causality holds also in the classical case, although the determinism of the theory trivializes it. I then conclude by arguing that causality naturally establishes an arrow of time. This implies that the scenario of the 'block Universe' and the connected 'past hypothesis' are incompatible with causality, and thus with quantum theory: they are both doomed to remain mere interpretations and, as such, are not falsifiable, similar to the hypothesis of 'super-determinism'.This article is part of a discussion meeting issue 'Foundations of quantum mechanics and their impact on contemporary society'. © 2018 The Author(s).

  8. Soliton Gases and Generalized Hydrodynamics

    NASA Astrophysics Data System (ADS)

    Doyon, Benjamin; Yoshimura, Takato; Caux, Jean-Sébastien

    2018-01-01

    We show that the equations of generalized hydrodynamics (GHD), a hydrodynamic theory for integrable quantum systems at the Euler scale, emerge in full generality in a family of classical gases, which generalize the gas of hard rods. In this family, the particles, upon colliding, jump forward or backward by a distance that depends on their velocities, reminiscent of classical soliton scattering. This provides a "molecular dynamics" for GHD: a numerical solver which is efficient, flexible, and which applies to the presence of external force fields. GHD also describes the hydrodynamics of classical soliton gases. We identify the GHD of any quantum model with that of the gas of its solitonlike wave packets, thus providing a remarkable quantum-classical equivalence. The theory is directly applicable, for instance, to integrable quantum chains and to the Lieb-Liniger model realized in cold-atom experiments.

  9. Topological and Orthomodular Modeling of Context in Behavioral Science

    NASA Astrophysics Data System (ADS)

    Narens, Louis

    2017-02-01

    Two non-boolean methods are discussed for modeling context in behavioral data and theory. The first is based on intuitionistic logic, which is similar to classical logic except that not every event has a complement. Its probability theory is also similar to classical probability theory except that the definition of probability function needs to be generalized to unions of events instead of applying only to unions of disjoint events. The generalization is needed, because intuitionistic event spaces may not contain enough disjoint events for the classical definition to be effective. The second method develops a version of quantum logic for its underlying probability theory. It differs from Hilbert space logic used in quantum mechanics as a foundation for quantum probability theory in variety of ways. John von Neumann and others have commented about the lack of a relative frequency approach and a rational foundation for this probability theory. This article argues that its version of quantum probability theory does not have such issues. The method based on intuitionistic logic is useful for modeling cognitive interpretations that vary with context, for example, the mood of the decision maker, the context produced by the influence of other items in a choice experiment, etc. The method based on this article's quantum logic is useful for modeling probabilities across contexts, for example, how probabilities of events from different experiments are related.

  10. A classical density functional theory of ionic liquids.

    PubMed

    Forsman, Jan; Woodward, Clifford E; Trulsson, Martin

    2011-04-28

    We present a simple, classical density functional approach to the study of simple models of room temperature ionic liquids. Dispersion attractions as well as ion correlation effects and excluded volume packing are taken into account. The oligomeric structure, common to many ionic liquid molecules, is handled by a polymer density functional treatment. The theory is evaluated by comparisons with simulations, with an emphasis on the differential capacitance, an experimentally measurable quantity of significant practical interest.

  11. Generalized quantum theory of recollapsing homogeneous cosmologies

    NASA Astrophysics Data System (ADS)

    Craig, David; Hartle, James B.

    2004-06-01

    A sum-over-histories generalized quantum theory is developed for homogeneous minisuperspace type A Bianchi cosmological models, focusing on the particular example of the classically recollapsing Bianchi type-IX universe. The decoherence functional for such universes is exhibited. We show how the probabilities of decoherent sets of alternative, coarse-grained histories of these model universes can be calculated. We consider in particular the probabilities for classical evolution defined by a suitable coarse graining. For a restricted class of initial conditions and coarse grainings we exhibit the approximate decoherence of alternative histories in which the universe behaves classically and those in which it does not. For these situations we show that the probability is near unity for the universe to recontract classically if it expands classically. We also determine the relative probabilities of quasiclassical trajectories for initial states of WKB form, recovering for such states a precise form of the familiar heuristic “JṡdΣ” rule of quantum cosmology, as well as a generalization of this rule to generic initial states.

  12. Quantum solitonic wave-packet of a meso-scopic system in singularity free gravity

    NASA Astrophysics Data System (ADS)

    Buoninfante, Luca; Lambiase, Gaetano; Mazumdar, Anupam

    2018-06-01

    In this paper we will discuss how to localise a quantum wave-packet due to self-gravitating meso-scopic object by taking into account gravitational self-interaction in the Schrödinger equation beyond General Relativity. In particular, we will study soliton-like solutions in infinite derivative ghost free theories of gravity, which resolves the gravitational 1 / r singularity in the potential. We will show a unique feature that the quantum spread of such a gravitational system is larger than that of the Newtonian gravity, therefore enabling us a window of opportunity to test classical and quantum properties of such theories of gravity in the near future at a table-top experiment.

  13. The Frame of Fixed Stars in Relational Mechanics

    NASA Astrophysics Data System (ADS)

    Ferraro, Rafael

    2017-01-01

    Relational mechanics is a gauge theory of classical mechanics whose laws do not govern the motion of individual particles but the evolution of the distances between particles. Its formulation gives a satisfactory answer to Leibniz's and Mach's criticisms of Newton's mechanics: relational mechanics does not rely on the idea of an absolute space. When describing the behavior of small subsystems with respect to the so called "fixed stars", relational mechanics basically agrees with Newtonian mechanics. However, those subsystems having huge angular momentum will deviate from the Newtonian behavior if they are described in the frame of fixed stars. Such subsystems naturally belong to the field of astronomy; they can be used to test the relational theory.

  14. Vibration-translation energy transfer in anharmonic diatomic molecules. 1: A critical evaluation of the semiclassical approximation

    NASA Technical Reports Server (NTRS)

    Mckenzie, R. L.

    1974-01-01

    The semiclassical approximation is applied to anharmonic diatomic oscillators in excited initial states. Multistate numerical solutions giving the vibrational transition probabilities for collinear collisions with an inert atom are compared with equivalent, exact quantum-mechanical calculations. Several symmetrization methods are shown to correlate accurately the predictions of both theories for all initial states, transitions, and molecular types tested, but only if coupling of the oscillator motion and the classical trajectory of the incident particle is considered. In anharmonic heteronuclear molecules, the customary semiclassical method of computing the classical trajectory independently leads to transition probabilities with anomalous low-energy resonances. Proper accounting of the effects of oscillator compression and recoil on the incident particle trajectory removes the anomalies and restores the applicability of the semiclassical approximation.

  15. Fermi orbital derivatives in self-interaction corrected density functional theory: Applications to closed shell atoms

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pederson, Mark R., E-mail: mark.pederson@science.doe.gov

    2015-02-14

    A recent modification of the Perdew-Zunger self-interaction-correction to the density-functional formalism has provided a framework for explicitly restoring unitary invariance to the expression for the total energy. The formalism depends upon construction of Löwdin orthonormalized Fermi-orbitals which parametrically depend on variational quasi-classical electronic positions. Derivatives of these quasi-classical electronic positions, required for efficient minimization of the self-interaction corrected energy, are derived and tested, here, on atoms. Total energies and ionization energies in closed-shell singlet atoms, where correlation is less important, using the Perdew-Wang 1992 Local Density Approximation (PW92) functional, are in good agreement with experiment and non-relativistic quantum-Monte-Carlo results albeitmore » slightly too low.« less

  16. A psychometric evaluation of the digital logic concept inventory

    NASA Astrophysics Data System (ADS)

    Herman, Geoffrey L.; Zilles, Craig; Loui, Michael C.

    2014-10-01

    Concept inventories hold tremendous promise for promoting the rigorous evaluation of teaching methods that might remedy common student misconceptions and promote deep learning. The measurements from concept inventories can be trusted only if the concept inventories are evaluated both by expert feedback and statistical scrutiny (psychometric evaluation). Classical Test Theory and Item Response Theory provide two psychometric frameworks for evaluating the quality of assessment tools. We discuss how these theories can be applied to assessment tools generally and then apply them to the Digital Logic Concept Inventory (DLCI). We demonstrate that the DLCI is sufficiently reliable for research purposes when used in its entirety and as a post-course assessment of students' conceptual understanding of digital logic. The DLCI can also discriminate between students across a wide range of ability levels, providing the most information about weaker students' ability levels.

  17. SurfKin: an ab initio kinetic code for modeling surface reactions.

    PubMed

    Le, Thong Nguyen-Minh; Liu, Bin; Huynh, Lam K

    2014-10-05

    In this article, we describe a C/C++ program called SurfKin (Surface Kinetics) to construct microkinetic mechanisms for modeling gas-surface reactions. Thermodynamic properties of reaction species are estimated based on density functional theory calculations and statistical mechanics. Rate constants for elementary steps (including adsorption, desorption, and chemical reactions on surfaces) are calculated using the classical collision theory and transition state theory. Methane decomposition and water-gas shift reaction on Ni(111) surface were chosen as test cases to validate the code implementations. The good agreement with literature data suggests this is a powerful tool to facilitate the analysis of complex reactions on surfaces, and thus it helps to effectively construct detailed microkinetic mechanisms for such surface reactions. SurfKin also opens a possibility for designing nanoscale model catalysts. Copyright © 2014 Wiley Periodicals, Inc.

  18. Thermodynamics of H-bonding in alcohols and water. The mobile order theory as opposed to the classical multicomponent order theories

    NASA Astrophysics Data System (ADS)

    Huyskens, P.; Kapuku, F.; Colemonts-Vandevyvere, C.

    1990-09-01

    In liquids the partners of H bonds constantly change. As a consequence the entities observed by IR spectroscopy are not the same as those considered for thermodynamic properties. For the latter, the H-bonds are shared by all the molecules. The thermodynamic "monomeric fraction", γ, the time fraction during which an alcohol molecule is vaporizable, is the square root of the spectroscopic monomeric fraction, and is the fraction of molecules which, during a time interval of 10 -14 s, have their hydroxylic proton and their lone pairs free. The classical thermodynamic treatments of Mecke and Prigogine consider the spectroscopic entities as real thermodynamic entities. Opposed to this, the mobile order theory considers all the formal molecules as equal but with a reduction of the entropy due to the fact that during a fraction 1-γ of the time, the OH proton follows a neighbouring oxygen atom on its journey through the liquid. Mobile order theory and classic multicomponent treatment lead, in binary mixtures of the associated substance A with the inert substance S, to expressions of the chemical potentials μ A and μ S that are fundamentally different. However, the differences become very important only when the molar volumes overlineVS and overlineVA differ by a factor larger than 2. As a consequence the equations of the classic theory can still fit the experimental vapour pressure data of mixtures of liquid alcohols and liquid alkanes. However, the solubilities of solid alkanes in water for which overlineVS > 3 overlineVA are only correctly predicted by the mobile order theory.

  19. From Foucault to Freire through Facebook: Toward an Integrated Theory of mHealth

    ERIC Educational Resources Information Center

    Bull, Sheana; Ezeanochie, Nnamdi

    2016-01-01

    Objective: To document the integration of social science theory in literature on mHealth (mobile health) and consider opportunities for integration of classic theory, health communication theory, and social networking to generate a relevant theory for mHealth program design. Method: A secondary review of research syntheses and meta-analyses…

  20. Correlated evolution between male ejaculate allocation and female remating behaviour in seed beetles (Bruchidae).

    PubMed

    Katvala, M; Rönn, J L; Arnqvist, G

    2008-03-01

    Sperm competition theory suggests that female remating rate determines the selective regime that dictates the evolution of male ejaculate allocation. To test for correlated evolution between female remating behaviour and male ejaculate traits, we subjected detailed experimental data on female and male reproductive traits in seven-seed beetle species to phylogenetic comparative analyses. The evolution of a larger first ejaculate was positively correlated with the evolution of a more rapid decline in ejaculate size over successive matings. Further, as predicted by theory, an increase in female remating rate correlated with the evolution of larger male testes but smaller ejaculates. However, an increase in female remating was associated with the evolution of a less even allocation of ejaculate resources over successive matings, contrary to classic sperm competition theory. We failed to find any evidence for coevolution between the pattern of male ejaculate allocation and variation in female quality and we conclude that some patterns of correlated evolution are congruent with current theory, whereas some are not. We suggest that this may reflect the fact that much sperm competition theory does not fully incorporate other factors that may affect the evolution of male and female traits, such as trade-offs between ejaculate expenditure and other competing demands and the evolution of resource acquisition.

  1. Resource Theory of Quantum Memories and Their Faithful Verification with Minimal Assumptions

    NASA Astrophysics Data System (ADS)

    Rosset, Denis; Buscemi, Francesco; Liang, Yeong-Cherng

    2018-04-01

    We provide a complete set of game-theoretic conditions equivalent to the existence of a transformation from one quantum channel into another one, by means of classically correlated preprocessing and postprocessing maps only. Such conditions naturally induce tests to certify that a quantum memory is capable of storing quantum information, as opposed to memories that can be simulated by measurement and state preparation (corresponding to entanglement-breaking channels). These results are formulated as a resource theory of genuine quantum memories (correlated in time), mirroring the resource theory of entanglement in quantum states (correlated spatially). As the set of conditions is complete, the corresponding tests are faithful, in the sense that any non-entanglement-breaking channel can be certified. Moreover, they only require the assumption of trusted inputs, known to be unavoidable for quantum channel verification. As such, the tests we propose are intrinsically different from the usual process tomography, for which the probes of both the input and the output of the channel must be trusted. An explicit construction is provided and shown to be experimentally realizable, even in the presence of arbitrarily strong losses in the memory or detectors.

  2. Theories of the Alcoholic Personality.

    ERIC Educational Resources Information Center

    Cox, W. Miles

    Several theories of the alcoholic personality have been devised to determine the relationship between the clusters of personality characteristics of alcoholics and their abuse of alcohol. The oldest and probably best known theory is the dependency theory, formulated in the tradition of classical psychoanalysis, which associates the alcoholic's…

  3. The Giffen Effect: A Note on Economic Purposes.

    ERIC Educational Resources Information Center

    Williams, William D.

    1990-01-01

    Describes the Giffen effect: demand for a commodity increases as price increases. Explains how applying control theory eliminates the paradox that the Giffen effect presents to classic economics supply and demand theory. Notes the differences in how conventional demand theory and control theory treat consumer behavior. (CH)

  4. Personality Theories for the 21st Century

    ERIC Educational Resources Information Center

    McCrae, Robert R.

    2011-01-01

    Classic personality theories, although intriguing, are outdated. The five-factor model of personality traits reinvigorated personality research, and the resulting findings spurred a new generation of personality theories. These theories assign a central place to traits and acknowledge the crucial role of evolved biology in shaping human…

  5. Continuous Time in Consistent Histories

    NASA Astrophysics Data System (ADS)

    Savvidou, Konstantina

    1999-12-01

    We discuss the case of histories labelled by a continuous time parameter in the History Projection Operator consistent-histories quantum theory. We describe how the appropriate representation of the history algebra may be chosen by requiring the existence of projection operators that represent propositions about time averages of the energy. We define the action operator for the consistent histories formalism, as the quantum analogue of the classical action functional, for the simple harmonic oscillator case. We show that the action operator is the generator of two types of time transformations that may be related to the two laws of time-evolution of the standard quantum theory: the `state-vector reduction' and the unitary time-evolution. We construct the corresponding classical histories and demonstrate the relevance with the quantum histories; we demonstrate how the requirement of the temporal logic structure of the theory is sufficient for the definition of classical histories. Furthermore, we show the relation of the action operator to the decoherence functional which describes the dynamics of the system. Finally, the discussion is extended to give a preliminary account of quantum field theory in this approach to the consistent histories formalism.

  6. Effects of Extrinsic Mortality on the Evolution of Aging: A Stochastic Modeling Approach

    PubMed Central

    Shokhirev, Maxim Nikolaievich; Johnson, Adiv Adam

    2014-01-01

    The evolutionary theories of aging are useful for gaining insights into the complex mechanisms underlying senescence. Classical theories argue that high levels of extrinsic mortality should select for the evolution of shorter lifespans and earlier peak fertility. Non-classical theories, in contrast, posit that an increase in extrinsic mortality could select for the evolution of longer lifespans. Although numerous studies support the classical paradigm, recent data challenge classical predictions, finding that high extrinsic mortality can select for the evolution of longer lifespans. To further elucidate the role of extrinsic mortality in the evolution of aging, we implemented a stochastic, agent-based, computational model. We used a simulated annealing optimization approach to predict which model parameters predispose populations to evolve longer or shorter lifespans in response to increased levels of predation. We report that longer lifespans evolved in the presence of rising predation if the cost of mating is relatively high and if energy is available in excess. Conversely, we found that dramatically shorter lifespans evolved when mating costs were relatively low and food was relatively scarce. We also analyzed the effects of increased predation on various parameters related to density dependence and energy allocation. Longer and shorter lifespans were accompanied by increased and decreased investments of energy into somatic maintenance, respectively. Similarly, earlier and later maturation ages were accompanied by increased and decreased energetic investments into early fecundity, respectively. Higher predation significantly decreased the total population size, enlarged the shared resource pool, and redistributed energy reserves for mature individuals. These results both corroborate and refine classical predictions, demonstrating a population-level trade-off between longevity and fecundity and identifying conditions that produce both classical and non-classical lifespan effects. PMID:24466165

  7. Assessing the quantum physics impacts on future x-ray free-electron lasers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schmitt, Mark J.; Anisimov, Petr Mikhaylovich

    A new quantum mechanical theory of x-ray free electron lasers (XFELs) has been successfully developed that has placed LANL at the forefront of the understanding of quantum effects in XFELs. Our quantum theory describes the interaction of relativistic electrons with x-ray radiation in the periodic magnetic field of an undulator using the same mathematical formalism as classical XFEL theory. This places classical and quantum treatments on the same footing and allows for a continuous transition from one regime to the other eliminating the disparate analytical approaches previously used. Moreover, Dr. Anisimov, the architect of this new theory, is now consideredmore » a resource in the international FEL community for assessing quantum effects in XFELs.« less

  8. Clinical outcome measurement: Models, theory, psychometrics and practice.

    PubMed

    McClimans, Leah; Browne, John; Cano, Stefan

    In the last decade much has been made of the role that models play in the epistemology of measurement. Specifically, philosophers have been interested in the role of models in producing measurement outcomes. This discussion has proceeded largely within the context of the physical sciences, with notable exceptions considering measurement in economics. However, models also play a central role in the methods used to develop instruments that purport to quantify psychological phenomena. These methods fall under the umbrella term 'psychometrics'. In this paper, we focus on Clinical Outcome Assessments (COAs) and discuss two measurement theories and their associated models: Classical Test Theory (CTT) and Rasch Measurement Theory. We argue that models have an important role to play in coordinating theoretical terms with empirical content, but to do so they must serve: 1) as a representation of the measurement interaction; and 2) in conjunction with a theory of the attribute in which we are interested. We conclude that Rasch Measurement Theory is a more promising approach than CTT in these regards despite the latter's popularity with health outcomes researchers. Copyright © 2017. Published by Elsevier Ltd.

  9. Model-Selection Theory: The Need for a More Nuanced Picture of Use-Novelty and Double-Counting

    PubMed Central

    Steele, Katie; Werndl, Charlotte

    2018-01-01

    Abstract This article argues that common intuitions regarding (a) the specialness of ‘use-novel’ data for confirmation and (b) that this specialness implies the ‘no-double-counting rule’, which says that data used in ‘constructing’ (calibrating) a model cannot also play a role in confirming the model’s predictions, are too crude. The intuitions in question are pertinent in all the sciences, but we appeal to a climate science case study to illustrate what is at stake. Our strategy is to analyse the intuitive claims in light of prominent accounts of confirmation of model predictions. We show that on the Bayesian account of confirmation, and also on the standard classical hypothesis-testing account, claims (a) and (b) are not generally true; but for some select cases, it is possible to distinguish data used for calibration from use-novel data, where only the latter confirm. The more specialized classical model-selection methods, on the other hand, uphold a nuanced version of claim (a), but this comes apart from (b), which must be rejected in favour of a more refined account of the relationship between calibration and confirmation. Thus, depending on the framework of confirmation, either the scope or the simplicity of the intuitive position must be revised. 1 Introduction2 A Climate Case Study3 The Bayesian Method vis-à-vis Intuitions4 Classical Tests vis-à-vis Intuitions5 Classical Model-Selection Methods vis-à-vis Intuitions  5.1 Introducing classical model-selection methods  5.2 Two cases6 Re-examining Our Case Study7 Conclusion PMID:29780170

  10. Model-Selection Theory: The Need for a More Nuanced Picture of Use-Novelty and Double-Counting.

    PubMed

    Steele, Katie; Werndl, Charlotte

    2018-06-01

    This article argues that common intuitions regarding (a) the specialness of 'use-novel' data for confirmation and (b) that this specialness implies the 'no-double-counting rule', which says that data used in 'constructing' (calibrating) a model cannot also play a role in confirming the model's predictions, are too crude. The intuitions in question are pertinent in all the sciences, but we appeal to a climate science case study to illustrate what is at stake. Our strategy is to analyse the intuitive claims in light of prominent accounts of confirmation of model predictions. We show that on the Bayesian account of confirmation, and also on the standard classical hypothesis-testing account, claims (a) and (b) are not generally true; but for some select cases, it is possible to distinguish data used for calibration from use-novel data, where only the latter confirm. The more specialized classical model-selection methods, on the other hand, uphold a nuanced version of claim (a), but this comes apart from (b), which must be rejected in favour of a more refined account of the relationship between calibration and confirmation. Thus, depending on the framework of confirmation, either the scope or the simplicity of the intuitive position must be revised. 1   Introduction 2   A Climate Case Study 3   The Bayesian Method vis-à-vis Intuitions 4   Classical Tests vis-à-vis Intuitions 5   Classical Model-Selection Methods vis-à-vis Intuitions    5.1   Introducing classical model-selection methods    5.2   Two cases 6   Re-examining Our Case Study 7   Conclusion .

  11. Development and Evaluation of the Lifestyle History Questionnaire (LHQ) for People Entering Treatment for Substance Addictions.

    PubMed

    Martin, Linda M; Triscari, Robert; Boisvert, Rosemary; Hipp, Kristi; Gersten, Jennifer; West, Rachel C; Kisling, Elizabeth; Donham, Aaron; Kollar, Naomi; Escobar, Patricia

    2015-01-01

    We developed and investigated the psychometric properties of the Lifestyle History Questionnaire (LHQ), a self-report instrument designed to measure the extent of occupational dysfunction attributable to substance abuse. The instrument was developed using concepts in the ecological models of occupational therapy and in the work of William L. White, who defined addiction culture in terms of the patterns of life in context. We analyzed data from two field tests using both classical test theory and item response theory. The final version of the instrument has 70 items, 1 unifying construct, and 8 subscales. We found it to be valid and reliable (α=.93) for measuring the extent of occupational dysfunction and specific areas of strengths and weaknesses. The LHQ is a promising new instrument, the first of its kind to measure occupational dysfunction in context for people with substance addictions. Copyright © 2015 by the American Occupational Therapy Association, Inc.

  12. Polynomial elimination theory and non-linear stability analysis for the Euler equations

    NASA Technical Reports Server (NTRS)

    Kennon, S. R.; Dulikravich, G. S.; Jespersen, D. C.

    1986-01-01

    Numerical methods are presented that exploit the polynomial properties of discretizations of the Euler equations. It is noted that most finite difference or finite volume discretizations of the steady-state Euler equations produce a polynomial system of equations to be solved. These equations are solved using classical polynomial elimination theory, with some innovative modifications. This paper also presents some preliminary results of a new non-linear stability analysis technique. This technique is applicable to determining the stability of polynomial iterative schemes. Results are presented for applying the elimination technique to a one-dimensional test case. For this test case, the exact solution is computed in three iterations. The non-linear stability analysis is applied to determine the optimal time step for solving Burgers' equation using the MacCormack scheme. The estimated optimal time step is very close to the time step that arises from a linear stability analysis.

  13. Decision Processes in Discrimination: Fundamental Misrepresentations of Signal Detection Theory

    NASA Technical Reports Server (NTRS)

    Balakrishnan, J. D.

    1998-01-01

    In the first part of this article, I describe a new approach to studying decision making in discrimination tasks that does not depend on the technical assumptions of signal detection theory (e.g., normality of the encoding distributions). Applying these new distribution-free tests to data from three experiments, I show that base rate and payoff manipulations had substantial effects on the participants' encoding distributions but no effect on their decision rules, which were uniformly unbiased in equal and unequal base rate conditions and in symmetric and asymmetric payoff conditions. In the second part of the article, I show that this seemingly paradoxical result is readily explained by the sequential sampling models of discrimination. I then propose a new, "model-free" test for response bias that seems to more properly identify both the nature and direction of the biases induced by the classical bias manipulations.

  14. Development and validation of a patient-reported outcome measure for stroke patients.

    PubMed

    Luo, Yanhong; Yang, Jie; Zhang, Yanbo

    2015-05-08

    Family support and patient satisfaction with treatment are crucial for aiding in the recovery from stroke. However, current validated stroke-specific questionnaires may not adequately capture the impact of these two variables on patients undergoing clinical trials of new drugs. Therefore, the aim of this study was to develop and evaluate a new stroke patient-reported outcome measure (Stroke-PROM) instrument for capturing more comprehensive effects of stroke on patients participating in clinical trials of new drugs. A conceptual framework and a pool of items for the preliminary Stroke-PROM were generated by consulting the relevant literature and other questionnaires created in China and other countries, and interviewing 20 patients and 4 experts to ensure that all germane parameters were included. During the first item-selection phase, classical test theory and item response theory were applied to an initial scale completed by 133 patients with stroke. During the item-revaluation phase, classical test theory and item response theory were used again, this time with 475 patients with stroke and 104 healthy participants. During the scale assessment phase, confirmatory factor analysis was applied to the final scale of the Stroke-PROM using the same study population as in the second item-selection phase. Reliability, validity, responsiveness and feasibility of the final scale were tested. The final scale of Stroke-PROM contained 46 items describing four domains (physiology, psychology, society and treatment). These four domains were subdivided into 10 subdomains. Cronbach's α coefficients for the four domains ranged from 0.861 to 0.908. Confirmatory factor analysis supported the validity of the final scale, and the model fit index satisfied the criterion. Differences in the Stroke-PROM mean scores were significant between patients with stroke and healthy participants in nine subdomains (P < 0.001), indicating that the scale showed good responsiveness. The Stroke-PROM is a patient-reported outcome multidimensional questionnaire developed especially for clinical trials of new drugs and is focused on issues of family support and patient satisfaction with treatment. Extensive data analyses supported the validity, reliability and responsiveness of the Stroke-PROM.

  15. NP-hardness of decoding quantum error-correction codes

    NASA Astrophysics Data System (ADS)

    Hsieh, Min-Hsiu; Le Gall, François

    2011-05-01

    Although the theory of quantum error correction is intimately related to classical coding theory and, in particular, one can construct quantum error-correction codes (QECCs) from classical codes with the dual-containing property, this does not necessarily imply that the computational complexity of decoding QECCs is the same as their classical counterparts. Instead, decoding QECCs can be very much different from decoding classical codes due to the degeneracy property. Intuitively, one expects degeneracy would simplify the decoding since two different errors might not and need not be distinguished in order to correct them. However, we show that general quantum decoding problem is NP-hard regardless of the quantum codes being degenerate or nondegenerate. This finding implies that no considerably fast decoding algorithm exists for the general quantum decoding problems and suggests the existence of a quantum cryptosystem based on the hardness of decoding QECCs.

  16. On quantum effects in a theory of biological evolution.

    PubMed

    Martin-Delgado, M A

    2012-01-01

    We construct a descriptive toy model that considers quantum effects on biological evolution starting from Chaitin's classical framework. There are smart evolution scenarios in which a quantum world is as favorable as classical worlds for evolution to take place. However, in more natural scenarios, the rate of evolution depends on the degree of entanglement present in quantum organisms with respect to classical organisms. If the entanglement is maximal, classical evolution turns out to be more favorable.

  17. On Quantum Effects in a Theory of Biological Evolution

    PubMed Central

    Martin-Delgado, M. A.

    2012-01-01

    We construct a descriptive toy model that considers quantum effects on biological evolution starting from Chaitin's classical framework. There are smart evolution scenarios in which a quantum world is as favorable as classical worlds for evolution to take place. However, in more natural scenarios, the rate of evolution depends on the degree of entanglement present in quantum organisms with respect to classical organisms. If the entanglement is maximal, classical evolution turns out to be more favorable. PMID:22413059

  18. Synthesis of active controls for flutter suppression on a flight research wing

    NASA Technical Reports Server (NTRS)

    Abel, I.; Perry, B., III; Murrow, H. N.

    1977-01-01

    This paper describes some activities associated with the preliminary design of an active control system for flutter suppression capable of demonstrating a 20% increase in flutter velocity. Results from two control system synthesis techniques are given. One technique uses classical control theory, and the other uses an 'aerodynamic energy method' where control surface rates or displacements are minimized. Analytical methods used to synthesize the control systems and evaluate their performance are described. Some aspects of a program for flight testing the active control system are also given. This program, called DAST (Drones for Aerodynamics and Structural Testing), employs modified drone-type vehicles for flight assessments and validation testing.

  19. Quasinormal modes of scale dependent black holes in (1 +2 )-dimensional Einstein-power-Maxwell theory

    NASA Astrophysics Data System (ADS)

    Rincón, Ángel; Panotopoulos, Grigoris

    2018-01-01

    We study for the first time the stability against scalar perturbations, and we compute the spectrum of quasinormal modes of three-dimensional charged black holes in Einstein-power-Maxwell nonlinear electrodynamics assuming running couplings. Adopting the sixth order Wentzel-Kramers-Brillouin (WKB) approximation we investigate how the running of the couplings change the spectrum of the classical theory. Our results show that all modes corresponding to nonvanishing angular momentum are unstable both in the classical theory and with the running of the couplings, while the fundamental mode can be stable or unstable depending on the running parameter and the electric charge.

  20. Constraints on primordial magnetic fields from inflation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Green, Daniel; Kobayashi, Takeshi, E-mail: drgreen@cita.utoronto.ca, E-mail: takeshi.kobayashi@sissa.it

    2016-03-01

    We present generic bounds on magnetic fields produced from cosmic inflation. By investigating field bounds on the vector potential, we constrain both the quantum mechanical production of magnetic fields and their classical growth in a model independent way. For classical growth, we show that only if the reheating temperature is as low as T{sub reh} ∼< 10{sup 2} MeV can magnetic fields of 10{sup −15} G be produced on Mpc scales in the present universe. For purely quantum mechanical scenarios, even stronger constraints are derived. Our bounds on classical and quantum mechanical scenarios apply to generic theories of inflationary magnetogenesis with a two-derivative timemore » kinetic term for the vector potential. In both cases, the magnetic field strength is limited by the gravitational back-reaction of the electric fields that are produced simultaneously. As an example of quantum mechanical scenarios, we construct vector field theories whose time diffeomorphisms are spontaneously broken, and explore magnetic field generation in theories with a variable speed of light. Transitions of quantum vector field fluctuations into classical fluctuations are also analyzed in the examples.« less

  1. Quantum Matching Theory (with new complexity-theoretic, combinatorial and topical insights on the nature of the quantum entanglement)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gurvits, L.

    2002-01-01

    Classical matching theory can be defined in terms of matrices with nonnegative entries. The notion of Positive operator, central in Quantum Theory, is a natural generalization of matrices with non-negative entries. Based on this point of view, we introduce a definition of perfect Quantum (operator) matching. We show that the new notion inherits many 'classical' properties, but not all of them. This new notion goes somewhere beyound matroids. For separable bipartite quantum states this new notion coinsides with the full rank property of the intersection of two corresponding geometric matroids. In the classical situation, permanents are naturally associated with perfectsmore » matchings. We introduce an analog of permanents for positive operators, called Quantum Permanent and show how this generalization of the permanent is related to the Quantum Entanglement. Besides many other things, Quantum Permanents provide new rational inequalities necessary for the separability of bipartite quantum states. Using Quantum Permanents, we give deterministic poly-time algorithm to solve Hidden Matroids Intersection Problem and indicate some 'classical' complexity difficulties associated with the Quantum Entanglement. Finally, we prove that the weak membership problem for the convex set of separable bipartite density matrices is NP-HARD.« less

  2. Ghost imaging of phase objects with classical incoherent light

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shirai, Tomohiro; Setaelae, Tero; Friberg, Ari T.

    2011-10-15

    We describe an optical setup for performing spatial Fourier filtering in ghost imaging with classical incoherent light. This is achieved by a modification of the conventional geometry for lensless ghost imaging. It is shown on the basis of classical coherence theory that with this technique one can realize what we call phase-contrast ghost imaging to visualize pure phase objects.

  3. Quantum-mechanical machinery for rational decision-making in classical guessing game

    NASA Astrophysics Data System (ADS)

    Bang, Jeongho; Ryu, Junghee; Pawłowski, Marcin; Ham, Byoung S.; Lee, Jinhyoung

    2016-02-01

    In quantum game theory, one of the most intriguing and important questions is, “Is it possible to get quantum advantages without any modification of the classical game?” The answer to this question so far has largely been negative. So far, it has usually been thought that a change of the classical game setting appears to be unavoidable for getting the quantum advantages. However, we give an affirmative answer here, focusing on the decision-making process (we call ‘reasoning’) to generate the best strategy, which may occur internally, e.g., in the player’s brain. To show this, we consider a classical guessing game. We then define a one-player reasoning problem in the context of the decision-making theory, where the machinery processes are designed to simulate classical and quantum reasoning. In such settings, we present a scenario where a rational player is able to make better use of his/her weak preferences due to quantum reasoning, without any altering or resetting of the classically defined game. We also argue in further analysis that the quantum reasoning may make the player fail, and even make the situation worse, due to any inappropriate preferences.

  4. Quantum-mechanical machinery for rational decision-making in classical guessing game

    PubMed Central

    Bang, Jeongho; Ryu, Junghee; Pawłowski, Marcin; Ham, Byoung S.; Lee, Jinhyoung

    2016-01-01

    In quantum game theory, one of the most intriguing and important questions is, “Is it possible to get quantum advantages without any modification of the classical game?” The answer to this question so far has largely been negative. So far, it has usually been thought that a change of the classical game setting appears to be unavoidable for getting the quantum advantages. However, we give an affirmative answer here, focusing on the decision-making process (we call ‘reasoning’) to generate the best strategy, which may occur internally, e.g., in the player’s brain. To show this, we consider a classical guessing game. We then define a one-player reasoning problem in the context of the decision-making theory, where the machinery processes are designed to simulate classical and quantum reasoning. In such settings, we present a scenario where a rational player is able to make better use of his/her weak preferences due to quantum reasoning, without any altering or resetting of the classically defined game. We also argue in further analysis that the quantum reasoning may make the player fail, and even make the situation worse, due to any inappropriate preferences. PMID:26875685

  5. Quantum-mechanical machinery for rational decision-making in classical guessing game.

    PubMed

    Bang, Jeongho; Ryu, Junghee; Pawłowski, Marcin; Ham, Byoung S; Lee, Jinhyoung

    2016-02-15

    In quantum game theory, one of the most intriguing and important questions is, "Is it possible to get quantum advantages without any modification of the classical game?" The answer to this question so far has largely been negative. So far, it has usually been thought that a change of the classical game setting appears to be unavoidable for getting the quantum advantages. However, we give an affirmative answer here, focusing on the decision-making process (we call 'reasoning') to generate the best strategy, which may occur internally, e.g., in the player's brain. To show this, we consider a classical guessing game. We then define a one-player reasoning problem in the context of the decision-making theory, where the machinery processes are designed to simulate classical and quantum reasoning. In such settings, we present a scenario where a rational player is able to make better use of his/her weak preferences due to quantum reasoning, without any altering or resetting of the classically defined game. We also argue in further analysis that the quantum reasoning may make the player fail, and even make the situation worse, due to any inappropriate preferences.

  6. Toda theories as contractions of affine Toda theories

    NASA Astrophysics Data System (ADS)

    Aghamohammadi, A.; Khorrami, M.; Shariati, A.

    1996-02-01

    Using a contraction procedure, we obtain Toda theories and their structures, from affine Toda theories and their corresponding structures. By structures, we mean the equation of motion, the classical Lax pair, the boundary term for half line theories, and the quantum transfer matrix. The Lax pair and the transfer matrix so obtained, depend nontrivially on the spectral parameter.

  7. An Approach to Biased Item Identification Using Latent Trait Measurement Theory.

    ERIC Educational Resources Information Center

    Rudner, Lawrence M.

    Because it is a true score model employing item parameters which are independent of the examined sample, item characteristic curve theory (ICC) offers several advantages over classical measurement theory. In this paper an approach to biased item identification using ICC theory is described and applied. The ICC theory approach is attractive in that…

  8. Quantum-like model of processing of information in the brain based on classical electromagnetic field.

    PubMed

    Khrennikov, Andrei

    2011-09-01

    We propose a model of quantum-like (QL) processing of mental information. This model is based on quantum information theory. However, in contrast to models of "quantum physical brain" reducing mental activity (at least at the highest level) to quantum physical phenomena in the brain, our model matches well with the basic neuronal paradigm of the cognitive science. QL information processing is based (surprisingly) on classical electromagnetic signals induced by joint activity of neurons. This novel approach to quantum information is based on representation of quantum mechanics as a version of classical signal theory which was recently elaborated by the author. The brain uses the QL representation (QLR) for working with abstract concepts; concrete images are described by classical information theory. Two processes, classical and QL, are performed parallely. Moreover, information is actively transmitted from one representation to another. A QL concept given in our model by a density operator can generate a variety of concrete images given by temporal realizations of the corresponding (Gaussian) random signal. This signal has the covariance operator coinciding with the density operator encoding the abstract concept under consideration. The presence of various temporal scales in the brain plays the crucial role in creation of QLR in the brain. Moreover, in our model electromagnetic noise produced by neurons is a source of superstrong QL correlations between processes in different spatial domains in the brain; the binding problem is solved on the QL level, but with the aid of the classical background fluctuations. Copyright © 2011 Elsevier Ireland Ltd. All rights reserved.

  9. Testing quantum gravity

    NASA Astrophysics Data System (ADS)

    Hansson, Johan; Francois, Stephane

    The search for a theory of quantum gravity is the most fundamental problem in all of theoretical physics, but there are as yet no experimental results at all to guide this endeavor. What seems to be needed is a pragmatic way to test if gravitation really occurs between quantum objects or not. In this paper, we suggest such a potential way out of this deadlock, utilizing macroscopic quantum systems; superfluid helium, gaseous Bose-Einstein condensates and “macroscopic” molecules. It turns out that true quantum gravity effects — here defined as observable gravitational interactions between truly quantum objects — could and should be seen (if they occur in nature) using existing technology. A falsification of the low-energy limit in the accessible weak-field regime would also falsify the full theory of quantum gravity, making it enter the realm of testable, potentially falsifiable theories, i.e. becoming real physics after almost a century of pure theorizing. If weak-field gravity between quantum objects is shown to be absent (in the regime where the approximation should apply), we know that gravity then is a strictly classical phenomenon absent at the quantum level.

  10. A Classical Based Derivation of Time Dilation Providing First Order Accuracy to Schwarzschild's Solution of Einstein's Field Equations

    NASA Astrophysics Data System (ADS)

    Austin, Rickey W.

    In Einstein's theory of Special Relativity (SR), one method to derive relativistic kinetic energy is via applying the classical work-energy theorem to relativistic momentum. This approach starts with a classical based work-energy theorem and applies SR's momentum to the derivation. One outcome of this derivation is relativistic kinetic energy. From this derivation, it is rather straight forward to form a kinetic energy based time dilation function. In the derivation of General Relativity a common approach is to bypass classical laws as a starting point. Instead a rigorous development of differential geometry and Riemannian space is constructed, from which classical based laws are derived. This is in contrast to SR's approach of starting with classical laws and applying the consequences of the universal speed of light by all observers. A possible method to derive time dilation due to Newtonian gravitational potential energy (NGPE) is to apply SR's approach to deriving relativistic kinetic energy. It will be shown this method gives a first order accuracy compared to Schwarzschild's metric. The SR's kinetic energy and the newly derived NGPE derivation are combined to form a Riemannian metric based on these two energies. A geodesic is derived and calculations compared to Schwarzschild's geodesic for an orbiting test mass about a central, non-rotating, non-charged massive body. The new metric results in high accuracy calculations when compared to Einsteins General Relativity's prediction. The new method provides a candidate approach for starting with classical laws and deriving General Relativity effects. This approach mimics SR's method of starting with classical mechanics when deriving relativistic equations. As a compliment to introducing General Relativity, it provides a plausible scaffolding method from classical physics when teaching introductory General Relativity. A straight forward path from classical laws to General Relativity will be derived. This derivation provides a minimum first order accuracy to Schwarzschild's solution to Einstein's field equations.

  11. Transfer function modeling of damping mechanisms in viscoelastic plates

    NASA Technical Reports Server (NTRS)

    Slater, J. C.; Inman, D. J.

    1991-01-01

    This work formulates a method for the modeling of material damping characteristics in plates. The Sophie German equation of classical plate theory is modified to incorporate hysteresis effects represented by complex stiffness using the transfer function approach proposed by Golla and Hughes, (1985). However, this procedure is not limited to this representation. The governing characteristic equation is decoupled through separation of variables, yielding a solution similar to that of undamped classical plate theory, allowing solution of the steady state as well as the transient response problem.

  12. Color Memory: A Yang-Mills Analog of Gravitational Wave Memory.

    PubMed

    Pate, Monica; Raclariu, Ana-Maria; Strominger, Andrew

    2017-12-29

    A transient color flux across null infinity in classical Yang-Mills theory is considered. It is shown that a pair of test "quarks" initially in a color singlet generically acquire net color as a result of the flux. A nonlinear formula is derived for the relative color rotation of the quarks. For a weak color flux, the formula linearizes to the Fourier transform of the soft gluon theorem. This color memory effect is the Yang-Mills analog of the gravitational memory effect.

  13. Color Memory: A Yang-Mills Analog of Gravitational Wave Memory

    NASA Astrophysics Data System (ADS)

    Pate, Monica; Raclariu, Ana-Maria; Strominger, Andrew

    2017-12-01

    A transient color flux across null infinity in classical Yang-Mills theory is considered. It is shown that a pair of test "quarks" initially in a color singlet generically acquire net color as a result of the flux. A nonlinear formula is derived for the relative color rotation of the quarks. For a weak color flux, the formula linearizes to the Fourier transform of the soft gluon theorem. This color memory effect is the Yang-Mills analog of the gravitational memory effect.

  14. An overview of coefficient alpha and a reliability matrix for estimating adequacy of internal consistency coefficients with psychological research measures.

    PubMed

    Ponterotto, Joseph G; Ruckdeschel, Daniel E

    2007-12-01

    The present article addresses issues in reliability assessment that are often neglected in psychological research such as acceptable levels of internal consistency for research purposes, factors affecting the magnitude of coefficient alpha (alpha), and considerations for interpreting alpha within the research context. A new reliability matrix anchored in classical test theory is introduced to help researchers judge adequacy of internal consistency coefficients with research measures. Guidelines and cautions in applying the matrix are provided.

  15. Catastrophe optics of sharp-edge diffraction.

    PubMed

    Borghi, Riccardo

    2016-07-01

    A classical problem of diffraction theory, namely plane wave diffraction by sharp-edge apertures, is here reformulated from the viewpoint of the fairly new subject of catastrophe optics. On using purely geometrical arguments, properly embedded into a wave optics context, uniform analytical estimates of the diffracted wavefield at points close to fold caustics are obtained, within paraxial approximation, in terms of the Airy function and its first derivative. Diffraction from parabolic apertures is proposed to test reliability and accuracy of our theoretical predictions.

  16. Static Strength Characteristics of Mechanically Fastened Composite Joints

    NASA Technical Reports Server (NTRS)

    Fox, D. E.; Swaim, K. W.

    1999-01-01

    The analysis of mechanically fastened composite joints presents a great challenge to structural analysts because of the large number of parameters that influence strength. These parameters include edge distance, width, bolt diameter, laminate thickness, ply orientation, and bolt torque. The research presented in this report investigates the influence of some of these parameters through testing and analysis. A methodology is presented for estimating the strength of the bolt-hole based on classical lamination theory using the Tsai-Hill failure criteria and typical bolthole bearing analytical methods.

  17. Field Extension of Real Values of Physical Observables in Classical Theory can Help Attain Quantum Results

    NASA Astrophysics Data System (ADS)

    Wang, Hai; Kumar, Asutosh; Cho, Minhyung; Wu, Junde

    2018-04-01

    Physical quantities are assumed to take real values, which stems from the fact that an usual measuring instrument that measures a physical observable always yields a real number. Here we consider the question of what would happen if physical observables are allowed to assume complex values. In this paper, we show that by allowing observables in the Bell inequality to take complex values, a classical physical theory can actually get the same upper bound of the Bell expression as quantum theory. Also, by extending the real field to the quaternionic field, we can puzzle out the GHZ problem using local hidden variable model. Furthermore, we try to build a new type of hidden-variable theory of a single qubit based on the result.

  18. From integrability to conformal symmetry: Bosonic superconformal Toda theories

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bo-Yu Hou; Liu Chao

    In this paper the authors study the conformal integrable models obtained from conformal reductions of WZNW theory associated with second order constraints. These models are called bosonic superconformal Toda models due to their conformal spectra and their resemblance to the usual Toda theories. From the reduction procedure they get the equations of motion and the linearized Lax equations in a generic Z gradation of the underlying Lie algebra. Then, in the special case of principal gradation, they derive the classical r matrix, fundamental Poisson relation, exchange algebra of chiral operators and find out the classical vertex operators. The result showsmore » that their model is very similar to the ordinary Toda theories in that one can obtain various conformal properties of the model from its integrability.« less

  19. Sound and Vision: Using Progressive Rock To Teach Social Theory.

    ERIC Educational Resources Information Center

    Ahlkvist, Jarl A.

    2001-01-01

    Describes a teaching technique that utilizes progressive rock music to educate students about sociological theories in introductory sociology courses. Discusses the use of music when teaching about classical social theory and offers an evaluation of this teaching strategy. Includes references. (CMK)

  20. Evolution of diffusion and dissemination theory.

    PubMed

    Dearing, James W

    2008-01-01

    The article provides a review and considers how the diffusion of innovations Research paradigm has changed, and offers suggestions for the further development of this theory of social change. Main emphases of diffusion Research studies are compared over time, with special attention to applications of diffusion theory-based concepts as types of dissemination science. A considerable degree of paradigmatic evolution is observed. The classical diffusion model focused on adopter innovativeness, individuals as the locus of decision, communication channels, and adoption as the primary outcome measures in post hoc observational study designs. The diffusion systems in question were centralized, with fidelity of implementation often assumed. Current dissemination Research and practice is better characterized by tests of interventions that operationalize one or more diffusion theory-based concepts and concepts from other change approaches, involve complex organizations as the units of adoption, and focus on implementation issues. Foment characterizes dissemination and implementation Research, Reflecting both its interdisciplinary Roots and the imperative of spreading evidence-based innovations as a basis for a new paradigm of translational studies of dissemination science.

  1. DNA Methylation and Sex Allocation in the Parasitoid Wasp Nasonia vitripennis.

    PubMed

    Cook, Nicola; Pannebakker, Bart A; Tauber, Eran; Shuker, David M

    2015-10-01

    The role of epigenetics in the control and evolution of behavior is being increasingly recognized. Here we test whether DNA methylation influences patterns of adaptive sex allocation in the parasitoid wasp Nasonia vitripennis. Female N. vitripennis allocate offspring sex broadly in line with local mate competition (LMC) theory. However, recent theory has highlighted how genomic conflict may influence sex allocation under LMC, conflict that requires parent-of-origin information to be retained by alleles through some form of epigenetic signal. We manipulated whole-genome DNA methylation in N. vitripennis females using the hypomethylating agent 5-aza-2'-deoxycytidine. Across two replicated experiments, we show that disruption of DNA methylation does not ablate the facultative sex allocation response of females, as sex ratios still vary with cofoundress number as in the classical theory. However, sex ratios are generally shifted upward when DNA methylation is disrupted. Our data are consistent with predictions from genomic conflict over sex allocation theory and suggest that sex ratios may be closer to the optimum for maternally inherited alleles.

  2. Niels Bohr as philosopher of experiment: Does decoherence theory challenge Bohr's doctrine of classical concepts?

    NASA Astrophysics Data System (ADS)

    Camilleri, Kristian; Schlosshauer, Maximilian

    2015-02-01

    Niels Bohr's doctrine of the primacy of "classical concepts" is arguably his most criticized and misunderstood view. We present a new, careful historical analysis that makes clear that Bohr's doctrine was primarily an epistemological thesis, derived from his understanding of the functional role of experiment. A hitherto largely overlooked disagreement between Bohr and Heisenberg about the movability of the "cut" between measuring apparatus and observed quantum system supports the view that, for Bohr, such a cut did not originate in dynamical (ontological) considerations, but rather in functional (epistemological) considerations. As such, both the motivation and the target of Bohr's doctrine of classical concepts are of a fundamentally different nature than what is understood as the dynamical problem of the quantum-to-classical transition. Our analysis suggests that, contrary to claims often found in the literature, Bohr's doctrine is not, and cannot be, at odds with proposed solutions to the dynamical problem of the quantum-classical transition that were pursued by several of Bohr's followers and culminated in the development of decoherence theory.

  3. Bifurcation Analysis of an Electrostatically Actuated Nano-Beam Based on Modified Couple Stress Theory

    NASA Astrophysics Data System (ADS)

    Rezaei Kivi, Araz; Azizi, Saber; Norouzi, Peyman

    2017-12-01

    In this paper, the nonlinear size-dependent static and dynamic behavior of an electrostatically actuated nano-beam is investigated. A fully clamped nano-beam is considered for the modeling of the deformable electrode of the NEMS. The governing differential equation of the motion is derived using Hamiltonian principle based on couple stress theory; a non-classical theory for considering length scale effects. The nonlinear partial differential equation of the motion is discretized to a nonlinear Duffing type ODE's using Galerkin method. Static and dynamic pull-in instabilities obtained by both classical theory and MCST are compared. At the second stage of analysis, shooting technique is utilized to obtain the frequency response curve, and to capture the periodic solutions of the motion; the stability of the periodic solutions are gained by Floquet theory. The nonlinear dynamic behavior of the deformable electrode due to the AC harmonic accompanied with size dependency is investigated.

  4. Finite conformal quantum gravity and spacetime singularities

    NASA Astrophysics Data System (ADS)

    Modesto, Leonardo; Rachwał, Lesław

    2017-12-01

    We show that a class of finite quantum non-local gravitational theories is conformally invariant at classical as well as at quantum level. This is actually a range of conformal anomaly-free theories in the spontaneously broken phase of the Weyl symmetry. At classical level we show how the Weyl conformal invariance is able to tame all the spacetime singularities that plague not only Einstein gravity, but also local and weakly non-local higher derivative theories. The latter statement is proved by a singularity theorem that applies to a large class of weakly non-local theories. Therefore, we are entitled to look for a solution of the spacetime singularity puzzle in a missed symmetry of nature, namely the Weyl conformal symmetry. Following the seminal paper by Narlikar and Kembhavi, we provide an explicit construction of singularity-free black hole exact solutions in a class of conformally invariant theories.

  5. Brane Physics in M-theory

    NASA Astrophysics Data System (ADS)

    Argurio, Riccardo

    1998-07-01

    The thesis begins with an introduction to M-theory (at a graduate student's level), starting from perturbative string theory and proceeding to dualities, D-branes and finally Matrix theory. The following chapter treats, in a self-contained way, of general classical p-brane solutions. Black and extremal branes are reviewed, along with their semi-classical thermodynamics. We then focus on intersecting extremal branes, the intersection rules being derived both with and without the explicit use of supersymmetry. The last three chapters comprise more advanced aspects of brane physics, such as the dynamics of open branes, the little theories on the world-volume of branes and how the four dimensional Schwarzschild black hole can be mapped to an extremal configuration of branes, thus allowing for a statistical interpretation of its entropy. The original results were already reported in hep-th/9701042, hep-th/9704190, hep-th/9710027 and hep-th/9801053.

  6. Calculating the spontaneous magnetization and defining the Curie temperature using a positive-feedback model

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Harrison, R. G., E-mail: rgh@doe.carleton.ca

    2014-01-21

    A positive-feedback mean-field modification of the classical Brillouin magnetization theory provides an explanation of the apparent persistence of the spontaneous magnetization beyond the conventional Curie temperature—the little understood “tail” phenomenon that occurs in many ferromagnetic materials. The classical theory is unable to resolve this apparent anomaly. The modified theory incorporates the temperature-dependent quantum-scale hysteretic and mesoscopic domain-scale anhysteretic magnetization processes and includes the effects of demagnetizing and exchange fields. It is found that the thermal behavior of the reversible and irreversible segments of the hysteresis loops, as predicted by the theory, is a key to the presence or absence ofmore » the “tails.” The theory, which permits arbitrary values of the quantum spin number J, generally provides a quantitative agreement with the thermal variations of both the spontaneous magnetization and the shape of the hysteresis loop.« less

  7. Continuum theory of fibrous tissue damage mechanics using bond kinetics: application to cartilage tissue engineering.

    PubMed

    Nims, Robert J; Durney, Krista M; Cigan, Alexander D; Dusséaux, Antoine; Hung, Clark T; Ateshian, Gerard A

    2016-02-06

    This study presents a damage mechanics framework that employs observable state variables to describe damage in isotropic or anisotropic fibrous tissues. In this mixture theory framework, damage is tracked by the mass fraction of bonds that have broken. Anisotropic damage is subsumed in the assumption that multiple bond species may coexist in a material, each having its own damage behaviour. This approach recovers the classical damage mechanics formulation for isotropic materials, but does not appeal to a tensorial damage measure for anisotropic materials. In contrast with the classical approach, the use of observable state variables for damage allows direct comparison of model predictions to experimental damage measures, such as biochemical assays or Raman spectroscopy. Investigations of damage in discrete fibre distributions demonstrate that the resilience to damage increases with the number of fibre bundles; idealizing fibrous tissues using continuous fibre distribution models precludes the modelling of damage. This damage framework was used to test and validate the hypothesis that growth of cartilage constructs can lead to damage of the synthesized collagen matrix due to excessive swelling caused by synthesized glycosaminoglycans. Therefore, alternative strategies must be implemented in tissue engineering studies to prevent collagen damage during the growth process.

  8. Continuum theory of fibrous tissue damage mechanics using bond kinetics: application to cartilage tissue engineering

    PubMed Central

    Nims, Robert J.; Durney, Krista M.; Cigan, Alexander D.; Hung, Clark T.; Ateshian, Gerard A.

    2016-01-01

    This study presents a damage mechanics framework that employs observable state variables to describe damage in isotropic or anisotropic fibrous tissues. In this mixture theory framework, damage is tracked by the mass fraction of bonds that have broken. Anisotropic damage is subsumed in the assumption that multiple bond species may coexist in a material, each having its own damage behaviour. This approach recovers the classical damage mechanics formulation for isotropic materials, but does not appeal to a tensorial damage measure for anisotropic materials. In contrast with the classical approach, the use of observable state variables for damage allows direct comparison of model predictions to experimental damage measures, such as biochemical assays or Raman spectroscopy. Investigations of damage in discrete fibre distributions demonstrate that the resilience to damage increases with the number of fibre bundles; idealizing fibrous tissues using continuous fibre distribution models precludes the modelling of damage. This damage framework was used to test and validate the hypothesis that growth of cartilage constructs can lead to damage of the synthesized collagen matrix due to excessive swelling caused by synthesized glycosaminoglycans. Therefore, alternative strategies must be implemented in tissue engineering studies to prevent collagen damage during the growth process. PMID:26855751

  9. General Linearized Theory of Quantum Fluctuations around Arbitrary Limit Cycles

    NASA Astrophysics Data System (ADS)

    Navarrete-Benlloch, Carlos; Weiss, Talitha; Walter, Stefan; de Valcárcel, Germán J.

    2017-09-01

    The theory of Gaussian quantum fluctuations around classical steady states in nonlinear quantum-optical systems (also known as standard linearization) is a cornerstone for the analysis of such systems. Its simplicity, together with its accuracy far from critical points or situations where the nonlinearity reaches the strong coupling regime, has turned it into a widespread technique, being the first method of choice in most works on the subject. However, such a technique finds strong practical and conceptual complications when one tries to apply it to situations in which the classical long-time solution is time dependent, a most prominent example being spontaneous limit-cycle formation. Here, we introduce a linearization scheme adapted to such situations, using the driven Van der Pol oscillator as a test bed for the method, which allows us to compare it with full numerical simulations. On a conceptual level, the scheme relies on the connection between the emergence of limit cycles and the spontaneous breaking of the symmetry under temporal translations. On the practical side, the method keeps the simplicity and linear scaling with the size of the problem (number of modes) characteristic of standard linearization, making it applicable to large (many-body) systems.

  10. Recent developments in bimetric theory

    NASA Astrophysics Data System (ADS)

    Schmidt-May, Angnis; von Strauss, Mikael

    2016-05-01

    This review is dedicated to recent progress in the field of classical, interacting, massive spin-2 theories, with a focus on ghost-free bimetric theory. We will outline its history and its development as a nontrivial extension and generalisation of nonlinear massive gravity. We present a detailed discussion of the consistency proofs of both theories, before we review Einstein solutions to the bimetric equations of motion in vacuum as well as the resulting mass spectrum. We introduce couplings to matter and then discuss the general relativity and massive gravity limits of bimetric theory, which correspond to decoupling the massive or the massless spin-2 field from the matter sector, respectively. More general classical solutions are reviewed and the present status of bimetric cosmology is summarised. An interesting corner in the bimetric parameter space which could potentially give rise to a nonlinear theory for partially massless spin-2 fields is also discussed. Relations to higher-curvature theories of gravity are explained and finally we give an overview of possible extensions of the theory and review its formulation in terms of vielbeins.

  11. Classical Aerodynamic Theory

    NASA Technical Reports Server (NTRS)

    Jones, R. T. (Compiler)

    1979-01-01

    A collection of papers on modern theoretical aerodynamics is presented. Included are theories of incompressible potential flow and research on the aerodynamic forces on wing and wing sections of aircraft and on airship hulls.

  12. Rational noncompliance with prescribed medical treatment.

    PubMed

    Stewart, Douglas O; DeMarco, Joseph P

    2010-09-01

    Despite the attention that patient noncompliance has received from medical researchers, patient noncompliance remains poorly understood and difficult to alter. With a better theory of patient noncompliance, both greater success in achieving compliance and greater respect for patient decision making are likely. The theory presented, which uses a microeconomic approach, bridges a gap in the extant literature that has so far ignored the contributions of this classic perspective on decision making involving the tradeoff of costs and benefits. The model also generates a surprising conclusion: that patients are typically acting rationally when they refuse to comply with certain treatments. However, compliance is predicted to rise with increased benefits and reduced costs. The prediction that noncompliance is rational is especially true in chronic conditions at the point that treatment begins to move closer to the medically ideal treatment level. Although the details of this theory have not been tested empirically, it is well supported by existing prospective and retrospective studies.

  13. The effects of moral judgment and moral identity on moral behavior: an empirical examination of the moral individual.

    PubMed

    Reynolds, Scott J; Ceranic, Tara L

    2007-11-01

    Recognizing limitations in classic cognitive moral development theory, several scholars have drawn from theories of identity to suggest that moral behavior results from both moral judgments and moral identity. The authors conducted 2 survey-based studies with more than 500 students and managers to test this argument. Results demonstrated that moral identity and moral judgments both independently influenced moral behavior. In addition, in situations in which social consensus regarding the moral behavior was not high, moral judgments and moral identity interacted to shape moral behavior. This interaction effect indicated that those who viewed themselves as moral individuals pursued the most extreme alternatives (e.g., never cheating, regularly cheating)--a finding that affirms the motivational power of a moral identity. The authors conclude by considering the implications of this research for both theory and practice. (c) 2007 APA

  14. Self-organizing map models of language acquisition

    PubMed Central

    Li, Ping; Zhao, Xiaowei

    2013-01-01

    Connectionist models have had a profound impact on theories of language. While most early models were inspired by the classic parallel distributed processing architecture, recent models of language have explored various other types of models, including self-organizing models for language acquisition. In this paper, we aim at providing a review of the latter type of models, and highlight a number of simulation experiments that we have conducted based on these models. We show that self-organizing connectionist models can provide significant insights into long-standing debates in both monolingual and bilingual language development. We suggest future directions in which these models can be extended, to better connect with behavioral and neural data, and to make clear predictions in testing relevant psycholinguistic theories. PMID:24312061

  15. The origin of the moon and the single-impact hypothesis. I

    NASA Technical Reports Server (NTRS)

    Benz, W.; Slattery, W. L.; Cameron, A. G. W.

    1986-01-01

    One of the newer ideas regarding the origin of the moon is concerned with a single-impact hypothesis. It is pointed out that this theory has the advantage of overcoming most of the difficulties with the classical theories. The angular momentum of the earth-moon system can easily be obtained by varying the initial conditions of the impact. A series of three-dimensional numerical simulations of the collision between the earth and an object of about 1/10 its mass is presented. Different impact velocities, impact parameters, and initial internal energies are considered. Attention is given to assumptions, the equation of state, numerical techniques utilizing the momentum equation and the energy conservation equation, tests, and initial conditions and units.

  16. Chiral vacuum fluctuations in quantum gravity.

    PubMed

    Magueijo, João; Benincasa, Dionigi M T

    2011-03-25

    We examine tensor perturbations around a de Sitter background within the framework of Ashtekar's variables and its cousins parameterized by the Immirzi parameter γ. At the classical level we recover standard cosmological perturbation theory, with illuminating insights. Quantization leads to real novelties. In the low energy limit we find a second quantized theory of gravitons which displays different vacuum fluctuations for right and left gravitons. Nonetheless right and left gravitons have the same (positive) energies, resolving a number of paradoxes suggested in the literature. The right-left asymmetry of the vacuum fluctuations depends on γ and the ordering of the Hamiltonian constraint, and it would leave a distinctive imprint in the polarization of the cosmic microwave background, thus opening quantum gravity to observational test.

  17. Contextual Advantage for State Discrimination

    NASA Astrophysics Data System (ADS)

    Schmid, David; Spekkens, Robert W.

    2018-02-01

    Finding quantitative aspects of quantum phenomena which cannot be explained by any classical model has foundational importance for understanding the boundary between classical and quantum theory. It also has practical significance for identifying information processing tasks for which those phenomena provide a quantum advantage. Using the framework of generalized noncontextuality as our notion of classicality, we find one such nonclassical feature within the phenomenology of quantum minimum-error state discrimination. Namely, we identify quantitative limits on the success probability for minimum-error state discrimination in any experiment described by a noncontextual ontological model. These constraints constitute noncontextuality inequalities that are violated by quantum theory, and this violation implies a quantum advantage for state discrimination relative to noncontextual models. Furthermore, our noncontextuality inequalities are robust to noise and are operationally formulated, so that any experimental violation of the inequalities is a witness of contextuality, independently of the validity of quantum theory. Along the way, we introduce new methods for analyzing noncontextuality scenarios and demonstrate a tight connection between our minimum-error state discrimination scenario and a Bell scenario.

  18. Scalar gravitational waves in the effective theory of gravity

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mottola, Emil

    As a low energy effective field theory, classical General Relativity receives an infrared relevant modification from the conformal trace anomaly of the energy-momentum tensor of massless, or nearly massless, quantum fields. The local form of the effective action associated with the trace anomaly is expressed in terms of a dynamical scalar field that couples to the conformal factor of the spacetime metric, allowing it to propagate over macroscopic distances. Linearized around flat spacetime, this semi-classical EFT admits scalar gravitational wave solutions in addition to the transversely polarized tensor waves of the classical Einstein theory. The amplitude of the scalar wavemore » modes, as well as their energy and energy flux which are positive and contain a monopole moment, are computed. As a result, astrophysical sources for scalar gravitational waves are considered, with the excited gluonic condensates in the interiors of neutron stars in merger events with other compact objects likely to provide the strongest burst signals.« less

  19. Classical continuum theory limits to determine the size-dependency of mechanical properties of GaN NWs

    NASA Astrophysics Data System (ADS)

    Zamani Kouhpanji, Mohammad Reza; Behzadirad, Mahmoud; Busani, Tito

    2017-12-01

    We used the stable strain gradient theory including acceleration gradients to investigate the classical and nonclassical mechanical properties of gallium nitride (GaN) nanowires (NWs). We predicted the static length scales, Young's modulus, and shear modulus of the GaN NWs from the experimental data. Combining these results with atomic simulations, we also found the dynamic length scale of the GaN NWs. Young's modulus, shear modulus, static, and dynamic length scales were found to be 318 GPa, 131 GPa, 8 nm, and 8.9 nm, respectively, usable for demonstrating the static and dynamic behaviors of GaN NWs having diameters from a few nm to bulk dimensions. Furthermore, the experimental data were analyzed with classical continuum theory (CCT) and compared with the available literature to illustrate the size-dependency of the mechanical properties of GaN NWs. This practice resolves the previous published discrepancies that happened due to the limitations of CCT used for determining the mechanical properties of GaN NWs and their size-dependency.

  20. Generalized quantum Fokker-Planck, diffusion, and Smoluchowski equations with true probability distribution functions.

    PubMed

    Banik, Suman Kumar; Bag, Bidhan Chandra; Ray, Deb Shankar

    2002-05-01

    Traditionally, quantum Brownian motion is described by Fokker-Planck or diffusion equations in terms of quasiprobability distribution functions, e.g., Wigner functions. These often become singular or negative in the full quantum regime. In this paper a simple approach to non-Markovian theory of quantum Brownian motion using true probability distribution functions is presented. Based on an initial coherent state representation of the bath oscillators and an equilibrium canonical distribution of the quantum mechanical mean values of their coordinates and momenta, we derive a generalized quantum Langevin equation in c numbers and show that the latter is amenable to a theoretical analysis in terms of the classical theory of non-Markovian dynamics. The corresponding Fokker-Planck, diffusion, and Smoluchowski equations are the exact quantum analogs of their classical counterparts. The present work is independent of path integral techniques. The theory as developed here is a natural extension of its classical version and is valid for arbitrary temperature and friction (the Smoluchowski equation being considered in the overdamped limit).

  1. Scalar gravitational waves in the effective theory of gravity

    DOE PAGES

    Mottola, Emil

    2017-07-10

    As a low energy effective field theory, classical General Relativity receives an infrared relevant modification from the conformal trace anomaly of the energy-momentum tensor of massless, or nearly massless, quantum fields. The local form of the effective action associated with the trace anomaly is expressed in terms of a dynamical scalar field that couples to the conformal factor of the spacetime metric, allowing it to propagate over macroscopic distances. Linearized around flat spacetime, this semi-classical EFT admits scalar gravitational wave solutions in addition to the transversely polarized tensor waves of the classical Einstein theory. The amplitude of the scalar wavemore » modes, as well as their energy and energy flux which are positive and contain a monopole moment, are computed. As a result, astrophysical sources for scalar gravitational waves are considered, with the excited gluonic condensates in the interiors of neutron stars in merger events with other compact objects likely to provide the strongest burst signals.« less

  2. The evolution of galaxies. III - Metal-enhanced star formation

    NASA Technical Reports Server (NTRS)

    Talbot, R. J., Jr.; Arnett, W. D.

    1973-01-01

    The problem of the paucity of low-metal-abundance low-mass stars is discussed. One alternative to the variable-initial-mass-function (VIMF) solution is proposed. It is shown that this solution - metal-enhanced star formation - satisfies the classical test which prompted the VIMF hypothesis. Furthermore, with no additional parameters it provides improved fits to other tests - e.g., inhomogeneities in the abundances in young stars, concordance of all nucleo-cosmochronologies, and a required yield of heavy-element production which is consistent with current stellar evolution theory. In this model the age of the Galaxy is 18.6 plus or minus 5.7 b.y.

  3. An appraisal of the classic forest succession paradigm with the shade tolerance index

    Treesearch

    Jean Lienard; Ionut Florescu; Nikolay Strigul

    2015-01-01

    We revisit the classic theory of forest succession that relates shade tolerance and species replacement and assess its validity to understand patch-mosaic patterns of forested ecosystems of the USA. We introduce a macroscopic parameter called the “shade tolerance index” and compare it to the classic continuum index in southern Wisconsin forests. We exemplify shade...

  4. Noncommutative gauge theory for Poisson manifolds

    NASA Astrophysics Data System (ADS)

    Jurčo, Branislav; Schupp, Peter; Wess, Julius

    2000-09-01

    A noncommutative gauge theory is associated to every Abelian gauge theory on a Poisson manifold. The semi-classical and full quantum version of the map from the ordinary gauge theory to the noncommutative gauge theory (Seiberg-Witten map) is given explicitly to all orders for any Poisson manifold in the Abelian case. In the quantum case the construction is based on Kontsevich's formality theorem.

  5. The Foundations of Einstein's Theory of Gravitation

    NASA Astrophysics Data System (ADS)

    Freundlich, Erwin; Brose, Translated by Henry L.; Einstein, Preface by Albert; Turner, Introduction by H. H.

    2011-06-01

    Introduction; 1. The special theory of relativity as a stepping-stone to the general theory of relativity; 2. Two fundamental postulates in the mathematical formulation of physical laws; 3. Concerning the fulfilment of the two postulates; 4. The difficulties in the principles of classical mechanics; 5. Einstein's theory of gravitation; 6. The verification of the new theory by actual experience; Appendix; Index.

  6. Historical Scientific Models and Theories as Resources for Learning and Teaching: The Case of Friction

    ERIC Educational Resources Information Center

    Besson, Ugo

    2013-01-01

    This paper presents a history of research and theories on sliding friction between solids. This history is divided into four phases: from Leonardo da Vinci to Coulomb and the establishment of classical laws of friction; the theories of lubrication and the Tomlinson's theory of friction (1850-1930); the theories of wear, the Bowden and Tabor's…

  7. Postbuckling response of long thick plates loaded in compression including higher order transverse shearing effects

    NASA Technical Reports Server (NTRS)

    Stein, Manuel; Sydow, P. Daniel; Librescu, Liviu

    1990-01-01

    Buckling and postbuckling results are presented for compression-loaded simply-supported aluminum plates and composite plates with a symmetric lay-up of thin + or - 45 deg plies composed of many layers. Buckling results for aluminum plates of finite length are given for various length-to-width ratios. Asymptotes to the curves based on buckling results give N(sub xcr) for plates of infinite length. Postbuckling results for plates with transverse shearing flexibility are compared to results from classical theory for various width-to-thickness ratios. Characteristic curves indicating the average longitudinal direct stress resultant as a function of the applied displacements are calculated based on four different theories: Classical von Karman theory using the Kirchoff assumptions, first-order shear deformation theory, higher-order shear deformation theory, and 3-D flexibility theory. Present results indicate that the 3-D flexibility theory gives the lowest buckling loads. The higher-order shear deformation theory has fewer unknowns than the 3-D flexibility theory but does not take into account through-the-thickness effects. The figures presented show that small differences occur in the average longitudinal direct stress resultants from the four theories that are functions of applied end-shortening displacement.

  8. Cognitive-Behavioral Therapy. Second Edition. Theories of Psychotherapy Series

    ERIC Educational Resources Information Center

    Craske, Michelle G.

    2017-01-01

    In this revised edition of "Cognitive-Behavioral Therapy," Michelle G. Craske discusses the history, theory, and practice of this commonly practiced therapy. Cognitive-behavioral therapy (CBT) originated in the science and theory of classical and instrumental conditioning when cognitive principles were adopted following dissatisfaction…

  9. A Transferrable Belief Model Representation for Physical Security of Nuclear Materials

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    David Gerts

    This work analyzed various probabilistic methods such as classic statistics, Bayesian inference, possibilistic theory, and Dempster-Shafer theory of belief functions for the potential insight offered into the physical security of nuclear materials as well as more broad application to nuclear non-proliferation automated decision making theory. A review of the fundamental heuristic and basic limitations of each of these methods suggested that the Dempster-Shafer theory of belief functions may offer significant capability. Further examination of the various interpretations of Dempster-Shafer theory, such as random set, generalized Bayesian, and upper/lower probability demonstrate some limitations. Compared to the other heuristics, the transferrable beliefmore » model (TBM), one of the leading interpretations of Dempster-Shafer theory, can improve the automated detection of the violation of physical security using sensors and human judgment. The improvement is shown to give a significant heuristic advantage over other probabilistic options by demonstrating significant successes for several classic gedanken experiments.« less

  10. On Probability Domains IV

    NASA Astrophysics Data System (ADS)

    Frič, Roman; Papčo, Martin

    2017-12-01

    Stressing a categorical approach, we continue our study of fuzzified domains of probability, in which classical random events are replaced by measurable fuzzy random events. In operational probability theory (S. Bugajski) classical random variables are replaced by statistical maps (generalized distribution maps induced by random variables) and in fuzzy probability theory (S. Gudder) the central role is played by observables (maps between probability domains). We show that to each of the two generalized probability theories there corresponds a suitable category and the two resulting categories are dually equivalent. Statistical maps and observables become morphisms. A statistical map can send a degenerated (pure) state to a non-degenerated one —a quantum phenomenon and, dually, an observable can map a crisp random event to a genuine fuzzy random event —a fuzzy phenomenon. The dual equivalence means that the operational probability theory and the fuzzy probability theory coincide and the resulting generalized probability theory has two dual aspects: quantum and fuzzy. We close with some notes on products and coproducts in the dual categories.

  11. There is no fitness but fitness, and the lineage is its bearer

    PubMed Central

    2016-01-01

    Inclusive fitness has been the cornerstone of social evolution theory for more than a half-century and has matured as a mathematical theory in the past 20 years. Yet surprisingly for a theory so central to an entire field, some of its connections to evolutionary theory more broadly remain contentious or underappreciated. In this paper, we aim to emphasize the connection between inclusive fitness and modern evolutionary theory through the following fact: inclusive fitness is simply classical Darwinian fitness, averaged over social, environmental and demographic states that members of a gene lineage experience. Therefore, inclusive fitness is neither a generalization of classical fitness, nor does it belong exclusively to the individual. Rather, the lineage perspective emphasizes that evolutionary success is determined by the effect of selection on all biological and environmental contexts that a lineage may experience. We argue that this understanding of inclusive fitness based on gene lineages provides the most illuminating and accurate picture and avoids pitfalls in interpretation and empirical applications of inclusive fitness theory. PMID:26729925

  12. Genetic algorithm based approach to investigate doped metal oxide materials: Application to lanthanide-doped ceria

    NASA Astrophysics Data System (ADS)

    Hooper, James; Ismail, Arif; Giorgi, Javier B.; Woo, Tom K.

    2010-06-01

    A genetic algorithm (GA)-inspired method to effectively map out low-energy configurations of doped metal oxide materials is presented. Specialized mating and mutation operations that do not alter the identity of the parent metal oxide have been incorporated to efficiently sample the metal dopant and oxygen vacancy sites. The search algorithms have been tested on lanthanide-doped ceria (L=Sm,Gd,Lu) with various dopant concentrations. Using both classical and first-principles density-functional-theory (DFT) potentials, we have shown the methodology reproduces the results of recent systematic searches of doped ceria at low concentrations (3.2% L2O3 ) and identifies low-energy structures of concentrated samarium-doped ceria (3.8% and 6.6% L2O3 ) which relate to the experimental and theoretical findings published thus far. We introduce a tandem classical/DFT GA algorithm in which an inexpensive classical potential is first used to generate a fit gene pool of structures to enhance the overall efficiency of the computationally demanding DFT-based GA search.

  13. Non-classical continuum theory for solids incorporating internal rotations and rotations of Cosserat theories

    NASA Astrophysics Data System (ADS)

    Surana, K. S.; Joy, A. D.; Reddy, J. N.

    2017-03-01

    This paper presents a non-classical continuum theory in Lagrangian description for solids in which the conservation and the balance laws are derived by incorporating both the internal rotations arising from the Jacobian of deformation and the rotations of Cosserat theories at a material point. In particular, in this non-classical continuum theory, we have (i) the usual displacements ( ±b \\varvec{u}) and (ii) three internal rotations ({}_i ±b \\varvec{Θ}) about the axes of a triad whose axes are parallel to the x-frame arising from the Jacobian of deformation (which are completely defined by the skew-symmetric part of the Jacobian of deformation), and (iii) three additional rotations ({}_e ±b \\varvec{Θ}) about the axes of the same triad located at each material point as additional three degrees of freedom referred to as Cosserat rotations. This gives rise to ±b \\varvec{u} and {}_e ±b \\varvec{{Θ} as six degrees of freedom at a material point. The internal rotations ({}_i ±b \\varvec{Θ}), often neglected in classical continuum mechanics, exist in all deforming solid continua as these are due to Jacobian of deformation. When the internal rotations {}_i ±b \\varvec{Θ} are resisted by the deforming matter, conjugate moment tensor arises that together with {}_i ±b \\varvec{Θ} may result in energy storage and/or dissipation, which must be accounted for in the conservation and the balance laws. The Cosserat rotations {}_e ±b \\varvec{Θ} also result in conjugate moment tensor which, together with {}_e ±b \\varvec{Θ}, may also result in energy storage and/or dissipation. The main focus of the paper is a consistent derivation of conservation and balance laws that incorporate aforementioned physics and associated constitutive theories for thermoelastic solids. The mathematical model derived here has closure, and the constitutive theories derived using two alternate approaches are in agreement with each other as well as with the condition resulting from the entropy inequality. Material coefficients introduced in the constitutive theories are clearly defined and discussed.

  14. Action and entanglement in gravity and field theory.

    PubMed

    Neiman, Yasha

    2013-12-27

    In nongravitational quantum field theory, the entanglement entropy across a surface depends on the short-distance regularization. Quantum gravity should not require such regularization, and it has been conjectured that the entanglement entropy there is always given by the black hole entropy formula evaluated on the entangling surface. We show that these statements have precise classical counterparts at the level of the action. Specifically, we point out that the action can have a nonadditive imaginary part. In gravity, the latter is fixed by the black hole entropy formula, while in nongravitating theories it is arbitrary. From these classical facts, the entanglement entropy conjecture follows by heuristically applying the relation between actions and wave functions.

  15. Inelastic black hole scattering from charged scalar amplitudes

    NASA Astrophysics Data System (ADS)

    Luna, Andrés; Nicholson, Isobel; O'Connell, Donal; White, Chris D.

    2018-03-01

    We explain how the lowest-order classical gravitational radiation produced during the inelastic scattering of two Schwarzschild black holes in General Relativity can be obtained from a tree scattering amplitude in gauge theory coupled to scalar fields. The gauge calculation is related to gravity through the double copy. We remove unwanted scalar forces which can occur in the double copy by introducing a massless scalar in the gauge theory, which is treated as a ghost in the link to gravity. We hope these methods are a step towards a direct application of the double copy at higher orders in classical perturbation theory, with the potential to greatly streamline gravity calculations for phenomenological applications.

  16. The Effect of Mass, Wind Angle, and Erection Technique on the Aeroelastic Behaviour of a Cable-Stayed Bridge Model (Effet de la Masse, de L’Angle du Vent et de la Technique D’Erection sur le Comportement Aeroelastique d’une Marquette de Pont a Haubans).

    DTIC Science & Technology

    1987-09-01

    response. An estimate of the buffeting response for the two cases is presented in Figure 4, using the theory of Irwin (Reference 7). Data acquisition was...values were obtained using the log decrement method by exciting the bridge in one mode and observing the decay of the response. Classical theory would...added mass or structural damping level. The addition of inertia to the deck would tend to lower the response according to classical vibration theory

  17. An entropy method for induced drag minimization

    NASA Technical Reports Server (NTRS)

    Greene, George C.

    1989-01-01

    A fundamentally new approach to the aircraft minimum induced drag problem is presented. The method, a 'viscous lifting line', is based on the minimum entropy production principle and does not require the planar wake assumption. An approximate, closed form solution is obtained for several wing configurations including a comparison of wing extension, winglets, and in-plane wing sweep, with and without a constraint on wing-root bending moment. Like the classical lifting-line theory, this theory predicts that induced drag is proportional to the square of the lift coefficient and inversely proportioinal to the wing aspect ratio. Unlike the classical theory, it predicts that induced drag is Reynolds number dependent and that the optimum spanwise circulation distribution is non-elliptic.

  18. A novel approach to the theory of homogeneous and heterogeneous nucleation.

    PubMed

    Ruckenstein, Eli; Berim, Gersh O; Narsimhan, Ganesan

    2015-01-01

    A new approach to the theory of nucleation, formulated relatively recently by Ruckenstein, Narsimhan, and Nowakowski (see Refs. [7-16]) and developed further by Ruckenstein and other colleagues, is presented. In contrast to the classical nucleation theory, which is based on calculating the free energy of formation of a cluster of the new phase as a function of its size on the basis of macroscopic thermodynamics, the proposed theory uses the kinetic theory of fluids to calculate the condensation (W(+)) and dissociation (W(-)) rates on and from the surface of the cluster, respectively. The dissociation rate of a monomer from a cluster is evaluated from the average time spent by a surface monomer in the potential well as obtained from the solution of the Fokker-Planck equation in the phase space of position and momentum for liquid-to-solid transition and the phase space of energy for vapor-to-liquid transition. The condensation rates are calculated using traditional expressions. The knowledge of those two rates allows one to calculate the size of the critical cluster from the equality W(+)=W(-) as well as the rate of nucleation. The developed microscopic approach allows one to avoid the controversial application of classical thermodynamics to the description of nuclei which contain a few molecules. The new theory was applied to a number of cases, such as the liquid-to-solid and vapor-to-liquid phase transitions, binary nucleation, heterogeneous nucleation, nucleation on soluble particles and protein folding. The theory predicts higher nucleation rates at high saturation ratios (small critical clusters) than the classical nucleation theory for both solid-to-liquid as well as vapor-to-liquid transitions. As expected, at low saturation ratios for which the size of the critical cluster is large, the results of the new theory are consistent with those of the classical one. The present approach was combined with the density functional theory to account for the density profile in the cluster. This approach was also applied to protein folding, viewed as the evolution of a cluster of native residues of spherical shape within a protein molecule, which could explain protein folding/unfolding and their dependence on temperature. Copyright © 2014 Elsevier B.V. All rights reserved.

  19. Branes and the Kraft-Procesi transition: classical case

    NASA Astrophysics Data System (ADS)

    Cabrera, Santiago; Hanany, Amihay

    2018-04-01

    Moduli spaces of a large set of 3 d N=4 effective gauge theories are known to be closures of nilpotent orbits. This set of theories has recently acquired a special status, due to Namikawa's theorem. As a consequence of this theorem, closures of nilpotent orbits are the simplest non-trivial moduli spaces that can be found in three dimensional theories with eight supercharges. In the early 80's mathematicians Hanspeter Kraft and Claudio Procesi characterized an inclusion relation between nilpotent orbit closures of the same classical Lie algebra. We recently [1] showed a physical realization of their work in terms of the motion of D3-branes on the Type IIB superstring embedding of the effective gauge theories. This analysis is restricted to A-type Lie algebras. The present note expands our previous discussion to the remaining classical cases: orthogonal and symplectic algebras. In order to do so we introduce O3-planes in the superstring description. We also find a brane realization for the mathematical map between two partitions of the same integer number known as collapse. Another result is that basic Kraft-Procesi transitions turn out to be described by the moduli space of orthosymplectic quivers with varying boundary conditions.

  20. Significant viscosity dependent deviations from classical van Deemter theory in liquid chromatography with porous silica monolithic columns.

    PubMed

    Nesterenko, Pavel N; Rybalko, Marina A; Paull, Brett

    2005-06-01

    Significant deviations from classical van Deemter behaviour, indicative of turbulent flow liquid chromatography, has been recorded for mobile phases of varying viscosity on porous silica monolithic columns at elevated mobile phase flow rates.

Top