ERIC Educational Resources Information Center
Yoon, So Yoon
2011-01-01
Working under classical test theory (CTT) and item response theory (IRT) frameworks, this study investigated psychometric properties of the Revised Purdue Spatial Visualization Tests: Visualization of Rotations (Revised PSVT:R). The original version, the PSVT:R was designed by Guay (1976) to measure spatial visualization ability in…
Ethical and Stylistic Implications in Delivering Conference Papers.
ERIC Educational Resources Information Center
Enos, Theresa
1986-01-01
Analyzes shortcomings of conference papers intended for the eye rather than the ear. Referring to classical oratory, speech act theory, and cognitive theory, recommends revising papers for oral presentation by using classical disposition; deductive rather than inductive argument; formulaic repetition of words and phrases; non-inverted clause…
Mathematical model of the SH-3G helicopter
NASA Technical Reports Server (NTRS)
Phillips, J. D.
1982-01-01
A mathematical model of the Sikorsky SH-3G helicopter based on classical nonlinear, quasi-steady rotor theory was developed. The model was validated statically and dynamically by comparison with Navy flight-test data. The model incorporates ad hoc revisions which address the ideal assumptions of classical rotor theory and improve the static trim characteristics to provide a more realistic simulation, while retaining the simplicity of the classical model.
Cognitive-Behavioral Therapy. Second Edition. Theories of Psychotherapy Series
ERIC Educational Resources Information Center
Craske, Michelle G.
2017-01-01
In this revised edition of "Cognitive-Behavioral Therapy," Michelle G. Craske discusses the history, theory, and practice of this commonly practiced therapy. Cognitive-behavioral therapy (CBT) originated in the science and theory of classical and instrumental conditioning when cognitive principles were adopted following dissatisfaction…
The Role of Eigensolutions in Nonlinear Inverse Cavity-Flow-Theory. Revision.
1985-06-10
The method of Levi Civita is applied to an isolated fully cavitating body at zero cavitation number and adapted to the solution of the inverse...Eigensolutions in Nonlinear Inverse Cavity-Flow Theory [Revised] Abstract: The method of Levi Civita is applied to an isolated fully cavitating body at...problem is not thought * to present much of a challenge at zero cavitation number. In this case, - the classical method of Levi Civita [7] can be
Rational choice and the political bases of changing Israeli counterinsurgency strategy.
Brym, Robert J; Andersen, Robert
2011-09-01
Israeli counterinsurgency doctrine holds that the persistent use of credible threat and disproportionate military force results in repeated victories that eventually teach the enemy the futility of aggression. The doctrine thus endorses classical rational choice theory's claim that narrow cost-benefit calculations shape fixed action rationales. This paper assesses whether Israel's strategic practice reflects its counterinsurgency doctrine by exploring the historical record and the association between Israeli and Palestinian deaths due to low-intensity warfare. In contrast to the expectations of classical rational choice theory, the evidence suggests that institutional, cultural and historical forces routinely override simple cost-benefit calculations. Changing domestic and international circumstances periodically cause revisions in counterinsurgency strategy. Credible threat and disproportionate military force lack the predicted long-term effect. © London School of Economics and Political Science 2011.
Sheaff, R; Lloyd-Kendall, A
2000-07-01
To investigate how far English National Health Service (NHS) Personal Medical Services (PMS) contracts embody a principal-agent relationship between health authorities (HAs) and primary health care providers, especially, but not exclusively, general practices involved in the first wave (1998) of PMS pilot projects; and to consider the implications for relational and classical theories of contract. Content analysis of 71 first-wave PMS contracts. Most PMS contracts reflect current English NHS policy priorities, but few institute mechanisms to ensure that providers realise these objectives. Although PMS contracts have some classical characteristics, relational characteristics are more evident. Some characteristics match neither the classical nor the relational model. First-wave PMS contracts do not appear to embody a strong principal-agent relationship between HAs and primary health care providers. This finding offers little support for the relevance of classical theories of contract, but also implies that relational theories of contract need to be revised for quasi-market settings. Future PMS contracts will need to focus more on evidence-based processes of primary care, health outputs and patient satisfaction and less upon service inputs. PMS contracts will also need to be longer-term contracts in order to promote the 'institutional embedding' of independent general practice in the wider management systems of the NHS.
New developments in social interdependence theory.
Johnson, David W; Johnson, Roger T
2005-11-01
Social interdependence theory is a classic example of the interaction of theory, research, and practice. The premise of the theory is the way that goals are structured determines how individuals interact, which in turn creates outcomes. Since its formulation nearly 60 years ago, social interdependence theory has been modified, extended, and refined on the basis of the increasing knowledge about, and application of, the theory. Researchers have conducted over 750 research studies on the relative merits of cooperative, competitive, and individualistic efforts and the conditions under which each is appropriate. Social interdependence theory has been widely applied, especially in education and business. These applications have resulted in revisions of the theory and the generation of considerable new research. The authors critically analyze the new developments resulting from extensive research on, and wide-scale applications of, social interdependence theory.
First-order design of geodetic networks using the simulated annealing method
NASA Astrophysics Data System (ADS)
Berné, J. L.; Baselga, S.
2004-09-01
The general problem of the optimal design for a geodetic network subject to any extrinsic factors, namely the first-order design problem, can be dealt with as a numeric optimization problem. The classic theory of this problem and the optimization methods are revised. Then the innovative use of the simulated annealing method, which has been successfully applied in other fields, is presented for this classical geodetic problem. This method, belonging to iterative heuristic techniques in operational research, uses a thermodynamical analogy to crystalline networks to offer a solution that converges probabilistically to the global optimum. Basic formulation and some examples are studied.
Scaling properties of the two-dimensional randomly stirred Navier-Stokes equation.
Mazzino, Andrea; Muratore-Ginanneschi, Paolo; Musacchio, Stefano
2007-10-05
We inquire into the scaling properties of the 2D Navier-Stokes equation sustained by a force field with Gaussian statistics, white noise in time, and with a power-law correlation in momentum space of degree 2 - 2 epsilon. This is at variance with the setting usually assumed to derive Kraichnan's classical theory. We contrast accurate numerical experiments with the different predictions provided for the small epsilon regime by Kraichnan's double cascade theory and by renormalization group analysis. We give clear evidence that for all epsilon, Kraichnan's theory is consistent with the observed phenomenology. Our results call for a revision in the renormalization group analysis of (2D) fully developed turbulence.
Using Classical Test Theory and Item Response Theory to Evaluate the LSCI
NASA Astrophysics Data System (ADS)
Schlingman, Wayne M.; Prather, E. E.; Collaboration of Astronomy Teaching Scholars CATS
2011-01-01
Analyzing the data from the recent national study using the Light and Spectroscopy Concept Inventory (LSCI), this project uses both Classical Test Theory (CTT) and Item Response Theory (IRT) to investigate the LSCI itself in order to better understand what it is actually measuring. We use Classical Test Theory to form a framework of results that can be used to evaluate the effectiveness of individual questions at measuring differences in student understanding and provide further insight into the prior results presented from this data set. In the second phase of this research, we use Item Response Theory to form a theoretical model that generates parameters accounting for a student's ability, a question's difficulty, and estimate the level of guessing. The combined results from our investigations using both CTT and IRT are used to better understand the learning that is taking place in classrooms across the country. The analysis will also allow us to evaluate the effectiveness of individual questions and determine whether the item difficulties are appropriately matched to the abilities of the students in our data set. These results may require that some questions be revised, motivating the need for further development of the LSCI. This material is based upon work supported by the National Science Foundation under Grant No. 0715517, a CCLI Phase III Grant for the Collaboration of Astronomy Teaching Scholars (CATS). Any opinions, findings, and conclusions or recommendations expressed in this material are those of the authors and do not necessarily reflect the views of the National Science Foundation.
From Data to Semantic Information
NASA Astrophysics Data System (ADS)
Floridi, Luciano
2003-06-01
There is no consensus yet on the definition of semantic information. This paper contributes to the current debate by criticising and revising the Standard Definition of semantic Information (SDI) as meaningful data, in favour of the Dretske-Grice approach: meaningful and well-formed data constitute semantic information only if they also qualify as contingently truthful. After a brief introduction, SDI is criticised for providing necessary but insufficient conditions for the definition of semantic information. SDI is incorrect because truth-values do not supervene on semantic information, and misinformation (that is, false semantic information) is not a type of semantic information, but pseudo-information, that is not semantic information at all. This is shown by arguing that none of the reasons for interpreting misinformation as a type of semantic information is convincing, whilst there are compelling reasons to treat it as pseudo-information. As a consequence, SDI is revised to include a necessary truth-condition. The last section summarises the main results of the paper and indicates the important implications of the revised definition for the analysis of the deflationary theories of truth, the standard definition of knowledge and the classic, quantitative theory of semantic information.
Milgrom's revision of Newton's laws - Dynamical and cosmological consequences
NASA Technical Reports Server (NTRS)
Felten, J. E.
1984-01-01
Milgrom's (1983) recent revision of Newtonian dynamics was introduced to eliminate the inference that large quantities of invisible mass exist in galaxies. It is shown by simple examples that a Milgrom acceleration, in the form presented so far, implies other far-reaching changes in dynamics. The momentum of an isolated system is not conserved, and the usual theorem for center-of-mass motion of any system does not hold. Naive applications require extreme caution. The model fails to provide a complete description of particle dynamics and should be thought of as a revision of Kepler's laws rather than Newton's. The Milgrom acceleration also implies fundamental changes in cosmology. A quasi-Newtonian calculation adapted from Newtonian cosmology suggests that a 'Milgrom universe' will recollapse even if the classical closure parameter Omega is much less than unity. The solution, however, fails to satisfy the cosmological principle. Reasons for the breakdown of this calculation are examined. A new theory of gravitation will be needed before the behavior of a Milgrom universe can be predicted.
Milgrom's revision of cosmic dynamics: Amending Newton's laws or Keplers?
NASA Technical Reports Server (NTRS)
Felten, J. E.
1983-01-01
Milgrom's recent revision of Newtonian dynamics was introduced to eliminate the inference that large quantities of invisible mass exist in galaxies. Simple examples show that a Milgrom acceleration, in the form presented so far, imply other far-reaching changes in dynamics. The momentum of an isolated system is not conserved, and the usual theorem for center-of-mass motion of any system does not hold. Naive applications require extreme caution. The model fails to provide a complete description of particle dynamics and should be thought of as a revision of Kepler's laws rather than Newton's. The Milgrom acceleration also implies fundamental changes in cosmology. A quasi-Newtonian calculation adapted from Newtonian cosmology suggests that a Milgrom universe will recollapse even if the classical closure parameter theta is less than 1. The solution, however, fails to satisfy the cosmological principle. Reasons for the breakdown of this calculation are examined. A theory of gravitation needed before the behavior of a Milgrom universe can be predicted.
Thermal and viscous effects on sound waves: revised classical theory.
Davis, Anthony M J; Brenner, Howard
2012-11-01
In this paper the recently developed, bi-velocity model of fluid mechanics based on the principles of linear irreversible thermodynamics (LIT) is applied to sound propagation in gases taking account of first-order thermal and viscous dissipation effects. The results are compared and contrasted with the classical Navier-Stokes-Fourier results of Pierce for this same situation cited in his textbook. Comparisons are also made with the recent analyses of Dadzie and Reese, whose molecularly based sound propagation calculations furnish results virtually identical with the purely macroscopic LIT-based bi-velocity results below, as well as being well-supported by experimental data. Illustrative dissipative sound propagation examples involving application of the bi-velocity model to several elementary situations are also provided, showing the disjoint entropy mode and the additional, evanescent viscous mode.
The retroperitoneal interfascial planes: current overview and future perspectives.
Ishikawa, Kazuo; Nakao, Shota; Nakamuro, Makoto; Huang, Tai-Ping; Nakano, Hiroshi
2016-07-01
Recently, the concept of interfascial planes has become the prevalent theory among radiologists for understanding the retroperitoneal anatomy, having replaced the classic tricompartmental theory. However, it is a little known fact that the concept remains incomplete and includes embryological errors, which have been revised on the basis of our microscopic study. We believe that the concept not only provides a much clearer understanding of the retroperitoneal anatomy, but it also allows further development for diagnosis and treatment of retroperitoneal injuries and diseases, should it become an accomplished theory. We explain the history and outline of the concept of interfascial planes, correct common misunderstandings about the concept, explain the unconsciously applied therapeutic procedures based on the concept, and present future perspectives of the concept using our published and unpublished data. This knowledge could be essential to acute care physicians and surgeons sometime soon.
The kinetics of root gravitropism: dual motors and sensors
NASA Technical Reports Server (NTRS)
Wolverton, Chris; Ishikawa, Hideo; Evans, Michael L.
2002-01-01
The Cholodny-Went theory of tropisms has served as a framework for investigation of root gravitropism for nearly three quarters of a century. Recent investigations using modern techniques have generated findings consistent with the classical theory, including confirmation of asymmetrical distribution of polar auxin transport carriers, molecular evidence for auxin asymmetry following gravistimulation, and generation of auxin response mutants with predictable lesions in gravitropism. Other results indicate that the classical model is inadequate to account for key features of root gravitropism. Initiation of curvature, for example, occurs outside the region of most rapid elongation and is driven by differential acceleration rather than differential inhibition of elongation. The evidence indicates that there are two motors driving root gravitropism, one of which appears not to be auxin regulated. We have recently developed technology that is capable of maintaining a constant angle of gravistimulation at any selected target region of a root while continuously monitoring growth and curvature kinetics. This review elaborates on the advantages of this new technology for analyzing gravitropism and describes applications of the technology that reveal (1) the existence of at least two phases to gravitropic motor output, even under conditions of constant stimulus input and (2) the existence of gravity sensing outside of the root cap. We propose a revised model of root gravitropism including dual sensors and dual motors interacting to accomplish root gravitropism, with only one of the systems linked to the classical Cholodny-Went theory.
The kinetics of root gravitropism: dual motors and sensors.
Wolverton, Chris; Ishikawa, Hideo; Evans, Michael L
2002-06-01
The Cholodny-Went theory of tropisms has served as a framework for investigation of root gravitropism for nearly three quarters of a century. Recent investigations using modern techniques have generated findings consistent with the classical theory, including confirmation of asymmetrical distribution of polar auxin transport carriers, molecular evidence for auxin asymmetry following gravistimulation, and generation of auxin response mutants with predictable lesions in gravitropism. Other results indicate that the classical model is inadequate to account for key features of root gravitropism. Initiation of curvature, for example, occurs outside the region of most rapid elongation and is driven by differential acceleration rather than differential inhibition of elongation. The evidence indicates that there are two motors driving root gravitropism, one of which appears not to be auxin regulated. We have recently developed technology that is capable of maintaining a constant angle of gravistimulation at any selected target region of a root while continuously monitoring growth and curvature kinetics. This review elaborates on the advantages of this new technology for analyzing gravitropism and describes applications of the technology that reveal (1) the existence of at least two phases to gravitropic motor output, even under conditions of constant stimulus input and (2) the existence of gravity sensing outside of the root cap. We propose a revised model of root gravitropism including dual sensors and dual motors interacting to accomplish root gravitropism, with only one of the systems linked to the classical Cholodny-Went theory.
Wyrwich, KW; Phillips, GA; Vollmer, T; Guo, S
2016-01-01
Background Investigations using classical test theory support the psychometric properties of the original version of the Multiple Sclerosis Impact Scale (MSIS-29v1), a disease-specific measure of multiple sclerosis (MS) impact (physical and psychological subscales). Later, assessments of the MSIS-29v1 in an MS community-based sample using Rasch analysis led to revisions of the instrument’s response options (MSIS-29v2). Objective The objective of this paper is to evaluate the psychometric properties of the MSIS-29v1 in a clinical trial cohort of relapsing–remitting MS patients (RRMS). Methods Data from 600 patients with RRMS enrolled in the SELECT clinical trial were used. Assessments were performed at baseline and at Weeks 12, 24, and 52. In addition to traditional psychometric analyses, Item Response Theory (IRT) and Rasch analysis were used to evaluate the measurement properties of the MSIS-29v1. Results Both MSIS-29v1 subscales demonstrated strong reliability, construct validity, and responsiveness. The IRT and Rasch analysis showed overall support for response category threshold ordering, person-item fit, and item fit for both subscales. Conclusions Both MSIS-29v1 subscales demonstrated robust measurement properties using classical, IRT, and Rasch techniques. Unlike previous research using a community-based sample, the MSIS-29v1 was found to be psychometrically sound to assess physical and psychological impairments in a clinical trial sample of patients with RRMS. PMID:28607741
Bacci, E D; Wyrwich, K W; Phillips, G A; Vollmer, T; Guo, S
2016-01-01
Investigations using classical test theory support the psychometric properties of the original version of the Multiple Sclerosis Impact Scale (MSIS-29v1), a disease-specific measure of multiple sclerosis (MS) impact (physical and psychological subscales). Later, assessments of the MSIS-29v1 in an MS community-based sample using Rasch analysis led to revisions of the instrument's response options (MSIS-29v2). The objective of this paper is to evaluate the psychometric properties of the MSIS-29v1 in a clinical trial cohort of relapsing-remitting MS patients (RRMS). Data from 600 patients with RRMS enrolled in the SELECT clinical trial were used. Assessments were performed at baseline and at Weeks 12, 24, and 52. In addition to traditional psychometric analyses, Item Response Theory (IRT) and Rasch analysis were used to evaluate the measurement properties of the MSIS-29v1. Both MSIS-29v1 subscales demonstrated strong reliability, construct validity, and responsiveness. The IRT and Rasch analysis showed overall support for response category threshold ordering, person-item fit, and item fit for both subscales. Both MSIS-29v1 subscales demonstrated robust measurement properties using classical, IRT, and Rasch techniques. Unlike previous research using a community-based sample, the MSIS-29v1 was found to be psychometrically sound to assess physical and psychological impairments in a clinical trial sample of patients with RRMS.
Decision-Making Under Risk: Integrating Perspectives From Biology, Economics, and Psychology.
Mishra, Sandeep
2014-08-01
Decision-making under risk has been variably characterized and examined in many different disciplines. However, interdisciplinary integration has not been forthcoming. Classic theories of decision-making have not been amply revised in light of greater empirical data on actual patterns of decision-making behavior. Furthermore, the meta-theoretical framework of evolution by natural selection has been largely ignored in theories of decision-making under risk in the human behavioral sciences. In this review, I critically examine four of the most influential theories of decision-making from economics, psychology, and biology: expected utility theory, prospect theory, risk-sensitivity theory, and heuristic approaches. I focus especially on risk-sensitivity theory, which offers a framework for understanding decision-making under risk that explicitly involves evolutionary considerations. I also review robust empirical evidence for individual differences and environmental/situational factors that predict actual risky decision-making that any general theory must account for. Finally, I offer steps toward integrating various theoretical perspectives and empirical findings on risky decision-making. © 2014 by the Society for Personality and Social Psychology, Inc.
Federal Register 2010, 2011, 2012, 2013, 2014
2010-08-10
... Live Swine, Pork, and Pork Products from Certain Regions Free of Classical Swine Fever in Chile and... revise an information collection associated with regulations for the importation of live swine, pork, and pork products from certain regions free of classical swine fever in Chile and Mexico and to request...
Phylogenetic escalation and decline of plant defense strategies
Agrawal, Anurag A.; Fishbein, Mark
2008-01-01
As the basal resource in most food webs, plants have evolved myriad strategies to battle consumption by herbivores. Over the past 50 years, plant defense theories have been formulated to explain the remarkable variation in abundance, distribution, and diversity of secondary chemistry and other defensive traits. For example, classic theories of enemy-driven evolutionary dynamics have hypothesized that defensive traits escalate through the diversification process. Despite the fact that macroevolutionary patterns are an explicit part of defense theories, phylogenetic analyses have not been previously attempted to disentangle specific predictions concerning (i) investment in resistance traits, (ii) recovery after damage, and (iii) plant growth rate. We constructed a molecular phylogeny of 38 species of milkweed and tested four major predictions of defense theory using maximum-likelihood methods. We did not find support for the growth-rate hypothesis. Our key finding was a pattern of phyletic decline in the three most potent resistance traits (cardenolides, latex, and trichomes) and an escalation of regrowth ability. Our neontological approach complements more common paleontological approaches to discover directional trends in the evolution of life and points to the importance of natural enemies in the macroevolution of species. The finding of macroevolutionary escalating regowth ability and declining resistance provides a window into the ongoing coevolutionary dynamics between plants and herbivores and suggests a revision of classic plant defense theory. Where plants are primarily consumed by specialist herbivores, regrowth (or tolerance) may be favored over resistance traits during the diversification process. PMID:18645183
Updating energy security and environmental policy: Energy security theories revisited.
Proskuryakova, L
2018-06-18
The energy security theories are based on the premises of sufficient and reliable supply of fossil fuels at affordable prices in centralized supply systems. Policy-makers and company chief executives develop energy security strategies based on the energy security theories and definitions that dominate in the research and policy discourse. It is therefore of utmost importance that scientists revisit these theories in line with the latest changes in the energy industry: the rapid advancement of renewables and smart grid, decentralization of energy systems, new environmental and climate challenges. The study examines the classic energy security concepts (neorealism, neoliberalism, constructivism and international political economy) and assesses if energy technology changes are taken into consideration. This is done through integrative literature review, comparative analysis, identification of 'international relations' and 'energy' research discourse with the use of big data, and case studies of Germany, China, and Russia. The paper offers suggestions for revision of energy security concepts through integration of future technology considerations. Copyright © 2018 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Born, Max; Wolf, Emil
1999-10-01
Principles of Optics is one of the classic science books of the twentieth century, and probably the most influential book in optics published in the past forty years. This edition has been thoroughly revised and updated, with new material covering the CAT scan, interference with broad-band light and the so-called Rayleigh-Sommerfeld diffraction theory. This edition also details scattering from inhomogeneous media and presents an account of the principles of diffraction tomography to which Emil Wolf has made a basic contribution. Several new appendices are also included. This new edition will be invaluable to advanced undergraduates, graduate students and researchers working in most areas of optics.
Bacci, Elizabeth D; Staniewska, Dorota; Coyne, Karin S; Boyer, Stacey; White, Leigh Ann; Zach, Neta; Cedarbaum, Jesse M
2016-01-01
Our objective was to examine dimensionality and item-level performance of the Amyotrophic Lateral Sclerosis Functional Rating Scale-Revised (ALSFRS-R) across time using classical and modern test theory approaches. Confirmatory factor analysis (CFA) and Item Response Theory (IRT) analyses were conducted using data from patients with amyotrophic lateral sclerosis (ALS) Pooled Resources Open-Access ALS Clinical Trials (PRO-ACT) database with complete ALSFRS-R data (n = 888) at three time-points (Time 0, Time 1 (6-months), Time 2 (1-year)). Results demonstrated that in this population of 888 patients, mean age was 54.6 years, 64.4% were male, and 93.7% were Caucasian. The CFA supported a 4* individual-domain structure (bulbar, gross motor, fine motor, and respiratory domains). IRT analysis within each domain revealed misfitting items and overlapping item response category thresholds at all time-points, particularly in the gross motor and respiratory domain items. Results indicate that many of the items of the ALSFRS-R may sub-optimally distinguish among varying levels of disability assessed by each domain, particularly in patients with less severe disability. Measure performance improved across time as patient disability severity increased. In conclusion, modifications to select ALSFRS-R items may improve the instrument's specificity to disability level and sensitivity to treatment effects.
The particle problem in classical gravity: a historical note on 1941
NASA Astrophysics Data System (ADS)
Galvagno, Mariano; Giribet, Gastón
2005-11-01
This historical note is mainly based on a relatively unknown paper published by Albert Einstein in Revista de la Universidad Nacional de Tucumán in 1941. Taking the ideas of this work as a leitmotiv, we review the discussions about the particle problem in the theory of gravitation within the historical context by means of the study of seminal works on the subject. The revision shows how the digressions regarding the structure of matter and the concise problem of finding regular solutions of the pure field equations turned out to be intrinsically unified in the beginning of the programme towards a final theory of fields. The paper mentioned (Einstein 1941a Rev. Univ. Nac. Tucumán A 2 11) represents the basis of the one written by Einstein in collaboration with Wolfgang Pauli in 1943, in which, following analogous lines, the proof of the non-existence of regular particle-type solutions was generalized to the case of cylindrical geometries in Kaluza-Klein theory (Einstein and Pauli 1943 Ann. Math. 44 131). Besides, other generalizations were subsequently presented. The (non-)existence of such solutions in classical unified field theory was undoubtedly an important criterion leading Einstein's investigations. This aspect was investigated with expertness by Jeroen van Dongen in a recent work, though restricting the scope to the particular case of Kaluza-Klein theory (van Dongen 2002 Stud. Hist. Phil. Mod. Phys. 33 185). Here, we discuss the particle problem within a more general context, presenting in this way a complement to previous reviews.
Rett syndrome diagnostic criteria: lessons from the Natural History Study.
Percy, Alan K; Neul, Jeffrey L; Glaze, Daniel G; Motil, Kathleen J; Skinner, Steven A; Khwaja, Omar; Lee, Hye-Seung; Lane, Jane B; Barrish, Judy O; Annese, Fran; McNair, Lauren; Graham, Joy; Barnes, Katherine
2010-12-01
Analysis of 819 participants enrolled in the Rett syndrome (RTT) Natural History Study validates recently revised diagnostic criteria. 765 females fulfilled 2002 consensus criteria for classic (653/85.4%) or variant (112/14.6%) RTT. All participants classified as classic RTT fulfilled each revised main criterion; supportive criteria were not uniformly present. All variant RTT participants met at least 3 of 6 main criteria in the 2002, 2 of 4 main criteria in the current format, and 5 of 11 supportive criteria in both. This analysis underscores the critical role of main criteria for classic RTT; variant RTT requires both main and supportive criteria.
ERIC Educational Resources Information Center
Yelboga, Atilla; Tavsancil, Ezel
2010-01-01
In this research, the classical test theory and generalizability theory analyses were carried out with the data obtained by a job performance scale for the years 2005 and 2006. The reliability coefficients obtained (estimated) from the classical test theory and generalizability theory analyses were compared. In classical test theory, test retest…
Ye, Zeng Jie; Liang, Mu Zi; Zhang, Hao Wei; Li, Peng Fei; Ouyang, Xue Ren; Yu, Yuan Liang; Liu, Mei Ling; Qiu, Hong Zhong
2018-06-01
Classic theory test has been used to develop and validate the 25-item Resilience Scale Specific to Cancer (RS-SC) in Chinese patients with cancer. This study was designed to provide additional information about the discriminative value of the individual items tested with an item response theory analysis. A two-parameter graded response model was performed to examine whether any of the items of the RS-SC exhibited problems with the ordering and steps of thresholds, as well as the ability of items to discriminate patients with different resilience levels using item characteristic curves. A sample of 214 Chinese patients with cancer diagnosis was analyzed. The established three-dimension structure of the RS-SC was confirmed. Several items showed problematic thresholds or discrimination ability and require further revision. Some problematic items should be refined and a short-form of RS-SC maybe feasible in clinical settings in order to reduce burden on patients. However, the generalizability of these findings warrants further investigations.
Quantum and classical behavior in interacting bosonic systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hertzberg, Mark P.
It is understood that in free bosonic theories, the classical field theory accurately describes the full quantum theory when the occupancy numbers of systems are very large. However, the situation is less understood in interacting theories, especially on time scales longer than the dynamical relaxation time. Recently there have been claims that the quantum theory deviates spectacularly from the classical theory on this time scale, even if the occupancy numbers are extremely large. Furthermore, it is claimed that the quantum theory quickly thermalizes while the classical theory does not. The evidence for these claims comes from noticing a spectacular differencemore » in the time evolution of expectation values of quantum operators compared to the classical micro-state evolution. If true, this would have dramatic consequences for many important phenomena, including laboratory studies of interacting BECs, dark matter axions, preheating after inflation, etc. In this work we critically examine these claims. We show that in fact the classical theory can describe the quantum behavior in the high occupancy regime, even when interactions are large. The connection is that the expectation values of quantum operators in a single quantum micro-state are approximated by a corresponding classical ensemble average over many classical micro-states. Furthermore, by the ergodic theorem, a classical ensemble average of local fields with statistical translation invariance is the spatial average of a single micro-state. So the correlation functions of the quantum and classical field theories of a single micro-state approximately agree at high occupancy, even in interacting systems. Furthermore, both quantum and classical field theories can thermalize, when appropriate coarse graining is introduced, with the classical case requiring a cutoff on low occupancy UV modes. We discuss applications of our results.« less
Park, Min Hee; Yang, Sook Ja; Chee, Yeon Kyung
2016-01-01
The twenty-one item Stress Measurement of Female Marriage Immigrants (SMFMI) was developed to assess stress of female marriage immigrants in Korea. This study reports the psychometric properties of a revised SMFMI (SMFMI-R) for application with female marriage immigrants to Korea who were raising children. Participants were 190 female marriage immigrants from China, Vietnam, the Philippines, and other Asian countries, who were recruited using convenience sampling between November 2013 and December 2013. Survey questionnaires were translated into study participants' native languages (Chinese, Vietnamese, and English). Principal component analysis yielded nineteen items in four factors (family, parenting, cultural, and economic stress), explaining 63.5% of the variance, which was slightly better than the original scale. Confirmatory factor analysis indicated adequate fit for the four-factor model. Based on classic test theory and item response theory, strong support was provided for item discrimination, item difficulty, and internal consistency (Cronbach's alpha = 0.923). SMFMI-R scores were negatively associated with Korean proficiency and subjective economic status. The SMFMI-R is a valid, reliable, and comprehensive measure of stress for female marriage immigrants and can provide useful information to develop intervention programs for those who may be at risk for emotional stress.
Diagrammar in classical scalar field theory
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cattaruzza, E., E-mail: Enrico.Cattaruzza@gmail.com; Gozzi, E., E-mail: gozzi@ts.infn.it; INFN, Sezione di Trieste
2011-09-15
In this paper we analyze perturbatively a g{phi}{sup 4}classical field theory with and without temperature. In order to do that, we make use of a path-integral approach developed some time ago for classical theories. It turns out that the diagrams appearing at the classical level are many more than at the quantum level due to the presence of extra auxiliary fields in the classical formalism. We shall show that a universal supersymmetry present in the classical path-integral mentioned above is responsible for the cancelation of various diagrams. The same supersymmetry allows the introduction of super-fields and super-diagrams which considerably simplifymore » the calculations and make the classical perturbative calculations almost 'identical' formally to the quantum ones. Using the super-diagrams technique, we develop the classical perturbation theory up to third order. We conclude the paper with a perturbative check of the fluctuation-dissipation theorem. - Highlights: > We provide the Feynman diagrams of perturbation theory for a classical field theory. > We give a super-formalism which links the quantum diagrams to the classical ones. > We check perturbatively the fluctuation-dissipation theorem.« less
A Guide to Classical Liberal Scholarship. Revised Edition.
ERIC Educational Resources Information Center
Palmer, Tom G.
This booklet introduces students to a wide range of works of classical liberal scholarship. The works described can be used in researching term papers, theses, and dissertations; each book and article provides valuable insights and information that can make the difference between an "A" and a "B" paper. The tradition of classical liberalism…
Disabling Fictions: Institutionalized Delimitations of Revision.
ERIC Educational Resources Information Center
Carroll, Jeffrey
1989-01-01
Examines three contemporary taxonomies of revision as proposed by Wallace Hildick, Lester Faigley and Stephen Witte, and Sondra Perl. Uses literary and cultural theory to bridge the gap between these theories and students' revision practices. Argues that while revision may be prescriptive, it must also be subordinate to the writer's intentions and…
All in the family: a belated response to Knudson-Martin's feminist revision of Bowen theory.
Horne, K Blake; Hicks, Mary W
2002-01-01
The first formal attempt at revising Bowen theory within the marriage and family therapy literature is represented in the work of Knudson-Martin (1994). Claiming that several of the theory's concepts are defined at odds with female development, Knudson-Martin (1994) reconceptualizes and expands Bowen theory to rectify these perceived shortcomings. In turn, we address several fundamental concerns with Knudson-Martin's critique and revision of Bowen theory. An alternative representation of Bowen Theory, as well as its relationship to feminist thought, is put forth. Suggestions for the field's future relationship to Bowen theory are also discussed.
Raykov, Tenko; Marcoulides, George A
2016-04-01
The frequently neglected and often misunderstood relationship between classical test theory and item response theory is discussed for the unidimensional case with binary measures and no guessing. It is pointed out that popular item response models can be directly obtained from classical test theory-based models by accounting for the discrete nature of the observed items. Two distinct observational equivalence approaches are outlined that render the item response models from corresponding classical test theory-based models, and can each be used to obtain the former from the latter models. Similarly, classical test theory models can be furnished using the reverse application of either of those approaches from corresponding item response models.
Falling paper: Navier-Stokes solutions, model of fluid forces, and center of mass elevation.
Pesavento, Umberto; Wang, Z Jane
2004-10-01
We investigate the problem of falling paper by solving the two dimensional Navier-Stokes equations subject to the motion of a free-falling body at Reynolds numbers around 10(3). The aerodynamic lift on a tumbling plate is found to be dominated by the product of linear and angular velocities rather than velocity squared, as appropriate for an airfoil. This coupling between translation and rotation provides a mechanism for a brief elevation of center of mass near the cusplike turning points. The Navier-Stokes solutions further provide the missing quantity in the classical theory of lift, the instantaneous circulation, and suggest a revised model for the fluid forces.
Problem solving stages in the five square problem
Fedor, Anna; Szathmáry, Eörs; Öllinger, Michael
2015-01-01
According to the restructuring hypothesis, insight problem solving typically progresses through consecutive stages of search, impasse, insight, and search again for someone, who solves the task. The order of these stages was determined through self-reports of problem solvers and has never been verified behaviorally. We asked whether individual analysis of problem solving attempts of participants revealed the same order of problem solving stages as defined by the theory and whether their subjective feelings corresponded to the problem solving stages they were in. Our participants tried to solve the Five-Square problem in an online task, while we recorded the time and trajectory of their stick movements. After the task they were asked about their feelings related to insight and some of them also had the possibility of reporting impasse while working on the task. We found that the majority of participants did not follow the classic four-stage model of insight, but had more complex sequences of problem solving stages, with search and impasse recurring several times. This means that the classic four-stage model is not sufficient to describe variability on the individual level. We revised the classic model and we provide a new model that can generate all sequences found. Solvers reported insight more often than non-solvers and non-solvers reported impasse more often than solvers, as expected; but participants did not report impasse more often during behaviorally defined impasse stages than during other stages. This shows that impasse reports might be unreliable indicators of impasse. Our study highlights the importance of individual analysis of problem solving behavior to verify insight theory. PMID:26300794
Problem solving stages in the five square problem.
Fedor, Anna; Szathmáry, Eörs; Öllinger, Michael
2015-01-01
According to the restructuring hypothesis, insight problem solving typically progresses through consecutive stages of search, impasse, insight, and search again for someone, who solves the task. The order of these stages was determined through self-reports of problem solvers and has never been verified behaviorally. We asked whether individual analysis of problem solving attempts of participants revealed the same order of problem solving stages as defined by the theory and whether their subjective feelings corresponded to the problem solving stages they were in. Our participants tried to solve the Five-Square problem in an online task, while we recorded the time and trajectory of their stick movements. After the task they were asked about their feelings related to insight and some of them also had the possibility of reporting impasse while working on the task. We found that the majority of participants did not follow the classic four-stage model of insight, but had more complex sequences of problem solving stages, with search and impasse recurring several times. This means that the classic four-stage model is not sufficient to describe variability on the individual level. We revised the classic model and we provide a new model that can generate all sequences found. Solvers reported insight more often than non-solvers and non-solvers reported impasse more often than solvers, as expected; but participants did not report impasse more often during behaviorally defined impasse stages than during other stages. This shows that impasse reports might be unreliable indicators of impasse. Our study highlights the importance of individual analysis of problem solving behavior to verify insight theory.
From Freud to Feminist Personality Theory: Getting Here from There.
ERIC Educational Resources Information Center
Lerman, Hannah
Neither Freud's original theories nor modern revisions of psychoanalytic theory serve women well. Because assumptions about the inherent inferiority of women are embedded at the core of the structure of psychoanalytic theory, the theory cannot be adequately revised for women. A new theory is needed which would serve women's interests better.…
Relativistic timescale analysis suggests lunar theory revision
NASA Astrophysics Data System (ADS)
Deines, Steven D.; Williams, Carol A.
1995-05-01
The SI second of the atomic clock was calibrated to match the Ephemeris Time (ET) second in a mutual four year effort between the National Physical Laboratory (NPL) and the United States Naval Observatory (USNO). The ephemeris time is 'clocked' by observing the elapsed time it takes the Moon to cross two positions (usually occultation of stars relative to a position on Earth) and dividing that time span into the predicted seconds according to the lunar equations of motion. The last revision of the equations of motion was the Improved Lunar Ephemeris (ILE), which was based on E. W. Brown's lunar theory. Brown classically derived the lunar equations from a purely Newtonian gravity with no relativistic compensations. However, ET is very theory dependent and is affected by relativity, which was not included in the ILE. To investigate the relativistic effects, a new, noninertial metric for a gravitated, translationally accelerated and rotating reference frame has three sets of contributions, namely (1) Earth's velocity, (2) the static solar gravity field and (3) the centripetal acceleration from Earth's orbit. This last term can be characterized as a pseudogravitational acceleration. This metric predicts a time dilation calculated to be -0.787481 seconds in one year. The effect of this dilation would make the ET timescale run slower than had been originally determined. Interestingly, this value is within 2 percent of the average leap second insertion rate, which is the result of the divergence between International Atomic Time (TAI) and Earth's rotational time called Universal Time (UT or UTI). Because the predictions themselves are significant, regardless of the comparison to TAI and UT, the authors will be rederiving the lunar ephemeris model in the manner of Brown with the relativistic time dilation effects from the new metric to determine a revised, relativistic ephemeris timescale that could be used to determine UT free of leap second adjustments.
Relativistic timescale analysis suggests lunar theory revision
NASA Technical Reports Server (NTRS)
Deines, Steven D.; Williams, Carol A.
1995-01-01
The SI second of the atomic clock was calibrated to match the Ephemeris Time (ET) second in a mutual four year effort between the National Physical Laboratory (NPL) and the United States Naval Observatory (USNO). The ephemeris time is 'clocked' by observing the elapsed time it takes the Moon to cross two positions (usually occultation of stars relative to a position on Earth) and dividing that time span into the predicted seconds according to the lunar equations of motion. The last revision of the equations of motion was the Improved Lunar Ephemeris (ILE), which was based on E. W. Brown's lunar theory. Brown classically derived the lunar equations from a purely Newtonian gravity with no relativistic compensations. However, ET is very theory dependent and is affected by relativity, which was not included in the ILE. To investigate the relativistic effects, a new, noninertial metric for a gravitated, translationally accelerated and rotating reference frame has three sets of contributions, namely (1) Earth's velocity, (2) the static solar gravity field and (3) the centripetal acceleration from Earth's orbit. This last term can be characterized as a pseudogravitational acceleration. This metric predicts a time dilation calculated to be -0.787481 seconds in one year. The effect of this dilation would make the ET timescale run slower than had been originally determined. Interestingly, this value is within 2 percent of the average leap second insertion rate, which is the result of the divergence between International Atomic Time (TAI) and Earth's rotational time called Universal Time (UT or UTI). Because the predictions themselves are significant, regardless of the comparison to TAI and UT, the authors will be rederiving the lunar ephemeris model in the manner of Brown with the relativistic time dilation effects from the new metric to determine a revised, relativistic ephemeris timescale that could be used to determine UT free of leap second adjustments.
Jairath, Nalini N; Peden-McAlpine, Cynthia J; Sullivan, Mary C; Vessey, Judith A; Henly, Susan J
Articles from three landmark symposia on theory for nursing-published in Nursing Research in 1968-1969-served as a key underpinning for the development of nursing as an academic discipline. The current special issue on Theory and Theorizing in Nursing Science celebrates the 50th anniversary of publication of these seminal works in nursing theory. The purpose of this commentary is to consider the future of nursing theory development in light of articles published in the anniversary issue. The Editorial Team for the special issue identified core questions about continued nursing theory development, as related to the nursing metaparadigm, practice theory, big data, and doctoral education. Using a dialogue format, the editors discussed these core questions. The classic nursing metaparadigm (health, person, environment, nursing) was viewed as a continuing unifying element for the discipline but is in need of revision in today's scientific and practice climates. Practice theory and precision healthcare jointly arise from an emphasis on individualization. Big data and the methods of e-science are challenging the assumptions on which nursing theory development was originally based. Doctoral education for nursing scholarship requires changes to ensure that tomorrow's scholars are prepared to steward the discipline by advancing (not reifying) past approaches to nursing theory. Ongoing reexamination of theory is needed to clarify the domain of nursing, guide nursing science and practice, and direct and communicate the unique and essential contributions of nursing science to the broader health research effort and of nursing to healthcare.
ERIC Educational Resources Information Center
Raykov, Tenko; Marcoulides, George A.
2016-01-01
The frequently neglected and often misunderstood relationship between classical test theory and item response theory is discussed for the unidimensional case with binary measures and no guessing. It is pointed out that popular item response models can be directly obtained from classical test theory-based models by accounting for the discrete…
Fundamental theories of waves and particles formulated without classical mass
NASA Astrophysics Data System (ADS)
Fry, J. L.; Musielak, Z. E.
2010-12-01
Quantum and classical mechanics are two conceptually and mathematically different theories of physics, and yet they do use the same concept of classical mass that was originally introduced by Newton in his formulation of the laws of dynamics. In this paper, physical consequences of using the classical mass by both theories are explored, and a novel approach that allows formulating fundamental (Galilean invariant) theories of waves and particles without formally introducing the classical mass is presented. In this new formulation, the theories depend only on one common parameter called 'wave mass', which is deduced from experiments for selected elementary particles and for the classical mass of one kilogram. It is shown that quantum theory with the wave mass is independent of the Planck constant and that higher accuracy of performing calculations can be attained by such theory. Natural units in connection with the presented approach are also discussed and justification beyond dimensional analysis is given for the particular choice of such units.
The contrasting roles of Planck's constant in classical and quantum theories
NASA Astrophysics Data System (ADS)
Boyer, Timothy H.
2018-04-01
We trace the historical appearance of Planck's constant in physics, and we note that initially the constant did not appear in connection with quanta. Furthermore, we emphasize that Planck's constant can appear in both classical and quantum theories. In both theories, Planck's constant sets the scale of atomic phenomena. However, the roles played in the foundations of the theories are sharply different. In quantum theory, Planck's constant is crucial to the structure of the theory. On the other hand, in classical electrodynamics, Planck's constant is optional, since it appears only as the scale factor for the (homogeneous) source-free contribution to the general solution of Maxwell's equations. Since classical electrodynamics can be solved while taking the homogenous source-free contribution in the solution as zero or non-zero, there are naturally two different theories of classical electrodynamics, one in which Planck's constant is taken as zero and one where it is taken as non-zero. The textbooks of classical electromagnetism present only the version in which Planck's constant is taken to vanish.
Acoustic and Seismic Dispersion in Complex Fluids and Solids
NASA Astrophysics Data System (ADS)
Goddard, Joe
2017-04-01
The first part of the present paper is the continuation of a previous work [3] on the effects of higher spatial gradients and temporal relaxation on stress and heat flux in complex fluids. In particular, the general linear theory is applied to acoustic dispersion, extending a simpler model proposed by Davis and Brenner [2]. The theory is applied to a linearized version of the Chapman-Enskog fluid [1] valid to terms of Burnett order and including Maxwell-Cataneo relaxation of stress and heat flux on relaxation time scales τ. For this model, the dispersion relation k(ω) giving spatial wave number k as function of temporal frequency ω is a cubic in k2, in contrast to the quadratic in k2 given by the classical model and the recently proposed modification [2]. The cubic terms are shown to be important only for ωτ = O(1) where Maxwell-Cataneo relaxation is also important. As a second part of the present work, it is shown how the above model can also be applied to isotropic solids, where both shear and pressure waves are important. Finally, consideration is given to hyperstress in micro- polar continua, including both graded and micro-morphic varieties. [1]S. Chapman and T. Cowling. The mathematical theory of non-uniform gases. Cambridge University Press, [Cambridge, UK], 1960. [2]A. M.J. Davis and H. Brenner. Thermal and viscous effects on sound waves: revised classical theory. J. Acoust. Soc. Am., 132(5):2963-9, 2012. [3] J.D. Goddard. On material velocities and non-locality in the thermo-mechanics of continua. Int. J. Eng. Sci., 48(11):1279-88, 2010.
Taking-On: A Grounded Theory of Addressing Barriers in Task Completion
ERIC Educational Resources Information Center
Austinson, Julie Ann
2011-01-01
This study of taking-on was conducted using classical grounded theory methodology (Glaser, 1978, 1992, 1998, 2001, 2005; Glaser & Strauss, 1967). Classical grounded theory is inductive, empirical, and naturalistic; it does not utilize manipulation or constrained time frames. Classical grounded theory is a systemic research method used to generate…
Knowledge-Directed Theory Revision
NASA Astrophysics Data System (ADS)
Ali, Kamal; Leung, Kevin; Konik, Tolga; Choi, Dongkyu; Shapiro, Dan
Using domain knowledge to speed up learning is widely accepted but theory revision of such knowledge continues to use general syntactic operators. Using such operators for theory revision of teleoreactive logic programs is especially expensive in which proof of a top-level goal involves playing a game. In such contexts, one should have the option to complement general theory revision with domain-specific knowledge. Using American football as an example, we use Icarus' multi-agent teleoreactive logic programming ability to encode a coach agent whose concepts correspond to faults recognized in execution of the play and whose skills correspond to making repairs in the goals of the player agents. Our results show effective learning using as few as twenty examples. We also show that structural changes made by such revision can produce performance gains that cannot be matched by doing only numeric optimization.
On the classic and modern theories of matching.
McDowell, J J
2005-07-01
Classic matching theory, which is based on Herrnstein's (1961) original matching equation and includes the well-known quantitative law of effect, is almost certainly false. The theory is logically inconsistent with known experimental findings, and experiments have shown that its central constant-k assumption is not tenable. Modern matching theory, which is based on the power function version of the original matching equation, remains tenable, although it has not been discussed or studied extensively. The modern theory is logically consistent with known experimental findings, it predicts the fact and details of the violation of the classic theory's constant-k assumption, and it accurately describes at least some data that are inconsistent with the classic theory.
Classical Field Theory and the Stress-Energy Tensor
NASA Astrophysics Data System (ADS)
Swanson, Mark S.
2015-09-01
This book is a concise introduction to the key concepts of classical field theory for beginning graduate students and advanced undergraduate students who wish to study the unifying structures and physical insights provided by classical field theory without dealing with the additional complication of quantization. In that regard, there are many important aspects of field theory that can be understood without quantizing the fields. These include the action formulation, Galilean and relativistic invariance, traveling and standing waves, spin angular momentum, gauge invariance, subsidiary conditions, fluctuations, spinor and vector fields, conservation laws and symmetries, and the Higgs mechanism, all of which are often treated briefly in a course on quantum field theory. The variational form of classical mechanics and continuum field theory are both developed in the time-honored graduate level text by Goldstein et al (2001). An introduction to classical field theory from a somewhat different perspective is available in Soper (2008). Basic classical field theory is often treated in books on quantum field theory. Two excellent texts where this is done are Greiner and Reinhardt (1996) and Peskin and Schroeder (1995). Green's function techniques are presented in Arfken et al (2013).
A quantum-classical theory with nonlinear and stochastic dynamics
NASA Astrophysics Data System (ADS)
Burić, N.; Popović, D. B.; Radonjić, M.; Prvanović, S.
2014-12-01
The method of constrained dynamical systems on the quantum-classical phase space is utilized to develop a theory of quantum-classical hybrid systems. Effects of the classical degrees of freedom on the quantum part are modeled using an appropriate constraint, and the interaction also includes the effects of neglected degrees of freedom. Dynamical law of the theory is given in terms of nonlinear stochastic differential equations with Hamiltonian and gradient terms. The theory provides a successful dynamical description of the collapse during quantum measurement.
Better Instructional Design Theory: Process Improvement or Reengineering?
ERIC Educational Resources Information Center
Dick, Walter
1997-01-01
Discusses three ways that instructional design theories can change over time: (1) revision via evolution of models to reflect the outcomes that are being achieved with its current use; (2) revision to reflect current understanding of technology; and (3) complete replacement of present theory with another more powerful theory. Describes the…
NASA Astrophysics Data System (ADS)
Gao, Zhi-yu; Kang, Yu; Li, Yan-shuai; Meng, Chao; Pan, Tao
2018-04-01
Elevated-temperature flow behavior of a novel Ni-Cr-Mo-B ultra-heavy-plate steel was investigated by conducting hot compressive deformation tests on a Gleeble-3800 thermo-mechanical simulator at a temperature range of 1123 K–1423 K with a strain rate range from 0.01 s‑1 to10 s‑1 and a height reduction of 70%. Based on the experimental results, classic strain-compensated Arrhenius-type, a new revised strain-compensated Arrhenius-type and classic modified Johnson-Cook constitutive models were developed for predicting the high-temperature deformation behavior of the steel. The predictability of these models were comparatively evaluated in terms of statistical parameters including correlation coefficient (R), average absolute relative error (AARE), average root mean square error (RMSE), normalized mean bias error (NMBE) and relative error. The statistical results indicate that the new revised strain-compensated Arrhenius-type model could give prediction of elevated-temperature flow stress for the steel accurately under the entire process conditions. However, the predicted values by the classic modified Johnson-Cook model could not agree well with the experimental values, and the classic strain-compensated Arrhenius-type model could track the deformation behavior more accurately compared with the modified Johnson-Cook model, but less accurately with the new revised strain-compensated Arrhenius-type model. In addition, reasons of differences in predictability of these models were discussed in detail.
Midgley, Nicholas
2006-01-01
Psychoanalysts have long recognized the complex interaction between clinical data and formal psychoanalytic theories. While clinical data are often used to provide "evidence" for psychoanalytic paradigms, the theoretical model used by the analyst also structures what can and cannot be seen in the data. This delicate interaction between theory and clinical data can be seen in the history of interpretations of Freud's "Analysis of a Phobia in a Five-Year-Old Boy" ("Little Hans"). Freud's himself revised his reading of the case in 1926, after which a number of psychoanalysts--including Melanie Klein, Jacques Lacan, and John Bowlby--reinterpreted the case in the light of their particular models of the mind. These analysts each found "evidence" for their theoretical model within this classic case study, and in doing so they illuminated aspects of the case that had previously been obscured, while also revealing a great deal about the shifting preoccupations of psychoanalysis as a field.
From Wald to Savage: homo economicus becomes a Bayesian statistician.
Giocoli, Nicola
2013-01-01
Bayesian rationality is the paradigm of rational behavior in neoclassical economics. An economic agent is deemed rational when she maximizes her subjective expected utility and consistently revises her beliefs according to Bayes's rule. The paper raises the question of how, when and why this characterization of rationality came to be endorsed by mainstream economists. Though no definitive answer is provided, it is argued that the question is of great historiographic importance. The story begins with Abraham Wald's behaviorist approach to statistics and culminates with Leonard J. Savage's elaboration of subjective expected utility theory in his 1954 classic The Foundations of Statistics. The latter's acknowledged fiasco to achieve a reinterpretation of traditional inference techniques along subjectivist and behaviorist lines raises the puzzle of how a failed project in statistics could turn into such a big success in economics. Possible answers call into play the emphasis on consistency requirements in neoclassical theory and the impact of the postwar transformation of U.S. business schools. © 2012 Wiley Periodicals, Inc.
The Situation-Specific Theory of Heart Failure Self-Care: Revised and Updated.
Riegel, Barbara; Dickson, Victoria Vaughan; Faulkner, Kenneth M
2016-01-01
Since the situation-specific theory of heart failure (HF) self-care was published in 2008, we have learned much about how and why patients with HF take care of themselves. This knowledge was used to revise and update the theory. The purpose of this article was to describe the revised, updated situation-specific theory of HF self-care. Three major revisions were made to the existing theory: (1) a new theoretical concept reflecting the process of symptom perception was added; (2) each self-care process now involves both autonomous and consultative elements; and (3) a closer link between the self-care processes and the naturalistic decision-making process is described. In the revised theory, HF self-care is defined as a naturalistic decision-making process with person, problem, and environmental factors that influence the everyday decisions made by patients and the self-care actions taken. The first self-care process, maintenance, captures those behaviors typically referred to as treatment adherence. The second self-care process, symptom perception, involves body listening, monitoring signs, as well as recognition, interpretation, and labeling of symptoms. The third self-care process, management, is the response to symptoms when they occur. A total of 5 assumptions and 8 testable propositions are specified in this revised theory. Prior research illustrates that all 3 self-care processes (ie, maintenance, symptom perception, and management) are integral to self-care. Further research is greatly needed to identify how best to help patients become experts in HF self-care.
Generalized classical and quantum signal theories
NASA Astrophysics Data System (ADS)
Rundblad, E.; Labunets, V.; Novak, P.
2005-05-01
In this paper we develop two topics and show their inter- and cross-relation. The first centers on general notions of the generalized classical signal theory on finite Abelian hypergroups. The second concerns the generalized quantum hyperharmonic analysis of quantum signals (Hermitean operators associated with classical signals). We study classical and quantum generalized convolution hypergroup algebras of classical and quantum signals.
All in the Family: A Belated Response to Knudson-Martin's Feminist Revision of Bowen Theory
ERIC Educational Resources Information Center
Horne, K. Blake; Hicks, Mary W.
2002-01-01
The first formal attempt at revising Bowen theory within the marriage and family therapy literature is represented in the work of Knudson-Martin (1994). Claiming that several of the theory's concepts are defined at odds with female development, Knudson-Martin (1994) reconceptualizes and expands Bowen theory to rectify these perceived shortcomings.…
Agnati, L F; Fuxe, K; Baluska, F; Guidolin, D
2009-08-01
Recently a revision of the cell theory has been proposed, which has several implications both for physiology and pathology. This revision is founded on adapting the old Julius von Sach's proposal (1892) of the Energide as the fundamental universal unit of eukaryotic life. This view maintains that, in most instances, the living unit is the symbiotic assemblage of the cell periphery complex organized around the plasma membrane, some peripheral semi-autonomous cytosol organelles (as mitochondria and plastids, which may be or not be present), and of the Energide (formed by the nucleus, microtubules, and other satellite structures). A fundamental aspect is the proposal that the Energide plays a pivotal and organizing role of the entire symbiotic assemblage (see Appendix 1). The present paper discusses how the Energide paradigm implies a revision of the concept of the internal milieu. As a matter of fact, the Energide interacts with the cytoplasm that, in turn, interacts with the interstitial fluid, and hence with the medium that has been, classically, known as the internal milieu. Some implications of this aspect have been also presented with the help of a computational model in a mathematical Appendix 2 to the paper. Finally, relevances of the Energide concept for the information handling in the central nervous system are discussed especially in relation to the inter-Energide exchange of information.
Introduction to Classical Density Functional Theory by a Computational Experiment
ERIC Educational Resources Information Center
Jeanmairet, Guillaume; Levy, Nicolas; Levesque, Maximilien; Borgis, Daniel
2014-01-01
We propose an in silico experiment to introduce the classical density functional theory (cDFT). Density functional theories, whether quantum or classical, rely on abstract concepts that are nonintuitive; however, they are at the heart of powerful tools and active fields of research in both physics and chemistry. They led to the 1998 Nobel Prize in…
Design Equations and Criteria of Orthotropic Composite Panels
2013-05-01
33 Appendix A Classical Laminate Theory ( CLT ): ....................................................................... A–1 Appendix...Science London , 1990. NSWCCD-65-TR–2004/16A A–1 Appendix A Classical Laminate Theory ( CLT ): In Section 6 of this report, preliminary design...determined using: Classical Laminate Theory, CLT , to Predict Equivalent Stiffness Characteristics, First- Ply Strength Note: CLT is valid for
k-Cosymplectic Classical Field Theories: Tulczyjew and Skinner-Rusk Formulations
NASA Astrophysics Data System (ADS)
Rey, Angel M.; Román-Roy, Narciso; Salgado, Modesto; Vilariño, Silvia
2012-06-01
The k-cosymplectic Lagrangian and Hamiltonian formalisms of first-order classical field theories are reviewed and completed. In particular, they are stated for singular and almost-regular systems. Subsequently, several alternative formulations for k-cosymplectic first-order field theories are developed: First, generalizing the construction of Tulczyjew for mechanics, we give a new interpretation of the classical field equations. Second, the Lagrangian and Hamiltonian formalisms are unified by giving an extension of the Skinner-Rusk formulation on classical mechanics.
NASA Astrophysics Data System (ADS)
Frodl, Peter
Von den Anfängen der Quantenmechanik bis heute gibt es Versuche, sie als statistische Theorie über Ensembles individueller klassischer Systeme zu interpretieren. Die Bedingungen, unter denen Theorien verborgener Parameter zu deterministischen Beschreibungen dieser individuellen Systeme als klassisch angesehen werden können, wurden von Einstein, Podolsky und Rosen 1935 formuliert: 1. Physikalische Systeme sind im Prinzip separierbar. 2. Zu jeder physikalischen Größe, deren Wert man ohne Störung des betrachteten Systems mit Sicherheit voraussagen kann, existiert ein ihr entsprechendes Element der physikalischen Realität.Zusammen sind sie, wie Bell 1964 gezeigt hat, prinzipiell unverträglich mit der Quantenmechanik und unhaltbar angesichts neuerer Experimente. Diese erweisen einmal mehr die Quantenmechanik als richtige Theorie. Um ihre Ergebnisse zu verstehen, müssen wir entweder die in der klassischen Physik als selbstverständlich angesehene Annahme der Separierbarkeit physikalischer Systeme aufgeben oder unseren Begriff der physikalischen Realität revidieren. Eine Untersuchung des Begriffs der Separabilität und einige Überlegungen zum Problem der Messung von Observablen zeigen, daß eine Änderung des Begriffs der physikalischen Realität unumgänglich ist. Der revidierte Realitätsbegriff sollte mit klassischer Physik und Quantenmechanik verträglich sein, um ein einheitliches Weltbild zu ermöglichen.Translated AbstractDo Quantum Mechanics Force us to Drastically Change our View of the World? Thoughts and Experiments after Einstein, Podolsky and RosenSince the advent of quantum mechanics there have been attempts of its interpretation in terms of statistical theory concerning individual classical systems. The very conditions necessary to consider hidden variable theories describing these individual systems as classical had been pointed out by Einstein, Podolsky and Rosen in 1935: 1. Physical systems are in principle separable. 2. If it is possible to predict with certainty the value of a physical quantity without disturbing the system under consideration, then there exists an element of physical reality corresponding to this physical quantity.
Model-Selection Theory: The Need for a More Nuanced Picture of Use-Novelty and Double-Counting
Steele, Katie; Werndl, Charlotte
2018-01-01
Abstract This article argues that common intuitions regarding (a) the specialness of ‘use-novel’ data for confirmation and (b) that this specialness implies the ‘no-double-counting rule’, which says that data used in ‘constructing’ (calibrating) a model cannot also play a role in confirming the model’s predictions, are too crude. The intuitions in question are pertinent in all the sciences, but we appeal to a climate science case study to illustrate what is at stake. Our strategy is to analyse the intuitive claims in light of prominent accounts of confirmation of model predictions. We show that on the Bayesian account of confirmation, and also on the standard classical hypothesis-testing account, claims (a) and (b) are not generally true; but for some select cases, it is possible to distinguish data used for calibration from use-novel data, where only the latter confirm. The more specialized classical model-selection methods, on the other hand, uphold a nuanced version of claim (a), but this comes apart from (b), which must be rejected in favour of a more refined account of the relationship between calibration and confirmation. Thus, depending on the framework of confirmation, either the scope or the simplicity of the intuitive position must be revised. 1 Introduction2 A Climate Case Study3 The Bayesian Method vis-à-vis Intuitions4 Classical Tests vis-à-vis Intuitions5 Classical Model-Selection Methods vis-à-vis Intuitions 5.1 Introducing classical model-selection methods 5.2 Two cases6 Re-examining Our Case Study7 Conclusion PMID:29780170
Model-Selection Theory: The Need for a More Nuanced Picture of Use-Novelty and Double-Counting.
Steele, Katie; Werndl, Charlotte
2018-06-01
This article argues that common intuitions regarding (a) the specialness of 'use-novel' data for confirmation and (b) that this specialness implies the 'no-double-counting rule', which says that data used in 'constructing' (calibrating) a model cannot also play a role in confirming the model's predictions, are too crude. The intuitions in question are pertinent in all the sciences, but we appeal to a climate science case study to illustrate what is at stake. Our strategy is to analyse the intuitive claims in light of prominent accounts of confirmation of model predictions. We show that on the Bayesian account of confirmation, and also on the standard classical hypothesis-testing account, claims (a) and (b) are not generally true; but for some select cases, it is possible to distinguish data used for calibration from use-novel data, where only the latter confirm. The more specialized classical model-selection methods, on the other hand, uphold a nuanced version of claim (a), but this comes apart from (b), which must be rejected in favour of a more refined account of the relationship between calibration and confirmation. Thus, depending on the framework of confirmation, either the scope or the simplicity of the intuitive position must be revised. 1 Introduction 2 A Climate Case Study 3 The Bayesian Method vis-à-vis Intuitions 4 Classical Tests vis-à-vis Intuitions 5 Classical Model-Selection Methods vis-à-vis Intuitions 5.1 Introducing classical model-selection methods 5.2 Two cases 6 Re-examining Our Case Study 7 Conclusion .
Melanie Klein and Repression: an examination of some unpublished Notes of 1934.
Hinshelwood, R D
2006-01-01
Fifteen pages of unpublished Notes were found in the Melanie Klein Archives dating from early 1934, a crucial moment in Klein's development. She was at this time, 1934, moving away from child analysis, whilst also rethinking and revising her allegiance to Karl Abraham's theory of the phases of libidinal development. These Notes, entitled "Early Repression Mechanism," show Klein struggling to develop what became her characteristic theories of the depressive position and the paranoid-schizoid position. Although these Notes are precursors of the paper Klein gave later to the IPA Congress in 1934, they also show the origins of the emphasis she and her followers eventually gave to "splitting" rather than repression. The Notes give us an insight into the way that she worked clinically at the time. We see Klein's confidence develop as she diverged from the classical theories and technique. Her ideas were based on close attention to the detail of her clinical material, rather than attacking theoretical problems directly. The Notes show her method of struggling to her own conclusions, and they offer us a chance to grasp the roots of the subsequent controversy over Kleinian thought.
Generalized probability theories: what determines the structure of quantum theory?
NASA Astrophysics Data System (ADS)
Janotta, Peter; Hinrichsen, Haye
2014-08-01
The framework of generalized probabilistic theories is a powerful tool for studying the foundations of quantum physics. It provides the basis for a variety of recent findings that significantly improve our understanding of the rich physical structure of quantum theory. This review paper tries to present the framework and recent results to a broader readership in an accessible manner. To achieve this, we follow a constructive approach. Starting from a few basic physically motivated assumptions we show how a given set of observations can be manifested in an operational theory. Furthermore, we characterize consistency conditions limiting the range of possible extensions. In this framework classical and quantum theory appear as special cases, and the aim is to understand what distinguishes quantum mechanics as the fundamental theory realized in nature. It turns out that non-classical features of single systems can equivalently result from higher-dimensional classical theories that have been restricted. Entanglement and non-locality, however, are shown to be genuine non-classical features.
The Process and Effects of Mass Communication. Revised Edition.
ERIC Educational Resources Information Center
Schramm, Wilbur, Ed.; Roberts, Donald F., Ed.
Composed of a mixture of old classics, new classics, reports on state of the art in important areas, and speculations about the future, this second edition of the reader in communication research provides an introduction to questions about how communication works and what it does. Papers by prominent researchers and writers in the field comprise…
de Bock, Élodie; Hardouin, Jean-Benoit; Blanchin, Myriam; Le Neel, Tanguy; Kubis, Gildas; Bonnaud-Antignac, Angélique; Dantan, Étienne; Sébille, Véronique
2016-10-01
The objective was to compare classical test theory and Rasch-family models derived from item response theory for the analysis of longitudinal patient-reported outcomes data with possibly informative intermittent missing items. A simulation study was performed in order to assess and compare the performance of classical test theory and Rasch model in terms of bias, control of the type I error and power of the test of time effect. The type I error was controlled for classical test theory and Rasch model whether data were complete or some items were missing. Both methods were unbiased and displayed similar power with complete data. When items were missing, Rasch model remained unbiased and displayed higher power than classical test theory. Rasch model performed better than the classical test theory approach regarding the analysis of longitudinal patient-reported outcomes with possibly informative intermittent missing items mainly for power. This study highlights the interest of Rasch-based models in clinical research and epidemiology for the analysis of incomplete patient-reported outcomes data. © The Author(s) 2013.
Telling and Not-Telling: A Classic Grounded Theory of Sharing Life-Stories
ERIC Educational Resources Information Center
Powers, Trudy Lee
2013-01-01
This study of "Telling and Not-Telling" was conducted using the classic grounded theory methodology (Glaser 1978, 1992, 1998; Glaser & Strauss, 1967). This unique methodology systematically and inductively generates conceptual theories from data. The goal is to discover theory that explains, predicts, and provides practical…
ERIC Educational Resources Information Center
David, Jane L.; Cuban, Larry
2010-01-01
"Cutting Through the Hype: The Essential Guide to School Reform" is a revised, expanded, and updated version of the classic work by Jane L. David and Larry Cuban. It offers balanced analyses of 23 currently popular school reform strategies, from teacher performance pay and putting mayors in charge to turnaround schools and data-driven instruction.…
Towards classical spectrum generating algebras for f-deformations
NASA Astrophysics Data System (ADS)
Kullock, Ricardo; Latini, Danilo
2016-01-01
In this paper we revise the classical analog of f-oscillators, a generalization of q-oscillators given in Man'ko et al. (1997) [8], in the framework of classical spectrum generating algebras (SGA) introduced in Kuru and Negro (2008) [9]. We write down the deformed Poisson algebra characterizing the entire family of non-linear oscillators and construct its general solution algebraically. The latter, covering the full range of f-deformations, shows an energy dependence both in the amplitude and the frequency of the motion.
Risitano, Salvatore; Sabatini, Luigi; Atzori, Francesco; Massè, Alessandro; Indelli, Pier Francesco
2018-06-01
Periprosthetic joint infection (PJI) is a serious complication in total knee arthroplasty (TKA) and represents one of the most common causes of revision. The challenge for surgeons treating an infected TKA is to quickly obtain an infection-free joint in order to re-implant, when possible, a new TKA. Recent literature confirms the role of local antibiotic-loaded beads as a strong bactericidal, allowing higher antibiotic elution when compared with antibiotic loaded spacers only. Unfortunately, classical Polymethylmethacrylate (PMMA) beads might allow bacteria adhesion, secondary development of antibiotic resistance and eventually surgical removal once antibiotics have eluted. This article describes a novel surgical technique using static, custom-made antibiotic loaded spacers augmented by calcium sulphate antibiotic-impregnated beads to improve the success rate of revision TKA in a setting of PJI. The use of calcium sulphate beads has several potential benefits, including a longer sustained local antibiotic release when compared with classical PMMA beads and, being resorbable, not requiring accessory surgical interventions.
Finite-block-length analysis in classical and quantum information theory.
Hayashi, Masahito
2017-01-01
Coding technology is used in several information processing tasks. In particular, when noise during transmission disturbs communications, coding technology is employed to protect the information. However, there are two types of coding technology: coding in classical information theory and coding in quantum information theory. Although the physical media used to transmit information ultimately obey quantum mechanics, we need to choose the type of coding depending on the kind of information device, classical or quantum, that is being used. In both branches of information theory, there are many elegant theoretical results under the ideal assumption that an infinitely large system is available. In a realistic situation, we need to account for finite size effects. The present paper reviews finite size effects in classical and quantum information theory with respect to various topics, including applied aspects.
Finite-block-length analysis in classical and quantum information theory
HAYASHI, Masahito
2017-01-01
Coding technology is used in several information processing tasks. In particular, when noise during transmission disturbs communications, coding technology is employed to protect the information. However, there are two types of coding technology: coding in classical information theory and coding in quantum information theory. Although the physical media used to transmit information ultimately obey quantum mechanics, we need to choose the type of coding depending on the kind of information device, classical or quantum, that is being used. In both branches of information theory, there are many elegant theoretical results under the ideal assumption that an infinitely large system is available. In a realistic situation, we need to account for finite size effects. The present paper reviews finite size effects in classical and quantum information theory with respect to various topics, including applied aspects. PMID:28302962
Koopman-von Neumann formulation of classical Yang-Mills theories: I
NASA Astrophysics Data System (ADS)
Carta, P.; Gozzi, E.; Mauro, D.
2006-03-01
In this paper we present the Koopman-von Neumann (KvN) formulation of classical non-Abelian gauge field theories. In particular we shall explore the functional (or classical path integral) counterpart of the KvN method. In the quantum path integral quantization of Yang-Mills theories concepts like gauge-fixing and Faddeev-Popov determinant appear in a quite natural way. We will prove that these same objects are needed also in this classical path integral formulation for Yang-Mills theories. We shall also explore the classical path integral counterpart of the BFV formalism and build all the associated universal and gauge charges. These last are quite different from the analog quantum ones and we shall show the relation between the two. This paper lays the foundation of this formalism which, due to the many auxiliary fields present, is rather heavy. Applications to specific topics outlined in the paper will appear in later publications.
a Classical Isodual Theory of Antimatter and its Prediction of Antigravity
NASA Astrophysics Data System (ADS)
Santilli, Ruggero Maria
An inspection of the contemporary physics literature reveals that, while matter is treated at all levels of study, from Newtonian mechanics to quantum field theory, antimatter is solely treated at the level of second quantization. For the purpose of initiating the restoration of full equivalence in the treatment of matter and antimatter in due time, and as the classical foundations of an axiomatically consistent inclusion of gravitation in unified gauge theories recently appeared elsewhere, in this paper we present a classical representation of antimatter which begins at the primitive Newtonian level with corresponding formulations at all subsequent levels. By recalling that charge conjugation of particles into antiparticles is antiautomorphic, the proposed theory of antimatter is based on a new map, called isoduality, which is also antiautomorphic (and more generally, antiisomorphic), yet it is applicable beginning at the classical level and then persists at the quantum level where it becomes equivalent to charge conjugation. We therefore present, apparently for the first time, the classical isodual theory of antimatter, we identify the physical foundations of the theory as being the novel isodual Galilean, special and general relativities, and we show the compatibility of the theory with all available classical experimental data on antimatter. We identify the classical foundations of the prediction of antigravity for antimatter in the field of matter (or vice-versa) without any claim on its validity, and defer its resolution to specifically identified experiments. We identify the novel, classical, isodual electromagnetic waves which are predicted to be emitted by antimatter, the so-called space-time machine based on a novel non-Newtonian geometric propulsion, and other implications of the theory. We also introduce, apparently for the first time, the isodual space and time inversions and show that they are nontrivially different than the conventional ones, thus offering a possibility for the future resolution whether far away galaxies and quasars are made up of matter or of antimatter. The paper ends with the indication that the studies are at their first infancy, and indicates some of the open problems. To avoid a prohibitive length, the paper is restricted to the classical treatment, while studies on operator profiles are treated elsewhere.
Classical Mechanics: A Modern Introduction
NASA Astrophysics Data System (ADS)
McCall, Martin W.
2000-12-01
Classical Mechanics is a clear introduction to the subject, combining a user-friendly style with an authoritative approach, whilst requiring minimal prerequisite mathematics - only elementary calculus and simple vectors are presumed. The text starts with a careful look at Newton's Laws, before applying them in one dimension to oscillations and collisions. More advanced applications - including gravitational orbits, rigid body dynamics and mechanics in rotating frames - are deferred until after the limitations of Newton's inertial frames have been highlighted through an exposition of Einstein's Special Relativity. The examples given throughout are often unusual for an elementary text, although they are made accessible through discussion and diagrams. Complete revision summaries are given at the end of each chapter, together with problems designed to be both illustrative and challenging. Features: * Comprehensive introduction to classical mechanics and relativity * Many novel examples, e.g. stability of the universe, falling cats, crickets bats and snooker * Includes many problems with numerical answers * Revision notes at the end of each chapter
ERIC Educational Resources Information Center
Gee, James Paul
2007-01-01
The author begins his classic book with "I want to talk about video games--yes, even violent video games--and say some positive things about them." With this simple but explosive statement, one of America's most well-respected educators looks seriously at the good that can come from playing video games. In this revised edition, new games like…
Dragesund, Tove; Strand, Liv Inger; Grotle, Margreth
2018-02-01
The Body Awareness Rating Questionnaire (BARQ) is a self-report questionnaire aimed at capturing how people with long-lasting musculoskeletal pain reflect on their own body awareness. Methods based on classical test theory were applied to the development of the instrument and resulted in 4 subscales. However, the scales were not correlated, and construct validity might be questioned. The primary purpose of this study was to explore the possibility of developing a unidimensional scale from items initially collected for the BARQ using Rasch analysis. A secondary purpose was to investigate the test-retest reliability of a revised version of the BARQ. This was a methodological study. Rasch and reliability analyses were performed for 3 samples of participants with long-lasting musculoskeletal pain. The first Rasch analysis was carried out on 66 items generated for the original BARQ and scored by 300 participants. The items supported by the first analysis were scored by a new group of 127 participants and analyzed in a second Rasch analysis. For the test-retest reliability analysis, 48 participants scored the revised BARQ items twice within 1 week. The 2-step Rasch analysis resulted in a unidimensional 12-item revised version of the BARQ with a 4-point response scale (scores from 0 to 36). It showed a good fit to the Rasch model, with acceptable internal consistency, satisfactory fit residuals, and no disordered thresholds. Test-retest reliability was high, with an intraclass correlation coefficient of .83 (95% CI = .71-.89) and a smallest detectable change of 6.3 points. The small sample size in the second Rasch analysis was a study limitation. The revised BARQ is a unidimensional and feasible measurement of body awareness, recommended for use in the context of body-mind physical therapy approaches for musculoskeletal conditions. © 2017 American Physical Therapy Association
The Prediction of Item Parameters Based on Classical Test Theory and Latent Trait Theory
ERIC Educational Resources Information Center
Anil, Duygu
2008-01-01
In this study, the prediction power of the item characteristics based on the experts' predictions on conditions try-out practices cannot be applied was examined for item characteristics computed depending on classical test theory and two-parameters logistic model of latent trait theory. The study was carried out on 9914 randomly selected students…
Any Ontological Model of the Single Qubit Stabilizer Formalism must be Contextual
NASA Astrophysics Data System (ADS)
Lillystone, Piers; Wallman, Joel J.
Quantum computers allow us to easily solve some problems classical computers find hard. Non-classical improvements in computational power should be due to some non-classical property of quantum theory. Contextuality, a more general notion of non-locality, is a necessary, but not sufficient, resource for quantum speed-up. Proofs of contextuality can be constructed for the classically simulable stabilizer formalism. Previous proofs of stabilizer contextuality are known for 2 or more qubits, for example the Mermin-Peres magic square. In the work presented we extend these results and prove that any ontological model of the single qubit stabilizer theory must be contextual, as defined by R. Spekkens, and give a relation between our result and the Mermin-Peres square. By demonstrating that contextuality is present in the qubit stabilizer formalism we provide further insight into the contextuality present in quantum theory. Understanding the contextuality of classical sub-theories will allow us to better identify the physical properties of quantum theory required for computational speed up. This research was supported by CIFAR, the Government of Ontario, and the Government of Canada through NSERC and Industry Canada.
NASA Astrophysics Data System (ADS)
Oblow, E. M.
1982-10-01
An evaluation was made of the mathematical and economic basis for conversion processes in the Long-term Energy Analysis Program (LEAP) energy economy model. Conversion processes are the main modeling subunit in LEAP used to represent energy conversion industries and are supposedly based on the classical economic theory of the firm. Questions about uniqueness and existence of LEAP solutions and their relation to classical equilibrium economic theory prompted the study. An analysis of classical theory and LEAP model equations was made to determine their exact relationship. The conclusions drawn from this analysis were that LEAP theory is not consistent with the classical theory of the firm. Specifically, the capacity factor formalism used by LEAP does not support a classical interpretation in terms of a technological production function for energy conversion processes. The economic implications of this inconsistency are suboptimal process operation and short term negative profits in years where plant operation should be terminated. A new capacity factor formalism, which retains the behavioral features of the original model, is proposed to resolve these discrepancies.
ERIC Educational Resources Information Center
Lange, Elizabeth
2015-01-01
This article argues that sociology has been a foundational discipline for the field of adult education, but it has been largely implicit, until recently. This article contextualizes classical theories of sociology within contemporary critiques, reviews the historical roots of sociology and then briefly introduces the classical theories…
Stott, Clifford; Drury, John
2016-04-01
This article explores the origins and ideology of classical crowd psychology, a body of theory reflected in contemporary popularised understandings such as of the 2011 English 'riots'. This article argues that during the nineteenth century, the crowd came to symbolise a fear of 'mass society' and that 'classical' crowd psychology was a product of these fears. Classical crowd psychology pathologised, reified and decontextualised the crowd, offering the ruling elites a perceived opportunity to control it. We contend that classical theory misrepresents crowd psychology and survives in contemporary understanding because it is ideological. We conclude by discussing how classical theory has been supplanted in academic contexts by an identity-based crowd psychology that restores the meaning to crowd action, replaces it in its social context and in so doing transforms theoretical understanding of 'riots' and the nature of the self. © The Author(s) 2016.
An empirical test of Rogers' original and revised theory of correlates in adolescents.
Yarcheski, A; Mahon, N E
1991-12-01
The purpose of this study was to examine Rogers' original and revised theory of correlates in adolescents. The correlates were measured by Perceived Field Motion, Human Field Rhythms, Creativity, Sentience, Fast Tempo, and Waking Periods. The original theory was tested with data obtained from samples of early (n = 116), middle (n = 116), and late (n = 116) adolescents. The revised theory was tested in a fourth selectively combined sample of adolescents, aged 12 to 21 (n = 89). Data were collected in classroom settings. Although the findings did not support either theory, they did indicate that: (1) four of the six correlates studied performed as correlates when examined in three discrete phases of adolescence, as determined by chronological age, (2) the means of the individual correlates increased slightly in frequency levels developmentally, and (3) the correlates emerged at different frequency levels when examined in adolescents, aged 12 to 21.
Influence of an asymmetric ring on the modeling of an orthogonally stiffened cylindrical shell
NASA Technical Reports Server (NTRS)
Rastogi, Naveen; Johnson, Eric R.
1994-01-01
Structural models are examined for the influence of a ring with an asymmetrical cross section on the linear elastic response of an orthogonally stiffened cylindrical shell subjected to internal pressure. The first structural model employs classical theory for the shell and stiffeners. The second model employs transverse shear deformation theories for the shell and stringer and classical theory for the ring. Closed-end pressure vessel effects are included. Interacting line load intensities are computed in the stiffener-to-skin joints for an example problem having the dimensions of the fuselage of a large transport aircraft. Classical structural theory is found to exaggerate the asymmetric response compared to the transverse shear deformation theory.
Interacting charges and the classical electron radius
NASA Astrophysics Data System (ADS)
De Luca, Roberto; Di Mauro, Marco; Faella, Orazio; Naddeo, Adele
2018-03-01
The equation of the motion of a point charge q repelled by a fixed point-like charge Q is derived and studied. In solving this problem useful concepts in classical and relativistic kinematics, in Newtonian mechanics and in non-linear ordinary differential equations are revised. The validity of the approximations is discussed from the physical point of view. In particular the classical electron radius emerges naturally from the requirement that the initial distance is large enough for the non-relativistic approximation to be valid. The relevance of this topic for undergraduate physics teaching is pointed out.
Personalisation: The Emerging "Revised" Code of Education?
ERIC Educational Resources Information Center
Hartley, David
2007-01-01
In England, a "revised" educational code appears to be emerging. It centres upon the concept of "personalisation". Its basis is less in educational theory, more in contemporary marketing theory. Personalisation can be regarded in two ways. First, it provides the rationale for a new mode of public-service delivery, one which…
ERIC Educational Resources Information Center
Magno, Carlo
2009-01-01
The present report demonstrates the difference between classical test theory (CTT) and item response theory (IRT) approach using an actual test data for chemistry junior high school students. The CTT and IRT were compared across two samples and two forms of test on their item difficulty, internal consistency, and measurement errors. The specific…
ERIC Educational Resources Information Center
Guler, Nese; Gelbal, Selahattin
2010-01-01
In this study, the Classical test theory and generalizability theory were used for determination to reliability of scores obtained from measurement tool of mathematics success. 24 open-ended mathematics question of the TIMSS-1999 was applied to 203 students in 2007-spring semester. Internal consistency of scores was found as 0.92. For…
ERIC Educational Resources Information Center
Kohli, Nidhi; Koran, Jennifer; Henn, Lisa
2015-01-01
There are well-defined theoretical differences between the classical test theory (CTT) and item response theory (IRT) frameworks. It is understood that in the CTT framework, person and item statistics are test- and sample-dependent. This is not the perception with IRT. For this reason, the IRT framework is considered to be theoretically superior…
Cappelleri, Joseph C; Jason Lundy, J; Hays, Ron D
2014-05-01
The US Food and Drug Administration's guidance for industry document on patient-reported outcomes (PRO) defines content validity as "the extent to which the instrument measures the concept of interest" (FDA, 2009, p. 12). According to Strauss and Smith (2009), construct validity "is now generally viewed as a unifying form of validity for psychological measurements, subsuming both content and criterion validity" (p. 7). Hence, both qualitative and quantitative information are essential in evaluating the validity of measures. We review classical test theory and item response theory (IRT) approaches to evaluating PRO measures, including frequency of responses to each category of the items in a multi-item scale, the distribution of scale scores, floor and ceiling effects, the relationship between item response options and the total score, and the extent to which hypothesized "difficulty" (severity) order of items is represented by observed responses. If a researcher has few qualitative data and wants to get preliminary information about the content validity of the instrument, then descriptive assessments using classical test theory should be the first step. As the sample size grows during subsequent stages of instrument development, confidence in the numerical estimates from Rasch and other IRT models (as well as those of classical test theory) would also grow. Classical test theory and IRT can be useful in providing a quantitative assessment of items and scales during the content-validity phase of PRO-measure development. Depending on the particular type of measure and the specific circumstances, the classical test theory and/or the IRT should be considered to help maximize the content validity of PRO measures. Copyright © 2014 Elsevier HS Journals, Inc. All rights reserved.
Constrained variational calculus for higher order classical field theories
NASA Astrophysics Data System (ADS)
Campos, Cédric M.; de León, Manuel; Martín de Diego, David
2010-11-01
We develop an intrinsic geometrical setting for higher order constrained field theories. As a main tool we use an appropriate generalization of the classical Skinner-Rusk formalism. Some examples of applications are studied, in particular to the geometrical description of optimal control theory for partial differential equations.
Chance, determinism and the classical theory of probability.
Vasudevan, Anubav
2018-02-01
This paper situates the metaphysical antinomy between chance and determinism in the historical context of some of the earliest developments in the mathematical theory of probability. Since Hacking's seminal work on the subject, it has been a widely held view that the classical theorists of probability were guilty of an unwitting equivocation between a subjective, or epistemic, interpretation of probability, on the one hand, and an objective, or statistical, interpretation, on the other. While there is some truth to this account, I argue that the tension at the heart of the classical theory of probability is not best understood in terms of the duality between subjective and objective interpretations of probability. Rather, the apparent paradox of chance and determinism, when viewed through the lens of the classical theory of probability, manifests itself in a much deeper ambivalence on the part of the classical probabilists as to the rational commensurability of causal and probabilistic reasoning. Copyright © 2017 Elsevier Ltd. All rights reserved.
Maintenance of Summer Monsoon Circulations: A Planetary-Scale Perspective.
NASA Astrophysics Data System (ADS)
Chen, Tsing-Chang
2003-06-01
The monsoon circulation, which is generally considered to be driven by the landmass-ocean thermal contrast, like a gigantic land-sea breeze circulation, exhibits a phase reversal in its vertical structure; a monsoon high aloft over a continental thermal low is juxtaposed with a midoceanic trough underlaid by an oceanic anticyclone. This classic monsoon circulation model is well matched by the monsoon circulation depicted with the observational data prior to the First Global Atmospheric Research Program (GARP) Global Experiment (FGGE). However, synthesizing findings of the global circulation portrayed with the post-FGGE data, it was found that some basic features of major monsoon circulations in Asia, North America, South America, and Australia differ from those of the classic monsoon circulation model. Therefore, a revision of the classic monsoon theory is suggested. With four different wave regimes selected to fit the horizontal dimensions of these monsoon circulations, basic features common to all four major monsoons are illustrated in terms of diagnostic analyses of the velocity potential maintenance equation (which relates diabatic heating and velocity potential) and the streamfunction budget (which links velocity potential and streamfunction) in these wave regimes. It is shown that a monsoon circulation is actually driven by the east-west differential heating and maintained dynamically by a balance between a vorticity source and advection. This dynamic balance is reflected by a spatial quadrature relationship between the monsoon divergent circulation and the monsoon high (low) at upper (lower) levels.
Atomic Scale Imaging of Nucleation and Growth Trajectories of an Interfacial Bismuth Nanodroplet.
Li, Yingxuan; Bunes, Benjamin R; Zang, Ling; Zhao, Jie; Li, Yan; Zhu, Yunqing; Wang, Chuanyi
2016-02-23
Because of the lack of experimental evidence, much confusion still exists on the nucleation and growth dynamics of a nanostructure, particularly of metal. The situation is even worse for nanodroplets because it is more difficult to induce the formation of a nanodroplet while imaging the dynamic process with atomic resolution. Here, taking advantage of an electron beam to induce the growth of Bi nanodroplets on a SrBi2Ta2O9 platelet under a high resolution transmission electron microscope (HRTEM), we directly observed the detailed growth pathways of Bi nanodroplets from the earliest stage of nucleation that were previously inaccessible. Atomic scale imaging reveals that the dynamics of nucleation involves a much more complex trajectory than previously predicted based on classical nucleation theory (CNT). The monatomic Bi layer was first formed in the nucleation process, which induced the formation of the prenucleated clusters. Following that, critical nuclei for the nanodroplets formed both directly from the addition of atoms to the prenucleated clusters by the classical growth process and indirectly through transformation of an intermediate liquid film based on the Stranski-Krastanov growth mode, in which the liquid film was induced by the self-assembly of the prenucleated clusters. Finally, the growth of the Bi nanodroplets advanced through the classical pathway and sudden droplet coalescence. This study allows us to visualize the critical steps in the nucleation process of an interfacial nanodroplet, which suggests a revision of the perspective of CNT.
Hojaij, C R
1984-12-01
Organic Brain Syndrome (OBS) is an expression finding in the Diagnostic and Statistical Manual of Mental Disorders belonging to the great chapter of Organic Mental Disorders. With this meaning, it has been used in psychiatric centers outside the United States. Beginning with a lecture of the major aspects of the OBS, a critical revision is formulated under methodological and conceptual views of psychopathology. For that, classic authors are revised from Bonhoeffer to Weitbrecht.
Dressing the post-Newtonian two-body problem and classical effective field theory
NASA Astrophysics Data System (ADS)
Kol, Barak; Smolkin, Michael
2009-12-01
We apply a dressed perturbation theory to better organize and economize the computation of high orders of the 2-body effective action of an inspiralling post-Newtonian (PN) gravitating binary. We use the effective field theory approach with the nonrelativistic field decomposition (NRG fields). For that purpose we develop quite generally the dressing theory of a nonlinear classical field theory coupled to pointlike sources. We introduce dressed charges and propagators, but unlike the quantum theory there are no dressed bulk vertices. The dressed quantities are found to obey recursive integral equations which succinctly encode parts of the diagrammatic expansion, and are the classical version of the Schwinger-Dyson equations. Actually, the classical equations are somewhat stronger since they involve only finitely many quantities, unlike the quantum theory. Classical diagrams are shown to factorize exactly when they contain nonlinear worldline vertices, and we classify all the possible topologies of irreducible diagrams for low loop numbers. We apply the dressing program to our post-Newtonian case of interest. The dressed charges consist of the dressed energy-momentum tensor after a nonrelativistic decomposition, and we compute all dressed charges (in the harmonic gauge) appearing up to 2PN in the 2-body effective action (and more). We determine the irreducible skeleton diagrams up to 3PN and we employ the dressed charges to compute several terms beyond 2PN.
Bosonic Loop Diagrams as Perturbative Solutions of the Classical Field Equations in ϕ4-Theory
NASA Astrophysics Data System (ADS)
Finster, Felix; Tolksdorf, Jürgen
2012-05-01
Solutions of the classical ϕ4-theory in Minkowski space-time are analyzed in a perturbation expansion in the nonlinearity. Using the language of Feynman diagrams, the solution of the Cauchy problem is expressed in terms of tree diagrams which involve the retarded Green's function and have one outgoing leg. In order to obtain general tree diagrams, we set up a "classical measurement process" in which a virtual observer of a scattering experiment modifies the field and detects suitable energy differences. By adding a classical stochastic background field, we even obtain all loop diagrams. The expansions are compared with the standard Feynman diagrams of the corresponding quantum field theory.
How Settings Change People: Applying Behavior Setting Theory to Consumer-Run Organizations
ERIC Educational Resources Information Center
Brown, Louis D.; Shepherd, Matthew D.; Wituk, Scott A.; Meissen, Greg
2007-01-01
Self-help initiatives stand as a classic context for organizational studies in community psychology. Behavior setting theory stands as a classic conception of organizations and the environment. This study explores both, applying behavior setting theory to consumer-run organizations (CROs). Analysis of multiple data sets from all CROs in Kansas…
ERIC Educational Resources Information Center
Langdale, John A.
The construct of "organizational climate" was explicated and various ways of operationalizing it were reviewed. A survey was made of the literature pertinent to the classical-human relations dimension of environmental quality. As a result, it was hypothesized that the appropriateness of the classical and human-relations master plans is moderated…
The evolving Planck mass in classically scale-invariant theories
NASA Astrophysics Data System (ADS)
Kannike, K.; Raidal, M.; Spethmann, C.; Veermäe, H.
2017-04-01
We consider classically scale-invariant theories with non-minimally coupled scalar fields, where the Planck mass and the hierarchy of physical scales are dynamically generated. The classical theories possess a fixed point, where scale invariance is spontaneously broken. In these theories, however, the Planck mass becomes unstable in the presence of explicit sources of scale invariance breaking, such as non-relativistic matter and cosmological constant terms. We quantify the constraints on such classical models from Big Bang Nucleosynthesis that lead to an upper bound on the non-minimal coupling and require trans-Planckian field values. We show that quantum corrections to the scalar potential can stabilise the fixed point close to the minimum of the Coleman-Weinberg potential. The time-averaged motion of the evolving fixed point is strongly suppressed, thus the limits on the evolving gravitational constant from Big Bang Nucleosynthesis and other measurements do not presently constrain this class of theories. Field oscillations around the fixed point, if not damped, contribute to the dark matter density of the Universe.
1988-01-01
antiquity about his own campaigns, provides information on Britain and its early in- habitants and also records Caesar’s successful campaigns in Britain...the Carthaginians’ early success, the famous Battle of Cannae, and Rome’s victory over Hannibal at Zama. Read- ing this book offers a classical...Machiavelli’s life and times. Oman, Charles William Chadwick, Sir. The Art of War in the Middie Ages, A.D. 378-1515. Revised and edited by John H. Beeler
Zhao, Junning; Ye, Zuguang
2012-08-01
Toxic classification of traditional Chinese medicine, as a contribution of traditional Chinese medicine (TCM) to the recognition of medicinal toxicity and rational use of medicinal materials by Chinese people, is now a great issue related to safe medication, sustainable development and internationalization of Chinese medicine. In this article, the origination and development of toxic classification theory was summarized and analyzed. Because toxic classification is an urgent issue related to TCM industrialization, modernization and internationalization, this article made a systematic analysis on the nature and connotation of toxic classification as well as risk control for TCM industry due to the medicinal toxicity. Based on the toxic studies, this article made some recommendations on toxic classification of Chinese medicinal materials for the revision of China Pharmacopeia (volume 1). From the aspect of scientific research, a new technical guideline for research on toxic classification of Chinese medicine should be formulated based on new biological toxicity test technology such as Microtox and ADME/Tox, because the present classification of acute toxicity of mice/rats can not met the modern development of Chinese medicine any more. The evaluation system and technical SOP of TCM toxic classification should also be established, and they should well balance TCM features, superiority and international requirements. From the aspect of medicine management, list of toxic medicines and their risk classification should be further improved by competent government according to scientific research. In China Pharmacopeia (volume I), such descriptions of strong toxicity, toxicity or mild toxicity should be abandoned when describing medicine nature and flavor. This revision might help promote TCM sustainable development and internationalization, and enhance the competitive capacity of Chinese medicine in both domestic and international market. However, description of strong toxicity, toxicity or mild toxicity might be used when making cautions for the medicine, stating that the description is based on Chinese classic works. In this way, TCM traditional theory might be inherited and features of Chinese medicine maintained and reflected. Besides, modern findings should be added to the cautions, including dose-response relationship, toxic mechanism, and toxic elements. The traditional toxic descriptions and modern findings, as a whole, can make the caution clear and scientific, and then promote safe medication and TCM modernization and internationalization.
DOE R&D Accomplishments Database
Weinberg, Alvin M.; Noderer, L. C.
1951-05-15
The large scale release of nuclear energy in a uranium fission chain reaction involves two essentially distinct physical phenomena. On the one hand there are the individual nuclear processes such as fission, neutron capture, and neutron scattering. These are essentially quantum mechanical in character, and their theory is non-classical. On the other hand, there is the process of diffusion -- in particular, diffusion of neutrons, which is of fundamental importance in a nuclear chain reaction. This process is classical; insofar as the theory of the nuclear chain reaction depends on the theory of neutron diffusion, the mathematical study of chain reactions is an application of classical, not quantum mechanical, techniques.
Classical conformality in the Standard Model from Coleman’s theory
NASA Astrophysics Data System (ADS)
Kawana, Kiyoharu
2016-09-01
The classical conformality (CC) is one of the possible candidates for explaining the gauge hierarchy of the Standard Model (SM). We show that it is naturally obtained from the Coleman’s theory on baby universe.
NASA Astrophysics Data System (ADS)
Baumeler, ńmin; Feix, Adrien; Wolf, Stefan
2014-10-01
Quantum theory in a global spacetime gives rise to nonlocal correlations, which cannot be explained causally in a satisfactory way; this motivates the study of theories with reduced global assumptions. Oreshkov, Costa, and Brukner [Nat. Commun. 3, 1092 (2012), 10.1038/ncomms2076] proposed a framework in which quantum theory is valid locally but where, at the same time, no global spacetime, i.e., predefined causal order, is assumed beyond the absence of logical paradoxes. It was shown for the two-party case, however, that a global causal order always emerges in the classical limit. Quite naturally, it has been conjectured that the same also holds in the multiparty setting. We show that, counter to this belief, classical correlations locally compatible with classical probability theory exist that allow for deterministic signaling between three or more parties incompatible with any predefined causal order.
A post-classical theory of enamel biomineralization… and why we need one.
Simmer, James P; Richardson, Amelia S; Hu, Yuan-Yuan; Smith, Charles E; Ching-Chun Hu, Jan
2012-09-01
Enamel crystals are unique in shape, orientation and organization. They are hundreds of thousands times longer than they are wide, run parallel to each other, are oriented with respect to the ameloblast membrane at the mineralization front and are organized into rod or interrod enamel. The classical theory of amelogenesis postulates that extracellular matrix proteins shape crystallites by specifically inhibiting ion deposition on the crystal sides, orient them by binding multiple crystallites and establish higher levels of crystal organization. Elements of the classical theory are supported in principle by in vitro studies; however, the classical theory does not explain how enamel forms in vivo. In this review, we describe how amelogenesis is highly integrated with ameloblast cell activities and how the shape, orientation and organization of enamel mineral ribbons are established by a mineralization front apparatus along the secretory surface of the ameloblast cell membrane.
ERIC Educational Resources Information Center
Lejuez, C. W.; Hopko, Derek R.; Acierno, Ron; Daughters, Stacey B.; Pagoto, Sherry L.
2011-01-01
Following from the seminal work of Ferster, Lewinsohn, and Jacobson, as well as theory and research on the Matching Law, Lejuez, Hopko, LePage, Hopko, and McNeil developed a reinforcement-based depression treatment that was brief, uncomplicated, and tied closely to behavioral theory. They called this treatment the brief behavioral activation…
Cross-Talk in Comp Theory: A Reader. Second Edition, Revised and Updated.
ERIC Educational Resources Information Center
Villanueva, Victor, Ed.
This revised and updated resource contains a total of 43 essays that serve to initiate graduate students and more experienced teachers into the theories that inform composition studies. Under Section One--The Givens in Our Conversations: The Writing Process--are these essays: "Teach Writing as a Process Not Product" (Donald M. Murray);…
Revising an Extension Education Website for Limited Resource Audiences Using Social Marketing Theory
ERIC Educational Resources Information Center
Francis, Sarah L.; Martin, Peggy; Taylor, Kristin
2011-01-01
Spend Smart Eat Smart (SSES), a unique website combining nutrition and food buying education for limited resource audiences (LRAs), was revised using social marketing theory to make it more appealing and relevant to LRAs (25-40 years). Focus groups and surveys identified the needs and preferences of LRAs. Needs were cooking, basic health, and…
Revision and Validation of the Revised Teacher Beliefs Survey.
ERIC Educational Resources Information Center
Benjamin, Jane
This study revised the Teacher Beliefs Survey (S. Wooley and A. Wooley, 1999; TBS), an instrument to assess teachers beliefs related to constructivist and behaviorist theories of learning, and then studied the validity of the revised TBS. Drawing on a literature review, researchers added items for the existing constructs of the TBS and added a new…
Statistical mechanics in the context of special relativity. II.
Kaniadakis, G
2005-09-01
The special relativity laws emerge as one-parameter (light speed) generalizations of the corresponding laws of classical physics. These generalizations, imposed by the Lorentz transformations, affect both the definition of the various physical observables (e.g., momentum, energy, etc.), as well as the mathematical apparatus of the theory. Here, following the general lines of [Phys. Rev. E 66, 056125 (2002)], we show that the Lorentz transformations impose also a proper one-parameter generalization of the classical Boltzmann-Gibbs-Shannon entropy. The obtained relativistic entropy permits us to construct a coherent and self-consistent relativistic statistical theory, preserving the main features of the ordinary statistical theory, which is recovered in the classical limit. The predicted distribution function is a one-parameter continuous deformation of the classical Maxwell-Boltzmann distribution and has a simple analytic form, showing power law tails in accordance with the experimental evidence. Furthermore, this statistical mechanics can be obtained as the stationary case of a generalized kinetic theory governed by an evolution equation obeying the H theorem and reproducing the Boltzmann equation of the ordinary kinetics in the classical limit.
Leading-order classical Lagrangians for the nonminimal standard-model extension
NASA Astrophysics Data System (ADS)
Reis, J. A. A. S.; Schreck, M.
2018-03-01
In this paper, we derive the general leading-order classical Lagrangian covering all fermion operators of the nonminimal standard-model extension (SME). Such a Lagrangian is considered to be the point-particle analog of the effective field theory description of Lorentz violation that is provided by the SME. At leading order in Lorentz violation, the Lagrangian obtained satisfies the set of five nonlinear equations that govern the map from the field theory to the classical description. This result can be of use for phenomenological studies of classical bodies in gravitational fields.
JOURNAL SCOPE GUIDELINES: Paper classification scheme
NASA Astrophysics Data System (ADS)
2005-06-01
This scheme is used to clarify the journal's scope and enable authors and readers to more easily locate the appropriate section for their work. For each of the sections listed in the scope statement we suggest some more detailed subject areas which help define that subject area. These lists are by no means exhaustive and are intended only as a guide to the type of papers we envisage appearing in each section. We acknowledge that no classification scheme can be perfect and that there are some papers which might be placed in more than one section. We are happy to provide further advice on paper classification to authors upon request (please email jphysa@iop.org). 1. Statistical physics numerical and computational methods statistical mechanics, phase transitions and critical phenomena quantum condensed matter theory Bose-Einstein condensation strongly correlated electron systems exactly solvable models in statistical mechanics lattice models, random walks and combinatorics field-theoretical models in statistical mechanics disordered systems, spin glasses and neural networks nonequilibrium systems network theory 2. Chaotic and complex systems nonlinear dynamics and classical chaos fractals and multifractals quantum chaos classical and quantum transport cellular automata granular systems and self-organization pattern formation biophysical models 3. Mathematical physics combinatorics algebraic structures and number theory matrix theory classical and quantum groups, symmetry and representation theory Lie algebras, special functions and orthogonal polynomials ordinary and partial differential equations difference and functional equations integrable systems soliton theory functional analysis and operator theory inverse problems geometry, differential geometry and topology numerical approximation and analysis geometric integration computational methods 4. Quantum mechanics and quantum information theory coherent states eigenvalue problems supersymmetric quantum mechanics scattering theory relativistic quantum mechanics semiclassical approximations foundations of quantum mechanics and measurement theory entanglement and quantum nonlocality geometric phases and quantum tomography quantum tunnelling decoherence and open systems quantum cryptography, communication and computation theoretical quantum optics 5. Classical and quantum field theory quantum field theory gauge and conformal field theory quantum electrodynamics and quantum chromodynamics Casimir effect integrable field theory random matrix theory applications in field theory string theory and its developments classical field theory and electromagnetism metamaterials 6. Fluid and plasma theory turbulence fundamental plasma physics kinetic theory magnetohydrodynamics and multifluid descriptions strongly coupled plasmas one-component plasmas non-neutral plasmas astrophysical and dusty plasmas
Quasi-Static Analysis of Round LaRC THUNDER Actuators
NASA Technical Reports Server (NTRS)
Campbell, Joel F.
2007-01-01
An analytic approach is developed to predict the shape and displacement with voltage in the quasi-static limit of round LaRC Thunder Actuators. The problem is treated with classical lamination theory and Von Karman non-linear analysis. In the case of classical lamination theory exact analytic solutions are found. It is shown that classical lamination theory is insufficient to describe the physical situation for large actuators but is sufficient for very small actuators. Numerical results are presented for the non-linear analysis and compared with experimental measurements. Snap-through behavior, bifurcation, and stability are presented and discussed.
Quasi-Static Analysis of LaRC THUNDER Actuators
NASA Technical Reports Server (NTRS)
Campbell, Joel F.
2007-01-01
An analytic approach is developed to predict the shape and displacement with voltage in the quasi-static limit of LaRC Thunder Actuators. The problem is treated with classical lamination theory and Von Karman non-linear analysis. In the case of classical lamination theory exact analytic solutions are found. It is shown that classical lamination theory is insufficient to describe the physical situation for large actuators but is sufficient for very small actuators. Numerical results are presented for the non-linear analysis and compared with experimental measurements. Snap-through behavior, bifurcation, and stability are presented and discussed.
The Discovery of Subatomic Particles Revised Edition
NASA Astrophysics Data System (ADS)
Weinberg, Steven
2003-09-01
This commentary on the discovery of the atom's constituents provides an historical account of key events in the physics of the twentieth century that led to the discoveries of the electron, proton and neutron. Steven Weinberg introduces the fundamentals of classical physics that played crucial roles in these discoveries. Connections are shown throughout the book between the historic discoveries of subatomic particles and contemporary research at the frontiers of physics, including the most current discoveries of new elementary particles. Steven Weinberg was Higgins Professor of Physics at Harvard before moving to The University of Texas at Austin, where he founded its Theory Group. At Texas he holds the Josey Regental Chair of Science and is a member of the Physics and Astronomy Departments. His research has spanned a broad range of topics in quantum field theory, elementary particle physics, and cosmology, and has been honored with numerous awards, including the Nobel Prize in Physics, the National Medal of Science, the Heinemann Prize in Mathematical Physics, the Cresson Medal of the Franklin Institute, the Madison Medal of Princeton University, and the Oppenheimer Prize. In addition to the well-known treatise, Gravitation and Cosmololgy, he has written several books for general readers, including the prize-winning The First Three Minutes (now translated into 22 foreign languages), and most recently Dreams of a Final Theory (Pantheon Books, 1993). He has also written a textbook The Quantum Theory of Fields, Vol.I, Vol. II, and Vol. III (Cambridge).
Navigating the grounded theory terrain. Part 2.
Hunter, Andrew; Murphy, Kathy; Grealish, Annmarie; Casey, Dympna; Keady, John
2011-01-01
In this paper, the choice of classic grounded theory will be discussed and justified in the context of the first author's PhD research. The methodological discussion takes place within the context of PhD research entitled: Development of a stakeholder-led framework for a structured education programme that will prepare nurses and healthcare assistants to deliver a psychosocial intervention for people with dementia. There is a lack of research and limited understanding of the effect of psychosocial interventions on people with dementia. The first author thought classic grounded theory a suitable research methodology to investigate as it is held to be ideal for areas of research where there is little understanding of the social processes at work. The literature relating to the practical application of classic grounded theory is illustrated using examples relating to four key grounded theory components: Theory development: using constant comparison and memoing, Methodological rigour, Emergence of a core category, Inclusion of self and engagement with participants. Following discussion of the choice and application of classic grounded theory, this paper explores the need for researchers to visit and understand the various grounded theory options. This paper argues that researchers new to grounded theory must be familiar with and understand the various options. The researchers will then be able to apply the methodologies they choose consistently and critically. Doing so will allow them to develop theory rigorously and they will ultimately be able to better defend their final methodological destinations.
Geometric Algebra for Physicists
NASA Astrophysics Data System (ADS)
Doran, Chris; Lasenby, Anthony
2007-11-01
Preface; Notation; 1. Introduction; 2. Geometric algebra in two and three dimensions; 3. Classical mechanics; 4. Foundations of geometric algebra; 5. Relativity and spacetime; 6. Geometric calculus; 7. Classical electrodynamics; 8. Quantum theory and spinors; 9. Multiparticle states and quantum entanglement; 10. Geometry; 11. Further topics in calculus and group theory; 12. Lagrangian and Hamiltonian techniques; 13. Symmetry and gauge theory; 14. Gravitation; Bibliography; Index.
Automated revision of CLIPS rule-bases
NASA Technical Reports Server (NTRS)
Murphy, Patrick M.; Pazzani, Michael J.
1994-01-01
This paper describes CLIPS-R, a theory revision system for the revision of CLIPS rule-bases. CLIPS-R may be used for a variety of knowledge-base revision tasks, such as refining a prototype system, adapting an existing system to slightly different operating conditions, or improving an operational system that makes occasional errors. We present a description of how CLIPS-R revises rule-bases, and an evaluation of the system on three rule-bases.
A Grounded Theory of Text Revision Processes Used by Young Adolescents Who Are Deaf
ERIC Educational Resources Information Center
Yuknis, Christina
2014-01-01
This study examined the revising processes used by 8 middle school students who are deaf or hard-of-hearing as they composed essays for their English classes. Using grounded theory, interviews with students and teachers in one middle school, observations of the students engaging in essay creation, and writing samples were collected for analysis.…
The Nature of Quantum Truth: Logic, Set Theory, & Mathematics in the Context of Quantum Theory
NASA Astrophysics Data System (ADS)
Frey, Kimberly
The purpose of this dissertation is to construct a radically new type of mathematics whose underlying logic differs from the ordinary classical logic used in standard mathematics, and which we feel may be more natural for applications in quantum mechanics. Specifically, we begin by constructing a first order quantum logic, the development of which closely parallels that of ordinary (classical) first order logic --- the essential differences are in the nature of the logical axioms, which, in our construction, are motivated by quantum theory. After showing that the axiomatic first order logic we develop is sound and complete (with respect to a particular class of models), this logic is then used as a foundation on which to build (axiomatic) mathematical systems --- and we refer to the resulting new mathematics as "quantum mathematics." As noted above, the hope is that this form of mathematics is more natural than classical mathematics for the description of quantum systems, and will enable us to address some foundational aspects of quantum theory which are still troublesome --- e.g. the measurement problem --- as well as possibly even inform our thinking about quantum gravity. After constructing the underlying logic, we investigate properties of several mathematical systems --- e.g. axiom systems for abstract algebras, group theory, linear algebra, etc. --- in the presence of this quantum logic. In the process, we demonstrate that the resulting quantum mathematical systems have some strange, but very interesting features, which indicates a richness in the structure of mathematics that is classically inaccessible. Moreover, some of these features do indeed suggest possible applications to foundational questions in quantum theory. We continue our investigation of quantum mathematics by constructing an axiomatic quantum set theory, which we show satisfies certain desirable criteria. Ultimately, we hope that such a set theory will lead to a foundation for quantum mathematics in a sense which parallels the foundational role of classical set theory in classical mathematics. One immediate application of the quantum set theory we develop is to provide a foundation on which to construct quantum natural numbers, which are the quantum analog of the classical counting numbers. It turns out that in a special class of models, there exists a 1-1 correspondence between the quantum natural numbers and bounded observables in quantum theory whose eigenvalues are (ordinary) natural numbers. This 1-1 correspondence is remarkably satisfying, and not only gives us great confidence in our quantum set theory, but indicates the naturalness of such models for quantum theory itself. We go on to develop a Peano-like arithmetic for these new "numbers," as well as consider some of its consequences. Finally, we conclude by summarizing our results, and discussing directions for future work.
Nucleation theory - Is replacement free energy needed?. [error analysis of capillary approximation
NASA Technical Reports Server (NTRS)
Doremus, R. H.
1982-01-01
It has been suggested that the classical theory of nucleation of liquid from its vapor as developed by Volmer and Weber (1926) needs modification with a factor referred to as the replacement free energy and that the capillary approximation underlying the classical theory is in error. Here, the classical nucleation equation is derived from fluctuation theory, Gibb's result for the reversible work to form a critical nucleus, and the rate of collision of gas molecules with a surface. The capillary approximation is not used in the derivation. The chemical potential of small drops is then considered, and it is shown that the capillary approximation can be derived from thermodynamic equations. The results show that no corrections to Volmer's equation are needed.
Effective model hierarchies for dynamic and static classical density functional theories
NASA Astrophysics Data System (ADS)
Majaniemi, S.; Provatas, N.; Nonomura, M.
2010-09-01
The origin and methodology of deriving effective model hierarchies are presented with applications to solidification of crystalline solids. In particular, it is discussed how the form of the equations of motion and the effective parameters on larger scales can be obtained from the more microscopic models. It will be shown that tying together the dynamic structure of the projection operator formalism with static classical density functional theories can lead to incomplete (mass) transport properties even though the linearized hydrodynamics on large scales is correctly reproduced. To facilitate a more natural way of binding together the dynamics of the macrovariables and classical density functional theory, a dynamic generalization of density functional theory based on the nonequilibrium generating functional is suggested.
Brewin, Chris R; Burgess, Neil
2014-03-01
Three recent studies (Pearson, 2012; Pearson, Ross, & Webster, 2012) purported to test the revised dual representation theory of posttraumatic stress disorder (Brewin, Gregory, Lipton, & Burgess, 2010) by manipulating the amount of additional information accompanying traumatic stimulus materials and assessing the effect on subsequent intrusive memories. Here we point out that these studies involve a misunderstanding of the meaning of "contextual" within the theory, such that the manipulation would be unlikely to have had the intended effect and the results are ambiguous with respect to the theory. Past and future experimental tests of the theory are discussed. Copyright © 2013 The Authors. Published by Elsevier Ltd.. All rights reserved.
Using extant literature in a grounded theory study: a personal account.
Yarwood-Ross, Lee; Jack, Kirsten
2015-03-01
To provide a personal account of the factors in a doctoral study that led to the adoption of classic grounded theory principles relating to the use of literature. Novice researchers considering grounded theory methodology will become aware of the contentious issue of how and when extant literature should be incorporated into a study. The three main grounded theory approaches are classic, Straussian and constructivist, and the seminal texts provide conflicting beliefs surrounding the use of literature. A classic approach avoids a pre-study literature review to minimise preconceptions and emphasises the constant comparison method, while the Straussian and constructivist approaches focus more on the beneficial aspects of an initial literature review and researcher reflexivity. The debate also extends into the wider academic community, where no consensus exists. This is a methodological paper detailing the authors' engagement in the debate surrounding the role of the literature in a grounded theory study. In the authors' experience, researchers can best understand the use of literature in grounded theory through immersion in the seminal texts, engaging with wider academic literature, and examining their preconceptions of the substantive area. The authors concluded that classic grounded theory principles were appropriate in the context of their doctoral study. Novice researchers will have their own sets of circumstances when preparing their studies and should become aware of the different perspectives to make decisions that they can ultimately justify. This paper can be used by other novice researchers as an example of the decision-making process that led to delaying a pre-study literature review and identifies the resources used to write a research proposal when using a classic grounded theory approach.
Classical BV Theories on Manifolds with Boundary
NASA Astrophysics Data System (ADS)
Cattaneo, Alberto S.; Mnev, Pavel; Reshetikhin, Nicolai
2014-12-01
In this paper we extend the classical BV framework to gauge theories on spacetime manifolds with boundary. In particular, we connect the BV construction in the bulk with the BFV construction on the boundary and we develop its extension to strata of higher codimension in the case of manifolds with corners. We present several examples including electrodynamics, Yang-Mills theory and topological field theories coming from the AKSZ construction, in particular, the Chern-Simons theory, the BF theory, and the Poisson sigma model. This paper is the first step towards developing the perturbative quantization of such theories on manifolds with boundary in a way consistent with gluing.
An Examination of the Flynn Effect in the National Intelligence Test in Estonia
ERIC Educational Resources Information Center
Shiu, William
2012-01-01
This study examined the Flynn Effect (FE; i.e., the rise in IQ scores over time) in Estonia from Scale B of the National Intelligence Test using both classical test theory (CTT) and item response theory (IRT) methods. Secondary data from two cohorts (1934, n = 890 and 2006, n = 913) of students were analyzed, using both classical test theory (CTT)…
A meta-science for a global bioethics and biomedicine.
Basser, David S
2017-11-07
As suggested by Shook and Giordano, understanding and therefore addressing the urgent international governance issues around globalizing bio-medical/technology research and applications is limited by the perception of the underlying science. A philosophical methodology is used, based on novel and classical philosophical reflection upon existent literature, clinical wisdoms and narrative theory to discover a meta-science and telos of humankind for the development of a relevant and defendable global biomedical bioethics. In this article, through pondering an integrative systems approach, I propose a biomedical model that may provide Western biomedicine with leadership and interesting insight into the unity beyond the artificial boundaries of its traditional divisions and the limit between physiological and pathological situations (health and disease). A unified biomedicine, as scientific foundation, might then provide the basis for dissolution of similar reflected boundaries within bioethics. A principled and communitarian cosmopolitan bioethics may then be synonymous with a recently proposed principled and communitarian cosmopolitan neuroethics based on a novel objective meta-ethics. In an attempt to help facilitate equal and inclusive participation in inter-, multi-, and transdisciplinary intercultural discourse regarding the aforementioned international governance issues, I offer: (1) a meta-science derived through considering the general behaviour of activity, plasticity and balance in biology and; (2) a novel thought framework to encourage and enhance the ability for self-evaluation, self-criticism, and self-revision aimed at broadening perspective, as well as acknowledging and responding to the strengths and limitations of extant knowledge. Through classical philosophical reflection, I evolve a theory of medicine to discover a telos of humankind which in turn provides an 'internal' moral grounding for a proposed global biomedical bioethics.
NASA Astrophysics Data System (ADS)
Sun, Xiao-Wei; Liu, Zi-Jiang; Quan, Wei-Long; Song, Ting; Khenata, Rabah; Bin-Omran, Saad
2018-05-01
Using the revised Perdew-Burke-Ernzerhof generalized gradient approximation based on first-principles plane-wave pseudopotential density functional theory, the high-pressure structural phase transition of LiF is explored. From the analysis of Gibbs free energies, we find that no phase transition occurs for LiF in the presented pressure range from 0 to 1000 GPa, and this result is consistent with the theoretical prediction obtained via ab initio calculations [N.A. Smirnov, Phys. Rev. B 83 (2011) 014109]. Using the classical molecular dynamics technique with effective pair potentials which consist of the Coulomb, dispersion, and repulsion interaction, the melting phase diagram of LiF is determined. The obtained normalized volumes under pressure are in good agreement with our density functional theory results and the available experimental data. Meanwhile, with the help of the quasi-harmonic Debye model in which the phononic effects are considered, the thermodynamic properties of interest, including the volume thermal expansion coefficient, isothermal bulk modulus and its first and second pressure derivatives, heat capacity at constant volume, entropy, Debye temperature, and Grüneisen parameter of LiF are predicted systematically. All the properties of LiF with the stable NaCl-type structure in the temperature range of 0-4900 K and the pressure up to 1000 GPa are summarized.
Bukhvostov-Lipatov model and quantum-classical duality
NASA Astrophysics Data System (ADS)
Bazhanov, Vladimir V.; Lukyanov, Sergei L.; Runov, Boris A.
2018-02-01
The Bukhvostov-Lipatov model is an exactly soluble model of two interacting Dirac fermions in 1 + 1 dimensions. The model describes weakly interacting instantons and anti-instantons in the O (3) non-linear sigma model. In our previous work [arxiv:arXiv:1607.04839] we have proposed an exact formula for the vacuum energy of the Bukhvostov-Lipatov model in terms of special solutions of the classical sinh-Gordon equation, which can be viewed as an example of a remarkable duality between integrable quantum field theories and integrable classical field theories in two dimensions. Here we present a complete derivation of this duality based on the classical inverse scattering transform method, traditional Bethe ansatz techniques and analytic theory of ordinary differential equations. In particular, we show that the Bethe ansatz equations defining the vacuum state of the quantum theory also define connection coefficients of an auxiliary linear problem for the classical sinh-Gordon equation. Moreover, we also present details of the derivation of the non-linear integral equations determining the vacuum energy and other spectral characteristics of the model in the case when the vacuum state is filled by 2-string solutions of the Bethe ansatz equations.
Whitley, Heather D.; Scullard, Christian R.; Benedict, Lorin X.; ...
2014-12-04
Here, we present a discussion of kinetic theory treatments of linear electrical and thermal transport in hydrogen plasmas, for a regime of interest to inertial confinement fusion applications. In order to assess the accuracy of one of the more involved of these approaches, classical Lenard-Balescu theory, we perform classical molecular dynamics simulations of hydrogen plasmas using 2-body quantum statistical potentials and compute both electrical and thermal conductivity from out particle trajectories using the Kubo approach. Our classical Lenard-Balescu results employing the identical statistical potentials agree well with the simulations.
Further Development of an Optimal Design Approach Applied to Axial Magnetic Bearings
NASA Technical Reports Server (NTRS)
Bloodgood, V. Dale, Jr.; Groom, Nelson J.; Britcher, Colin P.
2000-01-01
Classical design methods involved in magnetic bearings and magnetic suspension systems have always had their limitations. Because of this, the overall effectiveness of a design has always relied heavily on the skill and experience of the individual designer. This paper combines two approaches that have been developed to aid the accuracy and efficiency of magnetostatic design. The first approach integrates classical magnetic circuit theory with modern optimization theory to increase design efficiency. The second approach uses loss factors to increase the accuracy of classical magnetic circuit theory. As an example, an axial magnetic thrust bearing is designed for minimum power.
Quantum theory for 1D X-ray free electron laser
NASA Astrophysics Data System (ADS)
Anisimov, Petr M.
2018-06-01
Classical 1D X-ray Free Electron Laser (X-ray FEL) theory has stood the test of time by guiding FEL design and development prior to any full-scale analysis. Future X-ray FELs and inverse-Compton sources, where photon recoil approaches an electron energy spread value, push the classical theory to its limits of applicability. After substantial efforts by the community to find what those limits are, there is no universally agreed upon quantum approach to design and development of future X-ray sources. We offer a new approach to formulate the quantum theory for 1D X-ray FELs that has an obvious connection to the classical theory, which allows for immediate transfer of knowledge between the two regimes. We exploit this connection in order to draw quantum mechanical conclusions about the quantum nature of electrons and generated radiation in terms of FEL variables.
Plasmon mass scale and quantum fluctuations of classical fields on a real time lattice
NASA Astrophysics Data System (ADS)
Kurkela, Aleksi; Lappi, Tuomas; Peuron, Jarkko
2018-03-01
Classical real-time lattice simulations play an important role in understanding non-equilibrium phenomena in gauge theories and are used in particular to model the prethermal evolution of heavy-ion collisions. Above the Debye scale the classical Yang-Mills (CYM) theory can be matched smoothly to kinetic theory. First we study the limits of the quasiparticle picture of the CYM fields by determining the plasmon mass of the system using 3 different methods. Then we argue that one needs a numerical calculation of a system of classical gauge fields and small linearized fluctuations, which correspond to quantum fluctuations, in a way that keeps the separation between the two manifest. We demonstrate and test an implementation of an algorithm with the linearized fluctuation showing that the linearization indeed works and that the Gauss's law is conserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nomura, Yasunori; Salzetta, Nico; Sanches, Fabio
We study the Hilbert space structure of classical spacetimes under the assumption that entanglement in holographic theories determines semiclassical geometry. We show that this simple assumption has profound implications; for example, a superposition of classical spacetimes may lead to another classical spacetime. Despite its unconventional nature, this picture admits the standard interpretation of superpositions of well-defined semiclassical spacetimes in the limit that the number of holographic degrees of freedom becomes large. We illustrate these ideas using a model for the holographic theory of cosmological spacetimes.
Classical theory of radiating strings
NASA Technical Reports Server (NTRS)
Copeland, Edmund J.; Haws, D.; Hindmarsh, M.
1990-01-01
The divergent part of the self force of a radiating string coupled to gravity, an antisymmetric tensor and a dilaton in four dimensions are calculated to first order in classical perturbation theory. While this divergence can be absorbed into a renormalization of the string tension, demanding that both it and the divergence in the energy momentum tensor vanish forces the string to have the couplings of compactified N = 1 D = 10 supergravity. In effect, supersymmetry cures the classical infinities.
Emergence of a classical Universe from quantum gravity and cosmology.
Kiefer, Claus
2012-09-28
I describe how we can understand the classical appearance of our world from a universal quantum theory. The essential ingredient is the process of decoherence. I start with a general discussion in ordinary quantum theory and then turn to quantum gravity and quantum cosmology. There is a whole hierarchy of classicality from the global gravitational field to the fluctuations in the cosmic microwave background, which serve as the seeds for the structure in the Universe.
Classical gluon and graviton radiation from the bi-adjoint scalar double copy
NASA Astrophysics Data System (ADS)
Goldberger, Walter D.; Prabhu, Siddharth G.; Thompson, Jedidiah O.
2017-09-01
We find double-copy relations between classical radiating solutions in Yang-Mills theory coupled to dynamical color charges and their counterparts in a cubic bi-adjoint scalar field theory which interacts linearly with particles carrying bi-adjoint charge. The particular color-to-kinematics replacements we employ are motivated by the Bern-Carrasco-Johansson double-copy correspondence for on-shell amplitudes in gauge and gravity theories. They are identical to those recently used to establish relations between classical radiating solutions in gauge theory and in dilaton gravity. Our explicit bi-adjoint solutions are constructed to second order in a perturbative expansion, and map under the double copy onto gauge theory solutions which involve at most cubic gluon self-interactions. If the correspondence is found to persist to higher orders in perturbation theory, our results suggest the possibility of calculating gravitational radiation from colliding compact objects, directly from a scalar field with vastly simpler (purely cubic) Feynman vertices.
Combinatorial Market Processing for Multilateral Coordination
2005-09-01
8 In the classical auction theory literature, most of the attention is focused on one-sided, single-item auctions [86]. There is now a growing body of...Programming in Infinite-dimensional Spaces: Theory and Applications, Wiley, 1987. [3] K. J. Arrow, “An extension of the basic theorems of classical ...Commodities, Princeton University Press, 1969. [43] D. Friedman and J. Rust, The Double Auction Market: Institutions, Theories, and Evidence, Addison
ERIC Educational Resources Information Center
Boyer, Timothy H.
1985-01-01
The classical vacuum of physics is not empty, but contains a distinctive pattern of electromagnetic fields. Discovery of the vacuum, thermal spectrum, classical electron theory, zero-point spectrum, and effects of acceleration are discussed. Connection between thermal radiation and the classical vacuum reveals unexpected unity in the laws of…
Random walk in generalized quantum theory
NASA Astrophysics Data System (ADS)
Martin, Xavier; O'Connor, Denjoe; Sorkin, Rafael D.
2005-01-01
One can view quantum mechanics as a generalization of classical probability theory that provides for pairwise interference among alternatives. Adopting this perspective, we “quantize” the classical random walk by finding, subject to a certain condition of “strong positivity”, the most general Markovian, translationally invariant “decoherence functional” with nearest neighbor transitions.
Neo-classical theory of competition or Adam Smith's hand as mathematized ideology
NASA Astrophysics Data System (ADS)
McCauley, Joseph L.
2001-10-01
Orthodox economic theory (utility maximization, rational agents, efficient markets in equilibrium) is based on arbitrarily postulated, nonempiric notions. The disagreement between economic reality and a key feature of neo-classical economic theory was criticized empirically by Osborne. I show that the orthodox theory is internally self-inconsistent for the very reason suggested by Osborne: lack of invertibility of demand and supply as functions of price to obtain price as functions of supply and demand. The reason for the noninvertibililty arises from nonintegrable excess demand dynamics, a feature of their theory completely ignored by economists.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Banik, Manik, E-mail: manik11ju@gmail.com
Steering is one of the most counter intuitive non-classical features of bipartite quantum system, first noticed by Schrödinger at the early days of quantum theory. On the other hand, measurement incompatibility is another non-classical feature of quantum theory, initially pointed out by Bohr. Recently, Quintino et al. [Phys. Rev. Lett. 113, 160402 (2014)] and Uola et al. [Phys. Rev. Lett. 113, 160403 (2014)] have investigated the relation between these two distinct non-classical features. They have shown that a set of measurements is not jointly measurable (i.e., incompatible) if and only if they can be used for demonstrating Schrödinger-Einstein-Podolsky-Rosen steering. Themore » concept of steering has been generalized for more general abstract tensor product theories rather than just Hilbert space quantum mechanics. In this article, we discuss that the notion of measurement incompatibility can be extended for general probability theories. Further, we show that the connection between steering and measurement incompatibility holds in a border class of tensor product theories rather than just quantum theory.« less
What is Quantum Mechanics? A Minimal Formulation
NASA Astrophysics Data System (ADS)
Friedberg, R.; Hohenberg, P. C.
2018-03-01
This paper presents a minimal formulation of nonrelativistic quantum mechanics, by which is meant a formulation which describes the theory in a succinct, self-contained, clear, unambiguous and of course correct manner. The bulk of the presentation is the so-called "microscopic theory", applicable to any closed system S of arbitrary size N, using concepts referring to S alone, without resort to external apparatus or external agents. An example of a similar minimal microscopic theory is the standard formulation of classical mechanics, which serves as the template for a minimal quantum theory. The only substantive assumption required is the replacement of the classical Euclidean phase space by Hilbert space in the quantum case, with the attendant all-important phenomenon of quantum incompatibility. Two fundamental theorems of Hilbert space, the Kochen-Specker-Bell theorem and Gleason's theorem, then lead inevitably to the well-known Born probability rule. For both classical and quantum mechanics, questions of physical implementation and experimental verification of the predictions of the theories are the domain of the macroscopic theory, which is argued to be a special case or application of the more general microscopic theory.
Theory of mind deficit in adult patients with congenital heart disease.
Chiavarino, Claudia; Bianchino, Claudia; Brach-Prever, Silvia; Riggi, Chiara; Palumbo, Luigi; Bara, Bruno G; Bosco, Francesca M
2015-10-01
This article provides the first assessment of theory of mind, that is, the ability to reason about mental states, in adult patients with congenital heart disease. Patients with congenital heart disease and matched healthy controls were administered classical theory of mind tasks and a semi-structured interview which provides a multidimensional evaluation of theory of mind (Theory of Mind Assessment Scale). The patients with congenital heart disease performed worse than the controls on the Theory of Mind Assessment Scale, whereas they did as well as the control group on the classical theory-of-mind tasks. These findings provide the first evidence that adults with congenital heart disease may display specific impairments in theory of mind. © The Author(s) 2013.
Internal construct validity of the Shirom-Melamed Burnout Questionnaire (SMBQ)
2012-01-01
Background Burnout is a mental condition defined as a result of continuous and long-term stress exposure, particularly related to psychosocial factors at work. This paper seeks to examine the psychometric properties of the Shirom-Melamed Burnout Questionnaire (SMBQ) for validation of use in a clinical setting. Methods Data from both a clinical (319) and general population (319) samples of health care and social insurance workers were included in the study. Data were analysed using both classical and modern test theory approaches, including Confirmatory Factor Analysis (CFA) and Rasch analysis. Results Of the 638 people recruited into the study 416 (65%) persons were working full or part time. Data from the SMBQ failed a CFA, and initially failed to satisfy Rasch model expectations. After the removal of 4 of the original items measuring tension, and accommodating local dependency in the data, model expectations were met. As such, the total score from the revised scale is a sufficient statistic for ascertaining burnout and an interval scale transformation is available. The scale as a whole was perfectly targeted to the joint sample. A cut point of 4.4 for severe burnout was chosen at the intersection of the distributions of the clinical and general population. Conclusion A revised 18 item version of the SMBQ satisfies modern measurement standards. Using its cut point it offers the opportunity to identify potential clinical cases of burnout. PMID:22214479
Horváth, Gábor; Buchta, Krisztián; Varjú, Dezsö
2003-06-01
It is a well-known phenomenon that when we look into the water with two aerial eyes, both the apparent position and the apparent shape of underwater objects are different from the real ones because of refraction at the water surface. Earlier studies of the refraction-distorted structure of the underwater binocular visual field of aerial observers were restricted to either vertically or horizontally oriented eyes. We investigate a generalized version of this problem: We calculate the position of the binocular image point of an underwater object point viewed by two arbitrarily positioned aerial eyes, including oblique orientations of the eyes relative to the flat water surface. Assuming that binocular image fusion is performed by appropriate vergent eye movements to bring the object's image onto the foveas, the structure of the underwater binocular visual field is computed and visualized in different ways as a function of the relative positions of the eyes. We show that a revision of certain earlier treatments of the aerial imaging of underwater objects is necessary. We analyze and correct some widespread erroneous or incomplete representations of this classical geometric optical problem that occur in different textbooks. Improving the theory of aerial binocular imaging of underwater objects, we demonstrate that the structure of the underwater binocular visual field of aerial observers distorted by refraction is more complex than has been thought previously.
Storey, Jennifer E; Hart, Stephen D; Cooke, David J; Michie, Christine
2016-04-01
The Hare Psychopathy Checklist-Revised (PCL-R; Hare, 2003) is a commonly used psychological test for assessing traits of psychopathic personality disorder. Despite the abundance of research using the PCL-R, the vast majority of research used samples of convenience rather than systematic methods to minimize sampling bias and maximize the generalizability of findings. This potentially complicates the interpretation of test scores and research findings, including the "norms" for offenders from the United States and Canada included in the PCL-R manual. In the current study, we evaluated the psychometric properties of PCL-R scores for all male offenders admitted to a regional reception center of the Correctional Service of Canada during a 1-year period (n = 375). Because offenders were admitted for assessment prior to institutional classification, they comprise a sample that was heterogeneous with respect to correctional risks and needs yet representative of all offenders in that region of the service. We examined the distribution of PCL-R scores, classical test theory indices of its structural reliability, the factor structure of test items, and the external correlates of test scores. The findings were highly consistent with those typically reported in previous studies. We interpret these results as indicating it is unlikely any sampling limitations of past research using the PCL-R resulted in findings that were, overall, strongly biased or unrepresentative. (c) 2016 APA, all rights reserved).
Classical Physics and the Bounds of Quantum Correlations.
Frustaglia, Diego; Baltanás, José P; Velázquez-Ahumada, María C; Fernández-Prieto, Armando; Lujambio, Aintzane; Losada, Vicente; Freire, Manuel J; Cabello, Adán
2016-06-24
A unifying principle explaining the numerical bounds of quantum correlations remains elusive, despite the efforts devoted to identifying it. Here, we show that these bounds are indeed not exclusive to quantum theory: for any abstract correlation scenario with compatible measurements, models based on classical waves produce probability distributions indistinguishable from those of quantum theory and, therefore, share the same bounds. We demonstrate this finding by implementing classical microwaves that propagate along meter-size transmission-line circuits and reproduce the probabilities of three emblematic quantum experiments. Our results show that the "quantum" bounds would also occur in a classical universe without quanta. The implications of this observation are discussed.
Open or closed? Dirac, Heisenberg, and the relation between classical and quantum mechanics
NASA Astrophysics Data System (ADS)
Bokulich, Alisa
2004-09-01
This paper describes a long-standing, though little known, debate between Dirac and Heisenberg over the nature of scientific methodology, theory change, and intertheoretic relations. Following Heisenberg's terminology, their disagreements can be summarized as a debate over whether the classical and quantum theories are "open" or "closed." A close examination of this debate sheds new light on the philosophical views of two of the great founders of quantum theory.
The role of a posteriori mathematics in physics
NASA Astrophysics Data System (ADS)
MacKinnon, Edward
2018-05-01
The calculus that co-evolved with classical mechanics relied on definitions of functions and differentials that accommodated physical intuitions. In the early nineteenth century mathematicians began the rigorous reformulation of calculus and eventually succeeded in putting almost all of mathematics on a set-theoretic foundation. Physicists traditionally ignore this rigorous mathematics. Physicists often rely on a posteriori math, a practice of using physical considerations to determine mathematical formulations. This is illustrated by examples from classical and quantum physics. A justification of such practice stems from a consideration of the role of phenomenological theories in classical physics and effective theories in contemporary physics. This relates to the larger question of how physical theories should be interpreted.
NASA Technical Reports Server (NTRS)
Paquette, John A.; Nuth, Joseph A., III
2011-01-01
Classical nucleation theory has been used in models of dust nucleation in circumstellar outflows around oxygen-rich asymptotic giant branch stars. One objection to the application of classical nucleation theory (CNT) to astrophysical systems of this sort is that an equilibrium distribution of clusters (assumed by CNT) is unlikely to exist in such conditions due to a low collision rate of condensable species. A model of silicate grain nucleation and growth was modified to evaluate the effect of a nucleation flux orders of magnitUde below the equilibrium value. The results show that a lack of chemical equilibrium has only a small effect on the ultimate grain distribution.
S-Duality, Deconstruction and Confinement for a Marginal Deformation of N=4 SUSY Yang-Mills
NASA Astrophysics Data System (ADS)
Dorey, Nick
2004-08-01
We study an exactly marginal deformation of Script N = 4 SUSY Yang-Mills with gauge group U(N) using field theory and string theory methods. The classical theory has a Higgs branch for rational values of the deformation parameter. We argue that the quantum theory also has an S-dual confining branch which cannot be seen classically. The low-energy effective theory on these branches is a six-dimensional non-commutative gauge theory with sixteen supercharges. Confinement of magnetic and electric charges, on the Higgs and confining branches respectively, occurs due to the formation of BPS-saturated strings in the low energy theory. The results also suggest a new way of deconstructing Little String Theory as a large-N limit of a confining gauge theory in four dimensions.
High-pressure phase transitions - Examples of classical predictability
NASA Astrophysics Data System (ADS)
Celebonovic, Vladan
1992-09-01
The applicability of the Savic and Kasanin (1962-1967) classical theory of dense matter to laboratory experiments requiring estimates of high-pressure phase transitions was examined by determining phase transition pressures for a set of 19 chemical substances (including elements, hydrocarbons, metal oxides, and salts) for which experimental data were available. A comparison between experimental and transition points and those predicted by the Savic-Kasanin theory showed that the theory can be used for estimating values of transition pressures. The results also support conclusions obtained in previous astronomical applications of the Savic-Kasanin theory.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Khrennikov, Andrei
We present fundamentals of a prequantum model with hidden variables of the classical field type. In some sense this is the comeback of classical wave mechanics. Our approach also can be considered as incorporation of quantum mechanics into classical signal theory. All quantum averages (including correlations of entangled systems) can be represented as classical signal averages and correlations.
Uniting the Spheres: Modern Feminist Theory and Classic Texts in AP English
ERIC Educational Resources Information Center
Drew, Simao J. A.; Bosnic, Brenda G.
2008-01-01
High school teachers Simao J. A. Drew and Brenda G. Bosnic help familiarize students with gender role analysis and feminist theory. Students examine classic literature and contemporary texts, considering characters' historical, literary, and social contexts while expanding their understanding of how patterns of identity and gender norms exist and…
Aesthetic Creativity: Insights from Classical Literary Theory on Creative Learning
ERIC Educational Resources Information Center
Hellstrom, Tomas Georg
2011-01-01
This paper addresses the subject of textual creativity by drawing on work done in classical literary theory and criticism, specifically new criticism, structuralism and early poststructuralism. The question of how readers and writers engage creatively with the text is closely related to educational concerns, though they are often thought of as…
ERIC Educational Resources Information Center
Bazaldua, Diego A. Luna; Lee, Young-Sun; Keller, Bryan; Fellers, Lauren
2017-01-01
The performance of various classical test theory (CTT) item discrimination estimators has been compared in the literature using both empirical and simulated data, resulting in mixed results regarding the preference of some discrimination estimators over others. This study analyzes the performance of various item discrimination estimators in CTT:…
Louis Guttman's Contributions to Classical Test Theory
ERIC Educational Resources Information Center
Zimmerman, Donald W.; Williams, Richard H.; Zumbo, Bruno D.; Ross, Donald
2005-01-01
This article focuses on Louis Guttman's contributions to the classical theory of educational and psychological tests, one of the lesser known of his many contributions to quantitative methods in the social sciences. Guttman's work in this field provided a rigorous mathematical basis for ideas that, for many decades after Spearman's initial work,…
Generalization of the Activated Complex Theory of Reaction Rates. II. Classical Mechanical Treatment
DOE R&D Accomplishments Database
Marcus, R. A.
1964-01-01
In its usual classical form activated complex theory assumes a particular expression for the kinetic energy of the reacting system -- one associated with a rectilinear motion along the reaction coordinate. The derivation of the rate expression given in the present paper is based on the general kinetic energy expression.
NASA Astrophysics Data System (ADS)
Yang, Chen
2018-05-01
The transitions from classical theories to quantum theories have attracted many interests. This paper demonstrates the analogy between the electromagnetic potentials and wave-like dynamic variables with their connections to quantum theory for audiences at advanced undergraduate level and above. In the first part, the counterpart relations in the classical electrodynamics (e.g. gauge transform and Lorenz condition) and classical mechanics (e.g. Legendre transform and free particle condition) are presented. These relations lead to similar governing equations of the field variables and dynamic variables. The Lorenz gauge, scalar potential and vector potential manifest a one-to-one similarity to the action, Hamiltonian and momentum, respectively. In the second part, the connections between the classical pictures of electromagnetic field and particle to quantum picture are presented. By characterising the states of electromagnetic field and particle via their (corresponding) variables, their evolution pictures manifest the same algebraic structure (isomorphic). Subsequently, pictures of the electromagnetic field and particle are compared to the quantum picture and their interconnections are given. A brief summary of the obtained results are presented at the end of the paper.
Quantum-correlation breaking channels, quantum conditional probability and Perron-Frobenius theory
NASA Astrophysics Data System (ADS)
Chruściński, Dariusz
2013-03-01
Using the quantum analog of conditional probability and classical Bayes theorem we discuss some aspects of particular entanglement breaking channels: quantum-classical and classical-classical channels. Applying the quantum analog of Perron-Frobenius theorem we generalize the recent result of Korbicz et al. (2012) [8] on full and spectrum broadcasting from quantum-classical channels to arbitrary quantum channels.
Operator Formulation of Classical Mechanics.
ERIC Educational Resources Information Center
Cohn, Jack
1980-01-01
Discusses the construction of an operator formulation of classical mechanics which is directly concerned with wave packets in configuration space and is more similar to that of convential quantum theory than other extant operator formulations of classical mechanics. (Author/HM)
Epistemic View of Quantum States and Communication Complexity of Quantum Channels
NASA Astrophysics Data System (ADS)
Montina, Alberto
2012-09-01
The communication complexity of a quantum channel is the minimal amount of classical communication required for classically simulating a process of state preparation, transmission through the channel and subsequent measurement. It establishes a limit on the power of quantum communication in terms of classical resources. We show that classical simulations employing a finite amount of communication can be derived from a special class of hidden variable theories where quantum states represent statistical knowledge about the classical state and not an element of reality. This special class has attracted strong interest very recently. The communication cost of each derived simulation is given by the mutual information between the quantum state and the classical state of the parent hidden variable theory. Finally, we find that the communication complexity for single qubits is smaller than 1.28 bits. The previous known upper bound was 1.85 bits.
ERIC Educational Resources Information Center
Epstein, Kitty Kelly
2012-01-01
The revised edition of "A Different View of Urban Schools" updates a unique story about the realities of urban education in America and provides new insights on the origin of urban education issues; the route to a diverse and effective teaching force; and the impact of federal legislation and corporate involvement on urban schools. Dr. Epstein's…
Extended theory of harmonic maps connects general relativity to chaos and quantum mechanism
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ren, Gang; Duan, Yi-Shi
General relativity and quantum mechanism are two separate rules of modern physics explaining how nature works. Both theories are accurate, but the direct connection between two theories was not yet clarified. Recently, researchers blur the line between classical and quantum physics by connecting chaos and entanglement equation. Here in this paper, we showed the Duan's extended HM theory, which has the solution of the general relativity, can also have the solutions of the classic chaos equations and even the solution of Schrödinger equation in quantum physics, suggesting the extended theory of harmonic maps may act as a universal theory ofmore » physics.« less
Extended theory of harmonic maps connects general relativity to chaos and quantum mechanism
Ren, Gang; Duan, Yi-Shi
2017-07-20
General relativity and quantum mechanism are two separate rules of modern physics explaining how nature works. Both theories are accurate, but the direct connection between two theories was not yet clarified. Recently, researchers blur the line between classical and quantum physics by connecting chaos and entanglement equation. Here in this paper, we showed the Duan's extended HM theory, which has the solution of the general relativity, can also have the solutions of the classic chaos equations and even the solution of Schrödinger equation in quantum physics, suggesting the extended theory of harmonic maps may act as a universal theory ofmore » physics.« less
Computation in generalised probabilisitic theories
NASA Astrophysics Data System (ADS)
Lee, Ciarán M.; Barrett, Jonathan
2015-08-01
From the general difficulty of simulating quantum systems using classical systems, and in particular the existence of an efficient quantum algorithm for factoring, it is likely that quantum computation is intrinsically more powerful than classical computation. At present, the best upper bound known for the power of quantum computation is that {{BQP}}\\subseteq {{AWPP}}, where {{AWPP}} is a classical complexity class (known to be included in {{PP}}, hence {{PSPACE}}). This work investigates limits on computational power that are imposed by simple physical, or information theoretic, principles. To this end, we define a circuit-based model of computation in a class of operationally-defined theories more general than quantum theory, and ask: what is the minimal set of physical assumptions under which the above inclusions still hold? We show that given only an assumption of tomographic locality (roughly, that multipartite states and transformations can be characterized by local measurements), efficient computations are contained in {{AWPP}}. This inclusion still holds even without assuming a basic notion of causality (where the notion is, roughly, that probabilities for outcomes cannot depend on future measurement choices). Following Aaronson, we extend the computational model by allowing post-selection on measurement outcomes. Aaronson showed that the corresponding quantum complexity class, {{PostBQP}}, is equal to {{PP}}. Given only the assumption of tomographic locality, the inclusion in {{PP}} still holds for post-selected computation in general theories. Hence in a world with post-selection, quantum theory is optimal for computation in the space of all operational theories. We then consider whether one can obtain relativized complexity results for general theories. It is not obvious how to define a sensible notion of a computational oracle in the general framework that reduces to the standard notion in the quantum case. Nevertheless, it is possible to define computation relative to a ‘classical oracle’. Then, we show there exists a classical oracle relative to which efficient computation in any theory satisfying the causality assumption does not include {{NP}}.
Quantum correlations and dynamics from classical random fields valued in complex Hilbert spaces
DOE Office of Scientific and Technical Information (OSTI.GOV)
Khrennikov, Andrei
2010-08-15
One of the crucial differences between mathematical models of classical and quantum mechanics (QM) is the use of the tensor product of the state spaces of subsystems as the state space of the corresponding composite system. (To describe an ensemble of classical composite systems, one uses random variables taking values in the Cartesian product of the state spaces of subsystems.) We show that, nevertheless, it is possible to establish a natural correspondence between the classical and the quantum probabilistic descriptions of composite systems. Quantum averages for composite systems (including entangled) can be represented as averages with respect to classical randommore » fields. It is essentially what Albert Einstein dreamed of. QM is represented as classical statistical mechanics with infinite-dimensional phase space. While the mathematical construction is completely rigorous, its physical interpretation is a complicated problem. We present the basic physical interpretation of prequantum classical statistical field theory in Sec. II. However, this is only the first step toward real physical theory.« less
Helitzer, Deborah L; Sussman, Andrew L; Hoffman, Richard M; Getrich, Christina M; Warner, Teddy D; Rhyne, Robert L
2014-08-01
Conceptual frameworks (CF) have historically been used to develop program theory. We re-examine the literature about the role of CF in this context, specifically how they can be used to create descriptive and prescriptive theories, as building blocks for a program theory. Using a case example of colorectal cancer screening intervention development, we describe the process of developing our initial CF, the methods used to explore the constructs in the framework and revise the framework for intervention development. We present seven steps that guided the development of our CF: (1) assemble the "right" research team, (2) incorporate existing literature into the emerging CF, (3) construct the conceptual framework, (4) diagram the framework, (5) operationalize the framework: develop the research design and measures, (6) conduct the research, and (7) revise the framework. A revised conceptual framework depicted more complicated inter-relationships of the different predisposing, enabling, reinforcing, and system-based factors. The updated framework led us to generate program theory and serves as the basis for designing future intervention studies and outcome evaluations. A CF can build a foundation for program theory. We provide a set of concrete steps and lessons learned to assist practitioners in developing a CF. Copyright © 2014 Elsevier Ltd. All rights reserved.
Opening Switch Research on a Plasma Focus VI.
1988-02-26
Sausage Instability in the Plasma Focus In this section the classical Kruskal- Schwarzschild 3 theory for the sausage mode is applied to the pinch phase...on 1) the shape of the pinch, 2) axial flow of plasma, and 3) self-generated magnetic fields are also presented. The Kruskal- Schwarzschild Theory The...classical mhd theory for the m=O mode in a plasma supported by a magnetic field against gravity; this is the well-known Kruskal- Schwarzschild
Nanoscale Capillary Flows in Alumina: Testing the Limits of Classical Theory.
Lei, Wenwen; McKenzie, David R
2016-07-21
Anodic aluminum oxide (AAO) membranes have well-formed cylindrical channels, as small as 10 nm in diameter, in a close packed hexagonal array. The channels in AAO membranes simulate very small leaks that may be present for example in an aluminum oxide device encapsulation. The 10 nm alumina channel is the smallest that has been studied to date for its moisture flow properties and provides a stringent test of classical capillary theory. We measure the rate at which moisture penetrates channels with diameters in the range of 10 to 120 nm with moist air present at 1 atm on one side and dry air at the same total pressure on the other. We extend classical theory for water leak rates at high humidities by allowing for variable meniscus curvature at the entrance and show that the extended theory explains why the flow increases greatly when capillary filling occurs and enables the contact angle to be determined. At low humidities our measurements for air-filled channels agree well with theory for the interdiffusive flow of water vapor in air. The flow rate of water-filled channels is one order of magnitude less than expected from classical capillary filling theory and is coincidentally equal to the helium flow rate, validating the use of helium leak testing for evaluating moisture flows in aluminum oxide leaks.
Revised Geometric Measure of Entanglement in Infinite Dimensional Multipartite Quantum Systems
NASA Astrophysics Data System (ADS)
Wang, Yinzhu; Wang, Danxia; Huang, Li
2018-05-01
In Cao and Wang (J. Phys.: Math. Theor. 40, 3507-3542, 2007), the revised geometric measure of entanglement (RGME) for states in finite dimensional bipartite quantum systems was proposed. Furthermore, in Cao and Wang (Commun. Theor. Phys. 51(4), 613-620, 2009), the authors obtained the revised geometry measure of entanglement for multipartite states including three-qubit GHZ state, W state, and the generalized Smolin state in the presence of noise and the two-mode squeezed thermal state, and defined the Gaussian geometric entanglement measure. In this paper, we generalize the RGME to infinite dimensional multipartite quantum systems, and prove that this measure satisfies some necessary properties as a well-defined entanglement measure, including monotonicity under local operations and classical communications.
Savazzi, Filippo; Risplendi, Francesca; Mallia, Giuseppe; Harrison, Nicholas M; Cicero, Giancarlo
2018-04-05
Graphene oxide (GO) is a versatile 2D material whose properties can be tuned by changing the type and concentration of oxygen-containing functional groups attached to its surface. However, a detailed knowledge of the dependence of the chemo/physical features of this material on its chemical composition is largely unknown. We combine classical molecular dynamics and density functional theory simulations to predict the structural and electronic properties of GO at low degree of oxidation and suggest a revision of the Lerf-Klinowski model. We find that layer deformation is larger for samples containing high concentrations of epoxy groups and that correspondingly the band gap increases. Targeted chemical modification of the GO surface appears to be an effective route to tailor the electronic properties of the monolayer for given applications. Our simulations also show that the chemical shift of the C-1s XPS peak allows one to unambiguously characterize GO composition, resolving the peak attribution uncertainty often encountered in experiments.
Hamilton-Jacobi theory in multisymplectic classical field theories
NASA Astrophysics Data System (ADS)
de León, Manuel; Prieto-Martínez, Pedro Daniel; Román-Roy, Narciso; Vilariño, Silvia
2017-09-01
The geometric framework for the Hamilton-Jacobi theory developed in the studies of Cariñena et al. [Int. J. Geom. Methods Mod. Phys. 3(7), 1417-1458 (2006)], Cariñena et al. [Int. J. Geom. Methods Mod. Phys. 13(2), 1650017 (2015)], and de León et al. [Variations, Geometry and Physics (Nova Science Publishers, New York, 2009)] is extended for multisymplectic first-order classical field theories. The Hamilton-Jacobi problem is stated for the Lagrangian and the Hamiltonian formalisms of these theories as a particular case of a more general problem, and the classical Hamilton-Jacobi equation for field theories is recovered from this geometrical setting. Particular and complete solutions to these problems are defined and characterized in several equivalent ways in both formalisms, and the equivalence between them is proved. The use of distributions in jet bundles that represent the solutions to the field equations is the fundamental tool in this formulation. Some examples are analyzed and, in particular, the Hamilton-Jacobi equation for non-autonomous mechanical systems is obtained as a special case of our results.
Properties of the Boltzmann equation in the classical approximation
Epelbaum, Thomas; Gelis, François; Tanji, Naoto; ...
2014-12-30
We examine the Boltzmann equation with elastic point-like scalar interactions in two different versions of the the classical approximation. Although solving numerically the Boltzmann equation with the unapproximated collision term poses no problem, this allows one to study the effect of the ultraviolet cutoff in these approximations. This cutoff dependence in the classical approximations of the Boltzmann equation is closely related to the non-renormalizability of the classical statistical approximation of the underlying quantum field theory. The kinetic theory setup that we consider here allows one to study in a much simpler way the dependence on the ultraviolet cutoff, since onemore » has also access to the non-approximated result for comparison.« less
Experimental Observation of Two Features Unexpected from the Classical Theories of Rubber Elasticity
NASA Astrophysics Data System (ADS)
Nishi, Kengo; Fujii, Kenta; Chung, Ung-il; Shibayama, Mitsuhiro; Sakai, Takamasa
2017-12-01
Although the elastic modulus of a Gaussian chain network is thought to be successfully described by classical theories of rubber elasticity, such as the affine and phantom models, verification experiments are largely lacking owing to difficulties in precisely controlling of the network structure. We prepared well-defined model polymer networks experimentally, and measured the elastic modulus G for a broad range of polymer concentrations and connectivity probabilities, p . In our experiment, we observed two features that were distinct from those predicted by classical theories. First, we observed the critical behavior G ˜|p -pc|1.95 near the sol-gel transition. This scaling law is different from the prediction of classical theories, but can be explained by analogy between the electric conductivity of resistor networks and the elasticity of polymer networks. Here, pc is the sol-gel transition point. Furthermore, we found that the experimental G -p relations in the region above C* did not follow the affine or phantom theories. Instead, all the G /G0-p curves fell onto a single master curve when G was normalized by the elastic modulus at p =1 , G0. We show that the effective medium approximation for Gaussian chain networks explains this master curve.
ERIC Educational Resources Information Center
MacMillan, Peter D.
2000-01-01
Compared classical test theory (CTT), generalizability theory (GT), and multifaceted Rasch model (MFRM) approaches to detecting and correcting for rater variability using responses of 4,930 high school students graded by 3 raters on 9 scales. The MFRM approach identified far more raters as different than did the CTT analysis. GT and Rasch…
Marshaling Resources: A Classic Grounded Theory Study of Online Learners
ERIC Educational Resources Information Center
Yalof, Barbara
2012-01-01
Students who enroll in online courses comprise one quarter of an increasingly diverse student body in higher education today. Yet, it is not uncommon for an online program to lose over 50% of its enrolled students prior to graduation. This study used a classic grounded theory qualitative methodology to investigate the persistent problem of…
ERIC Educational Resources Information Center
Gotsch-Thomson, Susan
1990-01-01
Describes how gender is integrated into a classical social theory course by including a female theorist in the reading assignments and using "The Handmaid's Tale" by Margaret Atwood as the basis for class discussion. Reviews the course objectives and readings; describes the process of the class discussions; and provides student…
Traffic Flow Theory - A State-of-the-Art Report: Revised Monograph on Traffic Flow Theory
DOT National Transportation Integrated Search
2002-04-13
This publication is an update and expansion of the Transportation Research Board (TRB) Special Report 165, "Traffic Flow Theory," published in 1975. This updating was undertaken on recommendation of the TRB's Committee on Traffic Flow Theory and Char...
The Development of Bayesian Theory and Its Applications in Business and Bioinformatics
NASA Astrophysics Data System (ADS)
Zhang, Yifei
2018-03-01
Bayesian Theory originated from an Essay of a British mathematician named Thomas Bayes in 1763, and after its development in 20th century, Bayesian Statistics has been taking a significant part in statistical study of all fields. Due to the recent breakthrough of high-dimensional integral, Bayesian Statistics has been improved and perfected, and now it can be used to solve problems that Classical Statistics failed to solve. This paper summarizes Bayesian Statistics’ history, concepts and applications, which are illustrated in five parts: the history of Bayesian Statistics, the weakness of Classical Statistics, Bayesian Theory and its development and applications. The first two parts make a comparison between Bayesian Statistics and Classical Statistics in a macroscopic aspect. And the last three parts focus on Bayesian Theory in specific -- from introducing some particular Bayesian Statistics’ concepts to listing their development and finally their applications.
Reframing developmental biology and building evolutionary theory's new synthesis.
Tauber, Alfred I
2010-01-01
Gilbert and Epel present a new approach to developmental biology: embryogenesis must be understood within the full context of the organism's environment. Instead of an insular embryo following a genetic blueprint, this revised program maintains that embryogenesis is subject to inputs from the environment that generate novel genetic variation with dynamic consequences for development. Beyond allelic variation of structural genes and of regulatory loci, plasticity-derived epigenetic variation completes the triad of the major types of variation required for evolution. Developmental biology and ecology, disciplines that have previously been regarded as distinct, are presented here as fully integrated under the rubric of "eco-devo," and from this perspective, which highlights how the environment not only selects variation, it helps construct it, another synthesis with evolutionary biology must also be made, "eco-evo-devo." This second integration has enormous implications for expanding evolution theory, inasmuch as the Modern Synthesis (Provine 1971), which combined classical genetics and Darwinism in the mid-20th century, did not account for the role of development in evolution. The eco-evo-devo synthesis thus portends a major theoretical inflection in evolutionary biology. Following a description of these scientific developments, comment is offered as to how this new integrated approach might be understood within the larger shifts in contemporary biology.
The Receding Animal: Theorizing Anxiety and Attachment in Psychoanalysis from Freud to Imre Hermann.
Marinelli, Lydia; Mayer, Andreas
2016-03-01
Argument Animals played an important role in the formation of psychoanalysis as a theoretical and therapeutic enterprise. They are at the core of texts such as Freud's famous case histories of Little Hans, the Rat Man, or the Wolf Man. The infantile anxiety triggered by animals provided the essential link between the psychology of individual neuroses and the ambivalent status of the "totem" animal in so-called primitive societies in Freud's attempt to construct an anthropological basis for the Oedipus complex in Totem and Taboo. In the following, we attempt to track the status of animals as objects of indirect observation as they appear in Freud's classical texts, and in later revisionist accounts such as Otto Rank's Trauma of Birth and Imre Hermann's work on the clinging instinct. In the 1920s and 1930s, the Freudian conception of patients' animal phobias is substantially revised within Hermann's original psychoanalytic theory of instincts which draws heavily upon ethological observations of primates. Although such a reformulation remains grounded in the idea of "archaic" animal models for human development, it allows to a certain extent to empiricize the speculative elements of Freud's later instinct theory (notably the death instinct) and to come to a more embodied account of psychoanalytic practice.
NASA Astrophysics Data System (ADS)
Mojahedi, Mahdi; Shekoohinejad, Hamidreza
2018-02-01
In this paper, temperature distribution in the continuous and pulsed end-pumped Nd:YAG rod crystal is determined using nonclassical and classical heat conduction theories. In order to find the temperature distribution in crystal, heat transfer differential equations of crystal with consideration of boundary conditions are derived based on non-Fourier's model and temperature distribution of the crystal is achieved by an analytical method. Then, by transferring non-Fourier differential equations to matrix equations, using finite element method, temperature and stress of every point of crystal are calculated in the time domain. According to the results, a comparison between classical and nonclassical theories is represented to investigate rupture power values. In continuous end pumping with equal input powers, non-Fourier theory predicts greater temperature and stress compared to Fourier theory. It also shows that with an increase in relaxation time, crystal rupture power decreases. Despite of these results, in single rectangular pulsed end-pumping condition, with an equal input power, Fourier theory indicates higher temperature and stress rather than non-Fourier theory. It is also observed that, when the relaxation time increases, maximum amounts of temperature and stress decrease.
Knowing-It-All but Still Learning: Perceptions of One's Own Knowledge and Belief Revision
ERIC Educational Resources Information Center
Hagá, Sara; Olson, Kristina R.
2017-01-01
Lay theories suggest that people who are overconfident in their knowledge are less likely to revise that knowledge when someone else offers an alternative belief. Similarly, one might assume that people who "are" willing to revise their beliefs might not be very confident in their knowledge to begin with. Two studies with children ages…
Gambini, R; Pullin, J
2000-12-18
We consider general relativity with a cosmological constant as a perturbative expansion around a completely solvable diffeomorphism invariant field theory. This theory is the lambda --> infinity limit of general relativity. This allows an explicit perturbative computational setup in which the quantum states of the theory and the classical observables can be explicitly computed. An unexpected relationship arises at a quantum level between the discrete spectrum of the volume operator and the allowed values of the cosmological constant.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lusanna, Luca
2004-08-19
The four (electro-magnetic, weak, strong and gravitational) interactions are described by singular Lagrangians and by Dirac-Bergmann theory of Hamiltonian constraints. As a consequence a subset of the original configuration variables are gauge variables, not determined by the equations of motion. Only at the Hamiltonian level it is possible to separate the gauge variables from the deterministic physical degrees of freedom, the Dirac observables, and to formulate a well posed Cauchy problem for them both in special and general relativity. Then the requirement of causality dictates the choice of retarded solutions at the classical level. However both the problems of themore » classical theory of the electron, leading to the choice of (1/2) (retarded + advanced) solutions, and the regularization of quantum field theory, leading to the Feynman propagator, introduce anticipatory aspects. The determination of the relativistic Darwin potential as a semi-classical approximation to the Lienard-Wiechert solution for particles with Grassmann-valued electric charges, regularizing the Coulomb self-energies, shows that these anticipatory effects live beyond the semi-classical approximation (tree level) under the form of radiative corrections, at least for the electro-magnetic interaction.Talk and 'best contribution' at The Sixth International Conference on Computing Anticipatory Systems CASYS'03, Liege August 11-16, 2003.« less
The dynamical mass of a classical Cepheid variable star in an eclipsing binary system.
Pietrzyński, G; Thompson, I B; Gieren, W; Graczyk, D; Bono, G; Udalski, A; Soszyński, I; Minniti, D; Pilecki, B
2010-11-25
Stellar pulsation theory provides a means of determining the masses of pulsating classical Cepheid supergiants-it is the pulsation that causes their luminosity to vary. Such pulsational masses are found to be smaller than the masses derived from stellar evolution theory: this is the Cepheid mass discrepancy problem, for which a solution is missing. An independent, accurate dynamical mass determination for a classical Cepheid variable star (as opposed to type-II Cepheids, low-mass stars with a very different evolutionary history) in a binary system is needed in order to determine which is correct. The accuracy of previous efforts to establish a dynamical Cepheid mass from Galactic single-lined non-eclipsing binaries was typically about 15-30% (refs 6, 7), which is not good enough to resolve the mass discrepancy problem. In spite of many observational efforts, no firm detection of a classical Cepheid in an eclipsing double-lined binary has hitherto been reported. Here we report the discovery of a classical Cepheid in a well detached, double-lined eclipsing binary in the Large Magellanic Cloud. We determine the mass to a precision of 1% and show that it agrees with its pulsation mass, providing strong evidence that pulsation theory correctly and precisely predicts the masses of classical Cepheids.
Phase-Sensitive Coherence and the Classical-Quantum Boundary in Ghost Imaging
NASA Technical Reports Server (NTRS)
Erkmen, Baris I.; Hardy, Nicholas D.; Venkatraman, Dheera; Wong, Franco N. C.; Shapiro, Jeffrey H.
2011-01-01
The theory of partial coherence has a long and storied history in classical statistical optics. the vast majority of this work addresses fields that are statistically stationary in time, hence their complex envelopes only have phase-insensitive correlations. The quantum optics of squeezed-state generation, however, depends on nonlinear interactions producing baseband field operators with phase-insensitive and phase-sensitive correlations. Utilizing quantum light to enhance imaging has been a topic of considerable current interest, much of it involving biphotons, i.e., streams of entangled-photon pairs. Biphotons have been employed for quantum versions of optical coherence tomography, ghost imaging, holography, and lithography. However, their seemingly quantum features have been mimicked with classical-sate light, questioning wherein lies the classical-quantum boundary. We have shown, for the case of Gaussian-state light, that this boundary is intimately connected to the theory of phase-sensitive partial coherence. Here we present that theory, contrasting it with the familiar case of phase-insensitive partial coherence, and use it to elucidate the classical-quantum boundary of ghost imaging. We show, both theoretically and experimentally, that classical phase-sensitive light produces ghost imaging most closely mimicking those obtained in biphotons, and we derived the spatial resolution, image contrast, and signal-to-noise ratio of a standoff-sensing ghost imager, taking into account target-induced speckle.
Chess Revision: Acquiring the Rules of Chess Variants through FOL Theory Revision from Examples
NASA Astrophysics Data System (ADS)
Muggleton, Stephen; Paes, Aline; Santos Costa, Vítor; Zaverucha, Gerson
The game of chess has been a major testbed for research in artificial intelligence, since it requires focus on intelligent reasoning. Particularly, several challenges arise to machine learning systems when inducing a model describing legal moves of the chess, including the collection of the examples, the learning of a model correctly representing the official rules of the game, covering all the branches and restrictions of the correct moves, and the comprehensibility of such a model. Besides, the game of chess has inspired the creation of numerous variants, ranging from faster to more challenging or to regional versions of the game. The question arises if it is possible to take advantage of an initial classifier of chess as a starting point to obtain classifiers for the different variants. We approach this problem as an instance of theory revision from examples. The initial classifier of chess is inspired by a FOL theory approved by a chess expert and the examples are defined as sequences of moves within a game. Starting from a standard revision system, we argue that abduction and negation are also required to best address this problem. Experimental results show the effectiveness of our approach.
Framework based on communicability and flow to analyze complex network dynamics
NASA Astrophysics Data System (ADS)
Gilson, M.; Kouvaris, N. E.; Deco, G.; Zamora-López, G.
2018-05-01
Graph theory constitutes a widely used and established field providing powerful tools for the characterization of complex networks. The intricate topology of networks can also be investigated by means of the collective dynamics observed in the interactions of self-sustained oscillations (synchronization patterns) or propagationlike processes such as random walks. However, networks are often inferred from real-data-forming dynamic systems, which are different from those employed to reveal their topological characteristics. This stresses the necessity for a theoretical framework dedicated to the mutual relationship between the structure and dynamics in complex networks, as the two sides of the same coin. Here we propose a rigorous framework based on the network response over time (i.e., Green function) to study interactions between nodes across time. For this purpose we define the flow that describes the interplay between the network connectivity and external inputs. This multivariate measure relates to the concepts of graph communicability and the map equation. We illustrate our theory using the multivariate Ornstein-Uhlenbeck process, which describes stable and non-conservative dynamics, but the formalism can be adapted to other local dynamics for which the Green function is known. We provide applications to classical network examples, such as small-world ring and hierarchical networks. Our theory defines a comprehensive framework that is canonically related to directed and weighted networks, thus paving a way to revise the standards for network analysis, from the pairwise interactions between nodes to the global properties of networks including community detection.
Bertrand's theorem and virial theorem in fractional classical mechanics
NASA Astrophysics Data System (ADS)
Yu, Rui-Yan; Wang, Towe
2017-09-01
Fractional classical mechanics is the classical counterpart of fractional quantum mechanics. The central force problem in this theory is investigated. Bertrand's theorem is generalized, and virial theorem is revisited, both in three spatial dimensions. In order to produce stable, closed, non-circular orbits, the inverse-square law and the Hooke's law should be modified in fractional classical mechanics.
The Tensile Strength of Liquid Nitrogen
NASA Astrophysics Data System (ADS)
Huang, Jian
1992-01-01
The tensile strength of liquids has been a puzzling subject. On the one hand, the classical nucleation theory has met great success in predicting the nucleation rates of superheated liquids. On the other hand, most of reported experimental values of the tensile strength for different liquids are far below the prediction from the classical nucleation theory. In this study, homogeneous nucleation in liquid nitrogen and its tensile strength have been investigated. Different approaches for determining the pressure amplitude were studied carefully. It is shown that Raman-Nath theory, as modified by the introduction of an effective interaction length, can be used to determine the pressure amplitude in the focal plane of a focusing ultrasonic transducer. The results obtained from different diffraction orders are consistent and in good agreement with other approaches including Debye's theory and solving the KZK equation. The measurement of the tensile strength was carried out in a high pressure stainless steel dewar. A High intensity ultrasonic wave was focused into a small volume of liquid nitrogen in a short time period. A probe laser beam passes through the focal region of a concave spherical transducer with small aperture angle and the transmitted light is detected with a photodiode. The pressure amplitude at the focus is calculated based on the acoustic power radiated into the liquid. In the experiment, the electrical signal on the transducer is gated at its resonance frequency with gate widths of 20 mus to 0.2 ms and temperature range from 77 K to near 100 K. The calculated pressure amplitude is in agreement with the prediction of classical nucleation theory for the nucleation rates from 10^6 to 10^ {11} (bubbles/cm^3 sec). This work provides the experimental evidence that the validity of the classical nucleation theory can be extended to the region of the negative pressure up to -90 atm. This is only the second cryogenic liquid to reach the tensile strength predicted from the classical nucleation theory.
Nonequilibrium dynamics of the O( N ) model on dS3 and AdS crunches
NASA Astrophysics Data System (ADS)
Kumar, S. Prem; Vaganov, Vladislav
2018-03-01
We study the nonperturbative quantum evolution of the interacting O( N ) vector model at large- N , formulated on a spatial two-sphere, with time dependent couplings which diverge at finite time. This model - the so-called "E-frame" theory, is related via a conformal transformation to the interacting O( N ) model in three dimensional global de Sitter spacetime with time independent couplings. We show that with a purely quartic, relevant deformation the quantum evolution of the E-frame model is regular even when the classical theory is rendered singular at the end of time by the diverging coupling. Time evolution drives the E-frame theory to the large- N Wilson-Fisher fixed point when the classical coupling diverges. We study the quantum evolution numerically for a variety of initial conditions and demonstrate the finiteness of the energy at the classical "end of time". With an additional (time dependent) mass deformation, quantum backreaction lowers the mass, with a putative smooth time evolution only possible in the limit of infinite quartic coupling. We discuss the relevance of these results for the resolution of crunch singularities in AdS geometries dual to E-frame theories with a classical gravity dual.
Semenov, Alexander; Babikov, Dmitri
2015-12-17
The mixed quantum classical theory, MQCT, for inelastic scattering of two molecules is developed, in which the internal (rotational, vibrational) motion of both collision partners is treated with quantum mechanics, and the molecule-molecule scattering (translational motion) is described by classical trajectories. The resultant MQCT formalism includes a system of coupled differential equations for quantum probability amplitudes, and the classical equations of motion in the mean-field potential. Numerical tests of this theory are carried out for several most important rotational state-to-state transitions in the N2 + H2 system, in a broad range of collision energies. Besides scattering resonances (at low collision energies) excellent agreement with full-quantum results is obtained, including the excitation thresholds, the maxima of cross sections, and even some smaller features, such as slight oscillations of energy dependencies. Most importantly, at higher energies the results of MQCT are nearly identical to the full quantum results, which makes this approach a good alternative to the full-quantum calculations that become computationally expensive at higher collision energies and for heavier collision partners. Extensions of this theory to include vibrational transitions or general asymmetric-top rotor (polyatomic) molecules are relatively straightforward.
Classical theory of atom-surface scattering: The rainbow effect
NASA Astrophysics Data System (ADS)
Miret-Artés, Salvador; Pollak, Eli
2012-07-01
The scattering of heavy atoms and molecules from surfaces is oftentimes dominated by classical mechanics. A large body of experiments have gathered data on the angular distributions of the scattered species, their energy loss distribution, sticking probability, dependence on surface temperature and more. For many years these phenomena have been considered theoretically in the framework of the “washboard model” in which the interaction of the incident particle with the surface is described in terms of hard wall potentials. Although this class of models has helped in elucidating some of the features it left open many questions such as: true potentials are clearly not hard wall potentials, it does not provide a realistic framework for phonon scattering, and it cannot explain the incident angle and incident energy dependence of rainbow scattering, nor can it provide a consistent theory for sticking. In recent years we have been developing a classical perturbation theory approach which has provided new insight into the dynamics of atom-surface scattering. The theory includes both surface corrugation as well as interaction with surface phonons in terms of harmonic baths which are linearly coupled to the system coordinates. This model has been successful in elucidating many new features of rainbow scattering in terms of frictions and bath fluctuations or noise. It has also given new insight into the origins of asymmetry in atomic scattering from surfaces. New phenomena deduced from the theory include friction induced rainbows, energy loss rainbows, a theory of super-rainbows, and more. In this review we present the classical theory of atom-surface scattering as well as extensions and implications for semiclassical scattering and the further development of a quantum theory of surface scattering. Special emphasis is given to the inversion of scattering data into information on the particle-surface interactions.
Classical theory of atom-surface scattering: The rainbow effect
NASA Astrophysics Data System (ADS)
Miret-Artés, Salvador; Pollak, Eli
The scattering of heavy atoms and molecules from surfaces is oftentimes dominated by classical mechanics. A large body of experiments have gathered data on the angular distributions of the scattered species, their energy loss distribution, sticking probability, dependence on surface temperature and more. For many years these phenomena have been considered theoretically in the framework of the "washboard model" in which the interaction of the incident particle with the surface is described in terms of hard wall potentials. Although this class of models has helped in elucidating some of the features it left open many questions such as: true potentials are clearly not hard wall potentials, it does not provide a realistic framework for phonon scattering, and it cannot explain the incident angle and incident energy dependence of rainbow scattering, nor can it provide a consistent theory for sticking. In recent years we have been developing a classical perturbation theory approach which has provided new insight into the dynamics of atom-surface scattering. The theory includes both surface corrugation as well as interaction with surface phonons in terms of harmonic baths which are linearly coupled to the system coordinates. This model has been successful in elucidating many new features of rainbow scattering in terms of frictions and bath fluctuations or noise. It has also given new insight into the origins of asymmetry in atomic scattering from surfaces. New phenomena deduced from the theory include friction induced rainbows, energy loss rainbows, a theory of super-rainbows, and more. In this review we present the classical theory of atom-surface scattering as well as extensions and implications for semiclassical scattering and the further development of a quantum theory of surface scattering. Special emphasis is given to the inversion of scattering data into information on the particle-surface interactions.
Bojowald, Martin
2008-01-01
Quantum gravity is expected to be necessary in order to understand situations in which classical general relativity breaks down. In particular in cosmology one has to deal with initial singularities, i.e., the fact that the backward evolution of a classical spacetime inevitably comes to an end after a finite amount of proper time. This presents a breakdown of the classical picture and requires an extended theory for a meaningful description. Since small length scales and high curvatures are involved, quantum effects must play a role. Not only the singularity itself but also the surrounding spacetime is then modified. One particular theory is loop quantum cosmology, an application of loop quantum gravity to homogeneous systems, which removes classical singularities. Its implications can be studied at different levels. The main effects are introduced into effective classical equations, which allow one to avoid the interpretational problems of quantum theory. They give rise to new kinds of early-universe phenomenology with applications to inflation and cyclic models. To resolve classical singularities and to understand the structure of geometry around them, the quantum description is necessary. Classical evolution is then replaced by a difference equation for a wave function, which allows an extension of quantum spacetime beyond classical singularities. One main question is how these homogeneous scenarios are related to full loop quantum gravity, which can be dealt with at the level of distributional symmetric states. Finally, the new structure of spacetime arising in loop quantum gravity and its application to cosmology sheds light on more general issues, such as the nature of time. Supplementary material is available for this article at 10.12942/lrr-2008-4.
Contemporary Translation Theories. Second Revised Edition. Topics in Translation 21.
ERIC Educational Resources Information Center
Gentzler, Edwin
This book traces the growth of translation theory from its traditional roots through the recent proliferation of theories, fueled by research in feminism, poststructural, and postcolonial investigations. It examines 5 new approaches: the North American translation workshop, the science of translation, early translation studies, polysystem theory,…
ERIC Educational Resources Information Center
Zhang, Fuhui; Schunn, Christian D.; Baikadi, Alok
2017-01-01
Building upon self-regulated learning theories, we examined the nature of student writing goals and the relationship of these writing goals to revision alone and in combination with two other important sources of students' self-regulated revision--peer comments on their writing, and reflections for their own writing obtained from reviewing others'…
Using Qualitative Inquiry to Promote Organizational Intelligence
ERIC Educational Resources Information Center
Kimball, Ezekiel; Loya, Karla I.
2017-01-01
Framed by Terenzini's revision of his classic "On the nature of institutional research" article, this chapter offers concluding thoughts on the way in which technical/analytical, issues, and contextual types of awarenesses appeared across chapters in this volume. Moreover, it outlines how each chapter demonstrated how qualitative inquiry…
The Crossett Story, Revised: Updating a Forestry Classic
Don C. Bragg; James M. Guldin; Michael G. Shelton
2003-01-01
Abstract: The Crossett Story slide show was developed in 1980 to detail the history of logging, field forestry, and research centered on the USDA Forest Service's Crossett Experimental Forest (CEF). However, science and technology have advanced considerably over the last several decades and the regulatory environment has...
Aging Theories for Establishing Safe Life Spans of Airborne Critical Structural Components
NASA Technical Reports Server (NTRS)
Ko, William L.
2003-01-01
New aging theories have been developed to establish the safe life span of airborne critical structural components such as B-52B aircraft pylon hooks for carrying air-launch drop-test vehicles. The new aging theories use the equivalent-constant-amplitude loading spectrum to represent the actual random loading spectrum with the same damaging effect. The crack growth due to random loading cycling of the first flight is calculated using the half-cycle theory, and then extrapolated to all the crack growths of the subsequent flights. The predictions of the new aging theories (finite difference aging theory and closed-form aging theory) are compared with the classical flight-test life theory and the previously developed Ko first- and Ko second-order aging theories. The new aging theories predict the number of safe flights as considerably lower than that predicted by the classical aging theory, and slightly lower than those predicted by the Ko first- and Ko second-order aging theories due to the inclusion of all the higher order terms.
On the co-creation of classical and modern physics.
Staley, Richard
2005-12-01
While the concept of "classical physics" has long framed our understanding of the environment from which modern physics emerged, it has consistently been read back into a period in which the physicists concerned initially considered their work in quite other terms. This essay explores the shifting currency of the rich cultural image of the classical/ modern divide by tracing empirically different uses of "classical" within the physics community from the 1890s to 1911. A study of fin-de-siècle addresses shows that the earliest general uses of the concept proved controversial. Our present understanding of the term was in large part shaped by its incorporation (in different ways) within the emerging theories of relativity and quantum theory--where the content of "classical" physics was defined by proponents of the new. Studying the diverse ways in which Boltzmann, Larmor, Poincaré, Einstein, Minkowski, and Planck invoked the term "classical" will help clarify the critical relations between physicists' research programs and their use of worldview arguments in fashioning modern physics.
Contact stresses in gear teeth: A new method of analysis
NASA Technical Reports Server (NTRS)
Somprakit, Paisan; Huston, Ronald L.; Oswald, Fred B.
1991-01-01
A new, innovative procedure called point load superposition for determining the contact stresses in mating gear teeth. It is believed that this procedure will greatly extend both the range of applicability and the accuracy of gear contact stress analysis. Point load superposition is based upon fundamental solutions from the theory of elasticity. It is an iterative numerical procedure which has distinct advantages over the classical Hertz method, the finite element method, and over existing applications with the boundary element method. Specifically, friction and sliding effects, which are either excluded from or difficult to study with the classical methods, are routinely handled with the new procedure. Presented here are the basic theory and the algorithms. Several examples are given. Results are consistent with those of the classical theories. Applications to spur gears are discussed.
Brassey, Charlotte A.; Margetts, Lee; Kitchener, Andrew C.; Withers, Philip J.; Manning, Phillip L.; Sellers, William I.
2013-01-01
Classic beam theory is frequently used in biomechanics to model the stress behaviour of vertebrate long bones, particularly when creating intraspecific scaling models. Although methodologically straightforward, classic beam theory requires complex irregular bones to be approximated as slender beams, and the errors associated with simplifying complex organic structures to such an extent are unknown. Alternative approaches, such as finite element analysis (FEA), while much more time-consuming to perform, require no such assumptions. This study compares the results obtained using classic beam theory with those from FEA to quantify the beam theory errors and to provide recommendations about when a full FEA is essential for reasonable biomechanical predictions. High-resolution computed tomographic scans of eight vertebrate long bones were used to calculate diaphyseal stress owing to various loading regimes. Under compression, FEA values of minimum principal stress (σmin) were on average 142 per cent (±28% s.e.) larger than those predicted by beam theory, with deviation between the two models correlated to shaft curvature (two-tailed p = 0.03, r2 = 0.56). Under bending, FEA values of maximum principal stress (σmax) and beam theory values differed on average by 12 per cent (±4% s.e.), with deviation between the models significantly correlated to cross-sectional asymmetry at midshaft (two-tailed p = 0.02, r2 = 0.62). In torsion, assuming maximum stress values occurred at the location of minimum cortical thickness brought beam theory and FEA values closest in line, and in this case FEA values of τtorsion were on average 14 per cent (±5% s.e.) higher than beam theory. Therefore, FEA is the preferred modelling solution when estimates of absolute diaphyseal stress are required, although values calculated by beam theory for bending may be acceptable in some situations. PMID:23173199
Generalized Quantum Theory of Bianchi IX Cosmologies
NASA Astrophysics Data System (ADS)
Craig, David; Hartle, James
2003-04-01
We apply sum-over-histories generalized quantum theory to the closed homogeneous minisuperspace Bianchi IX cosmological model. We sketch how the probabilities in decoherent sets of alternative, coarse-grained histories of this model universe are calculated. We consider in particular, the probabilities for classical evolution in a suitable coarse-graining. For a restricted class of initial conditions and coarse grainings we exhibit the approximate decoherence of alternative histories in which the universe behaves classically and those in which it does not, illustrating the prediction that these universes will evolve in an approximately classical manner with a probability near unity.
Generalized mutual information and Tsirelson's bound
NASA Astrophysics Data System (ADS)
Wakakuwa, Eyuri; Murao, Mio
2014-12-01
We introduce a generalization of the quantum mutual information between a classical system and a quantum system into the mutual information between a classical system and a system described by general probabilistic theories. We apply this generalized mutual information (GMI) to a derivation of Tsirelson's bound from information causality, and prove that Tsirelson's bound can be derived from the chain rule of the GMI. By using the GMI, we formulate the "no-supersignalling condition" (NSS), that the assistance of correlations does not enhance the capability of classical communication. We prove that NSS is never violated in any no-signalling theory.
Generalized mutual information and Tsirelson's bound
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wakakuwa, Eyuri; Murao, Mio
2014-12-04
We introduce a generalization of the quantum mutual information between a classical system and a quantum system into the mutual information between a classical system and a system described by general probabilistic theories. We apply this generalized mutual information (GMI) to a derivation of Tsirelson's bound from information causality, and prove that Tsirelson's bound can be derived from the chain rule of the GMI. By using the GMI, we formulate the 'no-supersignalling condition' (NSS), that the assistance of correlations does not enhance the capability of classical communication. We prove that NSS is never violated in any no-signalling theory.
Budiyono, Agung; Rohrlich, Daniel
2017-11-03
Where does quantum mechanics part ways with classical mechanics? How does quantum randomness differ fundamentally from classical randomness? We cannot fully explain how the theories differ until we can derive them within a single axiomatic framework, allowing an unambiguous account of how one theory is the limit of the other. Here we derive non-relativistic quantum mechanics and classical statistical mechanics within a common framework. The common axioms include conservation of average energy and conservation of probability current. But two axioms distinguish quantum mechanics from classical statistical mechanics: an "ontic extension" defines a nonseparable (global) random variable that generates physical correlations, and an "epistemic restriction" constrains allowed phase space distributions. The ontic extension and epistemic restriction, with strength on the order of Planck's constant, imply quantum entanglement and uncertainty relations. This framework suggests that the wave function is epistemic, yet it does not provide an ontic dynamics for individual systems.
"Fathers" and "sons" of theories in cell physiology: the membrane theory.
Matveev, V V; Wheatley, D N
2005-12-16
The last 50 years in the history of life sciences are remarkable for a new important feature that looks as a great threat for their future. A profound specialization dominating in quickly developing fields of science causes a crisis of the scientific method. The essence of the method is a unity of two elements, the experimental data and the theory that explains them. To us, "fathers" of science, classically, were the creators of new ideas and theories. They were the true experts of their own theories. It is only they who have the right to say: "I am the theory". In other words, they were carriers of theories, of the theoretical knowledge. The fathers provided the necessary logical integrity to their theories, since theories in biology have still to be based on strict mathematical proofs. It is not true for sons. As a result of massive specialization, modern experts operate in very confined close spaces. They formulate particular rules far from the level of theory. The main theories of science are known to them only at the textbook level. Nowadays, nobody can say: "I am the theory". With whom, then is it possible to discuss today on a broader theoretical level? How can a classical theory--for example, the membrane one--be changed or even disproved under these conditions? How can the "sons" with their narrow education catch sight of membrane theory defects? As a result, "global" theories have few critics and control. Due to specialization, we have lost the ability to work at the experimental level of biology within the correct or appropriate theoretical context. The scientific method in its classic form is now being rapidly eroded. A good case can be made for "Membrane Theory", to which we will largely refer throughout this article.
Knipfer, T; Fei, J; Gambetta, G A; Shackel, K A; Matthews, M A
2014-10-21
The cell-pressure-probe is a unique tool to study plant water relations in-situ. Inaccuracy in the estimation of cell volume (νo) is the major source of error in the calculation of both cell volumetric elastic modulus (ε) and cell hydraulic conductivity (Lp). Estimates of νo and Lp can be obtained with the pressure-clamp (PC) and pressure-relaxation (PR) methods. In theory, both methods should result in comparable νo and Lp estimates, but this has not been the case. In this study, the existing νo-theories for PC and PR methods were reviewed and clarified. A revised νo-theory was developed that is equally valid for the PC and PR methods. The revised theory was used to determine νo for two extreme scenarios of solute mixing between the experimental cell and sap in the pressure probe microcapillary. Using a fully automated cell-pressure-probe (ACPP) on leaf epidermal cells of Tradescantia virginiana, the validity of the revised theory was tested with experimental data. Calculated νo values from both methods were in the range of optically determined νo (=1.1-5.0nL) for T. virginiana. However, the PC method produced a systematically lower (21%) calculated νo compared to the PR method. Effects of solute mixing could only explain a potential error in calculated νo of <3%. For both methods, this discrepancy in νo was almost identical to the discrepancy in the measured ratio of ΔV/ΔP (total change in microcapillary sap volume versus corresponding change in cell turgor) of 19%, which is a fundamental parameter in calculating νo. It followed from the revised theory that the ratio of ΔV/ΔP was inversely related to the solute reflection coefficient. This highlighted that treating the experimental cell as an ideal osmometer in both methods is potentially not correct. Effects of non-ideal osmotic behavior by transmembrane solute movement may be minimized in the PR as compared to the PC method. Copyright © 2014 Elsevier Ltd. All rights reserved.
Berthelsen, Connie Bøttcher; Lindhardt, Tove; Frederiksen, Kirsten
2017-06-01
This paper presents a discussion of the differences in using participant observation as a data collection method by comparing the classic grounded theory methodology of Barney Glaser with the constructivist grounded theory methodology by Kathy Charmaz. Participant observations allow nursing researchers to experience activities and interactions directly in situ. However, using participant observations as a data collection method can be done in many ways, depending on the chosen grounded theory methodology, and may produce different results. This discussion shows that how the differences between using participant observations in classic and constructivist grounded theory can be considerable and that grounded theory researchers should adhere to the method descriptions of performing participant observations according to the selected grounded theory methodology to enhance the quality of research. © 2016 Nordic College of Caring Science.
Reformulating Non-Monotonic Theories for Inference and Updating
NASA Technical Reports Server (NTRS)
Grosof, Benjamin N.
1992-01-01
We aim to help build programs that do large-scale, expressive non-monotonic reasoning (NMR): especially, 'learning agents' that store, and revise, a body of conclusions while continually acquiring new, possibly defeasible, premise beliefs. Currently available procedures for forward inference and belief revision are exhaustive, and thus impractical: they compute the entire non-monotonic theory, then re-compute from scratch upon updating with new axioms. These methods are thus badly intractable. In most theories of interest, even backward reasoning is combinatoric (at least NP-hard). Here, we give theoretical results for prioritized circumscription that show how to reformulate default theories so as to make forward inference be selective, as well as concurrent; and to restrict belief revision to a part of the theory. We elaborate a detailed divide-and-conquer strategy. We develop concepts of structure in NM theories, by showing how to reformulate them in a particular fashion: to be conjunctively decomposed into a collection of smaller 'part' theories. We identify two well-behaved special cases that are easily recognized in terms of syntactic properties: disjoint appearances of predicates, and disjoint appearances of individuals (terms). As part of this, we also definitionally reformulate the global axioms, one by one, in addition to applying decomposition. We identify a broad class of prioritized default theories, generalizing default inheritance, for which our results especially bear fruit. For this asocially monadic class, decomposition permits reasoning to be localized to individuals (ground terms), and reduced to propositional. Our reformulation methods are implementable in polynomial time, and apply to several other NM formalisms beyond circumscription.
The Institution of Sociological Theory in Canada.
Guzman, Cinthya; Silver, Daniel
2018-02-01
Using theory syllabi and departmental data collected for three academic years, this paper investigates the institutional practice of theory in sociology departments across Canada. In particular, it examines the position of theory within the sociological curriculum, and how this varies among universities. Taken together, our analyses indicate that theory remains deeply institutionalized at the core of sociological education and Canadian sociologists' self-understanding; that theorists as a whole show some coherence in how they define themselves, but differ in various ways, especially along lines of region, intellectual background, and gender; that despite these differences, the classical versus contemporary heuristic largely cuts across these divides, as does the strongly ingrained position of a small group of European authors as classics of the discipline as a whole. Nevertheless, who is a classic remains an unsettled question, alternatives to the "classical versus contemporary" heuristic do exist, and theorists' syllabi reveal diverse "others" as potential candidates. Our findings show that the field of sociology is neither marked by universal agreement nor by absolute division when it comes to its theoretical underpinnings. To the extent that they reveal a unified field, the findings suggest that unity lies more in a distinctive form than in a distinctive content, which defines the space and structure of the field of sociology. © 2018 Canadian Sociological Association/La Société canadienne de sociologie.
On the effective field theory of intersecting D3-branes
NASA Astrophysics Data System (ADS)
Abbaspur, Reza
2018-05-01
We study the effective field theory of two intersecting D3-branes with one common dimension along the lines recently proposed in ref. [1]. We introduce a systematic way of deriving the classical effective action to arbitrary orders in perturbation theory. Using a proper renormalization prescription to handle logarithmic divergencies arising at all orders in the perturbation series, we recover the first order renormalization group equation of ref. [1] plus an infinite set of higher order equations. We show the consistency of the higher order equations with the first order one and hence interpret the first order result as an exact RG flow equation in the classical theory.
NASA Technical Reports Server (NTRS)
Zeng, X. C.; Stroud, D.
1989-01-01
The previously developed Ginzburg-Landau theory for calculating the crystal-melt interfacial tension of bcc elements to treat the classical one-component plasma (OCP), the charged fermion system, and the Bose crystal. For the OCP, a direct application of the theory of Shih et al. (1987) yields for the surface tension 0.0012(Z-squared e-squared/a-cubed), where Ze is the ionic charge and a is the radius of the ionic sphere. Bose crystal-melt interface is treated by a quantum extension of the classical density-functional theory, using the Feynman formalism to estimate the relevant correlation functions. The theory is applied to the metastable He-4 solid-superfluid interface at T = 0, with a resulting surface tension of 0.085 erg/sq cm, in reasonable agreement with the value extrapolated from the measured surface tension of the bcc solid in the range 1.46-1.76 K. These results suggest that the density-functional approach is a satisfactory mean-field theory for estimating the equilibrium properties of liquid-solid interfaces, given knowledge of the uniform phases.
NASA Astrophysics Data System (ADS)
Brynjolfsson, Ari
2002-04-01
Einstein's general theory of relativity assumes that photons don't change frequency as they move from Sun to Earth. This assumption is correct in classical physics. All experiments proving the general relativity are in the domain of classical physics. This include the tests by Pound et al. of the gravitational redshift of 14.4 keV photons; the rocket experiments by Vessot et al.; the Galileo solar redshift experiments by Krisher et al.; the gravitational deflection of light experiments by Riveros and Vucetich; and delay of echoes of radar signals passing close to Sun as observed by Shapiro et al. Bohr's correspondence principle assures that quantum mechanical theory of general relativity agrees with Einstein's classical theory when frequency and gravitational field gradient approach zero, or when photons cannot interact with the gravitational field. When we treat photons as quantum mechanical particles; we find that gravitational force on photons is reversed (antigravity). This modified theory contradicts the equivalence principle, but is consistent with all experiments. Solar lines and distant stars are redshifted in accordance with author's plasma redshift theory. These changes result in a beautiful consistent cosmology.
Dahl, Gerhard
2016-10-01
The now available unabridged correspondence between Freud and Abraham leads to a re-evaluation of the significance of Abraham's work. The author proposes the thesis that clinical observations by Karl Abraham of the ambivalence of object relations and the destructive-sadistic aspects of orality have an important influence on the advancement of psychoanalytical theory. The phantasy problem of the Wolf Man and the question of the pathogenic relevance of early actual, or merely imagined traumata led Freud to doubt the validity of his theory. He attempted repeatedly to solve this problem using libido theory, but failed because of his problematic conception of oral erotics. The pathogenic effect of presymbolic traumatizations cannot be demonstrated scientifically because of the still underdeveloped brain in the early stage of the child's development. Consequently, the important empirical evidence of a scientific neurosis theory could not be provided. A revision of the theory of the instincts thus became necessary. With Abraham's clinical contributions and other pathologic evidence, Freud was, with some reservation, forced to modify his idea of oral erotics by ascribing to it a status of a merely constructed and fictive phase of oral organization. A solution was eventually facilitated via recognition of non-erotic aggression and destruction, thereby opening libido theory to fundamental revisions. Driven by the desire to develop a scientific theory, Freud initially had, in his first theory of the instincts, assumed a strongly causal-deterministic view on Psychic Function. His third revision of theory of the instincts, Beyond the Pleasure Principle including the death instinct hypothesis, considered the hermeneutic aspect of psychoanalytic theory, which had previously existed only implicitly in his theory. Further development of the death instinct hypothesis by Melanie Klein and her successors abandoned quantitative-economic and causal-deterministic principles, and instead focused on the practical utility of the psychoanalytic theory. Copyright © 2016 Institute of Psychoanalysis.
ERIC Educational Resources Information Center
Wilson, Mark; Allen, Diane D.; Li, Jun Corser
2006-01-01
This paper compares the approach and resultant outcomes of item response models (IRMs) and classical test theory (CTT). First, it reviews basic ideas of CTT, and compares them to the ideas about using IRMs introduced in an earlier paper. It then applies a comparison scheme based on the AERA/APA/NCME "Standards for Educational and…
ERIC Educational Resources Information Center
Culpepper, Steven Andrew
2013-01-01
A classic topic in the fields of psychometrics and measurement has been the impact of the number of scale categories on test score reliability. This study builds on previous research by further articulating the relationship between item response theory (IRT) and classical test theory (CTT). Equations are presented for comparing the reliability and…
ERIC Educational Resources Information Center
Mason, Brandon; Smithey, Martha
2012-01-01
This study examines Merton's Classical Strain Theory (1938) as a causative factor in intimate partner violence among college students. We theorize that college students experience general life strain and cumulative strain as they pursue the goal of a college degree. We test this strain on the likelihood of using intimate partner violence. Strain…
ERIC Educational Resources Information Center
Schlingman, Wayne M.; Prather, Edward E.; Wallace, Colin S.; Brissenden, Gina; Rudolph, Alexander L.
2012-01-01
This paper is the first in a series of investigations into the data from the recent national study using the Light and Spectroscopy Concept Inventory (LSCI). In this paper, we use classical test theory to form a framework of results that will be used to evaluate individual item difficulties, item discriminations, and the overall reliability of the…
Classical closure theory and Lam's interpretation of epsilon-RNG
NASA Technical Reports Server (NTRS)
Zhou, YE
1995-01-01
Lam's phenomenological epsilon-renormalization group (RNG) model is quite different from the other members of that group. It does not make use of the correspondence principle and the epsilon-expansion procedure. We demonstrate that Lam's epsilon-RNG model is essentially the physical space version of the classical closure theory in spectral space and consider the corresponding treatment of the eddy viscosity and energy backscatter.
New variables for classical and quantum gravity
NASA Technical Reports Server (NTRS)
Ashtekar, Abhay
1986-01-01
A Hamiltonian formulation of general relativity based on certain spinorial variables is introduced. These variables simplify the constraints of general relativity considerably and enable one to imbed the constraint surface in the phase space of Einstein's theory into that of Yang-Mills theory. The imbedding suggests new ways of attacking a number of problems in both classical and quantum gravity. Some illustrative applications are discussed.
ERIC Educational Resources Information Center
Sussman, Joshua; Beaujean, A. Alexander; Worrell, Frank C.; Watson, Stevie
2013-01-01
Item response models (IRMs) were used to analyze Cross Racial Identity Scale (CRIS) scores. Rasch analysis scores were compared with classical test theory (CTT) scores. The partial credit model demonstrated a high goodness of fit and correlations between Rasch and CTT scores ranged from 0.91 to 0.99. CRIS scores are supported by both methods.…
Conveying the Complex: Updating U.S. Joint Systems Analysis Doctrine with Complexity Theory
2013-12-10
screech during a public address, or sustain and amplify it during a guitar solo. Since the systems are nonlinear, understanding cause and effect... Classics , 2007), 12. 34 those frames.58 A technique to cope with the potentially confusing...Reynolds, Paul Davidson. A Primer in Theory Construction. Boston: Allyn and Bacon Classics , 2007. Riolo, Rick L. “The Effects and Evolution of Tag
Quantum kinetic theory of the filamentation instability
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bret, A.; Haas, F.
2011-07-15
The quantum electromagnetic dielectric tensor for a multi-species plasma is re-derived from the gauge-invariant Wigner-Maxwell system and presented under a form very similar to the classical one. The resulting expression is then applied to a quantum kinetic theory of the electromagnetic filamentation instability. Comparison is made with the quantum fluid theory including a Bohm pressure term and with the cold classical plasma result. A number of analytical expressions are derived for the cutoff wave vector, the largest growth rate, and the most unstable wave vector.
A classical density-functional theory for describing water interfaces.
Hughes, Jessica; Krebs, Eric J; Roundy, David
2013-01-14
We develop a classical density functional for water which combines the White Bear fundamental-measure theory (FMT) functional for the hard sphere fluid with attractive interactions based on the statistical associating fluid theory variable range (SAFT-VR). This functional reproduces the properties of water at both long and short length scales over a wide range of temperatures and is computationally efficient, comparable to the cost of FMT itself. We demonstrate our functional by applying it to systems composed of two hard rods, four hard rods arranged in a square, and hard spheres in water.
Geometry, topology, and string theory
DOE Office of Scientific and Technical Information (OSTI.GOV)
Varadarajan, Uday
A variety of scenarios are considered which shed light upon the uses and limitations of classical geometric and topological notions in string theory. The primary focus is on situations in which D-brane or string probes of a given classical space-time see the geometry quite differently than one might naively expect. In particular, situations in which extra dimensions, non-commutative geometries as well as other non-local structures emerge are explored in detail. Further, a preliminary exploration of such issues in Lorentzian space-times with non-trivial causal structures within string theory is initiated.
Summerhill School. A New View of Childhood.
ERIC Educational Resources Information Center
Neill, A. S.; Lamb, Albert, Ed.
This revised and expanded version of the 1960 classic "Summerhill," edited by Albert Lamb, portrays Summerhill School throughout its development. The book reveals A. S. Neill's fundamental belief in the self-regulated school in which children make their own rules and determine for themselves how much they will study. Neill's commitment…
Glenn Ligon: Re-Visioning Change
ERIC Educational Resources Information Center
Rhoades, Mindi; Sanders, Jim
2007-01-01
Glenn Ligon is a multifaceted artist working across multiple media, including painting, sculpture, printmaking, photography, video, and digital media. He is a conceptual artist, often working to include text with visuals and as visuals in his work. He appropriates text from classic authors, like Homer, from runaway slave broadsides, from Richard…
Epidemic models with an infected-infectious period
NASA Astrophysics Data System (ADS)
Méndez, Vicenç
1998-03-01
The introduction of an infective-infectious period on the geographic spread of epidemics is considered in two different models. The classical evolution equations arising in the literature are generalized and the existence of epidemic wave fronts is revised. The asymptotic speed is obtained and improves previous results for the Black Death plague.
ERIC Educational Resources Information Center
Chiu, Chun-Yu; Seo, Hyojeong; Turnbull, Ann P.; Summer, Jean Ann
2017-01-01
The Beach Center Family Quality of Life Scale is an internationally validated instrument for measuring family outcomes. To revise the scale for better alignment with the Family Quality of Life theory, the authors excluded non-outcome items in this revision. In this study, we examined reliability and validity of the revised scale (i.e., the FQoL…
Semiclassical theory of electronically nonadiabatic transitions in molecular collision processes
NASA Technical Reports Server (NTRS)
Lam, K. S.; George, T. F.
1979-01-01
An introductory account of the semiclassical theory of the S-matrix for molecular collision processes is presented, with special emphasis on electronically nonadiabatic transitions. This theory is based on the incorporation of classical mechanics with quantum superposition, and in practice makes use of the analytic continuation of classical mechanics into the complex space of time domain. The relevant concepts of molecular scattering theory and related dynamical models are described and the formalism is developed and illustrated with simple examples - collinear collision of the A+BC type. The theory is then extended to include the effects of laser-induced nonadiabatic transitions. Two bound continuum processes collisional ionization and collision-induced emission also amenable to the same general semiclassical treatment are discussed.
Classical theory of atomic collisions - The first hundred years
NASA Astrophysics Data System (ADS)
Grujić, Petar V.
2012-05-01
Classical calculations of the atomic processes started in 1911 with famous Rutherford's evaluation of the differential cross section for α particles scattered on foil atoms [1]. The success of these calculations was soon overshadowed by the rise of Quantum Mechanics in 1925 and its triumphal success in describing processes at the atomic and subatomic levels. It was generally recognized that the classical approach should be inadequate and it was neglected until 1953, when the famous paper by Gregory Wannier appeared, in which the threshold law for the single ionization cross section behaviour by electron impact was derived. All later calculations and experimental studies confirmed the law derived by purely classical theory. The next step was taken by Ian Percival and collaborators in 60s, who developed a general classical three-body computer code, which was used by many researchers in evaluating various atomic processes like ionization, excitation, detachment, dissociation, etc. Another approach was pursued by Michal Gryzinski from Warsaw, who started a far reaching programme for treating atomic particles and processes as purely classical objects [2]. Though often criticized for overestimating the domain of the classical theory, results of his group were able to match many experimental data. Belgrade group was pursuing the classical approach using both analytical and numerical calculations, studying a number of atomic collisions, in particular near-threshold processes. Riga group, lead by Modris Gailitis [3], contributed considerably to the field, as it was done by Valentin Ostrovsky and coworkers from Sanct Petersbourg, who developed powerful analytical methods within purely classical mechanics [4]. We shall make an overview of these approaches and show some of the remarkable results, which were subsequently confirmed by semiclassical and quantum mechanical calculations, as well as by the experimental evidence. Finally we discuss the theoretical and epistemological background of the classical calculations and explain why these turned out so successful, despite the essentially quantum nature of the atomic and subatomic systems.
NASA Technical Reports Server (NTRS)
Ioannou, Petros J.; Lindzen, Richard S.
1993-01-01
Classical tidal theory is applied to the atmospheres of the outer planets. The tidal geopotential due to satellites of the outer planets is discussed, and the solution of Laplace's tidal equation for Hough modes appropriate to tides on the outer planets is examined. The vertical structure of tidal modes is described, noting that only relatively high-order meridional mode numbers can propagate vertically with growing amplitude. Expected magnitudes for tides in the visible atmosphere of Jupiter are discussed. The classical theory is extended to planetary interiors taking the effects of spherically and self-gravity into account. The thermodynamic structure of Jupiter is described and the WKB theory of the vertical structure equation is presented. The regions for which inertial, gravity, and acoustic oscillations are possible are delineated. The case of a planet with a neutral interior is treated, discussing the various atmospheric boundary conditions and showing that the tidal response is small.
Physics of automated driving in framework of three-phase traffic theory.
Kerner, Boris S
2018-04-01
We have revealed physical features of automated driving in the framework of the three-phase traffic theory for which there is no fixed time headway to the preceding vehicle. A comparison with the classical model approach to automated driving for which an automated driving vehicle tries to reach a fixed (desired or "optimal") time headway to the preceding vehicle has been made. It turns out that automated driving in the framework of the three-phase traffic theory can exhibit the following advantages in comparison with the classical model of automated driving: (i) The absence of string instability. (ii) Considerably smaller speed disturbances at road bottlenecks. (iii) Automated driving vehicles based on the three-phase theory can decrease the probability of traffic breakdown at the bottleneck in mixed traffic flow consisting of human driving and automated driving vehicles; on the contrary, even a single automated driving vehicle based on the classical approach can provoke traffic breakdown at the bottleneck in mixed traffic flow.
Physics of automated driving in framework of three-phase traffic theory
NASA Astrophysics Data System (ADS)
Kerner, Boris S.
2018-04-01
We have revealed physical features of automated driving in the framework of the three-phase traffic theory for which there is no fixed time headway to the preceding vehicle. A comparison with the classical model approach to automated driving for which an automated driving vehicle tries to reach a fixed (desired or "optimal") time headway to the preceding vehicle has been made. It turns out that automated driving in the framework of the three-phase traffic theory can exhibit the following advantages in comparison with the classical model of automated driving: (i) The absence of string instability. (ii) Considerably smaller speed disturbances at road bottlenecks. (iii) Automated driving vehicles based on the three-phase theory can decrease the probability of traffic breakdown at the bottleneck in mixed traffic flow consisting of human driving and automated driving vehicles; on the contrary, even a single automated driving vehicle based on the classical approach can provoke traffic breakdown at the bottleneck in mixed traffic flow.
Urns and Chameleons: two metaphors for two different types of measurements
NASA Astrophysics Data System (ADS)
Accardi, Luigi
2013-09-01
The awareness of the physical possibility of models of space, alternative with respect to the Euclidean one, begun to emerge towards the end of the 19-th century. At the end of the 20-th century a similar awareness emerged concerning the physical possibility of models of the laws of chance alternative with respect to the classical probabilistic models (Kolmogorov model). In geometry the mathematical construction of several non-Euclidean models of space preceded of about one century their applications in physics, which came with the theory of relativity. In physics the opposite situation took place. In fact, while the first example of non Kolmogorov probabilistic models emerged in quantum physics approximately one century ago, at the beginning of 1900, the awareness of the fact that this new mathematical formalism reflected a new mathematical model of the laws of chance had to wait until the early 1980's. In this long time interval the classical and the new probabilistic models were both used in the description and the interpretation of quantum phenomena and negatively interfered with each other because of the absence (for many decades) of a mathematical theory that clearly delimited the respective domains of application. The result of this interference was the emergence of the so-called the "paradoxes of quantum theory". For several decades there have been many different attempts to solve these paradoxes giving rise to what K. Popper baptized "the great quantum muddle": a debate which has been at the core of the philosophy of science for more than 50 years. However these attempts have led to contradictions between the two fundamental theories of the contemporary physical: the quantum theory and the theory of the relativity. Quantum probability identifies the reason of the emergence of non Kolmogorov models, and therefore of the so-called the paradoxes of quantum theory, in the difference between the notion of passive measurements like "reading pre-existent properties" (urn metaphor) and measurements consisting in reading "a response to an interaction" (chameleon metaphor). The non-trivial point is that one can prove that, while the urn scheme cannot lead to empirical data outside of classic probability, response based measurements can give rise to non classical statistics. The talk will include entirely classical examples of non classical statistics and potential applications to economic, sociological or biomedical phenomena.
De Tiège, Alexis; Van de Peer, Yves; Braeckman, Johan; Tanghe, Koen B
2017-11-22
Although classical evolutionary theory, i.e., population genetics and the Modern Synthesis, was already implicitly 'gene-centred', the organism was, in practice, still generally regarded as the individual unit of which a population is composed. The gene-centred approach to evolution only reached a logical conclusion with the advent of the gene-selectionist or gene's eye view in the 1960s and 1970s. Whereas classical evolutionary theory can only work with (genotypically represented) fitness differences between individual organisms, gene-selectionism is capable of working with fitness differences among genes within the same organism and genome. Here, we explore the explanatory potential of 'intra-organismic' and 'intra-genomic' gene-selectionism, i.e., of a behavioural-ecological 'gene's eye view' on genetic, genomic and organismal evolution. First, we give a general outline of the framework and how it complements the-to some extent-still 'organism-centred' approach of classical evolutionary theory. Secondly, we give a more in-depth assessment of its explanatory potential for biological evolution, i.e., for Darwin's 'common descent with modification' or, more specifically, for 'historical continuity or homology with modular evolutionary change' as it has been studied by evolutionary developmental biology (evo-devo) during the last few decades. In contrast with classical evolutionary theory, evo-devo focuses on 'within-organism' developmental processes. Given the capacity of gene-selectionism to adopt an intra-organismal gene's eye view, we outline the relevance of the latter model for evo-devo. Overall, we aim for the conceptual integration between the gene's eye view on the one hand, and more organism-centred evolutionary models (both classical evolutionary theory and evo-devo) on the other.
Clerc, Daryl G
2016-07-21
An ab initio approach was used to study the molecular-level interactions that connect gene-mutation to changes in an organism׳s phenotype. The study provides new insights into the evolutionary process and presents a simplification whereby changes in phenotypic properties may be studied in terms of the binding affinities of the chemical interactions affected by mutation, rather than by correlation to the genes. The study also reports the role that nonlinear effects play in the progression of organs, and how those effects relate to the classical theory of evolution. Results indicate that the classical theory of evolution occurs as a special case within the ab initio model - a case having two attributes. The first attribute: proteins and promoter regions are not shared among organs. The second attribute: continuous limiting behavior exists in the physical properties of organs as well as in the binding affinity of the associated chemical interactions, with respect to displacements in the chemical properties of proteins and promoter regions induced by mutation. Outside of the special case, second-order coupling contributions are significant and nonlinear effects play an important role, a result corroborated by analyses of published activity levels in binding and transactivation assays. Further, gradations in the state of perfection of an organ may be small or large depending on the type of mutation, and not necessarily closely-separated as maintained by the classical theory. Results also indicate that organs progress with varying degrees of interdependence, the likelihood of successful mutation decreases with increasing complexity of the affected chemical system, and differences between the ab initio model and the classical theory increase with increasing complexity of the organism. Copyright © 2016 The Author. Published by Elsevier Ltd.. All rights reserved.
Conceptual Revolution of the 20th Century Leading to One Grand Unified Concept -- The Quantum Vacuum
NASA Astrophysics Data System (ADS)
Sreekantan, B. V.
2014-07-01
Concepts and the relations between concepts are the basis for all our scientific understanding and explanation of the wide variety of constituents and phenomena in nature. Some of the fundamental concepts like space, time, matter, radiation, causality, etc. had remained unchanged for almost four hundred years from the time of the dawn of science. However all these underwent a drastic transformation in the 20th century because of two reasons. One, in the light of certain experimental findings two radical theories namely theory of relativity and theory of quantum mechanics replaced the classical theory that had dominated since Newton's time. Secondly, the science-technology spiral resulted in the discovery of very many new features of the universe both on the micro scale and on the mega scale. There was an exponential increase in our knowledge. These new facts could not be fitted into the old concepts. Apart from drastic revision, many new concepts had to be brought in. Despite all this, one very encouraging trend has been to discern a holistic synthesis and unification of the different concepts -- an endeavor that has been helped by experiments over a wide scale of energy and distances and most importantly from theoretical insights triggered by mathematical underpinnings. These developments in physics and astrophysics are pointing to one grand concept, namely, the "quantum vacuum" endowed with certain special properties, as the substratum from which all the constituents of the universe as well as the processes of the universe emerge, including the creation of the universe itself. This is the view, at least of some of the scientists. In this brief article the essence of these approaches toward unification is highlighted. Maybe life sciences can take a clue from these developments in physical sciences.
The Basics: What's Essential about Theory for Community Development Practice?
ERIC Educational Resources Information Center
Hustedde, Ronald J.; Ganowicz, Jacek
2002-01-01
Relates three classical theories (structural functionalism, conflict theory, symbolic interactionism) to fundamental concerns of community development (structure, power, and shared meaning). Links these theories to Giddens' structuration theory, which connects macro and micro structures and community influence on change through cultural norms.…
Short stem survival after osteonecrosis of the femoral head.
Schnurr, Christoph; Loucif, Anissa; Patzer, Theresa; Schellen, Bernd; Beckmann, Johannes; Eysel, Peer
2018-04-01
Short stems were developed as a bone-conserving alternative especially for the young hip arthroplasty patient. Patients suffering from osteonecrosis of the femoral head are frequently younger than primary arthritis patients. The outcome of short stems in these patients remains unclear. The aim of our study was to compare mid-term survival of short stems after osteonecrosis of the femoral head (ONFH) and primary arthritis. Data on short stem implantations over a 10-year period were collected. Demographic data and X-ray measurements before and after surgery were recorded. Indication for operation was determined from medical records and X-rays. Patients were asked by post about any revision. Reason for revision was identified by analysis of operation protocols. Short stem revision rates were analyzed using Kaplan-Meier charts, comparing 212 ONFH patients (231 operations) and 1284 primary arthritis patients (1455 operations). Follow-up time averaged 5.3 and 6 years and was complete for 92% (ONFH) and 94% (primary arthritis) of the patients. ONFH patients were significantly younger (53 years vs. 59 years, p < 0.001) and more frequently male (55 vs. 42%, p < 0.001). The total revision rate did not differ between the two groups (8 years: 4.2 vs. 5.6%, p = ns). A trend towards more stem revisions was detected for ONFH patients (3 vs. 1.8%, p = ns). The aseptic stem loosening rate was significantly elevated for osteonecrosis patients (8 years: 2.6 vs. 0.7%, p = 0.013). Our study showed elevated short stem loosening rates after ONFH. Similar results are published for classic cementless stems. The question of which stem is best for the young osteonecrosis patient cannot be answered yet. Consecutive studies directly comparing loosening rates of short and classic cementless stems in young osteonecrosis patients are required.
On classical and quantum dynamics of tachyon-like fields and their cosmological implications
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dimitrijević, Dragoljub D., E-mail: ddrag@pmf.ni.ac.rs; Djordjević, Goran S., E-mail: ddrag@pmf.ni.ac.rs; Milošević, Milan, E-mail: ddrag@pmf.ni.ac.rs
2014-11-24
We consider a class of tachyon-like potentials, motivated by string theory, D-brane dynamics and inflation theory in the context of classical and quantum mechanics. A formalism for describing dynamics of tachyon fields in spatially homogenous and one-dimensional - classical and quantum mechanical limit is proposed. A few models with concrete potentials are considered. Additionally, possibilities for p-adic and adelic generalization of these models are discussed. Classical actions and corresponding quantum propagators, in the Feynman path integral approach, are calculated in a form invariant on a change of the background number fields, i.e. on both archimedean and nonarchimedean spaces. Looking formore » a quantum origin of inflation, relevance of p-adic and adelic generalizations are briefly discussed.« less
The Classical Theory of Light Colors: a Paradigm for Description of Particle Interactions
NASA Astrophysics Data System (ADS)
Mazilu, Nicolae; Agop, Maricel; Gatu, Irina; Iacob, Dan Dezideriu; Butuc, Irina; Ghizdovat, Vlad
2016-06-01
The color is an interaction property: of the interaction of light with matter. Classically speaking it is therefore akin to the forces. But while forces engendered the mechanical view of the world, the colors generated the optical view. One of the modern concepts of interaction between the fundamental particles of matter - the quantum chromodynamics - aims to fill the gap between mechanics and optics, in a specific description of strong interactions. We show here that this modern description of the particle interactions has ties with both the classical and quantum theories of light, regardless of the connection between forces and colors. In a word, the light is a universal model in the description of matter. The description involves classical Yang-Mills fields related to color.
Handbook of Qualitative Research. Second Edition.
ERIC Educational Resources Information Center
Denzin, Norman K., Ed.; Lincoln, Yvonna S., Ed.
This handbook's second edition represents the state of the art for the theory and practice of qualitative inquiry. It features eight new topics, including autoethnography, critical race theory, applied ethnography, queer theory, and "testimonio"every chapter in the handbook has been thoroughly revised and updated. The book…
NASA Astrophysics Data System (ADS)
Ivanov, Sergey V.; Buzykin, Oleg G.
2016-12-01
A classical approach is applied to calculate pressure broadening coefficients of CO2 vibration-rotational spectral lines perturbed by Ar. Three types of spectra are examined: electric dipole (infrared) absorption; isotropic and anisotropic Raman Q branches. Simple and explicit formulae of the classical impact theory are used along with exact 3D Hamilton equations for CO2-Ar molecular motion. The calculations utilize vibrationally independent most accurate ab initio potential energy surface (PES) of Hutson et al. expanded in Legendre polynomial series up to lmax = 24. New improved algorithm of classical rotational frequency selection is applied. The dependences of CO2 half-widths on rotational quantum number J up to J=100 are computed for the temperatures between 77 and 765 K and compared with available experimental data as well as with the results of fully quantum dynamical calculations performed on the same PES. To make the picture complete, the predictions of two independent variants of the semi-classical Robert-Bonamy formalism for dipole absorption lines are included. This method. however, has demonstrated poor accuracy almost for all temperatures. On the contrary, classical broadening coefficients are in excellent agreement both with measurements and with quantum results at all temperatures. The classical impact theory in its present variant is capable to produce quickly and accurately the pressure broadening coefficients of spectral lines of linear molecules for any J value (including high Js) using full-dimensional ab initio - based PES in the cases where other computational methods are either extremely time consuming (like the quantum close coupling method) or give erroneous results (like semi-classical methods).
The polymer physics of single DNA confined in nanochannels.
Dai, Liang; Renner, C Benjamin; Doyle, Patrick S
2016-06-01
In recent years, applications and experimental studies of DNA in nanochannels have stimulated the investigation of the polymer physics of DNA in confinement. Recent advances in the physics of confined polymers, using DNA as a model polymer, have moved beyond the classic Odijk theory for the strong confinement, and the classic blob theory for the weak confinement. In this review, we present the current understanding of the behaviors of confined polymers while briefly reviewing classic theories. Three aspects of confined DNA are presented: static, dynamic, and topological properties. The relevant simulation methods are also summarized. In addition, comparisons of confined DNA with DNA under tension and DNA in semidilute solution are made to emphasize universal behaviors. Finally, an outlook of the possible future research for confined DNA is given. Copyright © 2015 Elsevier B.V. All rights reserved.
Classical and non-classical effective medium theories: New perspectives
NASA Astrophysics Data System (ADS)
Tsukerman, Igor
2017-05-01
Future research in electrodynamics of periodic electromagnetic composites (metamaterials) can be expected to produce sophisticated homogenization theories valid for any composition and size of the lattice cell. The paper outlines a promising path in that direction, leading to non-asymptotic and nonlocal homogenization models, and highlights aspects of homogenization that are often overlooked: the finite size of the sample and the role of interface boundaries. Classical theories (e.g. Clausius-Mossotti, Maxwell Garnett), while originally derived from a very different set of ideas, fit well into the proposed framework. Nonlocal effects can be included in the model, making an order-of-magnitude accuracy improvements possible. One future challenge is to determine what effective parameters can or cannot be obtained for a given set of constituents of a metamaterial lattice cell, thereby delineating the possible from the impossible in metamaterial design.
Lee, Tak Yan
2011-01-01
This is a theoretical paper with an aim to construct an integrated conceptual framework for the prevention of adolescents' use and abuse of psychotropic drugs. This paper first reports the subjective reasons for adolescents' drug use and abuse in Hong Kong and reviews the theoretical underpinnings. Theories of drug use and abuse, including neurological, pharmacological, genetic predisposition, psychological, and sociological theories, were reviewed. It provides a critical re-examination of crucial factors that support the construction of a conceptual framework for primary prevention of adolescents' drug use and abuse building on, with minor revision, the model of victimization and substance abuse among women presented by Logan et al. This revised model provides a comprehensive and coherent framework synthesized from theories of drug abuse. This paper then provides empirical support for integrating a positive youth development perspective in the revised model. It further explains how the 15 empirically sound constructs identified by Catalano et al. and used in a positive youth development program, the Project P.A.T.H.S., relate generally to the components of the revised model to formulate an integrated positive youth development conceptual framework for primary prevention of adolescent drug use. Theoretical and practical implications as well as limitations and recommendations are discussed. PMID:22194671
ERIC Educational Resources Information Center
Kim, Sooyeon; Livingston, Samuel A.
2017-01-01
The purpose of this simulation study was to assess the accuracy of a classical test theory (CTT)-based procedure for estimating the alternate-forms reliability of scores on a multistage test (MST) having 3 stages. We generated item difficulty and discrimination parameters for 10 parallel, nonoverlapping forms of the complete 3-stage test and…
Wang, Wei; Takeda, Mitsuo
2006-09-01
A new concept of vector and tensor densities is introduced into the general coherence theory of vector electromagnetic fields that is based on energy and energy-flow coherence tensors. Related coherence conservation laws are presented in the form of continuity equations that provide new insights into the propagation of second-order correlation tensors associated with stationary random classical electromagnetic fields.
Application of ply level analysis to flexural wave propagation
NASA Astrophysics Data System (ADS)
Valisetty, R. R.; Rehfield, L. W.
1988-10-01
A brief survey is presented of the shear deformation theories of laminated plates. It indicates that there are certain non-classical influences that affect bending-related behavior in the same way as do the transverse shear stresses. They include bending- and stretching-related section warping and the concomitant non-classical surface parallel stress contributions and the transverse normal stress. A bending theory gives significantly improved performance if these non-classical affects are incorporated. The heterogeneous shear deformations that are characteristic of laminates with highly dissimilar materials, however, require that attention be paid to the modeling of local rotations. In this paper, it is shown that a ply level analysis can be used to model such disparate shear deformations. Here, equilibrium of each layer is analyzed separately. Earlier applications of this analysis include free-edge laminate stresses. It is now extended to the study of flexural wave propagation in laminates. A recently developed homogeneous plate theory is used as a ply level model. Due consideration is given to the non-classical influences and no shear correction factors are introduced extraneously in this theory. The results for the lowest flexural mode of travelling planar harmonic waves indicate that this approach is competitive and yields better results for certain laminates.
A theoretical model of the determinants of mortality.
Tourangeau, Ann E
2005-01-01
Outcome research in nursing has been criticized for being atheoretical. Although there has been research investigating patient mortality as an outcome, there has been little discussion about models or theories of nursing-related determinants of mortality for hospitalized patients. Yet, unnecessary patient mortality is an important patient safety outcome. This article describes development of beginning theory of determinants of patient mortality culminating with a revised mortality model. Conclusions are made related to plans for further testing and refinement of the revised mortality model. Further, the utility of the proposed model in practice is discussed.
Geometric Theory of Reduction of Nonlinear Control Systems
NASA Astrophysics Data System (ADS)
Elkin, V. I.
2018-02-01
The foundations of a differential geometric theory of nonlinear control systems are described on the basis of categorical concepts (isomorphism, factorization, restrictions) by analogy with classical mathematical theories (of linear spaces, groups, etc.).
Comment on Gallistel: behavior theory and information theory: some parallels.
Nevin, John A
2012-05-01
In this article, Gallistel proposes information theory as an approach to some enduring problems in the study of operant and classical conditioning. Copyright © 2012 Elsevier B.V. All rights reserved.
Quid pro quo: a mechanism for fair collaboration in networked systems.
Santos, Agustín; Fernández Anta, Antonio; López Fernández, Luis
2013-01-01
Collaboration may be understood as the execution of coordinated tasks (in the most general sense) by groups of users, who cooperate for achieving a common goal. Collaboration is a fundamental assumption and requirement for the correct operation of many communication systems. The main challenge when creating collaborative systems in a decentralized manner is dealing with the fact that users may behave in selfish ways, trying to obtain the benefits of the tasks but without participating in their execution. In this context, Game Theory has been instrumental to model collaborative systems and the task allocation problem, and to design mechanisms for optimal allocation of tasks. In this paper, we revise the classical assumptions of these models and propose a new approach to this problem. First, we establish a system model based on heterogenous nodes (users, players), and propose a basic distributed mechanism so that, when a new task appears, it is assigned to the most suitable node. The classical technique for compensating a node that executes a task is the use of payments (which in most networks are hard or impossible to implement). Instead, we propose a distributed mechanism for the optimal allocation of tasks without payments. We prove this mechanism to be robust evenevent in the presence of independent selfish or rationally limited players. Additionally, our model is based on very weak assumptions, which makes the proposed mechanisms susceptible to be implemented in networked systems (e.g., the Internet).
Poznanski, Roman R
2010-02-01
An assumption commonly used in cable theory is revised by taking into account electrical amplification due to intracellular capacitive effects in passive dendritic cables. A generalized cable equation for a cylindrical volume representation of a dendritic segment is derived from Maxwell's equations under assumptions: (i) the electric-field polarization is restricted longitudinally along the cable length; (ii) extracellular isopotentiality; (iii) quasielectrostatic conditions; and (iv) homogeneous medium with constant conductivity and permittivity. The generalized cable equation is identical to Barenblatt's equation arising in the theory of infiltration in fissured strata with a known analytical solution expressed in terms of a definite integral involving a modified Bessel function and the solution to a linear one-dimensional classical cable equation. Its solution is used to determine the impact of thermal noise on voltage attenuation with distance at any particular time. A regular perturbation expansion for the membrane potential about the linear one-dimensional classical cable equation solution is derived in terms of a Green's function in order to describe the dynamics of free charge within the Debye layer of endogenous structures in passive dendritic cables. The asymptotic value of the first perturbative term is explicitly evaluated for small values of time to predict how the slowly fluctuating (in submillisecond range) electric field attributed to intracellular capacitive effects alters the amplitude of the membrane potential. It was found that capacitive effects are almost negligible for cables with electrotonic lengths L>0.5 , contributes up to 10% of the signal for cables with electrotonic lengths in the range between 0.25
Jerosch-Herold, Christina; Chester, Rachel; Shepstone, Lee
2017-09-01
Study Design Cross-sectional secondary analysis of a prospective cohort study. Background The shortened version of the Disabilities of the Arm, Shoulder and Hand questionnaire (QuickDASH) is a widely used outcome measure that has been extensively evaluated using classical test theory. Rasch model analysis can identify strengths and weaknesses of rating scales and goes beyond classical test theory approaches. It uses a mathematical model to test the fit between the observed data and expected responses and converts ordinal-level scores into interval-level measurement. Objective To test the structural validity of the QuickDASH using Rasch analysis. Methods A prospective cohort study of 1030 patients with shoulder pain provided baseline data. Rasch analysis was conducted to (1) assess how the QuickDASH fits the Rasch model, (2) identify sources of misfit, and (3) explore potential solutions to these. Results There was evidence of multidimensionality and significant misfit to the Rasch model (χ 2 = 331.09, P<.001). Two items had disordered threshold responses with strong floor effects. Response bias was detected in most items for age and sex. Rescoring resulted in ordered thresholds; however, the 11-item scale still did not meet the expectations of the Rasch model. Conclusion Rasch model analysis on the QuickDASH has identified a number of problems that cannot be easily detected using traditional analyses. While revisions to the QuickDASH resulted in better fit, a "shoulder-specific" version is not advocated at present. Caution needs to be exercised when interpreting results of the QuickDASH outcome measure, as it does not meet the criteria for interval-level measurement and shows significant response bias by age and sex. J Orthop Sports Phys Ther 2017;47(9):664-672. Epub 13 Jul 2017. doi:10.2519/jospt.2017.7288.
Liberty, Authority, and Character Cultivation: John Stuart Mill's Revised Liberal Theories.
ERIC Educational Resources Information Center
Kim, Ki Su
1988-01-01
The article examines educational changes recommended by Mill in his liberal political theories to point out some of the attempts of liberals to adjust themselves to changing historical circumstances. (CB)
Cultivation Theory and Research: A Conceptual Critique.
ERIC Educational Resources Information Center
Potter, W. James
1993-01-01
Presents a critical analysis of how cultivation (long-term formation of perceptions and beliefs about the world as a result of exposure to media) has been conceptualized in theory and research. Analyses the construct of television exposure. Suggests revisions for conceptualizing the existing theory and extending it. (RS)
Culturally Responsive Teaching in the Context of Mathematics: A Grounded Theory Case Study
ERIC Educational Resources Information Center
Bonner, Emily P.; Adams, Thomasenia L.
2012-01-01
In this grounded theory case study, four interconnected, foundational cornerstones of culturally responsive mathematics teaching (CRMT), communication, knowledge, trust/relationships, and constant reflection/revision, were systematically unearthed to develop an initial working theory of CRMT that directly informs classroom practice. These…
ERIC Educational Resources Information Center
Hample, Dale; Dallinger, Judith M.
1995-01-01
Describes theoretical connections among field theory, defensiveness, attributions, and taking conflict personally (TCP). Revises the original version of a multidimensional TCP scale, measuring six TCP subscales: direct personalization, persecution feelings, stress reaction, positive relational effects, negative relational effects, and like/dislike…
Representational Realism, Closed Theories and the Quantum to Classical Limit
NASA Astrophysics Data System (ADS)
de Ronde, Christian
In this chapter, we discuss the representational realist stance as a pluralistontic approach to inter-theoretic relationships. Our stance stresses the fact that physical theories require the necessary consideration of a conceptual level of discourse which determines and configures the specific field of phenomena discussed by each particular theory. We will criticize the orthodox line of research which has grounded the analysis about QM in two (Bohrian) metaphysical presuppositions - accepted in the present as dogmas that all interpretations must follow. We will also examine how the orthodox project of "bridging the gap" between the quantum and the classical domains has constrained the possibilities of research, producing only a limited set of interpretational problems which only focus in the justification of "classical reality" and exclude the possibility of analyzing the possibilities of non-classical conceptual representations of QM. The representational realist stance introduces two new problems, namely, the superposition problem and the contextuality problem, which consider explicitly the conceptual representation of orthodox QM beyond the mere reference to mathematical structures and measurement outcomes. In the final part of the chapter, we revisit, from representational realist perspective, the quantum to classical limit and the orthodox claim that this inter-theoretic relation can be explained through the principle of decoherence.
[Discussion on six errors of formulas corresponding to syndromes in using the classic formulas].
Bao, Yan-ju; Hua, Bao-jin
2012-12-01
The theory of formulas corresponding to syndromes is one of the characteristics of Treatise on Cold Damage and Miscellaneous Diseases (Shanghan Zabing Lun) and one of the main principles in applying classic prescriptions. It is important to take effect by following the principle of formulas corresponding to syndromes. However, some medical practitioners always feel that the actual clinical effect is far less than expected. Six errors in the use of classic prescriptions as well as the theory of formulas corresponding to syndromes are the most important causes to be considered, i.e. paying attention only to the local syndromes while neglecting the whole, paying attention only to formulas corresponding to syndromes while neglecting the pathogenesis, paying attention only to syndromes while neglecting the pulse diagnosis, paying attention only to unilateral prescription but neglecting the combined prescriptions, paying attention only to classic prescriptions while neglecting the modern formulas, and paying attention only to the formulas but neglecting the drug dosage. Therefore, not only the patients' clinical syndromes, but also the combination of main syndrome and pathogenesis simultaneously is necessary in the clinical applications of classic prescriptions and the theory of prescription corresponding to syndrome. In addition, comprehensive syndrome differentiation, modern formulas, current prescriptions, combined prescriptions, and drug dosage all contribute to avoid clinical errors and improve clinical effects.
Influences on and Limitations of Classical Test Theory Reliability Estimates.
ERIC Educational Resources Information Center
Arnold, Margery E.
It is incorrect to say "the test is reliable" because reliability is a function not only of the test itself, but of many factors. The present paper explains how different factors affect classical reliability estimates such as test-retest, interrater, internal consistency, and equivalent forms coefficients. Furthermore, the limits of classical test…
A Comparison of Kinetic Energy and Momentum in Special Relativity and Classical Mechanics
ERIC Educational Resources Information Center
Riggs, Peter J.
2016-01-01
Kinetic energy and momentum are indispensable dynamical quantities in both the special theory of relativity and in classical mechanics. Although momentum and kinetic energy are central to understanding dynamics, the differences between their relativistic and classical notions have not always received adequate treatment in undergraduate teaching.…
A Comparative Analysis of Three Unique Theories of Organizational Learning
ERIC Educational Resources Information Center
Leavitt, Carol C.
2011-01-01
The purpose of this paper is to present three classical theories on organizational learning and conduct a comparative analysis that highlights their strengths, similarities, and differences. Two of the theories -- experiential learning theory and adaptive -- generative learning theory -- represent the thinking of the cognitive perspective, while…
NASA Astrophysics Data System (ADS)
Hwang, Jai-Chan; Noh, Hyerim
2005-03-01
We present cosmological perturbation theory based on generalized gravity theories including string theory correction terms and a tachyonic complication. The classical evolution as well as the quantum generation processes in these varieties of gravity theories are presented in unified forms. These apply both to the scalar- and tensor-type perturbations. Analyses are made based on the curvature variable in two different gauge conditions often used in the literature in Einstein’s gravity; these are the curvature variables in the comoving (or uniform-field) gauge and the zero-shear gauge. Applications to generalized slow-roll inflation and its consequent power spectra are derived in unified forms which include a wide range of inflationary scenarios based on Einstein’s gravity and others.
Infinite derivative gravity: non-singular cosmology & blackhole solutions
NASA Astrophysics Data System (ADS)
Mazumdar, A.
Both Einstein’s theory of General Relativity and Newton’s theory of gravity possess a short distance and small time scale catastrophe. The blackhole singularity and cosmological Big Bang singularity problems highlight that current theories of gravity are incomplete description at early times and small distances. I will discuss how one can potentially resolve these fundamental problems at a classical level and quantum level. In particular, I will discuss infinite derivative theories of gravity, where gravitational interactions become weaker in the ultraviolet, and therefore resolving some of the classical singularities, such as Big Bang and Schwarzschild singularity for compact non-singular objects with mass up to 1025 grams. In this lecture, I will discuss quantum aspects of infinite derivative gravity and discuss few aspects which can make the theory asymptotically free in the UV.
Psychodrama: group psychotherapy through role playing.
Kipper, D A
1992-10-01
The theory and the therapeutic procedure of classical psychodrama are described along with brief illustrations. Classical psychodrama and sociodrama stemmed from role theory, enactments, "tele," the reciprocity of choices, and the theory of spontaneity-robopathy and creativity. The discussion focuses on key concepts such as the therapeutic team, the structure of the session, transference and reality, countertransference, the here-and-now and the encounter, the group-as-a-whole, resistance and difficult clients, and affect and cognition. Also described are the neoclassical approaches of psychodrama, action methods, and clinical role playing, and the significance of the concept of behavioral simulation in group psychotherapy.
Second Language Studies Standard Course of Study and Grade Level Competencies, K-12. Revised
ERIC Educational Resources Information Center
North Carolina Department of Public Instruction, 2004
2004-01-01
The North Carolina Second Language Standard Course of Study establishes competency goals and objectives directing the teaching and learning of foreign language, heritage language, and classical language in North Carolina. This document sets high expectations for all students, it supports extended sequence of language learning and it takes into…
Federal Register 2010, 2011, 2012, 2013, 2014
2013-10-03
... hours per response. Respondents: Federal animal health officials of the Governments of Brazil, Chile... DEPARTMENT OF AGRICULTURE Animal and Plant Health Inspection Service [Docket No. APHIS-2013-0074... Live Swine, Pork, and Pork Products from Certain Regions Free of Classical Swine Fever in Brazil, Chile...
Classics: A Guide to Reference Sources. [Revised].
ERIC Educational Resources Information Center
McGill Univ., Montreal (Quebec). McLennan Library.
The emphasis of this bibliographical guide is on Greek and Roman culture, history, language, literature, and archaeology. It largely omits philosophy, numismatics, and theology, and does not include the Middle Ages. The organization of the guide is by type of reference source. Specific subjects, therefore, may be covered in more than one section.…
Universal Design in Higher Education: From Principles to Practice. Second Edition
ERIC Educational Resources Information Center
Burgstahler, Sheryl E., Ed.
2015-01-01
This second edition of the classic "Universal Design in Higher Education" is a comprehensive, up-to-the-minute guide for creating fully accessible college and university programs. The second edition has been thoroughly revised and expanded, and it addresses major recent changes in universities and colleges, the law, and technology. As…
The Public/Private Divide in Higher Education: A Global Revision
ERIC Educational Resources Information Center
Marginson, Simon
2007-01-01
Our common understandings of the public/private distinction in higher education are drawn from neo-classical economics and/or statist political philosophy. However, the development of competition and markets at the national level, and the new potentials for private and public goods created by globalisation in higher education, have exposed…
Particle in a Box: An Experiential Environment for Learning Introductory Quantum Mechanics
ERIC Educational Resources Information Center
Anupam, Aditya; Gupta, Ridhima; Naeemi, Azad; JafariNaimi, Nassim
2018-01-01
Quantum mechanics (QMs) is a foundational subject in many science and engineering fields. It is difficult to teach, however, as it requires a fundamental revision of the assumptions and laws of classical physics and probability. Furthermore, introductory QM courses and texts predominantly focus on the mathematical formulations of the subject and…
Urban Teaching: The Essentials. Third Edition
ERIC Educational Resources Information Center
Weiner, Lois; Jerome, Daniel
2016-01-01
This significantly revised edition will help prospective and new city teachers navigate the realities of city teaching. Now the classic introduction to urban teaching, this book explains how global, national, state, and local reforms have impacted what teachers need to know to not only survive, but to do their jobs well. The Third Edition melds…
Cappelleri, Joseph C.; Lundy, J. Jason; Hays, Ron D.
2014-01-01
Introduction The U.S. Food and Drug Administration’s patient-reported outcome (PRO) guidance document defines content validity as “the extent to which the instrument measures the concept of interest” (FDA, 2009, p. 12). “Construct validity is now generally viewed as a unifying form of validity for psychological measurements, subsuming both content and criterion validity” (Strauss & Smith, 2009, p. 7). Hence both qualitative and quantitative information are essential in evaluating the validity of measures. Methods We review classical test theory and item response theory approaches to evaluating PRO measures including frequency of responses to each category of the items in a multi-item scale, the distribution of scale scores, floor and ceiling effects, the relationship between item response options and the total score, and the extent to which hypothesized “difficulty” (severity) order of items is represented by observed responses. Conclusion Classical test theory and item response theory can be useful in providing a quantitative assessment of items and scales during the content validity phase of patient-reported outcome measures. Depending on the particular type of measure and the specific circumstances, either one or both approaches should be considered to help maximize the content validity of PRO measures. PMID:24811753
Spinning particles, axion radiation, and the classical double copy
NASA Astrophysics Data System (ADS)
Goldberger, Walter D.; Li, Jingping; Prabhu, Siddharth G.
2018-05-01
We extend the perturbative double copy between radiating classical sources in gauge theory and gravity to the case of spinning particles. We construct, to linear order in spins, perturbative radiating solutions to the classical Yang-Mills equations sourced by a set of interacting color charges with chromomagnetic dipole spin couplings. Using a color-to-kinematics replacement rule proposed earlier by one of the authors, these solutions map onto radiation in a theory of interacting particles coupled to massless fields that include the graviton, a scalar (dilaton) ϕ and the Kalb-Ramond axion field Bμ ν. Consistency of the double copy imposes constraints on the parameters of the theory on both the gauge and gravity sides of the correspondence. In particular, the color charges carry a chromomagnetic interaction which, in d =4 , corresponds to a gyromagnetic ratio equal to Dirac's value g =2 . The color-to-kinematics map implies that on the gravity side, the bulk theory of the fields (ϕ ,gμ ν,Bμ ν) has interactions which match those of d -dimensional "string gravity," as is the case both in the BCJ double copy of pure gauge theory scattering amplitudes and the KLT relations between the tree-level S -matrix elements of open and closed string theory.
Lamb wave extraction of dispersion curves in micro/nano-plates using couple stress theories
NASA Astrophysics Data System (ADS)
Ghodrati, Behnam; Yaghootian, Amin; Ghanbar Zadeh, Afshin; Mohammad-Sedighi, Hamid
2018-01-01
In this paper, Lamb wave propagation in a homogeneous and isotropic non-classical micro/nano-plates is investigated. To consider the effect of material microstructure on the wave propagation, three size-dependent models namely indeterminate-, modified- and consistent couple stress theories are used to extract the dispersion equations. In the mentioned theories, a parameter called 'characteristic length' is used to consider the size of material microstructure in the governing equations. To generalize the parametric studies and examine the effect of thickness, propagation wavelength, and characteristic length on the behavior of miniature plate structures, the governing equations are nondimensionalized by defining appropriate dimensionless parameters. Then the dispersion curves for phase and group velocities are plotted in terms of a wide frequency-thickness range to study the lamb waves propagation considering microstructure effects in very high frequencies. According to the illustrated results, it was observed that the couple stress theories in the Cosserat type material predict more rigidity than the classical theory; so that in a plate with constant thickness, by increasing the thickness to characteristic length ratio, the results approach to the classical theory, and by reducing this ratio, wave propagation speed in the plate is significantly increased. In addition, it is demonstrated that for high-frequency Lamb waves, it converges to dispersive Rayleigh wave velocity.
D'Ariano, Giacomo Mauro
2018-07-13
Causality has never gained the status of a 'law' or 'principle' in physics. Some recent literature has even popularized the false idea that causality is a notion that should be banned from theory. Such misconception relies on an alleged universality of the reversibility of the laws of physics, based either on the determinism of classical theory, or on the multiverse interpretation of quantum theory, in both cases motivated by mere interpretational requirements for realism of the theory. Here, I will show that a properly defined unambiguous notion of causality is a theorem of quantum theory, which is also a falsifiable proposition of the theory. Such a notion of causality appeared in the literature within the framework of operational probabilistic theories. It is a genuinely theoretical notion, corresponding to establishing a definite partial order among events, in the same way as we do by using the future causal cone on Minkowski space. The notion of causality is logically completely independent of the misidentified concept of 'determinism', and, being a consequence of quantum theory, is ubiquitous in physics. In addition, as classical theory can be regarded as a restriction of quantum theory, causality holds also in the classical case, although the determinism of the theory trivializes it. I then conclude by arguing that causality naturally establishes an arrow of time. This implies that the scenario of the 'block Universe' and the connected 'past hypothesis' are incompatible with causality, and thus with quantum theory: they are both doomed to remain mere interpretations and, as such, are not falsifiable, similar to the hypothesis of 'super-determinism'.This article is part of a discussion meeting issue 'Foundations of quantum mechanics and their impact on contemporary society'. © 2018 The Author(s).
Lesaint, Florian; Sigaud, Olivier; Flagel, Shelly B; Robinson, Terry E; Khamassi, Mehdi
2014-02-01
Reinforcement Learning has greatly influenced models of conditioning, providing powerful explanations of acquired behaviour and underlying physiological observations. However, in recent autoshaping experiments in rats, variation in the form of Pavlovian conditioned responses (CRs) and associated dopamine activity, have questioned the classical hypothesis that phasic dopamine activity corresponds to a reward prediction error-like signal arising from a classical Model-Free system, necessary for Pavlovian conditioning. Over the course of Pavlovian conditioning using food as the unconditioned stimulus (US), some rats (sign-trackers) come to approach and engage the conditioned stimulus (CS) itself - a lever - more and more avidly, whereas other rats (goal-trackers) learn to approach the location of food delivery upon CS presentation. Importantly, although both sign-trackers and goal-trackers learn the CS-US association equally well, only in sign-trackers does phasic dopamine activity show classical reward prediction error-like bursts. Furthermore, neither the acquisition nor the expression of a goal-tracking CR is dopamine-dependent. Here we present a computational model that can account for such individual variations. We show that a combination of a Model-Based system and a revised Model-Free system can account for the development of distinct CRs in rats. Moreover, we show that revising a classical Model-Free system to individually process stimuli by using factored representations can explain why classical dopaminergic patterns may be observed for some rats and not for others depending on the CR they develop. In addition, the model can account for other behavioural and pharmacological results obtained using the same, or similar, autoshaping procedures. Finally, the model makes it possible to draw a set of experimental predictions that may be verified in a modified experimental protocol. We suggest that further investigation of factored representations in computational neuroscience studies may be useful.
Lesaint, Florian; Sigaud, Olivier; Flagel, Shelly B.; Robinson, Terry E.; Khamassi, Mehdi
2014-01-01
Reinforcement Learning has greatly influenced models of conditioning, providing powerful explanations of acquired behaviour and underlying physiological observations. However, in recent autoshaping experiments in rats, variation in the form of Pavlovian conditioned responses (CRs) and associated dopamine activity, have questioned the classical hypothesis that phasic dopamine activity corresponds to a reward prediction error-like signal arising from a classical Model-Free system, necessary for Pavlovian conditioning. Over the course of Pavlovian conditioning using food as the unconditioned stimulus (US), some rats (sign-trackers) come to approach and engage the conditioned stimulus (CS) itself – a lever – more and more avidly, whereas other rats (goal-trackers) learn to approach the location of food delivery upon CS presentation. Importantly, although both sign-trackers and goal-trackers learn the CS-US association equally well, only in sign-trackers does phasic dopamine activity show classical reward prediction error-like bursts. Furthermore, neither the acquisition nor the expression of a goal-tracking CR is dopamine-dependent. Here we present a computational model that can account for such individual variations. We show that a combination of a Model-Based system and a revised Model-Free system can account for the development of distinct CRs in rats. Moreover, we show that revising a classical Model-Free system to individually process stimuli by using factored representations can explain why classical dopaminergic patterns may be observed for some rats and not for others depending on the CR they develop. In addition, the model can account for other behavioural and pharmacological results obtained using the same, or similar, autoshaping procedures. Finally, the model makes it possible to draw a set of experimental predictions that may be verified in a modified experimental protocol. We suggest that further investigation of factored representations in computational neuroscience studies may be useful. PMID:24550719
Soliton Gases and Generalized Hydrodynamics
NASA Astrophysics Data System (ADS)
Doyon, Benjamin; Yoshimura, Takato; Caux, Jean-Sébastien
2018-01-01
We show that the equations of generalized hydrodynamics (GHD), a hydrodynamic theory for integrable quantum systems at the Euler scale, emerge in full generality in a family of classical gases, which generalize the gas of hard rods. In this family, the particles, upon colliding, jump forward or backward by a distance that depends on their velocities, reminiscent of classical soliton scattering. This provides a "molecular dynamics" for GHD: a numerical solver which is efficient, flexible, and which applies to the presence of external force fields. GHD also describes the hydrodynamics of classical soliton gases. We identify the GHD of any quantum model with that of the gas of its solitonlike wave packets, thus providing a remarkable quantum-classical equivalence. The theory is directly applicable, for instance, to integrable quantum chains and to the Lieb-Liniger model realized in cold-atom experiments.
ERIC Educational Resources Information Center
Christian, Brittany; Yezierski, Ellen
2012-01-01
Science is always changing. Its very nature requires that scientists constantly revise theories to make sense of new observations. As they learn science, students are also constantly revising how they make sense of their observations, which requires comparisons with what they already know to process new information. A teacher can take advantage of…
ERIC Educational Resources Information Center
Jackson, A. T.
1973-01-01
Reviews theoretical and experimental fundamentals of Einstein's theory of general relativity. Indicates that recent development of the theory of the continually expanding universe may lead to revision of the space-time continuum of the finite and unbounded universe. (CC)
Topological and Orthomodular Modeling of Context in Behavioral Science
NASA Astrophysics Data System (ADS)
Narens, Louis
2017-02-01
Two non-boolean methods are discussed for modeling context in behavioral data and theory. The first is based on intuitionistic logic, which is similar to classical logic except that not every event has a complement. Its probability theory is also similar to classical probability theory except that the definition of probability function needs to be generalized to unions of events instead of applying only to unions of disjoint events. The generalization is needed, because intuitionistic event spaces may not contain enough disjoint events for the classical definition to be effective. The second method develops a version of quantum logic for its underlying probability theory. It differs from Hilbert space logic used in quantum mechanics as a foundation for quantum probability theory in variety of ways. John von Neumann and others have commented about the lack of a relative frequency approach and a rational foundation for this probability theory. This article argues that its version of quantum probability theory does not have such issues. The method based on intuitionistic logic is useful for modeling cognitive interpretations that vary with context, for example, the mood of the decision maker, the context produced by the influence of other items in a choice experiment, etc. The method based on this article's quantum logic is useful for modeling probabilities across contexts, for example, how probabilities of events from different experiments are related.
A classical density functional theory of ionic liquids.
Forsman, Jan; Woodward, Clifford E; Trulsson, Martin
2011-04-28
We present a simple, classical density functional approach to the study of simple models of room temperature ionic liquids. Dispersion attractions as well as ion correlation effects and excluded volume packing are taken into account. The oligomeric structure, common to many ionic liquid molecules, is handled by a polymer density functional treatment. The theory is evaluated by comparisons with simulations, with an emphasis on the differential capacitance, an experimentally measurable quantity of significant practical interest.
Generalized quantum theory of recollapsing homogeneous cosmologies
NASA Astrophysics Data System (ADS)
Craig, David; Hartle, James B.
2004-06-01
A sum-over-histories generalized quantum theory is developed for homogeneous minisuperspace type A Bianchi cosmological models, focusing on the particular example of the classically recollapsing Bianchi type-IX universe. The decoherence functional for such universes is exhibited. We show how the probabilities of decoherent sets of alternative, coarse-grained histories of these model universes can be calculated. We consider in particular the probabilities for classical evolution defined by a suitable coarse graining. For a restricted class of initial conditions and coarse grainings we exhibit the approximate decoherence of alternative histories in which the universe behaves classically and those in which it does not. For these situations we show that the probability is near unity for the universe to recontract classically if it expands classically. We also determine the relative probabilities of quasiclassical trajectories for initial states of WKB form, recovering for such states a precise form of the familiar heuristic “JṡdΣ” rule of quantum cosmology, as well as a generalization of this rule to generic initial states.
NASA Astrophysics Data System (ADS)
Schwörer, Magnus; Lorenzen, Konstantin; Mathias, Gerald; Tavan, Paul
2015-03-01
Recently, a novel approach to hybrid quantum mechanics/molecular mechanics (QM/MM) molecular dynamics (MD) simulations has been suggested [Schwörer et al., J. Chem. Phys. 138, 244103 (2013)]. Here, the forces acting on the atoms are calculated by grid-based density functional theory (DFT) for a solute molecule and by a polarizable molecular mechanics (PMM) force field for a large solvent environment composed of several 103-105 molecules as negative gradients of a DFT/PMM hybrid Hamiltonian. The electrostatic interactions are efficiently described by a hierarchical fast multipole method (FMM). Adopting recent progress of this FMM technique [Lorenzen et al., J. Chem. Theory Comput. 10, 3244 (2014)], which particularly entails a strictly linear scaling of the computational effort with the system size, and adapting this revised FMM approach to the computation of the interactions between the DFT and PMM fragments of a simulation system, here, we show how one can further enhance the efficiency and accuracy of such DFT/PMM-MD simulations. The resulting gain of total performance, as measured for alanine dipeptide (DFT) embedded in water (PMM) by the product of the gains in efficiency and accuracy, amounts to about one order of magnitude. We also demonstrate that the jointly parallelized implementation of the DFT and PMM-MD parts of the computation enables the efficient use of high-performance computing systems. The associated software is available online.
VizieR Online Data Catalog: Type II Cepheid and RR Lyrae variables (Feast+, 2008)
NASA Astrophysics Data System (ADS)
Feast, M. W.; Laney, C. D.; Kinman, T. D.; van Leeuwen, F.; Whitelock, P. A.
2008-10-01
Infrared and optical absolute magnitudes are derived for the type II Cepheids kappa Pav and VY Pyx using revised Hipparcos parallaxes and for kappa Pav, V553 Cen and SW Tau from pulsational parallaxes. Revised Hipparcos and HST parallaxes for RR Lyrae agree satisfactorily and are combined in deriving absolute magnitudes. Phase-corrected J, H and Ks mags are given for 142 Hipparcos RR Lyraes based on Two-Micron All-Sky Survey observations. Pulsation and trigonometrical parallaxes for classical Cepheids are compared to establish the best value for the projection factor (p) used in pulsational analyses. (3 data files).
Hydrostatic figure of the earth: Theory and results
NASA Technical Reports Server (NTRS)
Khan, M. A.
1973-01-01
The complete development of the mathematical theory of hydrostatic equilibrium for the earth is recounted. Modifications of the first order theory are given along with the subsequent extension to the second order. In addition, the equations are presented which resulted from a revision of the second order theory to suit the new applications and data types of the post-artificial earth satellite era.
ERIC Educational Resources Information Center
Natker, Elana; Baker, Susan S.; Auld, Garry; McGirr, Kathryn; Sutherland, Barbara; Cason, Katherine L.
2015-01-01
The project reported here served to assess a curriculum for EFNEP to ensure theory compliance and content validity. Adherence to Adult Learning Theory and Social Cognitive Theory tenets was determined. A curriculum assessment tool was developed and used by five reviewers to assess initial and revised versions of the curriculum. T-tests for…
NASA Astrophysics Data System (ADS)
Huyskens, P.; Kapuku, F.; Colemonts-Vandevyvere, C.
1990-09-01
In liquids the partners of H bonds constantly change. As a consequence the entities observed by IR spectroscopy are not the same as those considered for thermodynamic properties. For the latter, the H-bonds are shared by all the molecules. The thermodynamic "monomeric fraction", γ, the time fraction during which an alcohol molecule is vaporizable, is the square root of the spectroscopic monomeric fraction, and is the fraction of molecules which, during a time interval of 10 -14 s, have their hydroxylic proton and their lone pairs free. The classical thermodynamic treatments of Mecke and Prigogine consider the spectroscopic entities as real thermodynamic entities. Opposed to this, the mobile order theory considers all the formal molecules as equal but with a reduction of the entropy due to the fact that during a fraction 1-γ of the time, the OH proton follows a neighbouring oxygen atom on its journey through the liquid. Mobile order theory and classic multicomponent treatment lead, in binary mixtures of the associated substance A with the inert substance S, to expressions of the chemical potentials μ A and μ S that are fundamentally different. However, the differences become very important only when the molar volumes overlineVS and overlineVA differ by a factor larger than 2. As a consequence the equations of the classic theory can still fit the experimental vapour pressure data of mixtures of liquid alcohols and liquid alkanes. However, the solubilities of solid alkanes in water for which overlineVS > 3 overlineVA are only correctly predicted by the mobile order theory.
From Foucault to Freire through Facebook: Toward an Integrated Theory of mHealth
ERIC Educational Resources Information Center
Bull, Sheana; Ezeanochie, Nnamdi
2016-01-01
Objective: To document the integration of social science theory in literature on mHealth (mobile health) and consider opportunities for integration of classic theory, health communication theory, and social networking to generate a relevant theory for mHealth program design. Method: A secondary review of research syntheses and meta-analyses…
Devil is in the details: Using logic models to investigate program process.
Peyton, David J; Scicchitano, Michael
2017-12-01
Theory-based logic models are commonly developed as part of requirements for grant funding. As a tool to communicate complex social programs, theory based logic models are an effective visual communication. However, after initial development, theory based logic models are often abandoned and remain in their initial form despite changes in the program process. This paper examines the potential benefits of committing time and resources to revising the initial theory driven logic model and developing detailed logic models that describe key activities to accurately reflect the program and assist in effective program management. The authors use a funded special education teacher preparation program to exemplify the utility of drill down logic models. The paper concludes with lessons learned from the iterative revision process and suggests how the process can lead to more flexible and calibrated program management. Copyright © 2017 Elsevier Ltd. All rights reserved.
Where Do Epigenetics and Developmental Origins Take the Field of Developmental Psychopathology?
Nigg, Joel T
2016-04-01
The time is ripe for upgrading or rethinking the assumed paradigms for how we study developmental psychopathology. The classic transactional models appear robust but need specification in terms of biological and psychosocial processes. That specification is increasingly tractable due to developments in genetics, epigenetics, the measurement of psychosocial processes, and theory and data on developmental origins of health and disease. This essay offers a high-level view of where the field has been and where it may be going in regard to nosology and conceptions of etiology. Remarks seek to consider rapidly evolving contexts not only for children, but also for the science itself due to progress in our field and in neighboring fields. Illustrations are provided as to how syndromal nosology can be enriched and advanced by careful integration with biologically relevant behavioral dimensions and application of quantitative methods. It is concluded that a revised, forward-looking, transactional model of abnormal child psychology will incorporate prenatal and postnatal developmental programming, epigenetic mechanisms and their associated genotype x environment interactions, and inflammatory processes as a potential common mediator influencing numerous health and mental health conditions.
Exploratory Item Classification Via Spectral Graph Clustering
Chen, Yunxiao; Li, Xiaoou; Liu, Jingchen; Xu, Gongjun; Ying, Zhiliang
2017-01-01
Large-scale assessments are supported by a large item pool. An important task in test development is to assign items into scales that measure different characteristics of individuals, and a popular approach is cluster analysis of items. Classical methods in cluster analysis, such as the hierarchical clustering, K-means method, and latent-class analysis, often induce a high computational overhead and have difficulty handling missing data, especially in the presence of high-dimensional responses. In this article, the authors propose a spectral clustering algorithm for exploratory item cluster analysis. The method is computationally efficient, effective for data with missing or incomplete responses, easy to implement, and often outperforms traditional clustering algorithms in the context of high dimensionality. The spectral clustering algorithm is based on graph theory, a branch of mathematics that studies the properties of graphs. The algorithm first constructs a graph of items, characterizing the similarity structure among items. It then extracts item clusters based on the graphical structure, grouping similar items together. The proposed method is evaluated through simulations and an application to the revised Eysenck Personality Questionnaire. PMID:29033476
Generalizability Theory and Classical Test Theory
ERIC Educational Resources Information Center
Brennan, Robert L.
2011-01-01
Broadly conceived, reliability involves quantifying the consistencies and inconsistencies in observed scores. Generalizability theory, or G theory, is particularly well suited to addressing such matters in that it enables an investigator to quantify and distinguish the sources of inconsistencies in observed scores that arise, or could arise, over…
Theories of the Alcoholic Personality.
ERIC Educational Resources Information Center
Cox, W. Miles
Several theories of the alcoholic personality have been devised to determine the relationship between the clusters of personality characteristics of alcoholics and their abuse of alcohol. The oldest and probably best known theory is the dependency theory, formulated in the tradition of classical psychoanalysis, which associates the alcoholic's…
The Giffen Effect: A Note on Economic Purposes.
ERIC Educational Resources Information Center
Williams, William D.
1990-01-01
Describes the Giffen effect: demand for a commodity increases as price increases. Explains how applying control theory eliminates the paradox that the Giffen effect presents to classic economics supply and demand theory. Notes the differences in how conventional demand theory and control theory treat consumer behavior. (CH)
Personality Theories for the 21st Century
ERIC Educational Resources Information Center
McCrae, Robert R.
2011-01-01
Classic personality theories, although intriguing, are outdated. The five-factor model of personality traits reinvigorated personality research, and the resulting findings spurred a new generation of personality theories. These theories assign a central place to traits and acknowledge the crucial role of evolved biology in shaping human…
Continuous Time in Consistent Histories
NASA Astrophysics Data System (ADS)
Savvidou, Konstantina
1999-12-01
We discuss the case of histories labelled by a continuous time parameter in the History Projection Operator consistent-histories quantum theory. We describe how the appropriate representation of the history algebra may be chosen by requiring the existence of projection operators that represent propositions about time averages of the energy. We define the action operator for the consistent histories formalism, as the quantum analogue of the classical action functional, for the simple harmonic oscillator case. We show that the action operator is the generator of two types of time transformations that may be related to the two laws of time-evolution of the standard quantum theory: the `state-vector reduction' and the unitary time-evolution. We construct the corresponding classical histories and demonstrate the relevance with the quantum histories; we demonstrate how the requirement of the temporal logic structure of the theory is sufficient for the definition of classical histories. Furthermore, we show the relation of the action operator to the decoherence functional which describes the dynamics of the system. Finally, the discussion is extended to give a preliminary account of quantum field theory in this approach to the consistent histories formalism.
Effects of Extrinsic Mortality on the Evolution of Aging: A Stochastic Modeling Approach
Shokhirev, Maxim Nikolaievich; Johnson, Adiv Adam
2014-01-01
The evolutionary theories of aging are useful for gaining insights into the complex mechanisms underlying senescence. Classical theories argue that high levels of extrinsic mortality should select for the evolution of shorter lifespans and earlier peak fertility. Non-classical theories, in contrast, posit that an increase in extrinsic mortality could select for the evolution of longer lifespans. Although numerous studies support the classical paradigm, recent data challenge classical predictions, finding that high extrinsic mortality can select for the evolution of longer lifespans. To further elucidate the role of extrinsic mortality in the evolution of aging, we implemented a stochastic, agent-based, computational model. We used a simulated annealing optimization approach to predict which model parameters predispose populations to evolve longer or shorter lifespans in response to increased levels of predation. We report that longer lifespans evolved in the presence of rising predation if the cost of mating is relatively high and if energy is available in excess. Conversely, we found that dramatically shorter lifespans evolved when mating costs were relatively low and food was relatively scarce. We also analyzed the effects of increased predation on various parameters related to density dependence and energy allocation. Longer and shorter lifespans were accompanied by increased and decreased investments of energy into somatic maintenance, respectively. Similarly, earlier and later maturation ages were accompanied by increased and decreased energetic investments into early fecundity, respectively. Higher predation significantly decreased the total population size, enlarged the shared resource pool, and redistributed energy reserves for mature individuals. These results both corroborate and refine classical predictions, demonstrating a population-level trade-off between longevity and fecundity and identifying conditions that produce both classical and non-classical lifespan effects. PMID:24466165
Assessing the quantum physics impacts on future x-ray free-electron lasers
DOE Office of Scientific and Technical Information (OSTI.GOV)
Schmitt, Mark J.; Anisimov, Petr Mikhaylovich
A new quantum mechanical theory of x-ray free electron lasers (XFELs) has been successfully developed that has placed LANL at the forefront of the understanding of quantum effects in XFELs. Our quantum theory describes the interaction of relativistic electrons with x-ray radiation in the periodic magnetic field of an undulator using the same mathematical formalism as classical XFEL theory. This places classical and quantum treatments on the same footing and allows for a continuous transition from one regime to the other eliminating the disparate analytical approaches previously used. Moreover, Dr. Anisimov, the architect of this new theory, is now consideredmore » a resource in the international FEL community for assessing quantum effects in XFELs.« less
Revising Amartya Sen's Capability Approach to Education for Ethical Development
ERIC Educational Resources Information Center
Mok, Kwangsu; Jeong, Wongyu
2016-01-01
The purpose of this paper is to examine whether Amartya Sen's capability approach can suggest an appropriate theory of education for ethical development. Many advocates of Sen's capability approach insist that his approach is superior to rival theories of education, including the human capital theory. This is because Sen emphasizes the purpose and…
Re-Visioning Action: Participatory Action Research and Indigenous Theories of Change
ERIC Educational Resources Information Center
Tuck, Eve
2009-01-01
This article observes that participatory action research (PAR), by nature of being collaborative, necessitates making explicit theories of change that may have otherwise gone unseen or unexamined. The article explores the limits of the reform/revolution paradox on actions and theories of change in PAR. Citing examples from two recent youth PAR…
Identity Disorder and Career Counseling Theory: Recommendations for Conceptualization.
ERIC Educational Resources Information Center
O'Brien, Michael T.
The Diagnostic and Statistical Manual of Mental Disorders III Revised rubric of identity disorder is linked to career theory and research findings on vocational identity, career indecisiveness, vocational maturity, and to the theories of Erikson and Kohut. Identity disorder has been found in career counseling clients. It appears that the brief…
Rodrigues, Johannes; Müller, Mathias; Mühlberger, Andreas; Hewig, Johannes
2018-01-01
Frontal asymmetry has been investigated over the past 30 years, and several theories have been developed about its meaning. The original theory of Davidson and its diversification by Harmon-Jones & Allen allocated approach motivation to relative left frontal brain activity and withdrawal motivation to relative right frontal brain activity. Hewig and colleagues extended this theory by adding bilateral frontal activation representing a biological correlate of the behavioral activation system if actual behavior is shown. Wacker and colleagues formulated a theory related to the revised reinforcement sensitivity theory by Gray & McNaughton. Here, relative left frontal brain activation represents the revised behavioral activation system and behavior, while relative right frontal brain activation represents the revised behavioral inhibition system, representing the experience of conflict. These theories were investigated with a newly developed paradigm where participants were able to move around freely in a virtual T maze via joystick while having their EEG recorded. Analyzing the influence of frontal brain activation during this virtual reality task on observable behavior for 30 participants, we found more relative left frontal brain activation during approach behavior and more relative right brain activation for withdrawal behavior of any kind. Additionally, there was more bilateral frontal brain activation when participants were engaged in behavior compared to doing nothing. Hence, this study provides evidence for the idea that frontal asymmetry stands for behavioral approach or avoidance motivation, and bilateral frontal activation stands for behavior. Additionally, observable behavior is not only determined by frontal asymmetry, but also by relevant traits. © 2017 Society for Psychophysiological Research.
Discovery and Entropy in the Revision of Technical Reports.
ERIC Educational Resources Information Center
Marder, Daniel
A useful device in revising technical reports is the metaphor of entropy, which refers to the amount of disorder that is present in a system. Applied to communication theory, high entropy would correspond to increased amounts of unfamiliar or useless information in a text. Since entropy in rhetorical systems increases with the unfamiliarity of…
ERIC Educational Resources Information Center
Wylie, Ruth C.
This volume of the revised edition describes and evaluates measurement methods, research designs, and procedures which have been or might appropriately be used in self-concept research. Working from the perspective that self-concept or phenomenal personality theories can be scientifically investigated, methodological flaws and questionable…
NP-hardness of decoding quantum error-correction codes
NASA Astrophysics Data System (ADS)
Hsieh, Min-Hsiu; Le Gall, François
2011-05-01
Although the theory of quantum error correction is intimately related to classical coding theory and, in particular, one can construct quantum error-correction codes (QECCs) from classical codes with the dual-containing property, this does not necessarily imply that the computational complexity of decoding QECCs is the same as their classical counterparts. Instead, decoding QECCs can be very much different from decoding classical codes due to the degeneracy property. Intuitively, one expects degeneracy would simplify the decoding since two different errors might not and need not be distinguished in order to correct them. However, we show that general quantum decoding problem is NP-hard regardless of the quantum codes being degenerate or nondegenerate. This finding implies that no considerably fast decoding algorithm exists for the general quantum decoding problems and suggests the existence of a quantum cryptosystem based on the hardness of decoding QECCs.
On quantum effects in a theory of biological evolution.
Martin-Delgado, M A
2012-01-01
We construct a descriptive toy model that considers quantum effects on biological evolution starting from Chaitin's classical framework. There are smart evolution scenarios in which a quantum world is as favorable as classical worlds for evolution to take place. However, in more natural scenarios, the rate of evolution depends on the degree of entanglement present in quantum organisms with respect to classical organisms. If the entanglement is maximal, classical evolution turns out to be more favorable.
On Quantum Effects in a Theory of Biological Evolution
Martin-Delgado, M. A.
2012-01-01
We construct a descriptive toy model that considers quantum effects on biological evolution starting from Chaitin's classical framework. There are smart evolution scenarios in which a quantum world is as favorable as classical worlds for evolution to take place. However, in more natural scenarios, the rate of evolution depends on the degree of entanglement present in quantum organisms with respect to classical organisms. If the entanglement is maximal, classical evolution turns out to be more favorable. PMID:22413059
The Value of Item Response Theory in Clinical Assessment: A Review
ERIC Educational Resources Information Center
Thomas, Michael L.
2011-01-01
Item response theory (IRT) and related latent variable models represent modern psychometric theory, the successor to classical test theory in psychological assessment. Although IRT has become prevalent in the measurement of ability and achievement, its contributions to clinical domains have been less extensive. Applications of IRT to clinical…
NASA Astrophysics Data System (ADS)
Rincón, Ángel; Panotopoulos, Grigoris
2018-01-01
We study for the first time the stability against scalar perturbations, and we compute the spectrum of quasinormal modes of three-dimensional charged black holes in Einstein-power-Maxwell nonlinear electrodynamics assuming running couplings. Adopting the sixth order Wentzel-Kramers-Brillouin (WKB) approximation we investigate how the running of the couplings change the spectrum of the classical theory. Our results show that all modes corresponding to nonvanishing angular momentum are unstable both in the classical theory and with the running of the couplings, while the fundamental mode can be stable or unstable depending on the running parameter and the electric charge.
ERIC Educational Resources Information Center
Allen, Janet S.
1996-01-01
Explains how a teacher came to develop her own version of the whole language approach through her experimentation with remedial students in the 1970s. Makes a case for student research and inquiry into issues that matter to them personally, in lieu of traditional research of classic writers. (TB)
Internet, Phone, Mail, and Mixed-Mode Surveys: The Tailored Design Method, 4th Edition
ERIC Educational Resources Information Center
Dillman, Don A.; Smyth, Jolene D.; Christian, Lean Melani
2014-01-01
For over two decades, Dillman's classic text on survey design has aided both students and professionals in effectively planning and conducting mail, telephone, and, more recently, Internet surveys. The new edition is thoroughly updated and revised, and covers all aspects of survey research. It features expanded coverage of mobile phones, tablets,…
ERIC Educational Resources Information Center
Bastedo, Michael N., Ed.; Altbach, Philip G., Ed.; Gumport, Patricia J., Ed.
2016-01-01
First published in 1999, "American Higher Education in the Twenty First Century" offered a comprehensive introduction to the central issues facing American colleges and universities. This thoroughly revised edition brings the classic volume up to date. The contributors have rewritten every chapter to address major changes in higher…
Education and Identity. Second Edition. The Jossey-Bass Higher and Adult Education Series.
ERIC Educational Resources Information Center
Chickering, Arthur W.; Reisser, Linda
Developing policies and practices to create higher education environments that will foster broad-based development of human talent and potentials is the focus of this fully revised and updated edition, which adds findings from the last 25 years to a classic work. The volume begins with "A Current Theoretical Context for Student Development," which…
Silk Roads or Steppe Roads? The Silk Roads in World History.
ERIC Educational Resources Information Center
Christian, David
2000-01-01
Explores the prehistory of the Silk Roads, reexamines their structure and history in the classical era, and explores shifts in their geography in the last one thousand years. Explains that a revised understanding of the Silk Roads demonstrates how the Afro-Eurasian land mass has been linked by networks of exchange since the Bronze Age. (CMK)
What Is and What Can Be: How a Liminal Position Can Change Learning and Teaching in Higher Education
ERIC Educational Resources Information Center
Cook-Sather, Alison; Alter, Zanny
2011-01-01
In this article we analyze what happens when undergraduate students are positioned as pedagogical consultants in a faculty development program. Drawing on their spoken and written perspectives, and using the classical anthropological concept of "liminality," we illustrate how these student consultants revise their relationships with their teachers…
Constraints on primordial magnetic fields from inflation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Green, Daniel; Kobayashi, Takeshi, E-mail: drgreen@cita.utoronto.ca, E-mail: takeshi.kobayashi@sissa.it
2016-03-01
We present generic bounds on magnetic fields produced from cosmic inflation. By investigating field bounds on the vector potential, we constrain both the quantum mechanical production of magnetic fields and their classical growth in a model independent way. For classical growth, we show that only if the reheating temperature is as low as T{sub reh} ∼< 10{sup 2} MeV can magnetic fields of 10{sup −15} G be produced on Mpc scales in the present universe. For purely quantum mechanical scenarios, even stronger constraints are derived. Our bounds on classical and quantum mechanical scenarios apply to generic theories of inflationary magnetogenesis with a two-derivative timemore » kinetic term for the vector potential. In both cases, the magnetic field strength is limited by the gravitational back-reaction of the electric fields that are produced simultaneously. As an example of quantum mechanical scenarios, we construct vector field theories whose time diffeomorphisms are spontaneously broken, and explore magnetic field generation in theories with a variable speed of light. Transitions of quantum vector field fluctuations into classical fluctuations are also analyzed in the examples.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gurvits, L.
2002-01-01
Classical matching theory can be defined in terms of matrices with nonnegative entries. The notion of Positive operator, central in Quantum Theory, is a natural generalization of matrices with non-negative entries. Based on this point of view, we introduce a definition of perfect Quantum (operator) matching. We show that the new notion inherits many 'classical' properties, but not all of them. This new notion goes somewhere beyound matroids. For separable bipartite quantum states this new notion coinsides with the full rank property of the intersection of two corresponding geometric matroids. In the classical situation, permanents are naturally associated with perfectsmore » matchings. We introduce an analog of permanents for positive operators, called Quantum Permanent and show how this generalization of the permanent is related to the Quantum Entanglement. Besides many other things, Quantum Permanents provide new rational inequalities necessary for the separability of bipartite quantum states. Using Quantum Permanents, we give deterministic poly-time algorithm to solve Hidden Matroids Intersection Problem and indicate some 'classical' complexity difficulties associated with the Quantum Entanglement. Finally, we prove that the weak membership problem for the convex set of separable bipartite density matrices is NP-HARD.« less
Ghost imaging of phase objects with classical incoherent light
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shirai, Tomohiro; Setaelae, Tero; Friberg, Ari T.
2011-10-15
We describe an optical setup for performing spatial Fourier filtering in ghost imaging with classical incoherent light. This is achieved by a modification of the conventional geometry for lensless ghost imaging. It is shown on the basis of classical coherence theory that with this technique one can realize what we call phase-contrast ghost imaging to visualize pure phase objects.
Quantum-mechanical machinery for rational decision-making in classical guessing game
NASA Astrophysics Data System (ADS)
Bang, Jeongho; Ryu, Junghee; Pawłowski, Marcin; Ham, Byoung S.; Lee, Jinhyoung
2016-02-01
In quantum game theory, one of the most intriguing and important questions is, “Is it possible to get quantum advantages without any modification of the classical game?” The answer to this question so far has largely been negative. So far, it has usually been thought that a change of the classical game setting appears to be unavoidable for getting the quantum advantages. However, we give an affirmative answer here, focusing on the decision-making process (we call ‘reasoning’) to generate the best strategy, which may occur internally, e.g., in the player’s brain. To show this, we consider a classical guessing game. We then define a one-player reasoning problem in the context of the decision-making theory, where the machinery processes are designed to simulate classical and quantum reasoning. In such settings, we present a scenario where a rational player is able to make better use of his/her weak preferences due to quantum reasoning, without any altering or resetting of the classically defined game. We also argue in further analysis that the quantum reasoning may make the player fail, and even make the situation worse, due to any inappropriate preferences.
Quantum-mechanical machinery for rational decision-making in classical guessing game
Bang, Jeongho; Ryu, Junghee; Pawłowski, Marcin; Ham, Byoung S.; Lee, Jinhyoung
2016-01-01
In quantum game theory, one of the most intriguing and important questions is, “Is it possible to get quantum advantages without any modification of the classical game?” The answer to this question so far has largely been negative. So far, it has usually been thought that a change of the classical game setting appears to be unavoidable for getting the quantum advantages. However, we give an affirmative answer here, focusing on the decision-making process (we call ‘reasoning’) to generate the best strategy, which may occur internally, e.g., in the player’s brain. To show this, we consider a classical guessing game. We then define a one-player reasoning problem in the context of the decision-making theory, where the machinery processes are designed to simulate classical and quantum reasoning. In such settings, we present a scenario where a rational player is able to make better use of his/her weak preferences due to quantum reasoning, without any altering or resetting of the classically defined game. We also argue in further analysis that the quantum reasoning may make the player fail, and even make the situation worse, due to any inappropriate preferences. PMID:26875685
Quantum-mechanical machinery for rational decision-making in classical guessing game.
Bang, Jeongho; Ryu, Junghee; Pawłowski, Marcin; Ham, Byoung S; Lee, Jinhyoung
2016-02-15
In quantum game theory, one of the most intriguing and important questions is, "Is it possible to get quantum advantages without any modification of the classical game?" The answer to this question so far has largely been negative. So far, it has usually been thought that a change of the classical game setting appears to be unavoidable for getting the quantum advantages. However, we give an affirmative answer here, focusing on the decision-making process (we call 'reasoning') to generate the best strategy, which may occur internally, e.g., in the player's brain. To show this, we consider a classical guessing game. We then define a one-player reasoning problem in the context of the decision-making theory, where the machinery processes are designed to simulate classical and quantum reasoning. In such settings, we present a scenario where a rational player is able to make better use of his/her weak preferences due to quantum reasoning, without any altering or resetting of the classically defined game. We also argue in further analysis that the quantum reasoning may make the player fail, and even make the situation worse, due to any inappropriate preferences.
Speech-Act and Text-Act Theory: "Theme-ing" in Freshman Composition.
ERIC Educational Resources Information Center
Horner, Winifred B.
In contrast to a speech-act theory that is limited by a simple speaker/hearer relationship, a text-act theory of written language allows for the historical or personal context of a writer and reader, both in the written work itself and in the act of reading. This theory can be applied to theme writing, essay examinations, and revision in the…
ERIC Educational Resources Information Center
Boonma, Malai; Phaiboonnugulkij, Malinee
2014-01-01
This article calls for a strong need to propose the theoretical framework of the Multiple Intelligences theory (MI) and provide a suitable answer of the doubt in part of foreign language teaching. The article addresses the application of MI theory following various sources from Howard Gardner and the authors who revised this theory for use in the…
Toda theories as contractions of affine Toda theories
NASA Astrophysics Data System (ADS)
Aghamohammadi, A.; Khorrami, M.; Shariati, A.
1996-02-01
Using a contraction procedure, we obtain Toda theories and their structures, from affine Toda theories and their corresponding structures. By structures, we mean the equation of motion, the classical Lax pair, the boundary term for half line theories, and the quantum transfer matrix. The Lax pair and the transfer matrix so obtained, depend nontrivially on the spectral parameter.
Comparing the Effectiveness of SPSS and EduG Using Different Designs for Generalizability Theory
ERIC Educational Resources Information Center
Teker, Gulsen Tasdelen; Guler, Nese; Uyanik, Gulden Kaya
2015-01-01
Generalizability theory (G theory) provides a broad conceptual framework for social sciences such as psychology and education, and a comprehensive construct for numerous measurement events by using analysis of variance, a strong statistical method. G theory, as an extension of both classical test theory and analysis of variance, is a model which…
An Approach to Biased Item Identification Using Latent Trait Measurement Theory.
ERIC Educational Resources Information Center
Rudner, Lawrence M.
Because it is a true score model employing item parameters which are independent of the examined sample, item characteristic curve theory (ICC) offers several advantages over classical measurement theory. In this paper an approach to biased item identification using ICC theory is described and applied. The ICC theory approach is attractive in that…
Theoretical Studies of Low Frequency Instabilities in the Ionosphere. Final Report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dimant, Y. S.
2003-08-20
The objective of the current project is to provide a theoretical basis for better understanding of numerous radar and rocket observations of density irregularities and related effects in the lower equatorial and high-latitude ionospheres. The research focused on: (1) continuing efforts to develop a theory of nonlinear saturation of the Farley-Buneman instability; (2) revision of the kinetic theory of electron-thermal instability at low altitudes; (3) studying the effects of strong anomalous electron heating in the high-latitude electrojet; (4) analytical and numerical studies of the combined Farley-Bunemadion-thermal instabilities in the E-region ionosphere; (5) studying the effect of dust charging in Polarmore » Mesospheric Clouds. Revision of the kinetic theory of electron thermal instability at low altitudes.« less
Khrennikov, Andrei
2011-09-01
We propose a model of quantum-like (QL) processing of mental information. This model is based on quantum information theory. However, in contrast to models of "quantum physical brain" reducing mental activity (at least at the highest level) to quantum physical phenomena in the brain, our model matches well with the basic neuronal paradigm of the cognitive science. QL information processing is based (surprisingly) on classical electromagnetic signals induced by joint activity of neurons. This novel approach to quantum information is based on representation of quantum mechanics as a version of classical signal theory which was recently elaborated by the author. The brain uses the QL representation (QLR) for working with abstract concepts; concrete images are described by classical information theory. Two processes, classical and QL, are performed parallely. Moreover, information is actively transmitted from one representation to another. A QL concept given in our model by a density operator can generate a variety of concrete images given by temporal realizations of the corresponding (Gaussian) random signal. This signal has the covariance operator coinciding with the density operator encoding the abstract concept under consideration. The presence of various temporal scales in the brain plays the crucial role in creation of QLR in the brain. Moreover, in our model electromagnetic noise produced by neurons is a source of superstrong QL correlations between processes in different spatial domains in the brain; the binding problem is solved on the QL level, but with the aid of the classical background fluctuations. Copyright © 2011 Elsevier Ireland Ltd. All rights reserved.
The Heart Trumps the Head: Desirability Bias in Political Belief Revision
2017-01-01
Understanding how individuals revise their political beliefs has important implications for society. In a preregistered study (N = 900), we experimentally separated the predictions of 2 leading theories of human belief revision—desirability bias and confirmation bias—in the context of the 2016 U.S. presidential election. Participants indicated who they desired to win, and who they believed would win, the election. Following confrontation with evidence that was either consistent or inconsistent with their desires or beliefs, they again indicated who they believed would win. We observed a robust desirability bias—individuals updated their beliefs more if the evidence was consistent (vs. inconsistent) with their desired outcome. This bias was independent of whether the evidence was consistent or inconsistent with their prior beliefs. In contrast, we found limited evidence of an independent confirmation bias in belief updating. These results have implications for the relevant psychological theories and for political belief revision in practice. PMID:28557511
Rett syndrome: revised diagnostic criteria and nomenclature.
Neul, Jeffrey L; Kaufmann, Walter E; Glaze, Daniel G; Christodoulou, John; Clarke, Angus J; Bahi-Buisson, Nadia; Leonard, Helen; Bailey, Mark E S; Schanen, N Carolyn; Zappella, Michele; Renieri, Alessandra; Huppke, Peter; Percy, Alan K
2010-12-01
Rett syndrome (RTT) is a severe neurodevelopmental disease that affects approximately 1 in 10,000 live female births and is often caused by mutations in Methyl-CpG-binding protein 2 (MECP2). Despite distinct clinical features, the accumulation of clinical and molecular information in recent years has generated considerable confusion regarding the diagnosis of RTT. The purpose of this work was to revise and clarify 2002 consensus criteria for the diagnosis of RTT in anticipation of treatment trials. RettSearch members, representing the majority of the international clinical RTT specialists, participated in an iterative process to come to a consensus on a revised and simplified clinical diagnostic criteria for RTT. The clinical criteria required for the diagnosis of classic and atypical RTT were clarified and simplified. Guidelines for the diagnosis and molecular evaluation of specific variant forms of RTT were developed. These revised criteria provide clarity regarding the key features required for the diagnosis of RTT and reinforce the concept that RTT is a clinical diagnosis based on distinct clinical criteria, independent of molecular findings. We recommend that these criteria and guidelines be utilized in any proposed clinical research.
Transfer function modeling of damping mechanisms in viscoelastic plates
NASA Technical Reports Server (NTRS)
Slater, J. C.; Inman, D. J.
1991-01-01
This work formulates a method for the modeling of material damping characteristics in plates. The Sophie German equation of classical plate theory is modified to incorporate hysteresis effects represented by complex stiffness using the transfer function approach proposed by Golla and Hughes, (1985). However, this procedure is not limited to this representation. The governing characteristic equation is decoupled through separation of variables, yielding a solution similar to that of undamped classical plate theory, allowing solution of the steady state as well as the transient response problem.
NASA Astrophysics Data System (ADS)
Wang, Hai; Kumar, Asutosh; Cho, Minhyung; Wu, Junde
2018-04-01
Physical quantities are assumed to take real values, which stems from the fact that an usual measuring instrument that measures a physical observable always yields a real number. Here we consider the question of what would happen if physical observables are allowed to assume complex values. In this paper, we show that by allowing observables in the Bell inequality to take complex values, a classical physical theory can actually get the same upper bound of the Bell expression as quantum theory. Also, by extending the real field to the quaternionic field, we can puzzle out the GHZ problem using local hidden variable model. Furthermore, we try to build a new type of hidden-variable theory of a single qubit based on the result.
From integrability to conformal symmetry: Bosonic superconformal Toda theories
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bo-Yu Hou; Liu Chao
In this paper the authors study the conformal integrable models obtained from conformal reductions of WZNW theory associated with second order constraints. These models are called bosonic superconformal Toda models due to their conformal spectra and their resemblance to the usual Toda theories. From the reduction procedure they get the equations of motion and the linearized Lax equations in a generic Z gradation of the underlying Lie algebra. Then, in the special case of principal gradation, they derive the classical r matrix, fundamental Poisson relation, exchange algebra of chiral operators and find out the classical vertex operators. The result showsmore » that their model is very similar to the ordinary Toda theories in that one can obtain various conformal properties of the model from its integrability.« less
Sound and Vision: Using Progressive Rock To Teach Social Theory.
ERIC Educational Resources Information Center
Ahlkvist, Jarl A.
2001-01-01
Describes a teaching technique that utilizes progressive rock music to educate students about sociological theories in introductory sociology courses. Discusses the use of music when teaching about classical social theory and offers an evaluation of this teaching strategy. Includes references. (CMK)
NASA Astrophysics Data System (ADS)
Camilleri, Kristian; Schlosshauer, Maximilian
2015-02-01
Niels Bohr's doctrine of the primacy of "classical concepts" is arguably his most criticized and misunderstood view. We present a new, careful historical analysis that makes clear that Bohr's doctrine was primarily an epistemological thesis, derived from his understanding of the functional role of experiment. A hitherto largely overlooked disagreement between Bohr and Heisenberg about the movability of the "cut" between measuring apparatus and observed quantum system supports the view that, for Bohr, such a cut did not originate in dynamical (ontological) considerations, but rather in functional (epistemological) considerations. As such, both the motivation and the target of Bohr's doctrine of classical concepts are of a fundamentally different nature than what is understood as the dynamical problem of the quantum-to-classical transition. Our analysis suggests that, contrary to claims often found in the literature, Bohr's doctrine is not, and cannot be, at odds with proposed solutions to the dynamical problem of the quantum-classical transition that were pursued by several of Bohr's followers and culminated in the development of decoherence theory.
Force-field functor theory: classical force-fields which reproduce equilibrium quantum distributions
Babbush, Ryan; Parkhill, John; Aspuru-Guzik, Alán
2013-01-01
Feynman and Hibbs were the first to variationally determine an effective potential whose associated classical canonical ensemble approximates the exact quantum partition function. We examine the existence of a map between the local potential and an effective classical potential which matches the exact quantum equilibrium density and partition function. The usefulness of such a mapping rests in its ability to readily improve Born-Oppenheimer potentials for use with classical sampling. We show that such a map is unique and must exist. To explore the feasibility of using this result to improve classical molecular mechanics, we numerically produce a map from a library of randomly generated one-dimensional potential/effective potential pairs then evaluate its performance on independent test problems. We also apply the map to simulate liquid para-hydrogen, finding that the resulting radial pair distribution functions agree well with path integral Monte Carlo simulations. The surprising accessibility and transferability of the technique suggest a quantitative route to adapting Born-Oppenheimer potentials, with a motivation similar in spirit to the powerful ideas and approximations of density functional theory. PMID:24790954
Quantum theory for 1D X-ray free electron laser
Anisimov, Petr Mikhaylovich
2017-09-19
Classical 1D X-ray Free Electron Laser (X-ray FEL) theory has stood the test of time by guiding FEL design and development prior to any full-scale analysis. Future X-ray FELs and inverse-Compton sources, where photon recoil approaches an electron energy spread value, push the classical theory to its limits of applicability. After substantial efforts by the community to find what those limits are, there is no universally agreed upon quantum approach to design and development of future X-ray sources. We offer a new approach to formulate the quantum theory for 1D X-ray FELs that has an obvious connection to the classicalmore » theory, which allows for immediate transfer of knowledge between the two regimes. In conclusion, we exploit this connection in order to draw quantum mechanical conclusions about the quantum nature of electrons and generated radiation in terms of FEL variables.« less
NASA Astrophysics Data System (ADS)
Rezaei Kivi, Araz; Azizi, Saber; Norouzi, Peyman
2017-12-01
In this paper, the nonlinear size-dependent static and dynamic behavior of an electrostatically actuated nano-beam is investigated. A fully clamped nano-beam is considered for the modeling of the deformable electrode of the NEMS. The governing differential equation of the motion is derived using Hamiltonian principle based on couple stress theory; a non-classical theory for considering length scale effects. The nonlinear partial differential equation of the motion is discretized to a nonlinear Duffing type ODE's using Galerkin method. Static and dynamic pull-in instabilities obtained by both classical theory and MCST are compared. At the second stage of analysis, shooting technique is utilized to obtain the frequency response curve, and to capture the periodic solutions of the motion; the stability of the periodic solutions are gained by Floquet theory. The nonlinear dynamic behavior of the deformable electrode due to the AC harmonic accompanied with size dependency is investigated.
Finite conformal quantum gravity and spacetime singularities
NASA Astrophysics Data System (ADS)
Modesto, Leonardo; Rachwał, Lesław
2017-12-01
We show that a class of finite quantum non-local gravitational theories is conformally invariant at classical as well as at quantum level. This is actually a range of conformal anomaly-free theories in the spontaneously broken phase of the Weyl symmetry. At classical level we show how the Weyl conformal invariance is able to tame all the spacetime singularities that plague not only Einstein gravity, but also local and weakly non-local higher derivative theories. The latter statement is proved by a singularity theorem that applies to a large class of weakly non-local theories. Therefore, we are entitled to look for a solution of the spacetime singularity puzzle in a missed symmetry of nature, namely the Weyl conformal symmetry. Following the seminal paper by Narlikar and Kembhavi, we provide an explicit construction of singularity-free black hole exact solutions in a class of conformally invariant theories.
NASA Astrophysics Data System (ADS)
Argurio, Riccardo
1998-07-01
The thesis begins with an introduction to M-theory (at a graduate student's level), starting from perturbative string theory and proceeding to dualities, D-branes and finally Matrix theory. The following chapter treats, in a self-contained way, of general classical p-brane solutions. Black and extremal branes are reviewed, along with their semi-classical thermodynamics. We then focus on intersecting extremal branes, the intersection rules being derived both with and without the explicit use of supersymmetry. The last three chapters comprise more advanced aspects of brane physics, such as the dynamics of open branes, the little theories on the world-volume of branes and how the four dimensional Schwarzschild black hole can be mapped to an extremal configuration of branes, thus allowing for a statistical interpretation of its entropy. The original results were already reported in hep-th/9701042, hep-th/9704190, hep-th/9710027 and hep-th/9801053.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Harrison, R. G., E-mail: rgh@doe.carleton.ca
2014-01-21
A positive-feedback mean-field modification of the classical Brillouin magnetization theory provides an explanation of the apparent persistence of the spontaneous magnetization beyond the conventional Curie temperature—the little understood “tail” phenomenon that occurs in many ferromagnetic materials. The classical theory is unable to resolve this apparent anomaly. The modified theory incorporates the temperature-dependent quantum-scale hysteretic and mesoscopic domain-scale anhysteretic magnetization processes and includes the effects of demagnetizing and exchange fields. It is found that the thermal behavior of the reversible and irreversible segments of the hysteresis loops, as predicted by the theory, is a key to the presence or absence ofmore » the “tails.” The theory, which permits arbitrary values of the quantum spin number J, generally provides a quantitative agreement with the thermal variations of both the spontaneous magnetization and the shape of the hysteresis loop.« less
Item Response Modeling with Sum Scores
ERIC Educational Resources Information Center
Johnson, Timothy R.
2013-01-01
One of the distinctions between classical test theory and item response theory is that the former focuses on sum scores and their relationship to true scores, whereas the latter concerns item responses and their relationship to latent scores. Although item response theory is often viewed as the richer of the two theories, sum scores are still…
Test Theories, Educational Priorities and Reliability of Public Examinations in England
ERIC Educational Resources Information Center
Baird, Jo-Anne; Black, Paul
2013-01-01
Much has already been written on the controversies surrounding the use of different test theories in educational assessment. Other authors have noted the prevalence of classical test theory over item response theory in practice. This Special Issue draws together articles based upon work conducted on the Reliability Programme for England's…
Recent developments in bimetric theory
NASA Astrophysics Data System (ADS)
Schmidt-May, Angnis; von Strauss, Mikael
2016-05-01
This review is dedicated to recent progress in the field of classical, interacting, massive spin-2 theories, with a focus on ghost-free bimetric theory. We will outline its history and its development as a nontrivial extension and generalisation of nonlinear massive gravity. We present a detailed discussion of the consistency proofs of both theories, before we review Einstein solutions to the bimetric equations of motion in vacuum as well as the resulting mass spectrum. We introduce couplings to matter and then discuss the general relativity and massive gravity limits of bimetric theory, which correspond to decoupling the massive or the massless spin-2 field from the matter sector, respectively. More general classical solutions are reviewed and the present status of bimetric cosmology is summarised. An interesting corner in the bimetric parameter space which could potentially give rise to a nonlinear theory for partially massless spin-2 fields is also discussed. Relations to higher-curvature theories of gravity are explained and finally we give an overview of possible extensions of the theory and review its formulation in terms of vielbeins.
ERIC Educational Resources Information Center
Chang, Liang-Te; And Others
A study was conducted to develop the electronic technical competencies of duty and task analysis by using a revised DACUM (Developing a Curriculum) method, a questionnaire survey, and a fuzzy synthesis operation. The revised DACUM process relied on inviting electronics trade professionals to analyze electronic technology for entry-level…
ERIC Educational Resources Information Center
Saifer, Steffen
Based on sound developmentally appropriate theory, this revised guide is designed to help early childhood teachers deal with common problems that arise in all aspects of their work. Following an introduction and a list of the 20 most important principles for successful preschool teaching, the guide is divided into nine parts. Part 1 addresses…
Revisioning Premodern Fine Art as Popular Visual Culture
ERIC Educational Resources Information Center
Duncum, Paul
2014-01-01
Employing the concept of a rhetoric of emotions, European Premodern fine art is revisioned as popular culture. From ancient times, the rhetoric of emotion was one of the principle concepts informing the theory and practice of all forms of European cultural production, including the visual arts, until it was gradually displaced during the 1700s and…
Peer Scaffolding Behaviors Emerging in Revising a Written Task: A Microgenetic Analysis
ERIC Educational Resources Information Center
Ranjbar, Naser; Ghonsooly, Behzad
2017-01-01
Vygotsky's writings on Sociocultural Theory (SCT) of mind, his concept of Zone of Proximal Development (ZPD) and its related metaphor, scaffolding, serve as the theoretical basis for the study of peer collaboration. This paper aimed at examining the effects of peer-scaffolding on EFL writing ability and finding out how revising techniques are…
ERIC Educational Resources Information Center
Wilson, Steven R.; Aleman, Carlos G.; Leatham, Geoff B.
1998-01-01
Challenges and revises politeness theory by analyzing potential implications for both parties' face when the logical preconditions for seeking compliance are framed by specific influence goals. Tests undergraduate students' imagining asking favors, giving advice, and enforcing obligations with same-sex friends. Finds perceived face threats varied…
NASA Technical Reports Server (NTRS)
Jones, R. T. (Compiler)
1979-01-01
A collection of papers on modern theoretical aerodynamics is presented. Included are theories of incompressible potential flow and research on the aerodynamic forces on wing and wing sections of aircraft and on airship hulls.
Contextual Advantage for State Discrimination
NASA Astrophysics Data System (ADS)
Schmid, David; Spekkens, Robert W.
2018-02-01
Finding quantitative aspects of quantum phenomena which cannot be explained by any classical model has foundational importance for understanding the boundary between classical and quantum theory. It also has practical significance for identifying information processing tasks for which those phenomena provide a quantum advantage. Using the framework of generalized noncontextuality as our notion of classicality, we find one such nonclassical feature within the phenomenology of quantum minimum-error state discrimination. Namely, we identify quantitative limits on the success probability for minimum-error state discrimination in any experiment described by a noncontextual ontological model. These constraints constitute noncontextuality inequalities that are violated by quantum theory, and this violation implies a quantum advantage for state discrimination relative to noncontextual models. Furthermore, our noncontextuality inequalities are robust to noise and are operationally formulated, so that any experimental violation of the inequalities is a witness of contextuality, independently of the validity of quantum theory. Along the way, we introduce new methods for analyzing noncontextuality scenarios and demonstrate a tight connection between our minimum-error state discrimination scenario and a Bell scenario.
Scalar gravitational waves in the effective theory of gravity
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mottola, Emil
As a low energy effective field theory, classical General Relativity receives an infrared relevant modification from the conformal trace anomaly of the energy-momentum tensor of massless, or nearly massless, quantum fields. The local form of the effective action associated with the trace anomaly is expressed in terms of a dynamical scalar field that couples to the conformal factor of the spacetime metric, allowing it to propagate over macroscopic distances. Linearized around flat spacetime, this semi-classical EFT admits scalar gravitational wave solutions in addition to the transversely polarized tensor waves of the classical Einstein theory. The amplitude of the scalar wavemore » modes, as well as their energy and energy flux which are positive and contain a monopole moment, are computed. As a result, astrophysical sources for scalar gravitational waves are considered, with the excited gluonic condensates in the interiors of neutron stars in merger events with other compact objects likely to provide the strongest burst signals.« less
NASA Astrophysics Data System (ADS)
Zamani Kouhpanji, Mohammad Reza; Behzadirad, Mahmoud; Busani, Tito
2017-12-01
We used the stable strain gradient theory including acceleration gradients to investigate the classical and nonclassical mechanical properties of gallium nitride (GaN) nanowires (NWs). We predicted the static length scales, Young's modulus, and shear modulus of the GaN NWs from the experimental data. Combining these results with atomic simulations, we also found the dynamic length scale of the GaN NWs. Young's modulus, shear modulus, static, and dynamic length scales were found to be 318 GPa, 131 GPa, 8 nm, and 8.9 nm, respectively, usable for demonstrating the static and dynamic behaviors of GaN NWs having diameters from a few nm to bulk dimensions. Furthermore, the experimental data were analyzed with classical continuum theory (CCT) and compared with the available literature to illustrate the size-dependency of the mechanical properties of GaN NWs. This practice resolves the previous published discrepancies that happened due to the limitations of CCT used for determining the mechanical properties of GaN NWs and their size-dependency.
Banik, Suman Kumar; Bag, Bidhan Chandra; Ray, Deb Shankar
2002-05-01
Traditionally, quantum Brownian motion is described by Fokker-Planck or diffusion equations in terms of quasiprobability distribution functions, e.g., Wigner functions. These often become singular or negative in the full quantum regime. In this paper a simple approach to non-Markovian theory of quantum Brownian motion using true probability distribution functions is presented. Based on an initial coherent state representation of the bath oscillators and an equilibrium canonical distribution of the quantum mechanical mean values of their coordinates and momenta, we derive a generalized quantum Langevin equation in c numbers and show that the latter is amenable to a theoretical analysis in terms of the classical theory of non-Markovian dynamics. The corresponding Fokker-Planck, diffusion, and Smoluchowski equations are the exact quantum analogs of their classical counterparts. The present work is independent of path integral techniques. The theory as developed here is a natural extension of its classical version and is valid for arbitrary temperature and friction (the Smoluchowski equation being considered in the overdamped limit).
Density-functional theory simulation of large quantum dots
NASA Astrophysics Data System (ADS)
Jiang, Hong; Baranger, Harold U.; Yang, Weitao
2003-10-01
Kohn-Sham spin-density functional theory provides an efficient and accurate model to study electron-electron interaction effects in quantum dots, but its application to large systems is a challenge. Here an efficient method for the simulation of quantum dots using density-function theory is developed; it includes the particle-in-the-box representation of the Kohn-Sham orbitals, an efficient conjugate-gradient method to directly minimize the total energy, a Fourier convolution approach for the calculation of the Hartree potential, and a simplified multigrid technique to accelerate the convergence. We test the methodology in a two-dimensional model system and show that numerical studies of large quantum dots with several hundred electrons become computationally affordable. In the noninteracting limit, the classical dynamics of the system we study can be continuously varied from integrable to fully chaotic. The qualitative difference in the noninteracting classical dynamics has an effect on the quantum properties of the interacting system: integrable classical dynamics leads to higher-spin states and a broader distribution of spacing between Coulomb blockade peaks.
Scalar gravitational waves in the effective theory of gravity
Mottola, Emil
2017-07-10
As a low energy effective field theory, classical General Relativity receives an infrared relevant modification from the conformal trace anomaly of the energy-momentum tensor of massless, or nearly massless, quantum fields. The local form of the effective action associated with the trace anomaly is expressed in terms of a dynamical scalar field that couples to the conformal factor of the spacetime metric, allowing it to propagate over macroscopic distances. Linearized around flat spacetime, this semi-classical EFT admits scalar gravitational wave solutions in addition to the transversely polarized tensor waves of the classical Einstein theory. The amplitude of the scalar wavemore » modes, as well as their energy and energy flux which are positive and contain a monopole moment, are computed. As a result, astrophysical sources for scalar gravitational waves are considered, with the excited gluonic condensates in the interiors of neutron stars in merger events with other compact objects likely to provide the strongest burst signals.« less
An appraisal of the classic forest succession paradigm with the shade tolerance index
Jean Lienard; Ionut Florescu; Nikolay Strigul
2015-01-01
We revisit the classic theory of forest succession that relates shade tolerance and species replacement and assess its validity to understand patch-mosaic patterns of forested ecosystems of the USA. We introduce a macroscopic parameter called the âshade tolerance indexâ and compare it to the classic continuum index in southern Wisconsin forests. We exemplify shade...
Noncommutative gauge theory for Poisson manifolds
NASA Astrophysics Data System (ADS)
Jurčo, Branislav; Schupp, Peter; Wess, Julius
2000-09-01
A noncommutative gauge theory is associated to every Abelian gauge theory on a Poisson manifold. The semi-classical and full quantum version of the map from the ordinary gauge theory to the noncommutative gauge theory (Seiberg-Witten map) is given explicitly to all orders for any Poisson manifold in the Abelian case. In the quantum case the construction is based on Kontsevich's formality theorem.
The Foundations of Einstein's Theory of Gravitation
NASA Astrophysics Data System (ADS)
Freundlich, Erwin; Brose, Translated by Henry L.; Einstein, Preface by Albert; Turner, Introduction by H. H.
2011-06-01
Introduction; 1. The special theory of relativity as a stepping-stone to the general theory of relativity; 2. Two fundamental postulates in the mathematical formulation of physical laws; 3. Concerning the fulfilment of the two postulates; 4. The difficulties in the principles of classical mechanics; 5. Einstein's theory of gravitation; 6. The verification of the new theory by actual experience; Appendix; Index.
ERIC Educational Resources Information Center
Besson, Ugo
2013-01-01
This paper presents a history of research and theories on sliding friction between solids. This history is divided into four phases: from Leonardo da Vinci to Coulomb and the establishment of classical laws of friction; the theories of lubrication and the Tomlinson's theory of friction (1850-1930); the theories of wear, the Bowden and Tabor's…
NASA Technical Reports Server (NTRS)
Stein, Manuel; Sydow, P. Daniel; Librescu, Liviu
1990-01-01
Buckling and postbuckling results are presented for compression-loaded simply-supported aluminum plates and composite plates with a symmetric lay-up of thin + or - 45 deg plies composed of many layers. Buckling results for aluminum plates of finite length are given for various length-to-width ratios. Asymptotes to the curves based on buckling results give N(sub xcr) for plates of infinite length. Postbuckling results for plates with transverse shearing flexibility are compared to results from classical theory for various width-to-thickness ratios. Characteristic curves indicating the average longitudinal direct stress resultant as a function of the applied displacements are calculated based on four different theories: Classical von Karman theory using the Kirchoff assumptions, first-order shear deformation theory, higher-order shear deformation theory, and 3-D flexibility theory. Present results indicate that the 3-D flexibility theory gives the lowest buckling loads. The higher-order shear deformation theory has fewer unknowns than the 3-D flexibility theory but does not take into account through-the-thickness effects. The figures presented show that small differences occur in the average longitudinal direct stress resultants from the four theories that are functions of applied end-shortening displacement.
Kalsched, Donald E
2015-09-01
This paper explores the evolution of Michael Fordham's ideas concerning 'defences of the self', including his application of this concept to a group of 'difficult' adult patients in his famous 1974 paper by the same name. After tracing the relevance of Fordham's ideas to my own discovery of a 'self-care system' in the psychological material of early trauma patients (Kalsched ), I describe how Fordham's seminal notions might be revisioned in light of contemporary relational theory as well as early attachment theory and affective neuroscience. These revisionings involve an awareness that the severe woundings of early unremembered trauma are not transformable through interpretation but will inevitably be repeated in the transference, leading to mutual 'enactments' between the analytic partners and, hopefully, to a new outcome. A clinical example of one such mutual enactment between the author and his patient is provided. The paper concludes with reflections on the clinical implications of this difficult case and what it means to become a 'real person' to our patients. Finally, Jung's alchemical views on transference are shown to be useful analogies in our understanding of the necessary mutuality in the healing process with these patients. © 2015, The Society of Analytical Psychology.
The kinematics of the Scorpius-Centaurus OB association from Gaia DR1
NASA Astrophysics Data System (ADS)
Wright, Nicholas J.; Mamajek, Eric E.
2018-05-01
We present a kinematic study of the Scorpius-Centaurus (Sco-Cen) OB association (Sco OB2) using Gaia DR1 parallaxes and proper motions. Our goal is to test the classical theory that OB associations are the expanded remnants of dense and compact star clusters disrupted by processes such as residual gas expulsion. Gaia astrometry is available for 258 out of 433 members of the association, with revised Hipparcos astrometry used for the remainder. We use these data to confirm that the three subgroups of Sco-Cen are gravitationally unbound and have non-isotropic velocity dispersions, suggesting that they have not had time to dynamically relax. We also explore the internal kinematics of the subgroups to search for evidence of expansion. We test Blaauw's classical linear model of expansion, search for velocity trends along the Galactic axes, compare the expanding and non-expanding convergence points, perform traceback analysis assuming both linear trajectories and using an epicycle approximation, and assess the evidence for expansion in proper motions corrected for virtual expansion/contraction. None of these methods provide coherent evidence for expansion of the subgroups, with no evidence to suggest that the subgroups had a more compact configuration in the past. We find evidence for kinematic substructure within the subgroups that supports the view that they were not formed by the disruption of individual star clusters. We conclude that Sco-Cen was likely to have been born highly substructured, with multiple small-scale star formation events contributing to the overall OB association, and not as single, monolithic burst of clustered star formation.
A Transferrable Belief Model Representation for Physical Security of Nuclear Materials
DOE Office of Scientific and Technical Information (OSTI.GOV)
David Gerts
This work analyzed various probabilistic methods such as classic statistics, Bayesian inference, possibilistic theory, and Dempster-Shafer theory of belief functions for the potential insight offered into the physical security of nuclear materials as well as more broad application to nuclear non-proliferation automated decision making theory. A review of the fundamental heuristic and basic limitations of each of these methods suggested that the Dempster-Shafer theory of belief functions may offer significant capability. Further examination of the various interpretations of Dempster-Shafer theory, such as random set, generalized Bayesian, and upper/lower probability demonstrate some limitations. Compared to the other heuristics, the transferrable beliefmore » model (TBM), one of the leading interpretations of Dempster-Shafer theory, can improve the automated detection of the violation of physical security using sensors and human judgment. The improvement is shown to give a significant heuristic advantage over other probabilistic options by demonstrating significant successes for several classic gedanken experiments.« less
NASA Astrophysics Data System (ADS)
Frič, Roman; Papčo, Martin
2017-12-01
Stressing a categorical approach, we continue our study of fuzzified domains of probability, in which classical random events are replaced by measurable fuzzy random events. In operational probability theory (S. Bugajski) classical random variables are replaced by statistical maps (generalized distribution maps induced by random variables) and in fuzzy probability theory (S. Gudder) the central role is played by observables (maps between probability domains). We show that to each of the two generalized probability theories there corresponds a suitable category and the two resulting categories are dually equivalent. Statistical maps and observables become morphisms. A statistical map can send a degenerated (pure) state to a non-degenerated one —a quantum phenomenon and, dually, an observable can map a crisp random event to a genuine fuzzy random event —a fuzzy phenomenon. The dual equivalence means that the operational probability theory and the fuzzy probability theory coincide and the resulting generalized probability theory has two dual aspects: quantum and fuzzy. We close with some notes on products and coproducts in the dual categories.
There is no fitness but fitness, and the lineage is its bearer
2016-01-01
Inclusive fitness has been the cornerstone of social evolution theory for more than a half-century and has matured as a mathematical theory in the past 20 years. Yet surprisingly for a theory so central to an entire field, some of its connections to evolutionary theory more broadly remain contentious or underappreciated. In this paper, we aim to emphasize the connection between inclusive fitness and modern evolutionary theory through the following fact: inclusive fitness is simply classical Darwinian fitness, averaged over social, environmental and demographic states that members of a gene lineage experience. Therefore, inclusive fitness is neither a generalization of classical fitness, nor does it belong exclusively to the individual. Rather, the lineage perspective emphasizes that evolutionary success is determined by the effect of selection on all biological and environmental contexts that a lineage may experience. We argue that this understanding of inclusive fitness based on gene lineages provides the most illuminating and accurate picture and avoids pitfalls in interpretation and empirical applications of inclusive fitness theory. PMID:26729925
First Test of Long-Range Collisional Drag via Plasma Wave Damping
NASA Astrophysics Data System (ADS)
Affolter, Matthew
2017-10-01
In magnetized plasmas, the rate of particle collisions is enhanced over classical predictions when the cyclotron radius rc is less than the Debye length λD. Classical theories describe local velocity scattering collisions with impact parameters ρ
NASA Astrophysics Data System (ADS)
Surana, K. S.; Joy, A. D.; Reddy, J. N.
2017-03-01
This paper presents a non-classical continuum theory in Lagrangian description for solids in which the conservation and the balance laws are derived by incorporating both the internal rotations arising from the Jacobian of deformation and the rotations of Cosserat theories at a material point. In particular, in this non-classical continuum theory, we have (i) the usual displacements ( ±b \\varvec{u}) and (ii) three internal rotations ({}_i ±b \\varvec{Θ}) about the axes of a triad whose axes are parallel to the x-frame arising from the Jacobian of deformation (which are completely defined by the skew-symmetric part of the Jacobian of deformation), and (iii) three additional rotations ({}_e ±b \\varvec{Θ}) about the axes of the same triad located at each material point as additional three degrees of freedom referred to as Cosserat rotations. This gives rise to ±b \\varvec{u} and {}_e ±b \\varvec{{Θ} as six degrees of freedom at a material point. The internal rotations ({}_i ±b \\varvec{Θ}), often neglected in classical continuum mechanics, exist in all deforming solid continua as these are due to Jacobian of deformation. When the internal rotations {}_i ±b \\varvec{Θ} are resisted by the deforming matter, conjugate moment tensor arises that together with {}_i ±b \\varvec{Θ} may result in energy storage and/or dissipation, which must be accounted for in the conservation and the balance laws. The Cosserat rotations {}_e ±b \\varvec{Θ} also result in conjugate moment tensor which, together with {}_e ±b \\varvec{Θ}, may also result in energy storage and/or dissipation. The main focus of the paper is a consistent derivation of conservation and balance laws that incorporate aforementioned physics and associated constitutive theories for thermoelastic solids. The mathematical model derived here has closure, and the constitutive theories derived using two alternate approaches are in agreement with each other as well as with the condition resulting from the entropy inequality. Material coefficients introduced in the constitutive theories are clearly defined and discussed.
Action and entanglement in gravity and field theory.
Neiman, Yasha
2013-12-27
In nongravitational quantum field theory, the entanglement entropy across a surface depends on the short-distance regularization. Quantum gravity should not require such regularization, and it has been conjectured that the entanglement entropy there is always given by the black hole entropy formula evaluated on the entangling surface. We show that these statements have precise classical counterparts at the level of the action. Specifically, we point out that the action can have a nonadditive imaginary part. In gravity, the latter is fixed by the black hole entropy formula, while in nongravitating theories it is arbitrary. From these classical facts, the entanglement entropy conjecture follows by heuristically applying the relation between actions and wave functions.
Inelastic black hole scattering from charged scalar amplitudes
NASA Astrophysics Data System (ADS)
Luna, Andrés; Nicholson, Isobel; O'Connell, Donal; White, Chris D.
2018-03-01
We explain how the lowest-order classical gravitational radiation produced during the inelastic scattering of two Schwarzschild black holes in General Relativity can be obtained from a tree scattering amplitude in gauge theory coupled to scalar fields. The gauge calculation is related to gravity through the double copy. We remove unwanted scalar forces which can occur in the double copy by introducing a massless scalar in the gauge theory, which is treated as a ghost in the link to gravity. We hope these methods are a step towards a direct application of the double copy at higher orders in classical perturbation theory, with the potential to greatly streamline gravity calculations for phenomenological applications.
1987-09-01
response. An estimate of the buffeting response for the two cases is presented in Figure 4, using the theory of Irwin (Reference 7). Data acquisition was...values were obtained using the log decrement method by exciting the bridge in one mode and observing the decay of the response. Classical theory would...added mass or structural damping level. The addition of inertia to the deck would tend to lower the response according to classical vibration theory
An entropy method for induced drag minimization
NASA Technical Reports Server (NTRS)
Greene, George C.
1989-01-01
A fundamentally new approach to the aircraft minimum induced drag problem is presented. The method, a 'viscous lifting line', is based on the minimum entropy production principle and does not require the planar wake assumption. An approximate, closed form solution is obtained for several wing configurations including a comparison of wing extension, winglets, and in-plane wing sweep, with and without a constraint on wing-root bending moment. Like the classical lifting-line theory, this theory predicts that induced drag is proportional to the square of the lift coefficient and inversely proportioinal to the wing aspect ratio. Unlike the classical theory, it predicts that induced drag is Reynolds number dependent and that the optimum spanwise circulation distribution is non-elliptic.
A novel approach to the theory of homogeneous and heterogeneous nucleation.
Ruckenstein, Eli; Berim, Gersh O; Narsimhan, Ganesan
2015-01-01
A new approach to the theory of nucleation, formulated relatively recently by Ruckenstein, Narsimhan, and Nowakowski (see Refs. [7-16]) and developed further by Ruckenstein and other colleagues, is presented. In contrast to the classical nucleation theory, which is based on calculating the free energy of formation of a cluster of the new phase as a function of its size on the basis of macroscopic thermodynamics, the proposed theory uses the kinetic theory of fluids to calculate the condensation (W(+)) and dissociation (W(-)) rates on and from the surface of the cluster, respectively. The dissociation rate of a monomer from a cluster is evaluated from the average time spent by a surface monomer in the potential well as obtained from the solution of the Fokker-Planck equation in the phase space of position and momentum for liquid-to-solid transition and the phase space of energy for vapor-to-liquid transition. The condensation rates are calculated using traditional expressions. The knowledge of those two rates allows one to calculate the size of the critical cluster from the equality W(+)=W(-) as well as the rate of nucleation. The developed microscopic approach allows one to avoid the controversial application of classical thermodynamics to the description of nuclei which contain a few molecules. The new theory was applied to a number of cases, such as the liquid-to-solid and vapor-to-liquid phase transitions, binary nucleation, heterogeneous nucleation, nucleation on soluble particles and protein folding. The theory predicts higher nucleation rates at high saturation ratios (small critical clusters) than the classical nucleation theory for both solid-to-liquid as well as vapor-to-liquid transitions. As expected, at low saturation ratios for which the size of the critical cluster is large, the results of the new theory are consistent with those of the classical one. The present approach was combined with the density functional theory to account for the density profile in the cluster. This approach was also applied to protein folding, viewed as the evolution of a cluster of native residues of spherical shape within a protein molecule, which could explain protein folding/unfolding and their dependence on temperature. Copyright © 2014 Elsevier B.V. All rights reserved.
Branes and the Kraft-Procesi transition: classical case
NASA Astrophysics Data System (ADS)
Cabrera, Santiago; Hanany, Amihay
2018-04-01
Moduli spaces of a large set of 3 d N=4 effective gauge theories are known to be closures of nilpotent orbits. This set of theories has recently acquired a special status, due to Namikawa's theorem. As a consequence of this theorem, closures of nilpotent orbits are the simplest non-trivial moduli spaces that can be found in three dimensional theories with eight supercharges. In the early 80's mathematicians Hanspeter Kraft and Claudio Procesi characterized an inclusion relation between nilpotent orbit closures of the same classical Lie algebra. We recently [1] showed a physical realization of their work in terms of the motion of D3-branes on the Type IIB superstring embedding of the effective gauge theories. This analysis is restricted to A-type Lie algebras. The present note expands our previous discussion to the remaining classical cases: orthogonal and symplectic algebras. In order to do so we introduce O3-planes in the superstring description. We also find a brane realization for the mathematical map between two partitions of the same integer number known as collapse. Another result is that basic Kraft-Procesi transitions turn out to be described by the moduli space of orthosymplectic quivers with varying boundary conditions.
ERIC Educational Resources Information Center
Jones, Elizabeth; Reynolds, Gretchen
2011-01-01
Responding to current debates on the place of play in schools, the authors have extensively revised their groundbreaking book. They explain how and why play is a critical part of children's development, as well as the central role adults have to promote it. This classic textbook and popular practitioner resource offers systematic descriptions and…
Nesterenko, Pavel N; Rybalko, Marina A; Paull, Brett
2005-06-01
Significant deviations from classical van Deemter behaviour, indicative of turbulent flow liquid chromatography, has been recorded for mobile phases of varying viscosity on porous silica monolithic columns at elevated mobile phase flow rates.
On the emergence of classical gravity
NASA Astrophysics Data System (ADS)
Larjo, Klaus
In this thesis I will discuss how certain black holes arise as an effective, thermodynamical description from non-singular microstates in string theory. This provides a possible solution to the information paradox, and strengthens the case for treating black holes as thermodynamical objects. I will characterize the data defining a microstate of a black hole in several settings, and demonstrate that most of the data is unmeasurable for a classical observer. I will further show that the data that is measurable is universal for nearly all microstates, making it impossible for a classical observer to distinguish between microstates, thus giving rise to an effective statistical description for the black hole. In the first half of the thesis I will work with two specific systems: the half-BPS sector of [Special characters omitted.] = 4 super Yang-Mills the and the conformal field theory corresponding to the D1/D5 system; in both cases the high degree of symmetry present provides great control over potentially intractable computations. For these systems, I will further specify the conditions a quantum mechanical microstate must satisfy in order to have a classical description in terms of a unique metric, and define a 'metric operator' whose eigenstates correspond to classical geometries. In the second half of the thesis I will consider a much broader setting, general [Special characters omitted.] = I superconformal quiver gauge the= ories and their dual gravity theories, and demonstrate that a similar effective description arises also in this setting.
From Foucault to Freire Through Facebook: Toward an Integrated Theory of mHealth.
Bull, Sheana; Ezeanochie, Nnamdi
2016-08-01
To document the integration of social science theory in literature on mHealth (mobile health) and consider opportunities for integration of classic theory, health communication theory, and social networking to generate a relevant theory for mHealth program design. A secondary review of research syntheses and meta-analyses published between 2005 and 2014 related to mHealth, using the AMSTAR (A Measurement Tool to Assess Systematic Reviews) methodology for assessment of the quality of each review. High-quality articles from those reviews using a randomized controlled design and integrating social science theory in program design, implementation, or evaluation were reviewed. Results There were 1,749 articles among the 170 reviews with a high AMSTAR score (≥30). Only 13 were published from 2005 to 2014, used a randomized controlled design and made explicit mention of theory in any aspect of their mHealth program. All 13 included theoretical perspectives focused on psychological and/or psychosocial theories and constructs. Conclusions There is a very limited use of social science theory in mHealth despite demonstrated benefits in doing so. We propose an integrated theory of mHealth that incorporates classic theory, health communication theory, and social networking to guide development and evaluation of mHealth programs. © 2015 Society for Public Health Education.
ERIC Educational Resources Information Center
Schriewer, Jurgen, Ed.
2012-01-01
New theories and theory-based methodological approaches have found their way into Comparative Education--just as into Comparative Social Science more generally--in increasing number in the recent past. The essays of this volume express and critically discuss quite a range of these positions such as, inter alia, the theory of self-organizing social…
Social Comparison: The End of a Theory and the Emergence of a Field
ERIC Educational Resources Information Center
Buunk, Abraham P.; Gibbons, Frederick X.
2007-01-01
The past and current states of research on social comparison are reviewed with regard to a series of major theoretical developments that have occurred in the past 5 decades. These are, in chronological order: (1) classic social comparison theory, (2) fear-affiliation theory, (3) downward comparison theory, (4) social comparison as social…
ERIC Educational Resources Information Center
Gaziano, Cecilie
This paper seeks to integrate some ideas from family systems theory and attachment theory within a theory of public opinion and social movement. Citing the classic "The Authoritarian Personality," the paper states that the first authorities children know, their parents or other caregivers, shape children's attitudes toward all…
Linear Quantum Systems: Non-Classical States and Robust Stability
2016-06-29
quantum linear systems subject to non-classical quantum fields. The major outcomes of this project are (i) derivation of quantum filtering equations for...derivation of quantum filtering equations for systems non-classical input states including single photon states, (ii) determination of how linear...history going back some 50 years, to the birth of modern control theory with Kalman’s foundational work on filtering and LQG optimal control
An Arbitrary First Order Theory Can Be Represented by a Program: A Theorem
NASA Technical Reports Server (NTRS)
Hosheleva, Olga
1997-01-01
How can we represent knowledge inside a computer? For formalized knowledge, classical logic seems to be the most adequate tool. Classical logic is behind all formalisms of classical mathematics, and behind many formalisms used in Artificial Intelligence. There is only one serious problem with classical logic: due to the famous Godel's theorem, classical logic is algorithmically undecidable; as a result, when the knowledge is represented in the form of logical statements, it is very difficult to check whether, based on this statement, a given query is true or not. To make knowledge representations more algorithmic, a special field of logic programming was invented. An important portion of logic programming is algorithmically decidable. To cover knowledge that cannot be represented in this portion, several extensions of the decidable fragments have been proposed. In the spirit of logic programming, these extensions are usually introduced in such a way that even if a general algorithm is not available, good heuristic methods exist. It is important to check whether the already proposed extensions are sufficient, or further extensions is necessary. In the present paper, we show that one particular extension, namely, logic programming with classical negation, introduced by M. Gelfond and V. Lifschitz, can represent (in some reasonable sense) an arbitrary first order logical theory.
A Revision of Learning and Teaching = Revision del aprender y del ensenar.
ERIC Educational Resources Information Center
Reggini, Horace C.
1983-01-01
This review of the findings of recent cognitive science research pertaining to learning and teaching focuses on how science and mathematics are being taught, analyzes how the presence of the computer demonstrates a need for radical rethinking of both the theory and the practice of learning, and points out that if educators fail to consider the…
A revised econometric model of the domestic pallet market
Albert T. Schuler; Walter B. Wallin
1983-01-01
The purpose of this revised model is to project estimates of consumption and price of wooden pallets in the short term. This model differs from previous ones developed by Schuler and Wallin (1979 and 1980) in the following respects: The structure of the supply side of the market is more realistically identified (from an economic theory point of view) by including...
Derivation of Einstein-Cartan theory from general relativity
NASA Astrophysics Data System (ADS)
Petti, Richard
2015-04-01
General relativity cannot describe exchange of classical intrinsic angular momentum and orbital angular momentum. Einstein-Cartan theory fixes this problem in the least invasive way. In the late 20th century, the consensus view was that Einstein-Cartan theory requires inclusion of torsion without adequate justification, it has no empirical support (though it doesn't conflict with any known evidence), it solves no important problem, and it complicates gravitational theory with no compensating benefit. In 1986 the author published a derivation of Einstein-Cartan theory from general relativity, with no additional assumptions or parameters. Starting without torsion, Poincaré symmetry, classical or quantum spin, or spinors, it derives torsion and its relation to spin from a continuum limit of general relativistic solutions. The present work makes the case that this computation, combined with supporting arguments, constitutes a derivation of Einstein-Cartan theory from general relativity, not just a plausibility argument. This paper adds more and simpler explanations, more computational details, correction of a factor of 2, discussion of limitations of the derivation, and discussion of some areas of gravitational research where Einstein-Cartan theory is relevant.
Towards a General Model of Temporal Discounting
ERIC Educational Resources Information Center
van den Bos, Wouter; McClure, Samuel M.
2013-01-01
Psychological models of temporal discounting have now successfully displaced classical economic theory due to the simple fact that many common behavior patterns, such as impulsivity, were unexplainable with classic models. However, the now dominant hyperbolic model of discounting is itself becoming increasingly strained. Numerous factors have…
The evolution of consciousness
DOE Office of Scientific and Technical Information (OSTI.GOV)
Stapp, H.P.
1996-08-16
It is argued that the principles of classical physics are inimical to the development of an adequate science of consciousness. The problem is that insofar as the classical principles are valid consciousness can have no effect on the behavior, and hence on the survival prospects, of the organisms in which it inheres. Thus within the classical framework it is not possible to explain in natural terms the development of consciousness to the high-level form found in human beings. In quantum theory, on the other hand, consciousness can be dynamically efficacious: quantum theory does allow consciousness to influence behavior, and thencemore » to evolve in accordance with the principles of natural selection. However, this evolutionary requirement places important constraints upon the details of the formulation of the quantum dynamical principles.« less
Classical and quantum production of cornucopions at energies below 1018 GeV
NASA Astrophysics Data System (ADS)
Banks, T.; O'loughlin, M.
1993-01-01
We argue that the paradoxes associated with infinitely degenerate states, which plague relic particle scenarios for the end point of black hole evaporation, may be absent when the relics are horned particles. Most of our arguments are based on simple observations about the classical geometry of extremal dilaton black holes, but at a crucial point we are forced to speculate about classical solutions to string theory in which the infinite coupling singularity of the extremal dilaton solution is shielded by a condensate of massless modes propagating in its infinite horn. We use the nonsingular c=1 solution of (1+1)-dimensional string theory as a crude model for the properties of the condensate. We also present a brief discussion of more general relic scenarios based on large relics of low mass.
Classically and quantum stable emergent universe from conservation laws
DOE Office of Scientific and Technical Information (OSTI.GOV)
Campo, Sergio del; Herrera, Ramón; Guendelman, Eduardo I.
It has been recently pointed out by Mithani-Vilenkin [1-4] that certain emergent universe scenarios which are classically stable are nevertheless unstable semiclassically to collapse. Here, we show that there is a class of emergent universes derived from scale invariant two measures theories with spontaneous symmetry breaking (s.s.b) of the scale invariance, which can have both classical stability and do not suffer the instability pointed out by Mithani-Vilenkin towards collapse. We find that this stability is due to the presence of a symmetry in the 'emergent phase', which together with the non linearities of the theory, does not allow that themore » FLRW scale factor to be smaller that a certain minimum value a {sub 0} in a certain protected region.« less
Semi-classical Reissner-Nordstrom model for the structure of charged leptons
NASA Technical Reports Server (NTRS)
Rosen, G.
1980-01-01
The lepton self-mass problem is examined within the framework of the quantum theory of electromagnetism and gravity. Consideration is given to the Reissner-Nordstrom solution to the Einstein-Maxwell classical field equations for an electrically charged mass point, and the WKB theory for a semiclassical system with total energy zero is used to obtain an expression for the Einstein-Maxwell action factor. The condition obtained is found to account for the observed mass values of the three charged leptons, and to be in agreement with the correspondence principle.
Redundancy of constraints in the classical and quantum theories of gravitation.
NASA Technical Reports Server (NTRS)
Moncrief, V.
1972-01-01
It is shown that in Dirac's version of the quantum theory of gravitation, the Hamiltonian constraints are greatly redundant. If the Hamiltonian constraint condition is satisfied at one point on the underlying, closed three-dimensional manifold, then it is automatically satisfied at every point, provided only that the momentum constraints are everywhere satisfied. This permits one to replace the usual infinity of Hamiltonian constraints by a single condition which may be taken in the form of an integral over the manifold. Analogous theorems are given for the classical Einstein Hamilton-Jacobi equations.
Quantum gambling based on Nash-equilibrium
NASA Astrophysics Data System (ADS)
Zhang, Pei; Zhou, Xiao-Qi; Wang, Yun-Long; Liu, Bi-Heng; Shadbolt, Pete; Zhang, Yong-Sheng; Gao, Hong; Li, Fu-Li; O'Brien, Jeremy L.
2017-06-01
The problem of establishing a fair bet between spatially separated gambler and casino can only be solved in the classical regime by relying on a trusted third party. By combining Nash-equilibrium theory with quantum game theory, we show that a secure, remote, two-party game can be played using a quantum gambling machine which has no classical counterpart. Specifically, by modifying the Nash-equilibrium point we can construct games with arbitrary amount of bias, including a game that is demonstrably fair to both parties. We also report a proof-of-principle experimental demonstration using linear optics.
The energy-momentum tensor(s) in classical gauge theories
Blaschke, Daniel N.; Gieres, François; Reboud, Méril; ...
2016-07-12
We give an introduction to, and review of, the energy-momentum tensors in classical gauge field theories in Minkowski space, and to some extent also in curved space-time. For the canonical energy-momentum tensor of non-Abelian gauge fields and of matter fields coupled to such fields, we present a new and simple improvement procedure based on gauge invariance for constructing a gauge invariant, symmetric energy-momentum tensor. In conclusion, the relationship with the Einstein-Hilbert tensor following from the coupling to a gravitational field is also discussed.
Universal tight binding model for chemical reactions in solution and at surfaces. II. Water.
Lozovoi, A Y; Sheppard, T J; Pashov, D L; Kohanoff, J J; Paxton, A T
2014-07-28
A revised water model intended for use in condensed phase simulations in the framework of the self consistent polarizable ion tight binding theory is constructed. The model is applied to water monomer, dimer, hexamers, ice, and liquid, where it demonstrates good agreement with theoretical results obtained by more accurate methods, such as DFT and CCSD(T), and with experiment. In particular, the temperature dependence of the self diffusion coefficient in liquid water predicted by the model, closely reproduces experimental curves in the temperature interval between 230 K and 350 K. In addition, and in contrast to standard DFT, the model properly orders the relative densities of liquid water and ice. A notable, but inevitable, shortcoming of the model is underestimation of the static dielectric constant by a factor of two. We demonstrate that the description of inter and intramolecular forces embodied in the tight binding approximation in quantum mechanics leads to a number of valuable insights which can be missing from ab initio quantum chemistry and classical force fields. These include a discussion of the origin of the enhanced molecular electric dipole moment in the condensed phases, and a detailed explanation for the increase of coordination number in liquid water as a function of temperature and compared with ice--leading to insights into the anomalous expansion on freezing. The theory holds out the prospect of an understanding of the currently unexplained density maximum of water near the freezing point.
Development of the multiple sclerosis (MS) early mobility impairment questionnaire (EMIQ).
Ziemssen, Tjalf; Phillips, Glenn; Shah, Ruchit; Mathias, Adam; Foley, Catherine; Coon, Cheryl; Sen, Rohini; Lee, Andrew; Agarwal, Sonalee
2016-10-01
The Early Mobility Impairment Questionnaire (EMIQ) was developed to facilitate early identification of mobility impairments in multiple sclerosis (MS) patients. We describe the initial development of the EMIQ with a focus on the psychometric evaluation of the questionnaire using classical and item response theory methods. The initial 20-item EMIQ was constructed by clinical specialists and qualitatively tested among people with MS and physicians via cognitive interviews. Data from an observational study was used to make additional updates to the instrument based on exploratory factor analysis (EFA) and item response theory (IRT) analysis, and psychometric analyses were performed to evaluate the reliability and validity of the final instrument's scores and screening properties (i.e., sensitivity and specificity). Based on qualitative interview analyses, a revised 15-item EMIQ was included in the observational study. EFA, IRT and item-to-item correlation analyses revealed redundant items which were removed leading to the final nine-item EMIQ. The nine-item EMIQ performed well with respect to: test-retest reliability (ICC = 0.858); internal consistency (α = 0.893); convergent validity; and known-groups methods for construct validity. A cut-point of 41 on the 0-to-100 scale resulted in sufficient sensitivity and specificity statistics for viably identifying patients with mobility impairment. The EMIQ is a content valid and psychometrically sound instrument for capturing MS patients' experience with mobility impairments in a clinical practice setting. Additional research is suggested to further confirm the EMIQ's screening properties over time.
Kettle, Jonathan W L; O'Brien-Simpson, Laurie; Allen, Nicholas B
2008-02-01
First order theory of mind, as measured by the 'Reading the Mind in the Eyes Test' Revised, is impaired in schizophrenia. However, no study has investigated whether this occurs in first-episode schizophrenia. Also, it is unclear whether such a deficit is specific to schizophrenia, and whether convenience control samples, particularly undergraduate university students, represent valid comparison groups. This study investigated theory of mind ability, measured by the 'Reading the Mind in the Eyes Test' Revised, in a group of first-episode schizophrenia outpatients (n=13) and three control groups: outpatients with non-psychotic major depression (n=14), individuals from the general community (n=16) and from an undergraduate university course (n=27). The schizophrenia group exhibited significant theory of mind impairments compared to both non-psychiatric control groups but not the depression group. Unexpectedly, the depression group was not significantly impaired compared to the community control group, and the university control group exhibited superior theory of mind ability relative to all three groups. The findings indicate theory of mind deficits in first episode schizophrenia and support the implementation of theory of mind interventions in first-episode schizophrenia treatment programs. Results also indicate that community rather than university control groups represent more valid comparison groups in first-episode schizophrenia research.
Stochastic game theory: for playing games, not just for doing theory.
Goeree, J K; Holt, C A
1999-09-14
Recent theoretical advances have dramatically increased the relevance of game theory for predicting human behavior in interactive situations. By relaxing the classical assumptions of perfect rationality and perfect foresight, we obtain much improved explanations of initial decisions, dynamic patterns of learning and adjustment, and equilibrium steady-state distributions.
On coupling NEC-violating matter to gravity
Chatterjee, Saugata; Parikh, Maulik; van der Schaar, Jan Pieter
2015-03-16
We show that effective theories of matter that classically violate the null energy condition cannot be minimally coupled to Einstein gravity without being inconsistent with both string theory and black hole thermodynamics. We argue however that they could still be either non-minimally coupled or coupled to higher-curvature theories of gravity.
Examining U.S. Irregular Warfare Doctrine
2008-06-01
48 Future Theories ...Questions, and Hypotheses The classic warfare theories (i.e. Sun Tzu, Clausewitz, and so forth) directly apply to this research since they form the...Tzu, their theories on the conduct of warfare, and its close tie to politics, form the basis for examining the content of the manuals. The Marine
Modern Psychometric Methodology: Applications of Item Response Theory
ERIC Educational Resources Information Center
Reid, Christine A.; Kolakowsky-Hayner, Stephanie A.; Lewis, Allen N.; Armstrong, Amy J.
2007-01-01
Item response theory (IRT) methodology is introduced as a tool for improving assessment instruments used with people who have disabilities. Need for this approach in rehabilitation is emphasized; differences between IRT and classical test theory are clarified. Concepts essential to understanding IRT are defined, necessary data assumptions are…
Testing the Moral Algebra of Two Kohlbergian Informers
ERIC Educational Resources Information Center
Hommers, Wilfried; Lewand, Martin; Ehrmann, Dominic
2012-01-01
This paper seeks to unify two major theories of moral judgment: Kohlberg's stage theory and Anderson's moral information integration theory. Subjects were told about thoughts of actors in Kohlberg's classic altruistic Heinz dilemma and in a new egoistical dilemma. These actors's thoughts represented Kohlberg's stages I (Personal Risk) and IV…
NASA Astrophysics Data System (ADS)
Kovchegov, Yuri V.; Wu, Bin
2018-03-01
To understand the dynamics of thermalization in heavy ion collisions in the perturbative framework it is essential to first find corrections to the free-streaming classical gluon fields of the McLerran-Venugopalan model. The corrections that lead to deviations from free streaming (and that dominate at late proper time) would provide evidence for the onset of isotropization (and, possibly, thermalization) of the produced medium. To find such corrections we calculate the late-time two-point Green function and the energy-momentum tensor due to a single 2 → 2 scattering process involving two classical fields. To make the calculation tractable we employ the scalar φ 4 theory instead of QCD. We compare our exact diagrammatic results for these quantities to those in kinetic theory and find disagreement between the two. The disagreement is in the dependence on the proper time τ and, for the case of the two-point function, is also in the dependence on the space-time rapidity η: the exact diagrammatic calculation is, in fact, consistent with the free streaming scenario. Kinetic theory predicts a build-up of longitudinal pressure, which, however, is not observed in the exact calculation. We conclude that we find no evidence for the beginning of the transition from the free-streaming classical fields to the kinetic theory description of the produced matter after a single 2 → 2 rescattering.
Affine Isoperimetry and Information Theoretic Inequalities
ERIC Educational Resources Information Center
Lv, Songjun
2012-01-01
There are essential connections between the isoperimetric theory and information theoretic inequalities. In general, the Brunn-Minkowski inequality and the entropy power inequality, as well as the classical isoperimetric inequality and the classical entropy-moment inequality, turn out to be equivalent in some certain sense, respectively. Based on…
Teaching Classic Probability Problems With Modern Digital Tools
ERIC Educational Resources Information Center
Abramovich, Sergei; Nikitin, Yakov Yu.
2017-01-01
This article is written to share teaching ideas about using commonly available computer applications--a spreadsheet, "The Geometer's Sketchpad", and "Wolfram Alpha"--to explore three classic and historically significant problems from the probability theory. These ideas stem from the authors' work with prospective economists,…
Higher spin gauge theory on fuzzy \\boldsymbol {S^4_N}
NASA Astrophysics Data System (ADS)
Sperling, Marcus; Steinacker, Harold C.
2018-02-01
We examine in detail the higher spin fields which arise on the basic fuzzy sphere S^4N in the semi-classical limit. The space of functions can be identified with functions on classical S 4 taking values in a higher spin algebra associated to \
Contemporary Inventional Theory: An Aristotelian Model.
ERIC Educational Resources Information Center
Skopec, Eric W.
Contemporary rhetoricians are concerned with the re-examination of classical doctrines in the hope of finding solutions to current problems. In this study, the author presents a methodological perspective consistent with current interests, by re-examining the assumptions that underlie each classical precept. He outlines an inventional system based…
Determination of angle of light deflection in higher-derivative gravity theories
NASA Astrophysics Data System (ADS)
Xu, Chenmei; Yang, Yisong
2018-03-01
Gravitational light deflection is known as one of three classical tests of general relativity and the angle of deflection may be computed explicitly using approximate or exact solutions describing the gravitational force generated from a point mass. In various generalized gravity theories, however, such explicit determination is often impossible due to the difficulty in obtaining an exact expression for the deflection angle. In this work, we present some highly effective globally convergent iterative methods to determine the angle of semiclassical gravitational deflection in higher- and infinite-derivative formalisms of quantum gravity theories. We also establish the universal properties that the deflection angle always stays below the classical Einstein angle and is a strictly decreasing function of the incident photon energy, in these formalisms.
NASA Astrophysics Data System (ADS)
Bonilla, L. L.; Carretero, M.; Segura, A.
2017-12-01
When quantized, traces of classically chaotic single-particle systems include eigenvalue statistics and scars in eigenfuntions. Since 2001, many theoretical and experimental works have argued that classically chaotic single-electron dynamics influences and controls collective electron transport. For transport in semiconductor superlattices under tilted magnetic and electric fields, these theories rely on a reduction to a one-dimensional self-consistent drift model. A two-dimensional theory based on self-consistent Boltzmann transport does not support that single-electron chaos influences collective transport. This theory agrees with existing experimental evidence of current self-oscillations, predicts spontaneous collective chaos via a period doubling scenario, and could be tested unambiguously by measuring the electric potential inside the superlattice under a tilted magnetic field.
Much Polyphony but Little Harmony: Otto Sackur's Groping for a Quantum Theory of Gases
NASA Astrophysics Data System (ADS)
Badino, Massimiliano; Friedrich, Bretislav
2013-09-01
The endeavor of Otto Sackur (1880-1914) was driven, on the one hand, by his interest in Nernst's heat theorem, statistical mechanics, and the problem of chemical equilibrium and, on the other hand, by his goal to shed light on classical mechanics from the quantum vantage point. Inspired by the interplay between classical physics and quantum theory, Sackur chanced to expound his personal take on the role of the quantum in the changing landscape of physics in the turbulent 1910s. We tell the story of this enthusiastic practitioner of the old quantum theory and early contributor to quantum statistical mechanics, whose scientific ontogenesis provides a telling clue about the phylogeny of his contemporaries.
Bonilla, L L; Carretero, M; Segura, A
2017-12-01
When quantized, traces of classically chaotic single-particle systems include eigenvalue statistics and scars in eigenfuntions. Since 2001, many theoretical and experimental works have argued that classically chaotic single-electron dynamics influences and controls collective electron transport. For transport in semiconductor superlattices under tilted magnetic and electric fields, these theories rely on a reduction to a one-dimensional self-consistent drift model. A two-dimensional theory based on self-consistent Boltzmann transport does not support that single-electron chaos influences collective transport. This theory agrees with existing experimental evidence of current self-oscillations, predicts spontaneous collective chaos via a period doubling scenario, and could be tested unambiguously by measuring the electric potential inside the superlattice under a tilted magnetic field.
Quantum stochastic walks on networks for decision-making.
Martínez-Martínez, Ismael; Sánchez-Burillo, Eduardo
2016-03-31
Recent experiments report violations of the classical law of total probability and incompatibility of certain mental representations when humans process and react to information. Evidence shows promise of a more general quantum theory providing a better explanation of the dynamics and structure of real decision-making processes than classical probability theory. Inspired by this, we show how the behavioral choice-probabilities can arise as the unique stationary distribution of quantum stochastic walkers on the classical network defined from Luce's response probabilities. This work is relevant because (i) we provide a very general framework integrating the positive characteristics of both quantum and classical approaches previously in confrontation, and (ii) we define a cognitive network which can be used to bring other connectivist approaches to decision-making into the quantum stochastic realm. We model the decision-maker as an open system in contact with her surrounding environment, and the time-length of the decision-making process reveals to be also a measure of the process' degree of interplay between the unitary and irreversible dynamics. Implementing quantum coherence on classical networks may be a door to better integrate human-like reasoning biases in stochastic models for decision-making.
Quantum stochastic walks on networks for decision-making
NASA Astrophysics Data System (ADS)
Martínez-Martínez, Ismael; Sánchez-Burillo, Eduardo
2016-03-01
Recent experiments report violations of the classical law of total probability and incompatibility of certain mental representations when humans process and react to information. Evidence shows promise of a more general quantum theory providing a better explanation of the dynamics and structure of real decision-making processes than classical probability theory. Inspired by this, we show how the behavioral choice-probabilities can arise as the unique stationary distribution of quantum stochastic walkers on the classical network defined from Luce’s response probabilities. This work is relevant because (i) we provide a very general framework integrating the positive characteristics of both quantum and classical approaches previously in confrontation, and (ii) we define a cognitive network which can be used to bring other connectivist approaches to decision-making into the quantum stochastic realm. We model the decision-maker as an open system in contact with her surrounding environment, and the time-length of the decision-making process reveals to be also a measure of the process’ degree of interplay between the unitary and irreversible dynamics. Implementing quantum coherence on classical networks may be a door to better integrate human-like reasoning biases in stochastic models for decision-making.
Quantum stochastic walks on networks for decision-making
Martínez-Martínez, Ismael; Sánchez-Burillo, Eduardo
2016-01-01
Recent experiments report violations of the classical law of total probability and incompatibility of certain mental representations when humans process and react to information. Evidence shows promise of a more general quantum theory providing a better explanation of the dynamics and structure of real decision-making processes than classical probability theory. Inspired by this, we show how the behavioral choice-probabilities can arise as the unique stationary distribution of quantum stochastic walkers on the classical network defined from Luce’s response probabilities. This work is relevant because (i) we provide a very general framework integrating the positive characteristics of both quantum and classical approaches previously in confrontation, and (ii) we define a cognitive network which can be used to bring other connectivist approaches to decision-making into the quantum stochastic realm. We model the decision-maker as an open system in contact with her surrounding environment, and the time-length of the decision-making process reveals to be also a measure of the process’ degree of interplay between the unitary and irreversible dynamics. Implementing quantum coherence on classical networks may be a door to better integrate human-like reasoning biases in stochastic models for decision-making. PMID:27030372
Consciousness and values in the quantum universe
DOE Office of Scientific and Technical Information (OSTI.GOV)
Stapp, H.P.
1985-01-01
Application of quantum mechanical description to neurophysiological processes appears to provide for a natural unification of the physical and humanistic sciences. The categories of thought used to represent physical and psychical processes become united, and the mechanical conception of man created by classical physics is replaced by a profoundly different quantum conception. This revised image of man allows human values to be rooted in contemporary science.
ERIC Educational Resources Information Center
Kurtz, Kenneth J.; Levering, Kimery R.; Stanton, Roger D.; Romero, Joshua; Morris, Steven N.
2013-01-01
The findings of Shepard, Hovland, and Jenkins (1961) on the relative ease of learning 6 elemental types of 2-way classifications have been deeply influential 2 times over: 1st, as a rebuke to pure stimulus generalization accounts, and again as the leading benchmark for evaluating formal models of human category learning. The litmus test for models…
Kramer, Sam L; Rodriguez, Benjamin F
2018-07-01
Evidence suggests that the behavior inhibition system (BIS) and fight-flight-freeze system play a role in the individual differences seen in social anxiety disorder; however, findings concerning the role of the behavior approach system (BAS) have been mixed. To date, the role of revised reinforcement sensitivity theory (RST) subsystems underlying social anxiety has been measured with scales designed for the original RST. This study examined how the BIS, BAS, and fight, flight, freeze components of the fight-flight-freeze system uniquely relate to social interaction anxiety and social observation anxiety using both a measure specifically designed for the revised RST and a commonly used original RST measure. Comparison of regression analyses with the Jackson-5 and the commonly used BIS/BAS Scales revealed important differences in the relationships between RST subsystems and social anxiety depending on how RST was assessed. Limitations and future directions for revised RST measurement are discussed.
Extending the capability of GYRE to calculate tidally forced stellar oscillations
NASA Astrophysics Data System (ADS)
Guo, Zhao; Gies, Douglas R.
2016-01-01
Tidally forced oscillations have been observed in many eccentric binary systems, such as KOI-54 and many other 'heart beat stars'. The tidal response of the star can be calculated by solving a revised stellar oscillations equations.The open-source stellar oscillation code GYRE (Townsend & Teitler 2013) can be used to solve the free stellar oscillation equations in both adiabatic and non-adiabatic cases. It uses a novel matrix exponential method which avoids many difficulties of the classical shooting and relaxation method. The new version also includes the effect of rotation in traditional approximation.After showing the code flow of GYRE, we revise its subroutines and extend its capability to calculate tidallyforced oscillations in both adiabatic and non-adiabatic cases following the procedure in the CAFein code (Valsecchi et al. 2013). In the end, we compare the tidal eigenfunctions with those calculated from CAFein.More details of the revision and a simple version of the code in MATLAB can be obtained upon request.
Cepheid Period-Luminosity Relation and Kinematics Based on the Revised Hipparcos Catalogue
NASA Astrophysics Data System (ADS)
Zhang, H.; Shen, M.; Zhu, Z.
2011-12-01
The revised Hipparcos catalogue was released by van Leeuwen in 2007. The revised parallaxes of the classical Cepheids yield the zero-point of the period-luminosity relation ρ=-1.37± 0.07 in the optical BV bands, which is 0.06mag fainter than that given by Feast & Catchpole from the old Hipparcos data. Moreover, we discuss the kinematic parameters of the Galaxy based on an axisymmetric model. The Oort constants are A=17.42± 1.17km s-1kpc-1, B=-12.46± 0.86km s-1kpc-1, and the peculiar motion of the Sun is (12.58±1.09,14.52± 1.06, 8.98±0.98)km s-1. Using a dynamical model for an assumed elliptical disk, a weak elliptical potential of the disk is found with eccentricity ɛ(R0)=0.067± 0.036 and the direction of minor axis φb=31.7°± 14.5°.
Aviation accidents and the theory of the situation
NASA Technical Reports Server (NTRS)
Bolman, L.
1980-01-01
Social-psychological factors effecting the performance of flight crews are examined. In particular, a crew member's perceptual-psychological constructs of the flight situation (theories of the situation) are discussed. The skills and willingness of a flight crew to be alert to possible errors in the theory become critical to their effectiveness and their ability to ensure a safe flight. Several major factors that determine the likelihood that a faulty theory will be detected and revised are identified.
Quantum Theories of Self-Localization
NASA Astrophysics Data System (ADS)
Bernstein, Lisa Joan
In the classical dynamics of coupled oscillator systems, nonlinearity leads to the existence of stable solutions in which energy remains localized for all time. Here the quantum-mechanical counterpart of classical self-localization is investigated in the context of two model systems. For these quantum models, the terms corresponding to classical nonlinearities modify a subset of the stationary quantum states to be particularly suited to the creation of nonstationary wavepackets that localize energy for long times. The first model considered here is the Quantized Discrete Self-Trapping model (QDST), a system of anharmonic oscillators with linear dispersive coupling used to model local modes of vibration in polyatomic molecules. A simple formula is derived for a particular symmetry class of QDST systems which gives an analytic connection between quantum self-localization and classical local modes. This formula is also shown to be useful in the interpretation of the vibrational spectra of some molecules. The second model studied is the Frohlich/Einstein Dimer (FED), a two-site system of anharmonically coupled oscillators based on the Frohlich Hamiltonian and motivated by the theory of Davydov solitons in biological protein. The Born-Oppenheimer perturbation method is used to obtain approximate stationary state wavefunctions with error estimates for the FED at the first excited level. A second approach is used to reduce the first excited level FED eigenvalue problem to a system of ordinary differential equations. A simple theory of low-energy self-localization in the FED is discussed. The quantum theories of self-localization in the intrinsic QDST model and the extrinsic FED model are compared.
Marsalek, Ondrej; Markland, Thomas E
2016-02-07
Path integral molecular dynamics simulations, combined with an ab initio evaluation of interactions using electronic structure theory, incorporate the quantum mechanical nature of both the electrons and nuclei, which are essential to accurately describe systems containing light nuclei. However, path integral simulations have traditionally required a computational cost around two orders of magnitude greater than treating the nuclei classically, making them prohibitively costly for most applications. Here we show that the cost of path integral simulations can be dramatically reduced by extending our ring polymer contraction approach to ab initio molecular dynamics simulations. By using density functional tight binding as a reference system, we show that our ring polymer contraction scheme gives rapid and systematic convergence to the full path integral density functional theory result. We demonstrate the efficiency of this approach in ab initio simulations of liquid water and the reactive protonated and deprotonated water dimer systems. We find that the vast majority of the nuclear quantum effects are accurately captured using contraction to just the ring polymer centroid, which requires the same number of density functional theory calculations as a classical simulation. Combined with a multiple time step scheme using the same reference system, which allows the time step to be increased, this approach is as fast as a typical classical ab initio molecular dynamics simulation and 35× faster than a full path integral calculation, while still exactly including the quantum sampling of nuclei. This development thus offers a route to routinely include nuclear quantum effects in ab initio molecular dynamics simulations at negligible computational cost.
Rett Syndrome: Revised Diagnostic Criteria and Nomenclature
Neul, Jeffrey L.; Kaufmann, Walter E.; Glaze, Daniel G.; Christodoulou, John; Clarke, Angus J.; Bahi-Buisson, Nadia; Leonard, Helen; Bailey, Mark E. S.; Schanen, N. Carolyn; Zappella, Michele; Renieri, Alessandra; Huppke, Peter; Percy, Alan K.
2010-01-01
Objective Rett syndrome (RTT) is a severe neurodevelopmental disease that affects approximately 1 in 10,000 live female births and is often caused by mutations in Methyl-CpG-binding protein 2 (MECP2). Despite distinct clinical features, the accumulation of clinical and molecular information in recent years has generated considerable confusion regarding the diagnosis of RTT. The purpose of this work was revise and clarify 2002 consensus criteria for the diagnosis of RTT in anticipation of treatment trials. Method RettSearch members, representing the majority of the international clinical RTT specialists, participated in an iterative process to come to a consensus on a revised and simplified clinical diagnostic criteria for RTT. Results The clinical criteria required for the diagnosis of classic and atypical RTT were clarified and simplified. Guidelines for the diagnosis and molecular evaluation of specific variant forms of RTT were developed. Interpretation These revised criteria provide clarity regarding the key features required for the diagnosis of RTT and reinforce the concept that RTT is a clinical diagnosis based on distinct clinical criteria, independent of molecular findings. We recommend that these criteria and guidelines be utilized in any proposed clinical research. PMID:21154482
DISTINCTIVE FEATURE THEORY AND NASAL ASSIMILATION IN SPANISH.
ERIC Educational Resources Information Center
HARRIS, JAMES W.
CERTAIN FEATURES IN THE MEXICAN PRONUNCIATION OF NASAL CONSONANTS ARE PRESENTED HERE AND LINGUISTIC GENERALIZATIONS ARE FORMULATED--FIRST IN TERMS OF A CURRENT THEORY OF UNIVERSAL PHONOLOGICAL DISTINCTIVE FEATURES, AND SECOND IN TERMS OF A REVISED DISTINCTIVE FEATURE FRAMEWORK INCORPORATING THE CHANGES PROPOSED BY CHOMSKY AND HALLE IN "THE SOUND…
ERIC Educational Resources Information Center
Walker, Jeffrey
1990-01-01
Revisits the hemisphericity theory of the 1970s and the revised and less familiar accounts that emerged in the 1980s. Argues that neither the older nor the newer psychobiological accounts of mind support the Neoclassical/Romantic claims. Contends that these accounts are more congenial to an Aristotelian theory of mind and rhetoric. (RS)
Assessing the Dependability of Drinking Motives via Generalizability Theory
ERIC Educational Resources Information Center
Arterberry, Brooke J.; Martens, Matthew P.; Cadigan, Jennifer M.; Smith, Ashley E.
2012-01-01
This study assessed the score reliability of the Drinking Motives Questionnaire-Revised (DMQ-R) via generalizability theory. Participants (n = 367 college students) completed the DMQ-R at three time points. Across subscale scores, persons, persons x occasions, and persons x items interactions accounted for meaningful variance. Findings illustrate…
Revisiting competition in a classic model system using formal links between theory and data.
Hart, Simon P; Burgin, Jacqueline R; Marshall, Dustin J
2012-09-01
Formal links between theory and data are a critical goal for ecology. However, while our current understanding of competition provides the foundation for solving many derived ecological problems, this understanding is fractured because competition theory and data are rarely unified. Conclusions from seminal studies in space-limited benthic marine systems, in particular, have been very influential for our general understanding of competition, but rely on traditional empirical methods with limited inferential power and compatibility with theory. Here we explicitly link mathematical theory with experimental field data to provide a more sophisticated understanding of competition in this classic model system. In contrast to predictions from conceptual models, our estimates of competition coefficients show that a dominant space competitor can be equally affected by interspecific competition with a poor competitor (traditionally defined) as it is by intraspecific competition. More generally, the often-invoked competitive hierarchies and intransitivities in this system might be usefully revisited using more sophisticated empirical and analytical approaches.
A quantum theory account of order effects and conjunction fallacies in political judgments.
Yearsley, James M; Trueblood, Jennifer S
2017-09-06
Are our everyday judgments about the world around us normative? Decades of research in the judgment and decision-making literature suggest the answer is no. If people's judgments do not follow normative rules, then what rules if any do they follow? Quantum probability theory is a promising new approach to modeling human behavior that is at odds with normative, classical rules. One key advantage of using quantum theory is that it explains multiple types of judgment errors using the same basic machinery, unifying what have previously been thought of as disparate phenomena. In this article, we test predictions from quantum theory related to the co-occurrence of two classic judgment phenomena, order effects and conjunction fallacies, using judgments about real-world events (related to the U.S. presidential primaries). We also show that our data obeys two a priori and parameter free constraints derived from quantum theory. Further, we examine two factors that moderate the effects, cognitive thinking style (as measured by the Cognitive Reflection Test) and political ideology.
Twenty-Five Centuries of Quantum Physics: From Pythagoras to Us, and from Subjectivism to Realism
NASA Astrophysics Data System (ADS)
Bunge, Mario
Three main theses are proposed. The first is that the idea of a quantum or minimal unit is not peculiar to quantum theory, since it already occurs in the classical theories of elasticity and electrolysis. Second, the peculiarities of the objects described by quantum theory are the following: their basic laws are probabilistic; some of their properties, such as position and energy, are blunt rather than sharp; two particles that were once together continue to be associated even after becoming spatially separated; and the vacuum has physical properties, so that it is a kind of matter. Third, the orthodox or Copenhagen interpretation of the theory is false, and may conveniently be replaced with a realist (though not classicist) interpretation. Heisenberg's inequality, Schrödinger's cat and Zeno's quantum paradox are discussed in the light of the two rival interpretations. It is also shown that the experiments that falsified Bell's inequality do not refute realism but the classicism inherent in hidden variables theories.
Quantum optical effective-medium theory and transformation quantum optics for metamaterials
NASA Astrophysics Data System (ADS)
Wubs, Martijn; Amooghorban, Ehsan; Zhang, Jingjing; Mortensen, N. Asger
2016-09-01
While typically designed to manipulate classical light, metamaterials have many potential applications for quantum optics as well. We argue why a quantum optical effective-medium theory is needed. We present such a theory for layered metamaterials that is valid for light propagation in all spatial directions, thereby generalizing earlier work for one-dimensional propagation. In contrast to classical effective-medium theory there is an additional effective parameter that describes quantum noise. Our results for metamaterials are based on a rather general Lagrangian theory for the quantum electrodynamics of media with both loss and gain. In the second part of this paper, we present a new application of transformation optics whereby local spontaneous-emission rates of quantum emitters can be designed. This follows from an analysis how electromagnetic Green functions trans- form under coordinate transformations. Spontaneous-emission rates can be either enhanced or suppressed using invisibility cloaks or gradient index lenses. Furthermore, the anisotropic material profile of the cloak enables the directional control of spontaneous emission.
A Theoretical and Empirical Integration of the Rational-Emotive and Classical Conditioning Theories
ERIC Educational Resources Information Center
Russell, Phillip L.; Brandsma, Jeffrey M.
1974-01-01
Galvanic skin conductance response, respiration rate and respiration depth values of an experimental and control group were used to test the hypotheses of a Albert Ellis' ABC Theory of psychopathology. (EK)
[Taxonomic theory for non-classical systematics].
Pavlinov, I Ia
2012-01-01
Outlined briefly are basic principles of construing general taxonomic theory for biological systematics considered in the context of non-classical scientific paradigm. The necessity of such kind of theory is substantiated, and some key points of its elaboration are exposed: its interpretation as a framework concept for the partial taxonomic theories in various schools of systematics; elaboration of idea of cognitive situation including three interrelated components, namely subject, object, and epistemic ones; its construing as a content-wisely interpreted quasi-axiomatics, with strong structuring of its conceptual space including demarcation between axioms and inferring rules; its construing as a "conceptual pyramid" of concepts of various levels of generality; inclusion of a basic model into definition of the taxonomic system (classification) regulating its content. Two problems are indicated as fundamental: definition of taxonomic diversity as a subject domain for the systematics as a whole; definition of onto-epistemological status of taxonomic system (classification) in general and of taxa in particular.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Anisimov, Petr Mikhaylovich
Classical 1D X-ray Free Electron Laser (X-ray FEL) theory has stood the test of time by guiding FEL design and development prior to any full-scale analysis. Future X-ray FELs and inverse-Compton sources, where photon recoil approaches an electron energy spread value, push the classical theory to its limits of applicability. After substantial efforts by the community to find what those limits are, there is no universally agreed upon quantum approach to design and development of future X-ray sources. We offer a new approach to formulate the quantum theory for 1D X-ray FELs that has an obvious connection to the classicalmore » theory, which allows for immediate transfer of knowledge between the two regimes. In conclusion, we exploit this connection in order to draw quantum mechanical conclusions about the quantum nature of electrons and generated radiation in terms of FEL variables.« less
Delgado, Guilherme Costa
2017-07-01
This paper aims to conduct a conceptual analysis of the relationship between scientific and technical progress and social equality, or the reduction of inequalities. We examine this relationship by drawing on three theoretical perspectives: 1) ethical economics, championed by classical economic thinkers and centered on utilitarian self-interest, 2) Mainstream theories of economic development espousing the endogenous link between labor productivity growth and technical progress, 3) the critique of theories of economic development that emerged in the second half of the twentieth century, including Celso Furtado's critique of the theory of underdevelopment, emphasizing the prevalence of egalitarian tendencies, and ecological economics, which suggest alternative paths to those set by "classical" theories of development. The fundamental antinomy posed by the title of this article, characterized by an intrinsic contradiction between technical progress and social equality, strictly presupposes the ethical economics perspective, dominated by the social relations that constitute the "social order".
Electrical double layers and differential capacitance in molten salts from density functional theory
Frischknecht, Amalie L.; Halligan, Deaglan O.; Parks, Michael L.
2014-08-05
Classical density functional theory (DFT) is used to calculate the structure of the electrical double layer and the differential capacitance of model molten salts. The DFT is shown to give good qualitative agreement with Monte Carlo simulations in the molten salt regime. The DFT is then applied to three common molten salts, KCl, LiCl, and LiKCl, modeled as charged hard spheres near a planar charged surface. The DFT predicts strong layering of the ions near the surface, with the oscillatory density profiles extending to larger distances for larger electrostatic interactions resulting from either lower temperature or lower dielectric constant. Inmore » conclusion, overall the differential capacitance is found to be bell-shaped, in agreement with recent theories and simulations for ionic liquids and molten salts, but contrary to the results of the classical Gouy-Chapman theory.« less
Individual-based modeling of ecological and evolutionary processes
DeAngelis, Donald L.; Mooij, Wolf M.
2005-01-01
Individual-based models (IBMs) allow the explicit inclusion of individual variation in greater detail than do classical differential-equation and difference-equation models. Inclusion of such variation is important for continued progress in ecological and evolutionary theory. We provide a conceptual basis for IBMs by describing five major types of individual variation in IBMs: spatial, ontogenetic, phenotypic, cognitive, and genetic. IBMs are now used in almost all subfields of ecology and evolutionary biology. We map those subfields and look more closely at selected key papers on fish recruitment, forest dynamics, sympatric speciation, metapopulation dynamics, maintenance of diversity, and species conservation. Theorists are currently divided on whether IBMs represent only a practical tool for extending classical theory to more complex situations, or whether individual-based theory represents a radically new research program. We feel that the tension between these two poles of thinking can be a source of creativity in ecology and evolutionary theory.
Classical Mythology. Fourth Edition.
ERIC Educational Resources Information Center
Morford, Mark P. O.; Lenardon, Robert J.
Designed for students with little or no background in classical literature, this book introduces the Greek and Roman myths of creation, myths of the gods, Greek sagas and local legends, and presents contemporary theories about the myths. Drawing on Homer, Hesiod, Pindar, Vergil, and others, the book provides many translations and paraphrases of…
Making the Transition from Classical to Quantum Physics
ERIC Educational Resources Information Center
Dutt, Amit
2011-01-01
This paper reports on the nature of the conceptual understandings developed by Year 12 Victorian Certificate of Education (VCE) physics students as they made the transition from the essentially deterministic notions of classical physics, to interpretations characteristic of quantum theory. The research findings revealed the fact that the…
ERIC Educational Resources Information Center
Featherstone, Richard; Sorrell, Katie L.
2007-01-01
This paper explores whether the field of sociology harbors a dismissive attitude towards religion. Specifically it examines whether introductory sociology textbooks present the classic secularization theory over the more recent religious economies explanation of religious change. The classical secularization thesis suggests that religion is…
"Pathos" Reconsidered from the Perspective of Classical Chinese Rhetorical Theories.
ERIC Educational Resources Information Center
Garrett, Mary M.
1993-01-01
Proposes that cross-cultural rhetorical studies may provide insights into the sources of difficulties with "pathos." Presents an extensive case study that appeals to the emotions in classical Chinese rhetorics. Notes that the presuppositions of these rhetorics highlight the contingent nature of certain fundamental assumptions of many…
Laban Movement Analysis Approach to Classical Ballet Pedagogy
ERIC Educational Resources Information Center
Whittier, Cadence
2006-01-01
As a Certified Laban Movement Analyst and a classically trained ballet dancer, I consistently weave the Laban Movement Analysis/Bartenieff Fundamentals (LMA/BF) theories and philosophies into the ballet class. This integration assists in: (1) Identifying the qualitative movement elements both in the art of ballet and in the students' dancing…
Benedict, Lorin X.; Surh, Michael P.; Stanton, Liam G.; ...
2017-04-10
Here, we use classical molecular dynamics (MD) to study electron-ion temperature equilibration in two-component plasmas in regimes for which the presence of coupled collective modes has been predicted to substantively reduce the equilibration rate. Guided by previous kinetic theory work, we examine hydrogen plasmas at a density of n = 10 26cm –3, T i = 10 5K, and 10 7 K < Te < 10 9K. The nonequilibrium classical MD simulations are performed with interparticle interactions modeled by quantum statistical potentials (QSPs). Our MD results indicate (i) a large effect from time-varying potential energy, which we quantify by appealingmore » to an adiabatic two-temperature equation of state, and (ii) a notable deviation in the energy equilibration rate when compared to calculations from classical Lenard-Balescu theory including the QSPs. In particular, it is shown that the energy equilibration rates from MD are more similar to those of the theory when coupled modes are neglected. We suggest possible reasons for this surprising result and propose directions of further research along these lines.« less
Hybrid quantum-classical modeling of quantum dot devices
NASA Astrophysics Data System (ADS)
Kantner, Markus; Mittnenzweig, Markus; Koprucki, Thomas
2017-11-01
The design of electrically driven quantum dot devices for quantum optical applications asks for modeling approaches combining classical device physics with quantum mechanics. We connect the well-established fields of semiclassical semiconductor transport theory and the theory of open quantum systems to meet this requirement. By coupling the van Roosbroeck system with a quantum master equation in Lindblad form, we introduce a new hybrid quantum-classical modeling approach, which provides a comprehensive description of quantum dot devices on multiple scales: it enables the calculation of quantum optical figures of merit and the spatially resolved simulation of the current flow in realistic semiconductor device geometries in a unified way. We construct the interface between both theories in such a way, that the resulting hybrid system obeys the fundamental axioms of (non)equilibrium thermodynamics. We show that our approach guarantees the conservation of charge, consistency with the thermodynamic equilibrium and the second law of thermodynamics. The feasibility of the approach is demonstrated by numerical simulations of an electrically driven single-photon source based on a single quantum dot in the stationary and transient operation regime.
Quantum chemistry simulation on quantum computers: theories and experiments.
Lu, Dawei; Xu, Boruo; Xu, Nanyang; Li, Zhaokai; Chen, Hongwei; Peng, Xinhua; Xu, Ruixue; Du, Jiangfeng
2012-07-14
It has been claimed that quantum computers can mimic quantum systems efficiently in the polynomial scale. Traditionally, those simulations are carried out numerically on classical computers, which are inevitably confronted with the exponential growth of required resources, with the increasing size of quantum systems. Quantum computers avoid this problem, and thus provide a possible solution for large quantum systems. In this paper, we first discuss the ideas of quantum simulation, the background of quantum simulators, their categories, and the development in both theories and experiments. We then present a brief introduction to quantum chemistry evaluated via classical computers followed by typical procedures of quantum simulation towards quantum chemistry. Reviewed are not only theoretical proposals but also proof-of-principle experimental implementations, via a small quantum computer, which include the evaluation of the static molecular eigenenergy and the simulation of chemical reaction dynamics. Although the experimental development is still behind the theory, we give prospects and suggestions for future experiments. We anticipate that in the near future quantum simulation will become a powerful tool for quantum chemistry over classical computations.
Quantum to classical transition in quantum field theory
NASA Astrophysics Data System (ADS)
Lombardo, Fernando C.
1998-12-01
We study the quatum to classical transition process in the context of quantum field theory. Extending the influence functional formalism of Feynman and Vernon, we study the decoherence process for self-interacting quantum fields in flat space. We also use this formalism for arbitrary geometries to analyze the quantum to classical transition in quantum gravity. After summarizing the main results known for the quantum Brownian motion, we consider a self-interacting field theory in Minkowski spacetime. We compute a coarse grained effective action by integrating out the field modes with wavelength shorter than a critical value. From this effective action we obtain the evolution equation for the reduced density matrix (master equation). We compute the diffusion coefficients for this equation and analyze the decoherence induced on the long-wavelength modes. We generalize the results to the case of a conformally coupled scalar field in de Sitter spacetime. We show that the decoherence is effective as long as the critical wavelength is taken to be not shorter than the Hubble radius. On the other hand, we study the classical limit for scalar-tensorial models in two dimensions. We consider different couplings between the dilaton and the scalar field. We discuss the Hawking radiation process and, from an exact evaluation of the influence functional, we study the conditions by which decoherence ensures the validity of the semiclassical approximation in cosmological metrics. Finally we consider four dimensional models with massive scalar fields, arbitrary coupled to the geometry. We compute the Einstein-Langevin equations in order to study the effect of the fluctuations induced by the quantum fields on the classical geometry.
del Moral, F; Vázquez, J A; Ferrero, J J; Willisch, P; Ramírez, R D; Teijeiro, A; López Medina, A; Andrade, B; Vázquez, J; Salvador, F; Medal, D; Salgado, M; Muñoz, V
2009-09-01
Modern radiotherapy uses complex treatments that necessitate more complex quality assurance procedures. As a continuous medium, GafChromic EBT films offer suitable features for such verification. However, its sensitometric curve is not fully understood in terms of classical theoretical models. In fact, measured optical densities and those predicted by the classical models differ significantly. This difference increases systematically with wider dose ranges. Thus, achieving the accuracy required for intensity-modulated radiotherapy (IMRT) by classical methods is not possible, plecluding their use. As a result, experimental parametrizations, such as polynomial fits, are replacing phenomenological expressions in modern investigations. This article focuses on identifying new theoretical ways to describe sensitometric curves and on evaluating the quality of fit for experimental data based on four proposed models. A whole mathematical formalism starting with a geometrical version of the classical theory is used to develop new expressions for the sensitometric curves. General results from the percolation theory are also used. A flat-bed-scanner-based method was chosen for the film analysis. Different tests were performed, such as consistency of the numeric results for the proposed model and double examination using data from independent researchers. Results show that the percolation-theory-based model provides the best theoretical explanation for the sensitometric behavior of GafChromic films. The different sizes of active centers or monomer crystals of the film are the basis of this model, allowing acquisition of information about the internal structure of the films. Values for the mean size of the active centers were obtained in accordance with technical specifications. In this model, the dynamics of the interaction between the active centers of GafChromic film and radiation is also characterized by means of its interaction cross-section value. The percolation model fulfills the accuracy requirements for quality-control procedures when large ranges of doses are used and offers a physical explanation for the film response.
NASA Astrophysics Data System (ADS)
Li, Ziyi
2017-12-01
Generalized uncertainty principle (GUP), also known as the generalized uncertainty relationship, is the modified form of the classical Heisenberg’s Uncertainty Principle in special cases. When we apply quantum gravity theories such as the string theory, the theoretical results suggested that there should be a “minimum length of observation”, which is about the size of the Planck-scale (10-35m). Taking into account the basic scale of existence, we need to fix a new common form of Heisenberg’s uncertainty principle in the thermodynamic system and make effective corrections to statistical physical questions concerning about the quantum density of states. Especially for the condition at high temperature and high energy levels, generalized uncertainty calculations have a disruptive impact on classical statistical physical theories but the present theory of Femtosecond laser is still established on the classical Heisenberg’s Uncertainty Principle. In order to improve the detective accuracy and temporal resolution of the Femtosecond laser, we applied the modified form of generalized uncertainty principle to the wavelength, energy and pulse time of Femtosecond laser in our work. And we designed three typical systems from micro to macro size to estimate the feasibility of our theoretical model and method, respectively in the chemical solution condition, crystal lattice condition and nuclear fission reactor condition.
NASA Astrophysics Data System (ADS)
Fukushima, Kimichika; Sato, Hikaru
2018-04-01
Ultraviolet self-interaction energies in field theory sometimes contain meaningful physical quantities. The self-energies in such as classical electrodynamics are usually subtracted from the rest mass. For the consistent treatment of energies as sources of curvature in the Einstein field equations, this study includes these subtracted self-energies into vacuum energy expressed by the constant Lambda (used in such as Lambda-CDM). In this study, the self-energies in electrodynamics and macroscopic classical Einstein field equations are examined, using the formalisms with the ultraviolet cut-off scheme. One of the cut-off formalisms is the field theory in terms of the step-function-type basis functions, developed by the present authors. The other is a continuum theory of a fundamental particle with the same cut-off length. Based on the effectiveness of the continuum theory with the cut-off length shown in the examination, the dominant self-energy is the quadratic term of the Higgs field at a quantum level (classical self-energies are reduced to logarithmic forms by quantum corrections). The cut-off length is then determined to reproduce today's tiny value of Lambda for vacuum energy. Additionally, a field with nonperiodic vanishing boundary conditions is treated, showing that the field has no zero-point energy.
ERIC Educational Resources Information Center
Pratt, Cornelius B.
1994-01-01
Links ethical theories to the management of the product recall of the Perrier Group of America. Argues for a nonsituational theory-based eclectic approach to ethics in public relations to enable public relations practitioners, as strategic communication managers, to respond effectively to potentially unethical organizational actions. (SR)
Advanced classical thermodynamics
NASA Astrophysics Data System (ADS)
Emanuel, George
The theoretical and mathematical foundations of thermodynamics are presented in an advanced text intended for graduate engineering students. Chapters are devoted to definitions and postulates, the fundamental equation, equilibrium, the application of Jacobian theory to thermodynamics, the Maxwell equations, stability, the theory of real gases, critical-point theory, and chemical thermodynamics. Diagrams, graphs, tables, and sample problems are provided.
On Multidimensional Item Response Theory: A Coordinate-Free Approach. Research Report. ETS RR-07-30
ERIC Educational Resources Information Center
Antal, Tamás
2007-01-01
A coordinate-free definition of complex-structure multidimensional item response theory (MIRT) for dichotomously scored items is presented. The point of view taken emphasizes the possibilities and subtleties of understanding MIRT as a multidimensional extension of the classical unidimensional item response theory models. The main theorem of the…
An Introduction to Item Response Theory for Health Behavior Researchers
ERIC Educational Resources Information Center
Warne, Russell T.; McKyer, E. J. Lisako; Smith, Matthew L.
2012-01-01
Objective: To introduce item response theory (IRT) to health behavior researchers by contrasting it with classical test theory and providing an example of IRT in health behavior. Method: Demonstrate IRT by fitting the 2PL model to substance-use survey data from the Adolescent Health Risk Behavior questionnaire (n = 1343 adolescents). Results: An…
Item Response Theory: A Basic Concept
ERIC Educational Resources Information Center
Mahmud, Jumailiyah
2017-01-01
With the development in computing technology, item response theory (IRT) develops rapidly, and has become a user friendly application in psychometrics world. Limitation in classical theory is one aspect that encourages the use of IRT. In this study, the basic concept of IRT will be discussed. In addition, it will briefly review the ability…
An Alternative Approach to Identifying a Dimension in Second Language Proficiency.
ERIC Educational Resources Information Center
Griffin, Patrick E.; And Others
Current practice in language testing has not yet integrated classical test theory with assessment of language skills. In addition, language testing needs to be part of theory development. Lack of sound testing procedures can lead to problems in research design and ultimately, inappropriate theory development. The debate over dimensionality of…
A Critical Theory of Adult and Community Education
ERIC Educational Resources Information Center
Brookfield, Stephen
2012-01-01
Critical theory is one of the most influential theoretical frameworks influencing scholarship within the field of adult and community education. This chapter outlines what constitute the chief elements of critical theory using Horkheimer's (1937/1995) classic essay as a touchstone for this analysis. It argues for a set of adult learning tasks that…
A Critical Comparison of Classical and Domain Theory: Some Implications for Character Education
ERIC Educational Resources Information Center
Keefer, Matthew Wilks
2006-01-01
Contemporary approaches to moral education are influenced by the "domain theory" approach to understanding moral development (Turiel, 1983; 1998; Nucci, 2001). Domain theory holds there are distinct conventional, personal and moral domains; each constituting a cognitive "structured-whole" with its own normative source and sphere of influence. One…
Generalized continued fractions and ergodic theory
NASA Astrophysics Data System (ADS)
Pustyl'nikov, L. D.
2003-02-01
In this paper a new theory of generalized continued fractions is constructed and applied to numbers, multidimensional vectors belonging to a real space, and infinite-dimensional vectors with integral coordinates. The theory is based on a concept generalizing the procedure for constructing the classical continued fractions and substantially using ergodic theory. One of the versions of the theory is related to differential equations. In the finite-dimensional case the constructions thus introduced are used to solve problems posed by Weyl in analysis and number theory concerning estimates of trigonometric sums and of the remainder in the distribution law for the fractional parts of the values of a polynomial, and also the problem of characterizing algebraic and transcendental numbers with the use of generalized continued fractions. Infinite-dimensional generalized continued fractions are applied to estimate sums of Legendre symbols and to obtain new results in the classical problem of the distribution of quadratic residues and non-residues modulo a prime. In the course of constructing these continued fractions, an investigation is carried out of the ergodic properties of a class of infinite-dimensional dynamical systems which are also of independent interest.
Molecular dynamics simulations of classical sound absorption in a monatomic gas
NASA Astrophysics Data System (ADS)
Ayub, M.; Zander, A. C.; Huang, D. M.; Cazzolato, B. S.; Howard, C. Q.
2018-05-01
Sound wave propagation in argon gas is simulated using molecular dynamics (MD) in order to determine the attenuation of acoustic energy due to classical (viscous and thermal) losses at high frequencies. In addition, a method is described to estimate attenuation of acoustic energy using the thermodynamic concept of exergy. The results are compared against standing wave theory and the predictions of the theory of continuum mechanics. Acoustic energy losses are studied by evaluating various attenuation parameters and by comparing the changes in behavior at three different frequencies. This study demonstrates acoustic absorption effects in a gas simulated in a thermostatted molecular simulation and quantifies the classical losses in terms of the sound attenuation constant. The approach can be extended to further understanding of acoustic loss mechanisms in the presence of nanoscale porous materials in the simulation domain.
Optical rectenna operation: where Maxwell meets Einstein
NASA Astrophysics Data System (ADS)
Joshi, Saumil; Moddel, Garret
2016-07-01
Optical rectennas are antenna-coupled diode rectifiers that receive and convert optical-frequency electromagnetic radiation into DC output. The analysis of rectennas is carried out either classically using Maxwell’s wave-like approach, or quantum-mechanically using Einstein’s particle-like approach for electromagnetic radiation. One of the characteristics of classical operation is that multiple photons transfer their energy to individual electrons, whereas in quantum operation each photon transfers its energy to each electron. We analyze the correspondence between the two approaches by comparing rectenna response first to monochromatic illumination obtained using photon-assisted tunnelling theory and classical theory. Applied to broadband rectenna operation, this correspondence provides clues to designing a rectenna solar cell that has the potential to exceed the 44% quantum-limited conversion efficiency. The comparison of operating regimes shows how optical rectenna operation differs from microwave rectenna operation.
Quantum Structure in Cognition and the Foundations of Human Reasoning
NASA Astrophysics Data System (ADS)
Aerts, Diederik; Sozzo, Sandro; Veloz, Tomas
2015-12-01
Traditional cognitive science rests on a foundation of classical logic and probability theory. This foundation has been seriously challenged by several findings in experimental psychology on human decision making. Meanwhile, the formalism of quantum theory has provided an efficient resource for modeling these classically problematical situations. In this paper, we start from our successful quantum-theoretic approach to the modeling of concept combinations to formulate a unifying explanatory hypothesis. In it, human reasoning is the superposition of two processes - a conceptual reasoning, whose nature is emergence of new conceptuality, and a logical reasoning, founded on an algebraic calculus of the logical type. In most cognitive processes however, the former reasoning prevails over the latter. In this perspective, the observed deviations from classical logical reasoning should not be interpreted as biases but, rather, as natural expressions of emergence in its deepest form.
Quantum Barro-Gordon game in monetary economics
NASA Astrophysics Data System (ADS)
Samadi, Ali Hussein; Montakhab, Afshin; Marzban, Hussein; Owjimehr, Sakine
2018-01-01
Classical game theory addresses decision problems in multi-agent environment where one rational agent's decision affects other agents' payoffs. Game theory has widespread application in economic, social and biological sciences. In recent years quantum versions of classical games have been proposed and studied. In this paper, we consider a quantum version of the classical Barro-Gordon game which captures the problem of time inconsistency in monetary economics. Such time inconsistency refers to the temptation of weak policy maker to implement high inflation when the public expects low inflation. The inconsistency arises when the public punishes the weak policy maker in the next cycle. We first present a quantum version of the Barro-Gordon game. Next, we show that in a particular case of the quantum game, time-consistent Nash equilibrium could be achieved when public expects low inflation, thus resolving the game.
The origin of three-cocycles in quantum field theory
NASA Astrophysics Data System (ADS)
Carey, A. L.
1987-08-01
When quantising a classical field theory it is not automatic that a group of symmetries of the classical system is preserved as a symmetry of the quantum system. Apart from the phenomenon of symmetry breaking it can also happen (as in Faddeev's Gauss law anomaly) that only an extension of the classical group acts as a symmetry group of the quantum system. We show here that rather than signalling a failure of the associative law as has been suggested in the literature, the occurrence of a non-trivial three-cocycle on the local gauge group is an ``anomaly'' or obstruction to the existence of an extension of the local gauge group acting as a symmetry group of the quantum system. Permanent address: Department of Pure Mathematics, University of Adelaide, G.P.O. Box 498, Adelaide, SA 5000, Australia.
Experimental contextuality in classical light
NASA Astrophysics Data System (ADS)
Li, Tao; Zeng, Qiang; Song, Xinbing; Zhang, Xiangdong
2017-03-01
The Klyachko, Can, Binicioglu, and Shumovsky (KCBS) inequality is an important contextuality inequality in three-level system, which has been demonstrated experimentally by using quantum states. Using the path and polarization degrees of freedom of classical optics fields, we have constructed the classical trit (cetrit), tested the KCBS inequality and its geometrical form (Wright’s inequality) in this work. The projection measurement has been implemented, the clear violations of the KCBS inequality and its geometrical form have been observed. This means that the contextuality inequality, which is commonly used in test of the conflict between quantum theory and noncontextual realism, may be used as a quantitative tool in classical optical coherence to describe correlation characteristics of the classical fields.
Classical and quantum Big Brake cosmology for scalar field and tachyonic models
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kamenshchik, A. Yu.; Manti, S.
We study a relation between the cosmological singularities in classical and quantum theory, comparing the classical and quantum dynamics in some models possessing the Big Brake singularity - the model based on a scalar field and two models based on a tachyon-pseudo-tachyon field . It is shown that the effect of quantum avoidance is absent for the soft singularities of the Big Brake type while it is present for the Big Bang and Big Crunch singularities. Thus, there is some kind of a classical - quantum correspondence, because soft singularities are traversable in classical cosmology, while the strong Big Bangmore » and Big Crunch singularities are not traversable.« less
Li, Tao; Zhang, Xiong; Zeng, Qiang; Wang, Bo; Zhang, Xiangdong
2018-04-30
The Clauser-Horne-Shimony-Holt (CHSH) inequality and the Klyachko-Can-Binicioglu-Shumovski (KCBS) inequality present a tradeoff on the no-disturbance (ND) principle. Recently, the fundamental monogamy relation between contextuality and nonlocality in quantum theory has been demonstrated experimentally. Here we show that such a relation and tradeoff can also be simulated in classical optical systems. Using polarization, path and orbital angular momentum of the classical optical beam, in classical optical experiment we have observed the stringent monogamy relation between the two inequalities by implementing the projection measurement. Our results show the application prospect of the concepts developed recently in quantum information science to classical optical system and optical information processing.
Categorial Compositionality: A Category Theory Explanation for the Systematicity of Human Cognition
Phillips, Steven; Wilson, William H.
2010-01-01
Classical and Connectionist theories of cognitive architecture seek to explain systematicity (i.e., the property of human cognition whereby cognitive capacity comes in groups of related behaviours) as a consequence of syntactically and functionally compositional representations, respectively. However, both theories depend on ad hoc assumptions to exclude specific instances of these forms of compositionality (e.g. grammars, networks) that do not account for systematicity. By analogy with the Ptolemaic (i.e. geocentric) theory of planetary motion, although either theory can be made to be consistent with the data, both nonetheless fail to fully explain it. Category theory, a branch of mathematics, provides an alternative explanation based on the formal concept of adjunction, which relates a pair of structure-preserving maps, called functors. A functor generalizes the notion of a map between representational states to include a map between state transformations (or processes). In a formal sense, systematicity is a necessary consequence of a higher-order theory of cognitive architecture, in contrast to the first-order theories derived from Classicism or Connectionism. Category theory offers a re-conceptualization for cognitive science, analogous to the one that Copernicus provided for astronomy, where representational states are no longer the center of the cognitive universe—replaced by the relationships between the maps that transform them. PMID:20661306
Categorial compositionality: a category theory explanation for the systematicity of human cognition.
Phillips, Steven; Wilson, William H
2010-07-22
Classical and Connectionist theories of cognitive architecture seek to explain systematicity (i.e., the property of human cognition whereby cognitive capacity comes in groups of related behaviours) as a consequence of syntactically and functionally compositional representations, respectively. However, both theories depend on ad hoc assumptions to exclude specific instances of these forms of compositionality (e.g. grammars, networks) that do not account for systematicity. By analogy with the Ptolemaic (i.e. geocentric) theory of planetary motion, although either theory can be made to be consistent with the data, both nonetheless fail to fully explain it. Category theory, a branch of mathematics, provides an alternative explanation based on the formal concept of adjunction, which relates a pair of structure-preserving maps, called functors. A functor generalizes the notion of a map between representational states to include a map between state transformations (or processes). In a formal sense, systematicity is a necessary consequence of a higher-order theory of cognitive architecture, in contrast to the first-order theories derived from Classicism or Connectionism. Category theory offers a re-conceptualization for cognitive science, analogous to the one that Copernicus provided for astronomy, where representational states are no longer the center of the cognitive universe--replaced by the relationships between the maps that transform them.
Objectivism, naturalism, and the revision of quantum theory
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cordero-Lecca, A.
1992-01-01
Because of its remarkable predictive power and general scientific fertility, there is a temptation to take quantum theory (QT) seriously as the best statement science can currently make about the world. But when the consequences of the theory are draw out, they reveal strange, indeed paradoxical, possibilities of superpositions. QT seems to imply that the entire world, not just the world of microparticles, is considerably less [open quotes]objective[close quotes] than common intuition suggests. To what extent, if any, however, is QT at odds with a reasonable conception of scientific objectivity The author argues that, far from committing us to paradox,more » quantum physics actually encourages us to think of the world in strong objectivist naturalist terms. After examining various influential programs in the field, An approach to the foundational problems of QT which is less [open quotes]theory-down[close quotes] than usual is proposed. In particular, the author argues that recent studies of metastable states both support a strongly objectivist formulation and revision of QT. The model that thus emerges turns out to be a member of the stochastic family of QT revisions developed by Ghirardi, Rimini Weber, Pearle, and Gisin, which are theories that preserve the level of peaceful coexistence with special relativity that standard QT and its field-theoretic generalizations have. The specific-proposal advanced in this monograph focuses on spontaneous transitions to bound energy states. No ad hoc parameters are assumed. The resulting approach seems able to explain successful working-level talk about such topics as quantum jumps, localization by interaction with macrosystems, standard measurement rules, all this without denying the existence of either quantum superpositions or EPR correlations.« less
Deconstruction of the Maldacena Núñez compactification
NASA Astrophysics Data System (ADS)
Andrews, R. P.; Dorey, N.
2006-09-01
We demonstrate a classical equivalence between the large- N limit of the higgsed N=1 SUSY U(N) Yang-Mills theory and the Maldacena-Núñez twisted compactification of a six-dimensional gauge theory on a two-sphere. A direct comparison of the actions and spectra of the two theories reveals them to be identical. We also propose a gauge theory limit which should describe the corresponding spherical compactification of little string theory.
The Leadership of Groups in Organizations
1985-07-01
Managemert • July, 1985 01 i J JAN14 19866 K) Abstract A theory of leadership that focusses specifically on task-performing , groups in organizations in...p:xoposed. The theory takes a functional approach to leadership , explcring how leaders fulfill functions that are required for group effectiveness...that there are no theories of leadership around. There are theories of managerial leadership , from the classic statements of organization theorists
ERIC Educational Resources Information Center
Pardee, Ronald L.
Job satisfaction, motivation, and reward systems are included in one area of organizational theory. The strongest influence in this area is motivation because it overlaps into both of the other two components. A review of the classical literature on motivation reveals four major theory areas: (1) Maslow's Hierarchy of Needs; (2) Herzberg's…
Nonrelativistic Conformed Symmetry in 2 + 1 Dimensional Field Theory.
NASA Astrophysics Data System (ADS)
Bergman, Oren
This thesis is devoted to the study of conformal invariance and its breaking in non-relativistic field theories. It is a well known feature of relativistic field theory that theories which are conformally invariant at the classical level can acquire a conformal anomaly upon quantization and renormalization. The anomaly appears through the introduction of an arbitrary, but dimensionful, renormalization scale. One does not usually associate the concepts of renormalization and anomaly with nonrelativistic quantum mechanics, but there are a few examples where these concepts are useful. The most well known case is the two-dimensional delta -function potential. In two dimensions the delta-function scales like the kinetic term of the Hamiltonian, and therefore the problem is classically conformally invariant. Another example of classical conformal invariance is the famous Aharonov-Bohm (AB) problem. In that case each partial wave sees a 1/r^2 potential. We use the second quantized formulation of these problems, namely the nonrelativistic field theories, to compute Green's functions and derive the conformal anomaly. In the case of the AB problem we also solve an old puzzle, namely how to reproduce the result of Aharonov and Bohm in perturbation theory. The thesis is organized in the following manner. Chapter 1 is an introduction to nonrelativistic field theory, nonrelativistic conformal invariance, contact interactions and the AB problem. In Chapter 2 we discuss nonrelativistic scalar field theory, and how its quantization produces the anomaly. Chapter 3 is devoted to the AB problem, and the resolution of the perturbation puzzle. In Chapter 4 we generalize the discussion of Chapter 3 to particles carrying nonabelian charges. The structure of the nonabelian theory is much richer, and deserves a separate discussion. We also comment on the issues of forward scattering and single -valuedness of wavefunctions, which are important for Chapter 3 as well. (Copies available exclusively from MIT Libraries, Rm. 14-0551, Cambridge, MA 02139-4307. Ph. 617-253-5668; Fax 617-253-1690.).
Massively parallel GPU-accelerated minimization of classical density functional theory
NASA Astrophysics Data System (ADS)
Stopper, Daniel; Roth, Roland
2017-08-01
In this paper, we discuss the ability to numerically minimize the grand potential of hard disks in two-dimensional and of hard spheres in three-dimensional space within the framework of classical density functional and fundamental measure theory on modern graphics cards. Our main finding is that a massively parallel minimization leads to an enormous performance gain in comparison to standard sequential minimization schemes. Furthermore, the results indicate that in complex multi-dimensional situations, a heavy parallel minimization of the grand potential seems to be mandatory in order to reach a reasonable balance between accuracy and computational cost.
Correcting quantum errors with entanglement.
Brun, Todd; Devetak, Igor; Hsieh, Min-Hsiu
2006-10-20
We show how entanglement shared between encoder and decoder can simplify the theory of quantum error correction. The entanglement-assisted quantum codes we describe do not require the dual-containing constraint necessary for standard quantum error-correcting codes, thus allowing us to "quantize" all of classical linear coding theory. In particular, efficient modern classical codes that attain the Shannon capacity can be made into entanglement-assisted quantum codes attaining the hashing bound (closely related to the quantum capacity). For systems without large amounts of shared entanglement, these codes can also be used as catalytic codes, in which a small amount of initial entanglement enables quantum communication.
Magnetic torque on a rotating superconducting sphere
NASA Technical Reports Server (NTRS)
Holdeman, L. B.
1975-01-01
The London theory of superconductivity is used to calculate the torque on a superconducting sphere rotating in a uniform applied magnetic field. The London theory is combined with classical electrodynamics for a calculation of the direct effect of excess charge on a rotating superconducting sphere. Classical electrodynamics, with the assumption of a perfect Meissner effect, is used to calculate the torque on a superconducting sphere rotating in an arbitrary magnetic induction; this macroscopic approach yields results which are correct to first order. Using the same approach, the torque due to a current loop encircling the rotating sphere is calculated.
Counterfactual Definiteness and Bell's Inequality
NASA Astrophysics Data System (ADS)
Hess, Karl; Raedt, Hans De; Michielsen, Kristel
Counterfactual definiteness must be used as at least one of the postulates or axioms that are necessary to derive Bell-type inequalities. It is considered by many to be a postulate that is not only commensurate with classical physics (as for example Einstein's special relativity), but also separates and distinguishes classical physics from quantum mechanics. It is the purpose of this paper to show that Bell's choice of mathematical functions and independent variables implicitly includes counterfactual definiteness and reduces the generality of the physics of Bell-type theories so significantly that no meaningful comparison of these theories with actual Einstein-Podolsky-Rosen experiments can be made.
Towards an emergent model of solitonic particles from non-trivial vacuum structure
NASA Astrophysics Data System (ADS)
Gillard, Adam B.; Gresnigt, Niels G.
2017-12-01
We motivate and introduce what we refer to as the principles of Lie-stability and Hopf-stability and see what the physical theories must look like. Lie-stability is needed on the classical side and Hopf-stability is needed on the quantum side. We implement these two principles together with Lie-deformations consistent with basic constraints on the classical kinematical variables to arrive at the form of a theory that identifies standard model fermions with quantum solitonic trefoil knotted flux tubes which emerge from a flux tube vacuum network. Moreover, twisted unknot fluxtubes form natural dark matter candidates
On simulations of rarefied vapor flows with condensation
NASA Astrophysics Data System (ADS)
Bykov, Nikolay; Gorbachev, Yuriy; Fyodorov, Stanislav
2018-05-01
Results of the direct simulation Monte Carlo of 1D spherical and 2D axisymmetric expansions into vacuum of condens-ing water vapor are presented. Two models based on the kinetic approach and the size-corrected classical nucleation theory are employed for simulations. The difference in obtained results is discussed and advantages of the kinetic approach in comparison with the modified classical theory are demonstrated. The impact of clusterization on flow parameters is observed when volume fraction of clusters in the expansion region exceeds 5%. Comparison of the simulation data with the experimental results demonstrates good agreement.
Gavrilenko, T V; Es'kov, V M; Khadartsev, A A; Khimikova, O I; Sokolova, A A
2014-01-01
The behavior of the state vector of human cardio-vascular system in different age groups according to methods of theory of chaos-self-organization and methods of classical statistics was investigated. Observations were made on the indigenous people of North of the Russian Federation. Using methods of the theory of chaos-self-organization the differences in the parameters of quasi-attractors of the human state vector of cardio-vascular system of the people of Russian Federation North were shown. Comparison with the results obtained by classical statistics was made.
Boeyens, Jan C.A.; Levendis, Demetrius C.
2012-01-01
Molecular symmetry is intimately connected with the classical concept of three-dimensional molecular structure. In a non-classical theory of wave-like interaction in four-dimensional space-time, both of these concepts and traditional quantum mechanics lose their operational meaning, unless suitably modified. A required reformulation should emphasize the importance of four-dimensional effects like spin and the symmetry effects of space-time curvature that could lead to a fundamentally different understanding of molecular symmetry and structure in terms of elementary number theory. Isolated single molecules have no characteristic shape and macro-biomolecules only develop robust three-dimensional structure in hydrophobic response to aqueous cellular media. PMID:22942753
Determinism, independence, and objectivity are incompatible.
Ionicioiu, Radu; Mann, Robert B; Terno, Daniel R
2015-02-13
Hidden-variable models aim to reproduce the results of quantum theory and to satisfy our classical intuition. Their refutation is usually based on deriving predictions that are different from those of quantum mechanics. Here instead we study the mutual compatibility of apparently reasonable classical assumptions. We analyze a version of the delayed-choice experiment which ostensibly combines determinism, independence of hidden variables on the conducted experiments, and wave-particle objectivity (the assertion that quantum systems are, at any moment, either particles or waves, but not both). These three ideas are incompatible with any theory, not only with quantum mechanics.
[A non-classical approach to medical practices: Michel Foucault and Actor-Network Theory].
Bińczyk, E
2001-01-01
The text presents an analysis of medical practices stemming from two sources: Michel Foucault's conception and the research of Annemarie Mol and John Law, representatives of a trend known as Actor-Network Theory. Both approaches reveal significant theoretical kinship: they can be successfully consigned to the framework of non-classical sociology of science. I initially refer to the cited conceptions as a version of non-classical sociology of medicine. The identity of non-classical sociology of medicine hinges on the fact that it undermines the possibility of objective definitions of disease, health and body. These are rather approached as variable social and historical phenomena, co-constituted by medical practices. To both Foucault and Mol the main object of interest was not medicine as such, but rather the network of medical practices. Mol and Law sketch a new theoretical perspective for the analysis of medical practices. They attempt to go beyond the dichotomous scheme of thinking about the human body as an object of medical research and the subject of private experience. Research on patients suffering blood-sugar deficiency provide the empirical background for the thesis of Actor-Network Theory representatives. Michel Foucault's conceptions are extremely critical of medical practices. The French researcher describes the processes of 'medicalising' Western society as the emergence of a new type of power. He attempts to sensitise the reader to the ethical dimension of the processes of medicalising society.
Music and the Three Appeals of Classical Rhetoric
ERIC Educational Resources Information Center
LeCoat, Gerard G.
1976-01-01
Contends that rhetorical theory of the sixteenth through the eighteenth centuries influenced the theory of the composition of music and offers examples of vocal music which was adapted to the rhetorical appeals of logos, ethos, and pathos. (MH)
The Cultural Context of Career Assessment.
ERIC Educational Resources Information Center
Blustein, David L.; Ellis, Michael V.
2000-01-01
Building on social constructivism, culturally affirming career assessment should take a unificationist perspective, which does not assume the validity of tests across cultural contexts. Generalizability and item response theory are better suited than classical test theory to the unificationist perspective. (SK)