Sample records for set theory principles

  1. Independence of the uniformity principle from Church's thesis in intuitionistic set theory

    NASA Astrophysics Data System (ADS)

    Khakhanyan, V. Kh

    2013-12-01

    We prove the independence of the strong uniformity principle from Church's thesis with choice in intuitionistic set theory with the axiom of extensionality extended by Markov's principle and the double complement for sets.

  2. Calculating Interaction Energies Using First Principle Theories: Consideration of Basis Set Superposition Error and Fragment Relaxation

    ERIC Educational Resources Information Center

    Bowen, J. Philip; Sorensen, Jennifer B.; Kirschner, Karl N.

    2007-01-01

    The analysis explains the basis set superposition error (BSSE) and fragment relaxation involved in calculating the interaction energies using various first principle theories. Interacting the correlated fragment and increasing the size of the basis set can help in decreasing the BSSE to a great extent.

  3. Role of Logic and Mentality as the Basics of Wittgenstein's Picture Theory of Language and Extracting Educational Principles and Methods According to This Theory

    ERIC Educational Resources Information Center

    Heshi, Kamal Nosrati; Nasrabadi, Hassanali Bakhtiyar

    2016-01-01

    The present paper attempts to recognize principles and methods of education based on Wittgenstein's picture theory of language. This qualitative research utilized inferential analytical approach to review the related literature and extracted a set of principles and methods from his theory on picture language. Findings revealed that Wittgenstein…

  4. The Principle of General Tovariance

    NASA Astrophysics Data System (ADS)

    Heunen, C.; Landsman, N. P.; Spitters, B.

    2008-06-01

    We tentatively propose two guiding principles for the construction of theories of physics, which should be satisfied by a possible future theory of quantum gravity. These principles are inspired by those that led Einstein to his theory of general relativity, viz. his principle of general covariance and his equivalence principle, as well as by the two mysterious dogmas of Bohr's interpretation of quantum mechanics, i.e. his doctrine of classical concepts and his principle of complementarity. An appropriate mathematical language for combining these ideas is topos theory, a framework earlier proposed for physics by Isham and collaborators. Our principle of general tovariance states that any mathematical structure appearing in the laws of physics must be definable in an arbitrary topos (with natural numbers object) and must be preserved under so-called geometric morphisms. This principle identifies geometric logic as the mathematical language of physics and restricts the constructions and theorems to those valid in intuitionism: neither Aristotle's principle of the excluded third nor Zermelo's Axiom of Choice may be invoked. Subsequently, our equivalence principle states that any algebra of observables (initially defined in the topos Sets) is empirically equivalent to a commutative one in some other topos.

  5. Computer-based teaching module design: principles derived from learning theories.

    PubMed

    Lau, K H Vincent

    2014-03-01

    The computer-based teaching module (CBTM), which has recently gained prominence in medical education, is a teaching format in which a multimedia program serves as a single source for knowledge acquisition rather than playing an adjunctive role as it does in computer-assisted learning (CAL). Despite empirical validation in the past decade, there is limited research into the optimisation of CBTM design. This review aims to summarise research in classic and modern multimedia-specific learning theories applied to computer learning, and to collapse the findings into a set of design principles to guide the development of CBTMs. Scopus was searched for: (i) studies of classic cognitivism, constructivism and behaviourism theories (search terms: 'cognitive theory' OR 'constructivism theory' OR 'behaviourism theory' AND 'e-learning' OR 'web-based learning') and their sub-theories applied to computer learning, and (ii) recent studies of modern learning theories applied to computer learning (search terms: 'learning theory' AND 'e-learning' OR 'web-based learning') for articles published between 1990 and 2012. The first search identified 29 studies, dominated in topic by the cognitive load, elaboration and scaffolding theories. The second search identified 139 studies, with diverse topics in connectivism, discovery and technical scaffolding. Based on their relative representation in the literature, the applications of these theories were collapsed into a list of CBTM design principles. Ten principles were identified and categorised into three levels of design: the global level (managing objectives, framing, minimising technical load); the rhetoric level (optimising modality, making modality explicit, scaffolding, elaboration, spaced repeating), and the detail level (managing text, managing devices). This review examined the literature in the application of learning theories to CAL to develop a set of principles that guide CBTM design. Further research will enable educators to take advantage of this unique teaching format as it gains increasing importance in medical education. © 2014 John Wiley & Sons Ltd.

  6. Re-Conceptualization of Modified Angoff Standard Setting: Unified Statistical, Measurement, Cognitive, and Social Psychological Theories

    ERIC Educational Resources Information Center

    Iyioke, Ifeoma Chika

    2013-01-01

    This dissertation describes a design for training, in accordance with probability judgment heuristics principles, for the Angoff standard setting method. The new training with instruction, practice, and feedback tailored to the probability judgment heuristics principles was called the Heuristic training and the prevailing Angoff method training…

  7. The Principle of the Fermionic Projector: An Approach for Quantum Gravity?

    NASA Astrophysics Data System (ADS)

    Finster, Felix

    In this short article we introduce the mathematical framework of the principle of the fermionic projector and set up a variational principle in discrete space-time. The underlying physical principles are discussed. We outline the connection to the continuum theory and state recent results. In the last two sections, we speculate on how it might be possible to describe quantum gravity within this framework.

  8. Measuring uncertainty by extracting fuzzy rules using rough sets and extracting fuzzy rules under uncertainty and measuring definability using rough sets

    NASA Technical Reports Server (NTRS)

    Worm, Jeffrey A.; Culas, Donald E.

    1991-01-01

    Computers are not designed to handle terms where uncertainty is present. To deal with uncertainty, techniques other than classical logic must be developed. This paper examines the concepts of statistical analysis, the Dempster-Shafer theory, rough set theory, and fuzzy set theory to solve this problem. The fundamentals of these theories are combined to provide the possible optimal solution. By incorporating principles from these theories, a decision-making process may be simulated by extracting two sets of fuzzy rules: certain rules and possible rules. From these rules a corresponding measure of how much we believe these rules is constructed. From this, the idea of how much a fuzzy diagnosis is definable in terms of its fuzzy attributes is studied.

  9. Noncommutative Common Cause Principles in algebraic quantum field theory

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hofer-Szabo, Gabor; Vecsernyes, Peter

    2013-04-15

    States in algebraic quantum field theory 'typically' establish correlation between spacelike separated events. Reichenbach's Common Cause Principle, generalized to the quantum field theoretical setting, offers an apt tool to causally account for these superluminal correlations. In the paper we motivate first why commutativity between the common cause and the correlating events should be abandoned in the definition of the common cause. Then we show that the Noncommutative Weak Common Cause Principle holds in algebraic quantum field theory with locally finite degrees of freedom. Namely, for any pair of projections A, B supported in spacelike separated regions V{sub A} and V{submore » B}, respectively, there is a local projection C not necessarily commuting with A and B such that C is supported within the union of the backward light cones of V{sub A} and V{sub B} and the set {l_brace}C, C{sup Up-Tack }{r_brace} screens off the correlation between A and B.« less

  10. Criticism of generally accepted fundamentals and methodologies of traffic and transportation theory

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kerner, Boris S.

    It is explained why the set of the fundamental empirical features of traffic breakdown (a transition from free flow to congested traffic) should be the empirical basis for any traffic and transportation theory that can be reliable used for control and optimization in traffic networks. It is shown that generally accepted fundamentals and methodologies of traffic and transportation theory are not consistent with the set of the fundamental empirical features of traffic breakdown at a highway bottleneck. To these fundamentals and methodologies of traffic and transportation theory belong (i) Lighthill-Whitham-Richards (LWR) theory, (ii) the General Motors (GM) model class (formore » example, Herman, Gazis et al. GM model, Gipps’s model, Payne’s model, Newell’s optimal velocity (OV) model, Wiedemann’s model, Bando et al. OV model, Treiber’s IDM, Krauß’s model), (iii) the understanding of highway capacity as a particular stochastic value, and (iv) principles for traffic and transportation network optimization and control (for example, Wardrop’s user equilibrium (UE) and system optimum (SO) principles). Alternatively to these generally accepted fundamentals and methodologies of traffic and transportation theory, we discuss three-phase traffic theory as the basis for traffic flow modeling as well as briefly consider the network breakdown minimization (BM) principle for the optimization of traffic and transportation networks with road bottlenecks.« less

  11. Attachment Theory and Challenging Behaviors: Reconstructing the Nature of Relationships.

    ERIC Educational Resources Information Center

    Watson, Marilyn

    2003-01-01

    Outlines the basic principles of attachment theory and its implications for young children's social and emotional development. Applies attachment theory to children whose behaviors are especially challenging, using examples from a primary classroom in an urban setting. Asserts that sensitive teachers can build collaborative relationships. (SD)

  12. Ethical theory, ethnography, and differences between doctors and nurses in approaches to patient care.

    PubMed Central

    Robertson, D W

    1996-01-01

    OBJECTIVES: To study empirically whether ethical theory (from the mainstream principles-based, virtue-based, and feminist schools) usefully describes the approaches doctors and nurses take in everyday patient care. DESIGN: Ethnographic methods: participant observation and interviews, the transcripts of which were analysed to identify themes in ethical approaches. SETTING: A British old-age psychiatry ward. PARTICIPANTS: The more than 20 doctors and nurses on the ward. RESULTS: Doctors and nurses on the ward differed in their conceptions of the principles of beneficence and respect for patient autonomy. Nurses shared with doctors a commitment to liberal and utilitarian conceptions of these principles, but also placed much greater weight on relationships and character virtues when expressing the same principles. Nurses also emphasised patient autonomy, while doctors were more likely to advocate beneficence, when the two principles conflicted. CONCLUSION: The study indicates that ethical theory can, contrary to the charges of certain critics, be relevant to everyday health care-if it (a) attends to social context and (b) is flexible enough to draw on various schools of theory. PMID:8910782

  13. Dynamical basis sets for algebraic variational calculations in quantum-mechanical scattering theory

    NASA Technical Reports Server (NTRS)

    Sun, Yan; Kouri, Donald J.; Truhlar, Donald G.; Schwenke, David W.

    1990-01-01

    New basis sets are proposed for linear algebraic variational calculations of transition amplitudes in quantum-mechanical scattering problems. These basis sets are hybrids of those that yield the Kohn variational principle (KVP) and those that yield the generalized Newton variational principle (GNVP) when substituted in Schlessinger's stationary expression for the T operator. Trial calculations show that efficiencies almost as great as that of the GNVP and much greater than the KVP can be obtained, even for basis sets with the majority of the members independent of energy.

  14. Measuring uncertainty by extracting fuzzy rules using rough sets

    NASA Technical Reports Server (NTRS)

    Worm, Jeffrey A.

    1991-01-01

    Despite the advancements in the computer industry in the past 30 years, there is still one major deficiency. Computers are not designed to handle terms where uncertainty is present. To deal with uncertainty, techniques other than classical logic must be developed. The methods are examined of statistical analysis, the Dempster-Shafer theory, rough set theory, and fuzzy set theory to solve this problem. The fundamentals of these theories are combined to possibly provide the optimal solution. By incorporating principles from these theories, a decision making process may be simulated by extracting two sets of fuzzy rules: certain rules and possible rules. From these rules a corresponding measure of how much these rules is believed is constructed. From this, the idea of how much a fuzzy diagnosis is definable in terms of a set of fuzzy attributes is studied.

  15. Motivating Readers: Helping Students Set and Attain Personal Reading Goals

    ERIC Educational Resources Information Center

    Cabral-Márquez, Consuelo

    2015-01-01

    The motivational, cognitive, and performance benefits associated with setting goals are presented in light of goal-setting theory. These theoretical principles provide a framework that teachers can use to guide students in setting and pursuing personal reading goals that are proximal, specific, and compatible with students' reading abilities…

  16. Ethical Challenges in Psychiatric Administration and Leadership.

    PubMed

    Moffic, H Steven; Saeed, Sy Atezaz; Silver, Stuart; Koh, Steve

    2015-09-01

    As with all professional ethical principles, those in psychiatry have to evolve over time and societal changes. The current ethical challenges for psychiatric administration and leadership, especially regarding for-profit managed care, need updated solutions. One solution resides in the development by the American Association of Psychiatric Administrators (AAPA) of the first set of ethical principles designed specifically for psychiatric administrators. These principles build on prior Psychological Theories of leadership, such as those of Freud, Kernberg, and Kohut. Supplementing these theories are the actual real life models of psychiatrist leadership as depicted in the memoirs of various psychiatrists. Appreciating these principles, theories, and models may help emerging leaders to better recognize the importance of ethical challenges. A conclusion is that psychiatrists should have the potential to assume more successful leadership positions once again. In such positions, making the skills and well-being of all in the organization seems now to be the foremost ethical priority.

  17. Values and principles evident in current health promotion practice.

    PubMed

    Gregg, Jane; O'Hara, Lily

    2007-04-01

    Modern health promotion practice needs to respond to complex health issues that have multiple interrelated determinants. This requires an understanding of the values and principles of health promotion. A literature review was undertaken to explore the values and principles evident in current health promotion theory and practice. A broad range of values and principles are espoused as being integral to modern health promotion theory and practice. Although there are some commonalities across these lists, there is no recognised, authoritative set of values and principles accepted as fundamental and applicable to modern health promotion. There is a continuum of values and principles evident in health promotion practice from those associated with holistic, ecological, salutogenic health promotion to those more in keeping with conventional health promotion. There is a need for a system of values and principles consistent with modern health promotion that enables practitioners to purposefully integrate these values and principles into their understanding of health, as well as their needs assessment, planning, implementation and evaluation practice.

  18. Theories of Modern Management.

    ERIC Educational Resources Information Center

    Knight, W. Hal

    This chapter of "Principles of School Business Management" identifies management theories that provide a fundamental conceptual knowledge base that school business officials can use to understand the school organizational setting and its influences on the day-to-day operation of the educational process. Particular attention is paid to…

  19. Rough set classification based on quantum logic

    NASA Astrophysics Data System (ADS)

    Hassan, Yasser F.

    2017-11-01

    By combining the advantages of quantum computing and soft computing, the paper shows that rough sets can be used with quantum logic for classification and recognition systems. We suggest the new definition of rough set theory as quantum logic theory. Rough approximations are essential elements in rough set theory, the quantum rough set model for set-valued data directly construct set approximation based on a kind of quantum similarity relation which is presented here. Theoretical analyses demonstrate that the new model for quantum rough sets has new type of decision rule with less redundancy which can be used to give accurate classification using principles of quantum superposition and non-linear quantum relations. To our knowledge, this is the first attempt aiming to define rough sets in representation of a quantum rather than logic or sets. The experiments on data-sets have demonstrated that the proposed model is more accuracy than the traditional rough sets in terms of finding optimal classifications.

  20. Quantum Theory from Observer's Mathematics Point of View

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Khots, Dmitriy; Khots, Boris

    2010-05-04

    This work considers the linear (time-dependent) Schrodinger equation, quantum theory of two-slit interference, wave-particle duality for single photons, and the uncertainty principle in a setting of arithmetic, algebra, and topology provided by Observer's Mathematics, see [1]. Certain theoretical results and communications pertaining to these theorems are also provided.

  1. Quantum Physics Principles and Communication in the Acute Healthcare Setting: A Pilot Study.

    PubMed

    Helgeson, Heidi L; Peyerl, Colleen Kraft; Solheim-Witt, Marit

    This pilot study explores whether clinician awareness of quantum physics principles could facilitate open communication between patients and providers. In the spirit of action research, this study was conceptualized with a holistic view of human health, using a mixed method design of grounded theory as an emergent method. Instrumentation includes surveys and a focus group discussion with twelve registered nurses working in an acute care hospital setting. Findings document that the preliminary core phenomenon, energy as information, influences communication in the healthcare environment. Key emergent themes include awareness, language, validation, open communication, strategies, coherence, incoherence and power. Research participants indicate that quantum physics principles provide a language and conceptual framework for improving their awareness of communication and interactions in the healthcare environment. Implications of this pilot study support the feasibility of future research and education on awareness of quantum physics principles in other clinical settings. Copyright © 2016 Elsevier Inc. All rights reserved.

  2. First-principles investigation on Rydberg and resonance excitations: A case study of the firefly luciferin anion

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Noguchi, Yoshifumi, E-mail: y.noguchi@issp.u-tokyo.ac.jp; Hiyama, Miyabi; Akiyama, Hidefumi

    2014-07-28

    The optical properties of an isolated firefly luciferin anion are investigated by using first-principles calculations, employing the many-body perturbation theory to take into account the excitonic effect. The calculated photoabsorption spectra are compared with the results obtained using the time-dependent density functional theory (TDDFT) employing the localized atomic orbital (AO) basis sets and a recent experiment in vacuum. The present method well reproduces the line shape at the photon energy corresponding to the Rydberg and resonance excitations but overestimates the peak positions by about 0.5 eV. However, the TDDFT-calculated positions of some peaks are closer to those of the experiment.more » We also investigate the basis set dependency in describing the free electron states above vacuum level and the excitons involving the transitions to the free electron states and conclude that AO-only basis sets are inaccurate for free electron states and the use of a plane wave basis set is required.« less

  3. Applied Behavior Analysis in Flying Training Research. Interim Report for Period June 1978-August 1978.

    ERIC Educational Resources Information Center

    Bailey, Jon S.; Hughes, Ronald G.

    Research developments in learning theory over the past fifty years have led to principles of behavior which have been shown in innumerable applied settings to be valuable in analyzing and modifying human behavior. When applied to flying training using simulators, these principles suggest that a significant contribution could be made in improving…

  4. Foregrounding Knowledge in E-Learning Design: An Illustration in a Museum Setting

    ERIC Educational Resources Information Center

    Carvalho, Lucila; Dong, Andy; Maton, Karl

    2015-01-01

    The nature of knowledge, and the various forms knowledge may take, is a neglected aspect of the development of e-learning environments. This paper uses Legitimation Code Theory (LCT) to conceptualise the organising principles of knowledge practices. As we will illustrate, when it comes to the design of e-learning, the organising principles of the…

  5. The Efficiency Challenge: Creating a Transformative Learning Experience in a Principles of Management Course

    ERIC Educational Resources Information Center

    Durant, Rita A.; Carlon, Donna M.; Downs, Alexis

    2017-01-01

    This article describes the results of the "Efficiency Challenge," a 10-week, Principles of Management course activity that uses reflection and goal setting to help students understand the concept of operational efficiency. With transformative learning theory as a lens, we base our report on 4 years' worth of student reflections regarding…

  6. Testing typicality in multiverse cosmology

    NASA Astrophysics Data System (ADS)

    Azhar, Feraz

    2015-05-01

    In extracting predictions from theories that describe a multiverse, we face the difficulty that we must assess probability distributions over possible observations prescribed not just by an underlying theory, but by a theory together with a conditionalization scheme that allows for (anthropic) selection effects. This means we usually need to compare distributions that are consistent with a broad range of possible observations with actual experimental data. One controversial means of making this comparison is by invoking the "principle of mediocrity": that is, the principle that we are typical of the reference class implicit in the conjunction of the theory and the conditionalization scheme. In this paper, we quantitatively assess the principle of mediocrity in a range of cosmological settings, employing "xerographic distributions" to impose a variety of assumptions regarding typicality. We find that for a fixed theory, the assumption that we are typical gives rise to higher likelihoods for our observations. If, however, one allows both the underlying theory and the assumption of typicality to vary, then the assumption of typicality does not always provide the highest likelihoods. Interpreted from a Bayesian perspective, these results support the claim that when one has the freedom to consider different combinations of theories and xerographic distributions (or different "frameworks"), one should favor the framework that has the highest posterior probability; and then from this framework one can infer, in particular, how typical we are. In this way, the invocation of the principle of mediocrity is more questionable than has been recently claimed.

  7. Almost-Quantum Correlations Violate the No-Restriction Hypothesis

    NASA Astrophysics Data System (ADS)

    Sainz, Ana Belén; Guryanova, Yelena; Acín, Antonio; Navascués, Miguel

    2018-05-01

    To identify which principles characterize quantum correlations, it is essential to understand in which sense this set of correlations differs from that of almost-quantum correlations. We solve this problem by invoking the so-called no-restriction hypothesis, an explicit and natural axiom in many reconstructions of quantum theory stating that the set of possible measurements is the dual of the set of states. We prove that, contrary to quantum correlations, no generalized probabilistic theory satisfying the no-restriction hypothesis is able to reproduce the set of almost-quantum correlations. Therefore, any theory whose correlations are exactly, or very close to, the almost-quantum correlations necessarily requires a rule limiting the possible measurements. Our results suggest that the no-restriction hypothesis may play a fundamental role in singling out the set of quantum correlations among other nonsignaling ones.

  8. Almost-Quantum Correlations Violate the No-Restriction Hypothesis.

    PubMed

    Sainz, Ana Belén; Guryanova, Yelena; Acín, Antonio; Navascués, Miguel

    2018-05-18

    To identify which principles characterize quantum correlations, it is essential to understand in which sense this set of correlations differs from that of almost-quantum correlations. We solve this problem by invoking the so-called no-restriction hypothesis, an explicit and natural axiom in many reconstructions of quantum theory stating that the set of possible measurements is the dual of the set of states. We prove that, contrary to quantum correlations, no generalized probabilistic theory satisfying the no-restriction hypothesis is able to reproduce the set of almost-quantum correlations. Therefore, any theory whose correlations are exactly, or very close to, the almost-quantum correlations necessarily requires a rule limiting the possible measurements. Our results suggest that the no-restriction hypothesis may play a fundamental role in singling out the set of quantum correlations among other nonsignaling ones.

  9. Diagnostic Assessment of Reading and Listening in a Second or Foreign Language: Elaborating on Diagnostic Principles

    ERIC Educational Resources Information Center

    Harding, Luke; Alderson, J. Charles; Brunfaut, Tineke

    2015-01-01

    Alderson, Brunfaut and Harding (2014) recently investigated how diagnosis is practised across a range of professions in order to develop a tentative framework for a theory of diagnosis in second or foreign language (SFL) assessment. In articulating this framework, a set of five broad principles were proposed, encompassing the entire enterprise of…

  10. Ethics of respect for nature

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Taylor, P.W.

    The foundational structure for a life-centered theory of environmental ethics consists of three interrelated components. First is the adopting of a certain ultimate moral attitude toward nature, which the author calls respect for nature. Second is a belief system that constitutes a way of conceiving of the natural world and of our place in it. This belief system underlies and supports the attitude in a way that makes it an appropriate attitude to take toward the Earth's natural ecosystems and their life communities. Third is a system of moral rules and standards for guiding our treatment of those ecosystems andmore » life communities, a set of normative principles which give concrete embodiment or expression to the attitude of respect for nature. The theory set forth and defended here is structurally symmetrical with a theory of human ethics based on the principle of respect for persons and the legal entitlement to be protected. We can begin by placing limits on human population and technology.« less

  11. In search of a metatheory for cognitive development (or, Piaget is dead and I don't feel so good myself).

    PubMed

    Bjorklund, D F

    1997-02-01

    With the waning of influence of Piaget's theory and the shortcomings of information-processing perspectives of cognitive growth, cognitive developmentalists lack a common set of broad, overarching principles and assumptions--a metatheory--to guide their research. Developmental biology is suggested as metatheory for cognitive development. Although it is important for developmentalists to understand proximal biological causes (e.g., brain development), most important for such a metatheory is an evolutionary perspective. Some basic principles of evolutionary psychology are introduced, and examples of contemporary research and theory consistent with these ideas are provided.

  12. Learning Theory and the Typewriter Teacher

    ERIC Educational Resources Information Center

    Wakin, B. Bertha

    1974-01-01

    Eight basic principles of learning are described and discussed in terms of practical learning strategies for typewriting. Described are goal setting, preassessment, active participation, individual differences, reinforcement, practice, transfer of learning, and evaluation. (SC)

  13. Modification of Schrödinger-Newton equation due to braneworld models with minimal length

    NASA Astrophysics Data System (ADS)

    Bhat, Anha; Dey, Sanjib; Faizal, Mir; Hou, Chenguang; Zhao, Qin

    2017-07-01

    We study the correction of the energy spectrum of a gravitational quantum well due to the combined effect of the braneworld model with infinite extra dimensions and generalized uncertainty principle. The correction terms arise from a natural deformation of a semiclassical theory of quantum gravity governed by the Schrödinger-Newton equation based on a minimal length framework. The two fold correction in the energy yields new values of the spectrum, which are closer to the values obtained in the GRANIT experiment. This raises the possibility that the combined theory of the semiclassical quantum gravity and the generalized uncertainty principle may provide an intermediate theory between the semiclassical and the full theory of quantum gravity. We also prepare a schematic experimental set-up which may guide to the understanding of the phenomena in the laboratory.

  14. Pauli Exclusion Principle

    NASA Astrophysics Data System (ADS)

    Murdin, P.

    2000-11-01

    A principle of quantum theory, devised in 1925 by Wolfgang Pauli (1900-58), which states that no two fermions may exist in the same quantum state. The quantum state of a particle is defined by a set of numbers that describe quantities such as energy, angular momentum and spin. Fermions are particles such as quarks, protons, neutrons and electrons, that have spin = ½ (in units of h/2π, where h is ...

  15. Design Principles for Applied Learning: Bringing Theory and Practice Together in an Online VET Teacher-Education Degree

    ERIC Educational Resources Information Center

    Downing, Jillian J.

    2017-01-01

    This paper reports on a doctoral study that investigated an alternative pedagogical approach in an online VET teacher-education course offered at a mid-sized university in Australia. Students in the course were mature-aged and adding study to their role as in-service VET teachers. Building on previous research, a set of design principles was…

  16. Application of theory to family-centered care: a role for social workers.

    PubMed

    Miller, Gary

    2012-01-01

    Family-centered care is an emerging trend in health care settings today. An explanation, principles, and a definition of family-centered care are offered and discussed. A theoretical framework, Balance Theory of Coordination, which can be utilized by social workers to develop and enhance family-centered care practices, is explained and discussed. Various family-centered care practices are examined within the context of Balance Theory of Coordination as examples.

  17. Differentiation with Stratification: A Principle of Theoretical Physics in the Tradition of the Memory Art

    NASA Astrophysics Data System (ADS)

    Pombo, Claudia

    2015-10-01

    The art of memory started with Aristotle's questions on memory. During its long evolution, it had important contributions from alchemists, was transformed by Ramon Llull and apparently ended with Giordano Bruno, who was considered the best known representative of this art. This tradition did not disappear, but lives in the formulations of our modern scientific theories. From its initial form as a method of keeping information via associations, it became a principle of classification and structuring of knowledge. This principle, which we here name differentiation with stratification, is a structural design behind classical mechanics. Integrating two different traditions of science in one structure, this physical theory became the modern paradigm of science. In this paper, we show that this principle can also be formulated as a set of questions. This is done via an analysis of theories, based on the epistemology of observational realism. A combination of Rudolph Carnap's concept of theory as a system of observational and theoretical languages, with a criterion for separating observational languages, based on analytical psychology, shapes this epistemology. The `nuclear' role of the observational laws and the differentiations from these nucleus, reproducing the general cases of phenomena, reveals the memory art's heritage in the theories. Here in this paper we argue that this design is also present in special relativity and in quantum mechanics.

  18. First Principles Optical Absorption Spectra of Organic Molecules Adsorbed on Titania Nanoparticles

    NASA Astrophysics Data System (ADS)

    Baishya, Kopinjol; Ogut, Serdar; Mete, Ersen; Gulseren, Oguz; Ellialtioglu, Sinasi

    2012-02-01

    We present results from first principles computations on passivated rutile TiO2 nanoparticles in both free-standing and dye-sensitized configurations to investigate the size dependence of their optical absorption spectra. The computations are performed using time-dependent density functional theory (TDDFT) as well as GW-Bethe-Salpeter-Equation (GWBSE) methods and compared with each other. We interpret the first principles spectra for free-standing TiO2 nanoparticles within the framework of the classical Mie-Gans theory using the bulk dielectric function of TiO2. We investigate the effects of the titania support on the absorption spectra of a particular set of perylene-diimide (PDI) derived dye molecules, namely brominated PDI (Br2C24H8N2O4) and its glycine and aspartine derivatives.

  19. Intelligent virtual reality in the setting of fuzzy sets

    NASA Technical Reports Server (NTRS)

    Dockery, John; Littman, David

    1992-01-01

    The authors have previously introduced the concept of virtual reality worlds governed by artificial intelligence. Creation of an intelligent virtual reality was further proposed as a universal interface for the handicapped. This paper extends consideration of intelligent virtual realty to a context in which fuzzy set principles are explored as a major tool for implementing theory in the domain of applications to the disabled.

  20. Predictive minimum description length principle approach to inferring gene regulatory networks.

    PubMed

    Chaitankar, Vijender; Zhang, Chaoyang; Ghosh, Preetam; Gong, Ping; Perkins, Edward J; Deng, Youping

    2011-01-01

    Reverse engineering of gene regulatory networks using information theory models has received much attention due to its simplicity, low computational cost, and capability of inferring large networks. One of the major problems with information theory models is to determine the threshold that defines the regulatory relationships between genes. The minimum description length (MDL) principle has been implemented to overcome this problem. The description length of the MDL principle is the sum of model length and data encoding length. A user-specified fine tuning parameter is used as control mechanism between model and data encoding, but it is difficult to find the optimal parameter. In this work, we propose a new inference algorithm that incorporates mutual information (MI), conditional mutual information (CMI), and predictive minimum description length (PMDL) principle to infer gene regulatory networks from DNA microarray data. In this algorithm, the information theoretic quantities MI and CMI determine the regulatory relationships between genes and the PMDL principle method attempts to determine the best MI threshold without the need of a user-specified fine tuning parameter. The performance of the proposed algorithm is evaluated using both synthetic time series data sets and a biological time series data set (Saccharomyces cerevisiae). The results show that the proposed algorithm produced fewer false edges and significantly improved the precision when compared to existing MDL algorithm.

  1. Quantum cellular automata and free quantum field theory

    NASA Astrophysics Data System (ADS)

    D'Ariano, Giacomo Mauro; Perinotti, Paolo

    2017-02-01

    In a series of recent papers [1-4] it has been shown how free quantum field theory can be derived without using mechanical primitives (including space-time, special relativity, quantization rules, etc.), but only considering the easiest quantum algorithm encompassing a countable set of quantum systems whose network of interactions satisfies the simple principles of unitarity, homogeneity, locality, and isotropy. This has opened the route to extending the axiomatic information-theoretic derivation of the quantum theory of abstract systems [5, 6] to include quantum field theory. The inherent discrete nature of the informational axiomatization leads to an extension of quantum field theory to a quantum cellular automata theory, where the usual field theory is recovered in a regime where the discrete structure of the automata cannot be probed. A simple heuristic argument sets the scale of discreteness to the Planck scale, and the customary physical regime where discreteness is not visible is the relativistic one of small wavevectors. In this paper we provide a thorough derivation from principles that in the most general case the graph of the quantum cellular automaton is the Cayley graph of a finitely presented group, and showing how for the case corresponding to Euclidean emergent space (where the group resorts to an Abelian one) the automata leads to Weyl, Dirac and Maxwell field dynamics in the relativistic limit. We conclude with some perspectives towards the more general scenario of non-linear automata for interacting quantum field theory.

  2. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rossi, Tuomas P., E-mail: tuomas.rossi@alumni.aalto.fi; Sakko, Arto; Puska, Martti J.

    We present an approach for generating local numerical basis sets of improving accuracy for first-principles nanoplasmonics simulations within time-dependent density functional theory. The method is demonstrated for copper, silver, and gold nanoparticles that are of experimental interest but computationally demanding due to the semi-core d-electrons that affect their plasmonic response. The basis sets are constructed by augmenting numerical atomic orbital basis sets by truncated Gaussian-type orbitals generated by the completeness-optimization scheme, which is applied to the photoabsorption spectra of homoatomic metal atom dimers. We obtain basis sets of improving accuracy up to the complete basis set limit and demonstrate thatmore » the performance of the basis sets transfers to simulations of larger nanoparticles and nanoalloys as well as to calculations with various exchange-correlation functionals. This work promotes the use of the local basis set approach of controllable accuracy in first-principles nanoplasmonics simulations and beyond.« less

  3. Study unique artistic lopburi province for design brass tea set of bantahkrayang community

    NASA Astrophysics Data System (ADS)

    Pliansiri, V.; Seviset, S.

    2017-07-01

    The objectives of this study were as follows: 1) to study the production process of handcrafted Brass Tea Set; and 2) to design and develop the handcrafted of Brass Tea Set. The process of design was started by mutual analytical processes and conceptual framework for product design, Quality Function Deployment, Theory of Inventive Problem Solving, Principles of Craft Design, and Principle of Reverse Engineering. The experts in field of both Industrial Product Design and Brass Handicraft Product, have evaluated the Brass Tea Set design and created prototype of Brass tea set by the sample of consumers who have ever bought the Brass Tea Set of Bantahkrayang Community on this research. The statistics methods used were percentage, mean ({{{\\overline X}} = }) and standard deviation (S.D.) 3. To assess consumer satisfaction toward of handcrafted Brass tea set was at the high level.

  4. Local Telephone Costs and the Design of Rate Structures,

    DTIC Science & Technology

    1981-05-01

    guide the setting of prices for the multi-product regulated firm. Economic effi- ciency can be increased by designing rate structures that incorporate the... basic principles developed from this theory. These principles call for provisionally pricing each of the firm’s outputs at its marginal cost, testing...rule--prices are increased above marginal costs in inverse proportion to the individual price elasticities of demand. This paper applies ratemaking

  5. Dungeons, Gratings, and Black Rooms: A Defense of Double-Anchoring Theory and a Reply to Howe et al. (2007)

    ERIC Educational Resources Information Center

    Bressan, Paola

    2007-01-01

    Replies to comments mad by Howe et al. on the current author's original article. The double-anchoring theory of lightness (P. Bressan, 2006b) assumes that any given region belongs to a set of frameworks, created by Gestalt grouping principles, and receives a provisional lightness within each of them; the region's final lightness is a weighted…

  6. A coupled mode formulation by reciprocity and a variational principle

    NASA Technical Reports Server (NTRS)

    Chuang, Shun-Lien

    1987-01-01

    A coupled mode formulation for parallel dielectric waveguides is presented via two methods: a reciprocity theorem and a variational principle. In the first method, a generalized reciprocity relation for two sets of field solutions satisfying Maxwell's equations and the boundary conditions in two different media, respectively, is derived. Based on the generalized reciprocity theorem, the coupled mode equations can then be formulated. The second method using a variational principle is also presented for a general waveguide system which can be lossy. The results of the variational principle can also be shown to be identical to those from the reciprocity theorem. The exact relations governing the 'conventional' and the new coupling coefficients are derived. It is shown analytically that the present formulation satisfies the reciprocity theorem and power conservation exactly, while the conventional theory violates the power conservation and reciprocity theorem by as much as 55 percent and the Hardy-Streifer (1985, 1986) theory by 0.033 percent, for example.

  7. A principled and cosmopolitan neuroethics: considerations for international relevance.

    PubMed

    Shook, John R; Giordano, James

    2014-01-03

    Neuroethics applies cognitive neuroscience for prescribing alterations to conceptions of self and society, and for prescriptively judging the ethical applications of neurotechnologies. Plentiful normative premises are available to ground such prescriptivity, however prescriptive neuroethics may remain fragmented by social conventions, cultural ideologies, and ethical theories. Herein we offer that an objectively principled neuroethics for international relevance requires a new meta-ethics: understanding how morality works, and how humans manage and improve morality, as objectively based on the brain and social sciences. This new meta-ethics will simultaneously equip neuroethics for evaluating and revising older cultural ideologies and ethical theories, and direct neuroethics towards scientifically valid views of encultured humans intelligently managing moralities. Bypassing absolutism, cultural essentialisms, and unrealistic ethical philosophies, neuroethics arrives at a small set of principles about proper human flourishing that are more culturally inclusive and cosmopolitan in spirit. This cosmopolitanism in turn suggests augmentations to traditional medical ethics in the form of four principled guidelines for international consideration: empowerment, non-obsolescence, self-creativity, and citizenship.

  8. A principled and cosmopolitan neuroethics: considerations for international relevance

    PubMed Central

    2014-01-01

    Neuroethics applies cognitive neuroscience for prescribing alterations to conceptions of self and society, and for prescriptively judging the ethical applications of neurotechnologies. Plentiful normative premises are available to ground such prescriptivity, however prescriptive neuroethics may remain fragmented by social conventions, cultural ideologies, and ethical theories. Herein we offer that an objectively principled neuroethics for international relevance requires a new meta-ethics: understanding how morality works, and how humans manage and improve morality, as objectively based on the brain and social sciences. This new meta-ethics will simultaneously equip neuroethics for evaluating and revising older cultural ideologies and ethical theories, and direct neuroethics towards scientifically valid views of encultured humans intelligently managing moralities. Bypassing absolutism, cultural essentialisms, and unrealistic ethical philosophies, neuroethics arrives at a small set of principles about proper human flourishing that are more culturally inclusive and cosmopolitan in spirit. This cosmopolitanism in turn suggests augmentations to traditional medical ethics in the form of four principled guidelines for international consideration: empowerment, non-obsolescence, self-creativity, and citizenship. PMID:24387102

  9. Information Theory - The Bridge Connecting Bounded Rational Game Theory and Statistical Physics

    NASA Technical Reports Server (NTRS)

    Wolpert, David H.

    2005-01-01

    A long-running difficulty with conventional game theory has been how to modify it to accommodate the bounded rationality of all red-world players. A recurring issue in statistical physics is how best to approximate joint probability distributions with decoupled (and therefore far more tractable) distributions. This paper shows that the same information theoretic mathematical structure, known as Product Distribution (PD) theory, addresses both issues. In this, PD theory not only provides a principle formulation of bounded rationality and a set of new types of mean field theory in statistical physics; it also shows that those topics are fundamentally one and the same.

  10. Director of nursing and midwifery leadership: informed through the lens of critical social science.

    PubMed

    Solman, Annette

    2010-05-01

    Highlight the use of critical social science theories, practice development principles and a situational leadership framework within transformational leadership to inform Directors of Nursing and Midwifery (DoNM) practices as leaders. Healthcare is constantly changing, unpredictable, strives for quality service and cost containment, which can result in stress and crisis for healthcare workers. DoNM leadership is critical to supporting and leading staff through these complex times within healthcare. Understanding theories, frameworks and their application to real-world practice can assist in supporting individuals and teams to navigate through the changing healthcare environment. Blending critical social science theories with practice development principles and the situational leadership framework can assist the DoNM to enact transformational leadership to support the development of individuals and teams to meet the complex healthcare needs of patients within the clinical setting. IMPLICATIONS FOR NURSE MANAGEMENT: This article contributes through the practical application of critical social science theories, practice development principles and situational leadership framework within transformational leadership as an approach for enacting DoNM leadership. To further understand and develop in the role of the contemporary DoNM in leadership, these directors are encouraged to publish their work.

  11. Variational theorems for superimposed motions in elasticity, with application to beams

    NASA Technical Reports Server (NTRS)

    Doekmeci, M. C.

    1976-01-01

    Variational theorems are presented for a theory of small motions superimposed on large static deformations and governing equations for prestressed beams on the basis of 3-D theory of elastodynamics. First, the principle of virtual work is modified through Friedrichs's transformation so as to describe the initial stress problem of elastodynamics. Next, the modified principle together with a chosen displacement field is used to derive a set of 1-D macroscopic governing equations of prestressed beams. The resulting equations describe all the types of superimposed motions in elastic beams, and they include all the effects of transverse shear and normal strains, and the rotatory inertia. The instability of the governing equations is discussed briefly.

  12. Qualitative Data Analysis for Health Services Research: Developing Taxonomy, Themes, and Theory

    PubMed Central

    Bradley, Elizabeth H; Curry, Leslie A; Devers, Kelly J

    2007-01-01

    Objective To provide practical strategies for conducting and evaluating analyses of qualitative data applicable for health services researchers. Data Sources and Design We draw on extant qualitative methodological literature to describe practical approaches to qualitative data analysis. Approaches to data analysis vary by discipline and analytic tradition; however, we focus on qualitative data analysis that has as a goal the generation of taxonomy, themes, and theory germane to health services research. Principle Findings We describe an approach to qualitative data analysis that applies the principles of inductive reasoning while also employing predetermined code types to guide data analysis and interpretation. These code types (conceptual, relationship, perspective, participant characteristics, and setting codes) define a structure that is appropriate for generation of taxonomy, themes, and theory. Conceptual codes and subcodes facilitate the development of taxonomies. Relationship and perspective codes facilitate the development of themes and theory. Intersectional analyses with data coded for participant characteristics and setting codes can facilitate comparative analyses. Conclusions Qualitative inquiry can improve the description and explanation of complex, real-world phenomena pertinent to health services research. Greater understanding of the processes of qualitative data analysis can be helpful for health services researchers as they use these methods themselves or collaborate with qualitative researchers from a wide range of disciplines. PMID:17286625

  13. Failures of explaining away and screening off in described versus experienced causal learning scenarios.

    PubMed

    Rehder, Bob; Waldmann, Michael R

    2017-02-01

    Causal Bayes nets capture many aspects of causal thinking that set them apart from purely associative reasoning. However, some central properties of this normative theory routinely violated. In tasks requiring an understanding of explaining away and screening off, subjects often deviate from these principles and manifest the operation of an associative bias that we refer to as the rich-get-richer principle. This research focuses on these two failures comparing tasks in which causal scenarios are merely described (via verbal statements of the causal relations) versus experienced (via samples of data that manifest the intervariable correlations implied by the causal relations). Our key finding is that we obtained stronger deviations from normative predictions in the described conditions that highlight the instructed causal model compared to those that presented data. This counterintuitive finding indicate that a theory of causal reasoning and learning needs to integrate normative principles with biases people hold about causal relations.

  14. Scale-covariant theory of gravitation and astrophysical applications

    NASA Technical Reports Server (NTRS)

    Canuto, V.; Adams, P. J.; Hsieh, S.-H.; Tsiang, E.

    1977-01-01

    A scale-covariant theory of gravitation is presented which is characterized by a set of equations that are complete only after a choice of the scale function is made. Special attention is given to gauge conditions and units which allow gravitational phenomena to be described in atomic units. The generalized gravitational-field equations are derived by performing a direct scale transformation, by extending Riemannian geometry to Weyl geometry through the introduction of the notion of cotensors, and from a variation principle. Modified conservation laws are provided, a set of dynamical equations is obtained, and astrophysical consequences are considered. The theory is applied to examine certain homogeneous cosmological solutions, perihelion shifts, light deflections, secular variations of planetary orbital elements, stellar structure equations for a star in quasi-static equilibrium, and the past thermal history of earth. The possible relation of the scale-covariant theory to gauge field theories and their predictions of cosmological constants is discussed.

  15. Ground-state densities from the Rayleigh-Ritz variation principle and from density-functional theory.

    PubMed

    Kvaal, Simen; Helgaker, Trygve

    2015-11-14

    The relationship between the densities of ground-state wave functions (i.e., the minimizers of the Rayleigh-Ritz variation principle) and the ground-state densities in density-functional theory (i.e., the minimizers of the Hohenberg-Kohn variation principle) is studied within the framework of convex conjugation, in a generic setting covering molecular systems, solid-state systems, and more. Having introduced admissible density functionals as functionals that produce the exact ground-state energy for a given external potential by minimizing over densities in the Hohenberg-Kohn variation principle, necessary and sufficient conditions on such functionals are established to ensure that the Rayleigh-Ritz ground-state densities and the Hohenberg-Kohn ground-state densities are identical. We apply the results to molecular systems in the Born-Oppenheimer approximation. For any given potential v ∈ L(3/2)(ℝ(3)) + L(∞)(ℝ(3)), we establish a one-to-one correspondence between the mixed ground-state densities of the Rayleigh-Ritz variation principle and the mixed ground-state densities of the Hohenberg-Kohn variation principle when the Lieb density-matrix constrained-search universal density functional is taken as the admissible functional. A similar one-to-one correspondence is established between the pure ground-state densities of the Rayleigh-Ritz variation principle and the pure ground-state densities obtained using the Hohenberg-Kohn variation principle with the Levy-Lieb pure-state constrained-search functional. In other words, all physical ground-state densities (pure or mixed) are recovered with these functionals and no false densities (i.e., minimizing densities that are not physical) exist. The importance of topology (i.e., choice of Banach space of densities and potentials) is emphasized and illustrated. The relevance of these results for current-density-functional theory is examined.

  16. Using big data to map the network organization of the brain.

    PubMed

    Swain, James E; Sripada, Chandra; Swain, John D

    2014-02-01

    The past few years have shown a major rise in network analysis of "big data" sets in the social sciences, revealing non-obvious patterns of organization and dynamic principles. We speculate that the dependency dimension - individuality versus sociality - might offer important insights into the dynamics of neurons and neuronal ensembles. Connectomic neural analyses, informed by social network theory, may be helpful in understanding underlying fundamental principles of brain organization.

  17. Using big data to map the network organization of the brain

    PubMed Central

    Swain, James E.; Sripada, Chandra; Swain, John D.

    2015-01-01

    The past few years have shown a major rise in network analysis of “big data” sets in the social sciences, revealing non-obvious patterns of organization and dynamic principles. We speculate that the dependency dimension – individuality versus sociality – might offer important insights into the dynamics of neurons and neuronal ensembles. Connectomic neural analyses, informed by social network theory, may be helpful in understanding underlying fundamental principles of brain organization. PMID:24572243

  18. Physics Without Physics. The Power of Information-theoretical Principles

    NASA Astrophysics Data System (ADS)

    D'Ariano, Giacomo Mauro

    2017-01-01

    David Finkelstein was very fond of the new information-theoretic paradigm of physics advocated by John Archibald Wheeler and Richard Feynman. Only recently, however, the paradigm has concretely shown its full power, with the derivation of quantum theory (Chiribella et al., Phys. Rev. A 84:012311, 2011; D'Ariano et al., 2017) and of free quantum field theory (D'Ariano and Perinotti, Phys. Rev. A 90:062106, 2014; Bisio et al., Phys. Rev. A 88:032301, 2013; Bisio et al., Ann. Phys. 354:244, 2015; Bisio et al., Ann. Phys. 368:177, 2016) from informational principles. The paradigm has opened for the first time the possibility of avoiding physical primitives in the axioms of the physical theory, allowing a re-foundation of the whole physics over logically solid grounds. In addition to such methodological value, the new information-theoretic derivation of quantum field theory is particularly interesting for establishing a theoretical framework for quantum gravity, with the idea of obtaining gravity itself as emergent from the quantum information processing, as also suggested by the role played by information in the holographic principle (Susskind, J. Math. Phys. 36:6377, 1995; Bousso, Rev. Mod. Phys. 74:825, 2002). In this paper I review how free quantum field theory is derived without using mechanical primitives, including space-time, special relativity, Hamiltonians, and quantization rules. The theory is simply provided by the simplest quantum algorithm encompassing a countable set of quantum systems whose network of interactions satisfies the three following simple principles: homogeneity, locality, and isotropy. The inherent discrete nature of the informational derivation leads to an extension of quantum field theory in terms of a quantum cellular automata and quantum walks. A simple heuristic argument sets the scale to the Planck one, and the currently observed regime where discreteness is not visible is the so-called "relativistic regime" of small wavevectors, which holds for all energies ever tested (and even much larger), where the usual free quantum field theory is perfectly recovered. In the present quantum discrete theory Einstein relativity principle can be restated without using space-time in terms of invariance of the eigenvalue equation of the automaton/walk under change of representations. Distortions of the Poincaré group emerge at the Planck scale, whereas special relativity is perfectly recovered in the relativistic regime. Discreteness, on the other hand, has some plus compared to the continuum theory: 1) it contains it as a special regime; 2) it leads to some additional features with GR flavor: the existence of an upper bound for the particle mass (with physical interpretation as the Planck mass), and a global De Sitter invariance; 3) it provides its own physical standards for space, time, and mass within a purely mathematical adimensional context. The paper ends with the future perspectives of this project, and with an Appendix containing biographic notes about my friendship with David Finkelstein, to whom this paper is dedicated.

  19. Social cognitive theory, metacognition, and simulation learning in nursing education.

    PubMed

    Burke, Helen; Mancuso, Lorraine

    2012-10-01

    Simulation learning encompasses simple, introductory scenarios requiring response to patients' needs during basic hygienic care and during situations demanding complex decision making. Simulation integrates principles of social cognitive theory (SCT) into an interactive approach to learning that encompasses the core principles of intentionality, forethought, self-reactiveness, and self-reflectiveness. Effective simulation requires an environment conducive to learning and introduces activities that foster symbolic coding operations and mastery of new skills; debriefing builds self-efficacy and supports self-regulation of behavior. Tailoring the level of difficulty to students' mastery level supports successful outcomes and motivation to set higher standards. Mindful selection of simulation complexity and structure matches course learning objectives and supports progressive development of metacognition. Theory-based facilitation of simulated learning optimizes efficacy of this learning method to foster maturation of cognitive processes of SCT, metacognition, and self-directedness. Examples of metacognition that are supported through mindful, theory-based implementation of simulation learning are provided. Copyright 2012, SLACK Incorporated.

  20. Making Semantic Waves: A Key to Cumulative Knowledge-Building

    ERIC Educational Resources Information Center

    Maton, Karl

    2013-01-01

    The paper begins by arguing that knowledge-blindness in educational research represents a serious obstacle to understanding knowledge-building. It then offers sociological concepts from Legitimation Code Theory--"semantic gravity" and "semantic density"--that systematically conceptualize one set of organizing principles underlying knowledge…

  1. Protein structure and evolution: are they constrained globally by a principle derived from information theory?

    PubMed

    Hatton, Leslie; Warr, Gregory

    2015-01-01

    That the physicochemical properties of amino acids constrain the structure, function and evolution of proteins is not in doubt. However, principles derived from information theory may also set bounds on the structure (and thus also the evolution) of proteins. Here we analyze the global properties of the full set of proteins in release 13-11 of the SwissProt database, showing by experimental test of predictions from information theory that their collective structure exhibits properties that are consistent with their being guided by a conservation principle. This principle (Conservation of Information) defines the global properties of systems composed of discrete components each of which is in turn assembled from discrete smaller pieces. In the system of proteins, each protein is a component, and each protein is assembled from amino acids. Central to this principle is the inter-relationship of the unique amino acid count and total length of a protein and its implications for both average protein length and occurrence of proteins with specific unique amino acid counts. The unique amino acid count is simply the number of distinct amino acids (including those that are post-translationally modified) that occur in a protein, and is independent of the number of times that the particular amino acid occurs in the sequence. Conservation of Information does not operate at the local level (it is independent of the physicochemical properties of the amino acids) where the influences of natural selection are manifest in the variety of protein structure and function that is well understood. Rather, this analysis implies that Conservation of Information would define the global bounds within which the whole system of proteins is constrained; thus it appears to be acting to constrain evolution at a level different from natural selection, a conclusion that appears counter-intuitive but is supported by the studies described herein.

  2. The group-as-a-whole-object relations model of group psychotherapy.

    PubMed

    Rosen, D; Stukenberg, K W; Saeks, S

    2001-01-01

    The authors review the theoretical basis of group psychotherapy performed at The Menninger Clinic and demonstrate how the theory has been put into practice on two different types of inpatient units. The fundamental elements of the theory and practice used can be traced to object relations theory as originally proposed by Melanie Klein. Her work with individuals was directly applied to working with groups by Ezriel and Bion, who focused on interpreting group tension. More modern approaches have reintegrated working with individual concerns while also attending to the group-as-a-whole. Historically, these principles have been applied to long-term group treatment. The authors apply the concepts from the group-as-a-whole literature to short- and medium-length inpatient groups with open membership. They offer clinical examples of the application of these principles in short-term inpatient settings in groups with open membership.

  3. Theory of relativistic radiation reflection from plasmas

    NASA Astrophysics Data System (ADS)

    Gonoskov, Arkady

    2018-01-01

    We consider the reflection of relativistically strong radiation from plasma and identify the physical origin of the electrons' tendency to form a thin sheet, which maintains its localisation throughout its motion. Thereby, we justify the principle of relativistic electronic spring (RES) proposed in [Gonoskov et al., Phys. Rev. E 84, 046403 (2011)]. Using the RES principle, we derive a closed set of differential equations that describe the reflection of radiation with arbitrary variation of polarization and intensity from plasma with an arbitrary density profile for an arbitrary angle of incidence. We confirm with ab initio PIC simulations that the developed theory accurately describes laser-plasma interactions in the regime where the reflection of relativistically strong radiation is accompanied by significant, repeated relocation of plasma electrons. In particular, the theory can be applied for the studies of plasma heating and coherent and incoherent emissions in the RES regime of high-intensity laser-plasma interaction.

  4. Mathematical theory of a relaxed design problem in structural optimization

    NASA Technical Reports Server (NTRS)

    Kikuchi, Noboru; Suzuki, Katsuyuki

    1990-01-01

    Various attempts have been made to construct a rigorous mathematical theory of optimization for size, shape, and topology (i.e. layout) of an elastic structure. If these are represented by a finite number of parametric functions, as Armand described, it is possible to construct an existence theory of the optimum design using compactness argument in a finite dimensional design space or a closed admissible set of a finite dimensional design space. However, if the admissible design set is a subset of non-reflexive Banach space such as L(sup infinity)(Omega), construction of the existence theory of the optimum design becomes suddenly difficult and requires to extend (i.e. generalize) the design problem to much more wider class of design that is compatible to mechanics of structures in the sense of variational principle. Starting from the study by Cheng and Olhoff, Lurie, Cherkaev, and Fedorov introduced a new concept of convergence of design variables in a generalized sense and construct the 'G-Closure' theory of an extended (relaxed) optimum design problem. A similar attempt, but independent in large extent, can also be found in Kohn and Strang in which the shape and topology optimization problem is relaxed to allow to use of perforated composites rather than restricting it to usual solid structures. An identical idea is also stated in Murat and Tartar using the notion of the homogenization theory. That is, introducing possibility of micro-scale perforation together with the theory of homogenization, the optimum design problem is relaxed to construct its mathematical theory. It is also noted that this type of relaxed design problem is perfectly matched to the variational principle in structural mechanics.

  5. Quantum Theory of Jaynes' Principle, Bayes' Theorem, and Information

    NASA Astrophysics Data System (ADS)

    Haken, Hermann

    2014-12-01

    After a reminder of Jaynes' maximum entropy principle and of my quantum theoretical extension, I consider two coupled quantum systems A,B and formulate a quantum version of Bayes' theorem. The application of Feynman's disentangling theorem allows me to calculate the conditional density matrix ρ (A|B) , if system A is an oscillator (or a set of them), linearly coupled to an arbitrary quantum system B. Expectation values can simply be calculated by means of the normalization factor of ρ (A|B) that is derived.

  6. A First-Principles Analytical Theory for 2D Magnetic Reconnection in Electron and Hall Magnetohydrodynamics

    NASA Astrophysics Data System (ADS)

    Chacon, L.; Simakov, A. N.; Zocco, A.

    2007-12-01

    Although the relevance of two-fluid effects in fast magnetic reconnection is well-known, (J. Birn et al., J. Geophys. Res., 106 (A3), 3715 (2001) a first-principles theory -- akin to Sweet and Parker's in resistive MHD -- has been elusive. Here, we present such a first principles steady-state analytical theory for electron MHD, (L. Chacón, A. N. Simakov, A. Zocco, Phys. Rev. Lett., submitted) and its extension to Hall MHD. (A. N. Simakov, L. Chacón, in preparation) The theory discretizes the extended MHD equations at the reconnection site, leading to a set of time-dependent ODEs. Their steady-state analysis, which describes the system at or around the point of maximum reconnection rate, provides predictions for the scaling of relevant quantities with the dissipation coefficients (e.g, resistivity and hyper-resistivity) and other relevant parameters. In particular, we will show that EMHD admits both elongated and open-X point configurations of the reconnection region, and that the reconnection rate can be shown not to scale explicitly with the dissipation parameters. This result is, to our knowledge, the first analytical confirmation of the possibility of fast magnetic reconnection in EMHD. In Hall MHD, the transition between resistive MHD and EMHD is studied, and scalings with the ion inertial length are obtained.

  7. A first-principles analytical theory for 2D magnetic reconnection in electron and Hall MHD.

    NASA Astrophysics Data System (ADS)

    Zocco, A.; Simakov, A. N.; Chacon, L.

    2007-11-01

    While the relevance of two-fluid effects in fast magnetic reconnection is well-known,ootnotetextJ. Birn et al., J. Geophys. Res., 106 (A3), pp. 3715--3719 (2001) a first-principles theory --akin to Sweet and Parker's in resistive MHD-- has been elusive. Here, we present such a first principles steady-state theory for electron MHD,ootnotetextL. Chac'on, A. N. Simakov, A. Zocco, Phys. Rev. Lett., submitted and its extension to Hall.ootnotetextA. N. Simakov, L. Chac'on, in preparation The theory discretizes the extended MHD equations at the reconnection site, leading to a set of time-dependent ODEs. Their steady-state analysis provides predictions for the scaling of relevant quantities with the dissipation coefficients (e.g, resistivity and hyper-resistivity) and other relevant parameters. In particular, we will show that EMHD admits both elongated and open-X point configurations of the reconnection region, and that the reconnection rate Ez can be shown not to scale explicitly with the dissipation parameters. This analytic result confirms earlier computational work on the possibility of fast (dissipation-independent) magnetic reconnection in EMHD. We have extended the EMHD results to Hall MHD, and have found a general scaling law for the reconnection rate (and associated length scales) that bridges the gap between resistive and EMHD.

  8. The Other Women: Radicalizing Feminism.

    ERIC Educational Resources Information Center

    Puigvert, Lidia; Darder, Antonia; Merrill, Barbara; de los Reyes, Eileen; Stromquist, Nelly

    A recent international symposium on radicalizing feminism explored ways of developing a dialogic feminism that emphasizes working in different settings under the common goal of including women who have been invisible in the dominant feminist literature by furthering theories and practices based on the principles of dialogic feminism. The seminar…

  9. [The anthropic principle in biology and radiobiology].

    PubMed

    Akif'ev, A P; Degtiarev, S V

    1999-01-01

    In accordance with the anthropic principle of the Universe the physical constants of fundamental particles of matter and the laws of their counteraction are those that an appearance of man and mind becomes possible and necessary. It is suggested to add some biological constants to the set of fundamental constants. With reparation of DNA as an example it was shown how a cell ran some parameters of Watson-Crick double helix. It was pointed that the concept of the anthropic principle of the Universe in its full body including biological constants is a key to developing of a unified theory of evolution of the Universe within the limits of scientific creationism.

  10. Several foundational and information theoretic implications of Bell’s theorem

    NASA Astrophysics Data System (ADS)

    Kar, Guruprasad; Banik, Manik

    2016-08-01

    In 1935, Albert Einstein and two colleagues, Boris Podolsky and Nathan Rosen (EPR) developed a thought experiment to demonstrate what they felt was a lack of completeness in quantum mechanics (QM). EPR also postulated the existence of more fundamental theory where physical reality of any system would be completely described by the variables/states of that fundamental theory. This variable is commonly called hidden variable and the theory is called hidden variable theory (HVT). In 1964, John Bell proposed an empirically verifiable criterion to test for the existence of these HVTs. He derived an inequality, which must be satisfied by any theory that fulfill the conditions of locality and reality. He also showed that QM, as it violates this inequality, is incompatible with any local-realistic theory. Later it has been shown that Bell’s inequality (BI) can be derived from different set of assumptions and it also find applications in useful information theoretic protocols. In this review, we will discuss various foundational as well as information theoretic implications of BI. We will also discuss about some restricted nonlocal feature of quantum nonlocality and elaborate the role of Uncertainty principle and Complementarity principle in explaining this feature.

  11. Perfect fluids in the Einstein-Cartan theory

    NASA Technical Reports Server (NTRS)

    Ray, J. R.; Smalley, L. J.

    1982-01-01

    It is pointed out that whereas most of the discussion of the Einstein-Cartan (EC) theory involves the relationship between gravitation and elementary particles, it is possible that the theory, if correct, may be important in certain extreme astrophysical and cosmological problems. The latter would include something like the collapse of a spinning star or an early universe with spin. A set of equations that describe a macroscopic perfect fluid in the EC theory is derived and examined. The equations are derived starting from the fundamental variational principle for a perfect fluid in general relativity. A brief review of the study by Ray (1972) is included, and the results for the EC theory are presented.

  12. Current Concerns in Validity Theory.

    ERIC Educational Resources Information Center

    Kane, Michael

    Validity is concerned with the clarification and justification of the intended interpretations and uses of observed scores. It has not been easy to formulate a general methodology set of principles for validation, but progress has been made, especially as the field has moved from relatively limited criterion-related models to sophisticated…

  13. Using "Relationship Marketing" Theory To Develop a Training Model for Admissions Recruiters.

    ERIC Educational Resources Information Center

    Gyure, James F.; Arnold, Susan G.

    2001-01-01

    Addresses a critical aspect of enrollment management by providing a "conceptual training outline" based on relationship marketing and management principles for admissions recruiters and other appropriate enrollment staff. Provides a set of "Attitude Tools" to suggest how various training methods might benefit from a consistent…

  14. GEOMETRY, TENTATIVE GUIDES.

    ERIC Educational Resources Information Center

    KLIER, KATHERINE M.

    PRESENTED IS A FUSED COURSE IN PLANE, SOLID, AND COORDINATE GEOMETRY. ELEMENTARY SET THEORY, LOGIC, AND THE PRINCIPLE OF SEPARATION PROVIDE UNIFYING THREADS THROUGHOUT THE TEXT. THE TWO CURRICULUM GUIDES HAVE BEEN PREPARED FOR USE WITH TWO DIFFERENT TEXTS. EITHER CURRICULUM GUIDE MAY BE USED DEPENDING UPON THE CHOICE OF THE TEACHER AND THE NEEDS…

  15. What Is Diversity Pedagogy?

    ERIC Educational Resources Information Center

    Sheets, Rosa Hernandez

    2009-01-01

    Diversity Pedagogy Theory (DPT) is a set of principles that point out the natural and inseparable connection between culture and cognition. In other words, to be effective as a teacher, he/she must understand and acknowledge the critical role culture plays in the teaching-learning process. DPT maintains that culturally inclusive teachers (a)…

  16. Generalized global symmetries in states with dynamical defects: The case of the transverse sound in field theory and holography

    NASA Astrophysics Data System (ADS)

    Grozdanov, Sašo; Poovuttikul, Napat

    2018-05-01

    In this work, we show how states with conserved numbers of dynamical defects (strings, domain walls, etc.) can be understood as possessing generalized global symmetries even when the microscopic origins of these symmetries are unknown. Using this philosophy, we build an effective theory of a 2 +1 -dimensional fluid state with two perpendicular sets of immersed elastic line defects. When the number of defects is independently conserved in each set, then the state possesses two one-form symmetries. Normally, such viscoelastic states are described as fluids coupled to Goldstone bosons associated with spontaneous breaking of translational symmetry caused by the underlying microscopic structure—the principle feature of which is a transverse sound mode. At the linear, nondissipative level, we verify that our theory, based entirely on symmetry principles, is equivalent to a viscoelastic theory. We then build a simple holographic dual of such a state containing dynamical gravity and two two-form gauge fields, and use it to study its hydrodynamic and higher-energy spectral properties characterized by nonhydrodynamic, gapped modes. Based on the holographic analysis of transverse two-point functions, we study consistency between low-energy predictions of the bulk theory and the effective boundary theory. Various new features of the holographic dictionary are explained in theories with higher-form symmetries, such as the mixed-boundary-condition modification of the quasinormal mode prescription that depends on the running coupling of the boundary double-trace deformations. Furthermore, we examine details of low- and high-energy parts of the spectrum that depend on temperature, line defect densities and the renormalization group scale.

  17. On a New Theory of the System of Reference

    NASA Astrophysics Data System (ADS)

    Kalanov, Temur Z.

    2003-04-01

    A new theory of the system of reference is suggested. It represents the new point of view which has arisen from the critical analysis of the foundations of physics (in particular, the theory of relativity and quantum mechanics), mathematics, cosmology and philosophy. The main idea following from the analysis is that the correct concept of system of reference represents a key to comprehension of many basic logic errors which are in modern physics. The starting point of the theory is represented by the philosophical (dialectical materialistic) principles, in particular, the gnosiological principle. (The gnosiological principle is briefly formulated as follows. The purpose of a science is to know the laws of the Nature. The law is a form of scientific knowledge of the essence and the phenomenon. The essence is the internal basis of the phenomenon, and the phenomenon is the manifestation of the essence. Human practice is a basis of knowledge and a criterion of truth). These principles lead to the following statements. (1) The reality is the dialectical unity of the opposites: the objective reality and the non-objective (subjective) reality. (2) The system mankind + means of knowledge belongs to the subjective reality and is called system of reference. In this wide sense, the system of reference is the universal informational gnostic basis (i.e. the system consisting of natural objects and processes, of constructed devices and instruments, of sum of human knowledge and skills) created and used by mankind for the purpose of knowledge of the world. (3) The opposites are bounds of each other. Hence, the principle of objectivity of the physical laws is formulated as follows: the objective physical laws must not contain mentions of system of reference (in particular, references to procedure of measurement or of calculation). (4) The main informational property of the unitary system set of researches physical objects + system of reference is that the system of reference determines (measures, calculates) the parameters of the subsystem set of researched physical objects (for example, the coordinates x_M, y_M, zM of the object M); the parameters characterize the system of reference (for example, the system of coordinates). (5) The main gnostic property of the unitary system set of researches physical objects + system of reference is that the system of reference defines (formulates) the physical laws (i.e. creates the theories); the physical laws characterize the system of reference. (6) The parameters which take on values independently of existence of the researched physical objects characterize the system of reference. For example, the clock C, a part of the system of reference S, determines (but it does not measure!) the time t_C; the time tC characterizes the clock C. If all clocks have been synchronized, the universal time tS characterizes the system of reference S. (7) Researched physical object M and a clock are mutually independent objects. Hence, the coordinates x_M, y_M, zM and the time tS are mutually independent parameters. (8) The informational one-to-one correspondence between motion of object M and physical clock-process in clock is established (is defined) by man. For example, it has a form: dx_M/dtS ≡ v_x_M. Consequences: (a) information about the world is an ordered information because the system of reference S is an ordered and universal system. This information is an objective one if it does not depend on a system of reference; (b) mathematical operations on physical quantities with the coordinates and with the time are allowed by the laws of logic because the set of researches physical objects + system of reference is a unitary system; (c) the principle of existence and of transformation of coordinates: there are no coordinates and no transformation of coordinates in general, and there exist the coordinates x_M, y_M, zM and transformation of the coordinates x_M, y_M, zM of the object M only; (d) the special and general theories of relativity are an erroneous theories because their foundations, firstly, do not satisfy the principle of objectivity of the physical laws, secondly, they contradict the principle of transformation of coordinates and, thirdly, they assume mutual dependence between the researched physical object and a clock (i.e. between coordinates and time); (e) quantum mechanics does not satisfy the principle of objectivity of the physical laws.

  18. Bioethics of life programs: taking seriously moral pluralism in clinical settings.

    PubMed

    Niebroj, Leslaw

    2010-11-04

    In the more and more globalized world, the experience of moral pluralism (often related to, or based upon, religious pluralism) has become a common issue which ethical importance is undeniable. Potential conflicts between patients' and therapeutic teams' moral views and between moral beliefs of the particular member of this team are being resolved in the light of bioethical theories, among which principlism remains the mainstream approach to biomedical ethics. The question arises, however, whether this approach, in itself, as being strictly bound to the specific and distinct American philosophical tradition, is to be considered the tool for so called ?moral imperialism'. Also architectures of principlism, in particular by elaborating the concept of common morality, defend the applicability of their theory to the pluralistic settings, it should be emphasized that the idea that some norms and standards of moral character are shared by all morally serious people in every culture has attracted criticism both from empirical as well as theoretical backgrounds. This paper aims at reconsidering principlism so that it would be more suitable for resolving moral dilemma in ethically pluralistic clinical settings. Lakatos' sophisticated methodological falsification is used into two different ways: (1) to construct a concept of 'life programs' and (2) to confront a newly elaborated ethical theory with principlism. The reflection is limited to the norms related to the key issue in clinical ethics, i.e., respecting the patient's autonomy. The concepts of common morality and particular moralities are interpreted (in the light of Lakatos' philosophy of sciences) as "hard core" and "protective belt" of life programs, respectively. Accepting diversity of research programs, Lakatos maintains the idea of the objectivity of truth. Analogously, the plurality of life programs does not put into question the objectivity of moral values. The plurality of moral norms not only respects the objectivity of the good, but also can be seen as a condition sine qua non of such objectivity in the changing socio-historical context of doctor-patient relationship. The life program approach to bioethics and clinical ethics in particular, can be seen as a form of widening of principlism. This new approach, being non-relativistic, is at the same time sensitive to moral pluralism experienced in everyday medical practice.

  19. Bioethics of life programs: Taking seriously moral pluralism in clinical settings

    PubMed Central

    2010-01-01

    Background In the more and more globalized world, the experience of moral pluralism (often related to, or based upon, religious pluralism) has become a common issue which ethical importance is undeniable. Potential conflicts between patients' and therapeutic teams' moral views and between moral beliefs of the particular member of this team are being resolved in the light of bioethical theories, among which principlism remains the mainstream approach to biomedical ethics. The question arises, however, whether this approach, in itself, as being strictly bound to the specific and distinct American philosophical tradition, is to be considered the tool for so called 'moral imperialism'. Also architectures of principlism, in particular by elaborating the concept of common morality, defend the applicability of their theory to the pluralistic settings, it should be emphasized that the idea that some norms and standards of moral character are shared by all morally serious people in every culture has attracted criticism both from empirical as well as theoretical backgrounds. Objective This paper aims at reconsidering principlism so that it would be more suitable for resolving moral dilemma in ethically pluralistic clinical settings. Methods Lakatos' sophisticated methodological falsification is used into two different ways: (1) to construct a concept of 'life programs' and (2) to confront a newly elaborated ethical theory with principlism. The reflection is limited to the norms related to the key issue in clinical ethics, i.e., respecting the patient's autonomy. Results The concepts of common morality and particular moralities are interpreted (in the light of Lakatos' philosophy of sciences) as 'hard core' and 'protective belt' of life programs, respectively. Accepting diversity of research programs, Lakatos maintains the idea of the objectivity of truth. Analogously, the plurality of life programs does not put into question the objectivity of moral values. The plurality of moral norms not only respects the objectivity of the good, but also can be seen as a condition sine qua non of such objectivity in the changing socio-historical context of doctor-patient relationship. Conclusions The life program approach to bioethics and clinical ethics in particular, can be seen as a form of widening of principlism. This new approach, being non-relativistic, is at the same time sensitive to moral pluralism experienced in everyday medical practice. PMID:21147632

  20. A novel gene network inference algorithm using predictive minimum description length approach.

    PubMed

    Chaitankar, Vijender; Ghosh, Preetam; Perkins, Edward J; Gong, Ping; Deng, Youping; Zhang, Chaoyang

    2010-05-28

    Reverse engineering of gene regulatory networks using information theory models has received much attention due to its simplicity, low computational cost, and capability of inferring large networks. One of the major problems with information theory models is to determine the threshold which defines the regulatory relationships between genes. The minimum description length (MDL) principle has been implemented to overcome this problem. The description length of the MDL principle is the sum of model length and data encoding length. A user-specified fine tuning parameter is used as control mechanism between model and data encoding, but it is difficult to find the optimal parameter. In this work, we proposed a new inference algorithm which incorporated mutual information (MI), conditional mutual information (CMI) and predictive minimum description length (PMDL) principle to infer gene regulatory networks from DNA microarray data. In this algorithm, the information theoretic quantities MI and CMI determine the regulatory relationships between genes and the PMDL principle method attempts to determine the best MI threshold without the need of a user-specified fine tuning parameter. The performance of the proposed algorithm was evaluated using both synthetic time series data sets and a biological time series data set for the yeast Saccharomyces cerevisiae. The benchmark quantities precision and recall were used as performance measures. The results show that the proposed algorithm produced less false edges and significantly improved the precision, as compared to the existing algorithm. For further analysis the performance of the algorithms was observed over different sizes of data. We have proposed a new algorithm that implements the PMDL principle for inferring gene regulatory networks from time series DNA microarray data that eliminates the need of a fine tuning parameter. The evaluation results obtained from both synthetic and actual biological data sets show that the PMDL principle is effective in determining the MI threshold and the developed algorithm improves precision of gene regulatory network inference. Based on the sensitivity analysis of all tested cases, an optimal CMI threshold value has been identified. Finally it was observed that the performance of the algorithms saturates at a certain threshold of data size.

  1. Starlings uphold principles of economic rationality for delay and probability of reward.

    PubMed

    Monteiro, Tiago; Vasconcelos, Marco; Kacelnik, Alex

    2013-04-07

    Rationality principles are the bedrock of normative theories of decision-making in biology and microeconomics, but whereas in microeconomics, consistent choice underlies the notion of utility; in biology, the assumption of consistent selective pressures justifies modelling decision mechanisms as if they were designed to maximize fitness. In either case, violations of consistency contradict expectations and attract theoretical interest. Reported violations of rationality in non-humans include intransitivity (i.e. circular preferences) and lack of independence of irrelevant alternatives (changes in relative preference between options when embedded in different choice sets), but the extent to which these observations truly represent breaches of rationality is debatable. We tested both principles with starlings (Sturnus vulgaris), training subjects either with five options differing in food delay (exp. 1) or with six options differing in reward probability (exp. 2), before letting them choose repeatedly one option out of several binary and trinary sets of options. The starlings conformed to economic rationality on both tests, showing strong stochastic transitivity and no violation of the independence principle. These results endorse the rational choice and optimality approaches used in behavioural ecology, and highlight the need for functional and mechanistic enquiring when apparent violations of such principles are observed.

  2. Starlings uphold principles of economic rationality for delay and probability of reward

    PubMed Central

    Monteiro, Tiago; Vasconcelos, Marco; Kacelnik, Alex

    2013-01-01

    Rationality principles are the bedrock of normative theories of decision-making in biology and microeconomics, but whereas in microeconomics, consistent choice underlies the notion of utility; in biology, the assumption of consistent selective pressures justifies modelling decision mechanisms as if they were designed to maximize fitness. In either case, violations of consistency contradict expectations and attract theoretical interest. Reported violations of rationality in non-humans include intransitivity (i.e. circular preferences) and lack of independence of irrelevant alternatives (changes in relative preference between options when embedded in different choice sets), but the extent to which these observations truly represent breaches of rationality is debatable. We tested both principles with starlings (Sturnus vulgaris), training subjects either with five options differing in food delay (exp. 1) or with six options differing in reward probability (exp. 2), before letting them choose repeatedly one option out of several binary and trinary sets of options. The starlings conformed to economic rationality on both tests, showing strong stochastic transitivity and no violation of the independence principle. These results endorse the rational choice and optimality approaches used in behavioural ecology, and highlight the need for functional and mechanistic enquiring when apparent violations of such principles are observed. PMID:23390098

  3. Master surgeons' operative teaching philosophies: a qualitative analysis of parallels to learning theory.

    PubMed

    Pernar, Luise I M; Ashley, Stanley W; Smink, Douglas S; Zinner, Michael J; Peyre, Sarah E

    2012-01-01

    Practicing within the Halstedian model of surgical education, academic surgeons serve dual roles as physicians to their patients and educators of their trainees. Despite this significant responsibility, few surgeons receive formal training in educational theory to inform their practice. The goal of this work was to gain an understanding of how master surgeons approach teaching uncommon and highly complex operations and to determine the educational constructs that frame their teaching philosophies and approaches. Individuals included in the study were queried using electronically distributed open-ended, structured surveys. Responses to the surveys were analyzed and grouped using grounded theory and were examined for parallels to concepts of learning theory. Academic teaching hospital. Twenty-two individuals identified as master surgeons. Twenty-one (95.5%) individuals responded to the survey. Two primary thematic clusters were identified: global approach to teaching (90.5% of respondents) and approach to intraoperative teaching (76.2%). Many of the emergent themes paralleled principles of transfer learning theory outlined in the psychology and education literature. Key elements included: conferring graduated responsibility (57.1%), encouraging development of a mental set (47.6%), fostering or expecting deliberate practice (42.9%), deconstructing complex tasks (38.1%), vertical transfer of information (33.3%), and identifying general principles to structure knowledge (9.5%). Master surgeons employ many of the principles of learning theory when teaching uncommon and highly complex operations. The findings may hold significant implications for faculty development in surgical education. Copyright © 2012 Association of Program Directors in Surgery. Published by Elsevier Inc. All rights reserved.

  4. What is the role of theories in the study of schizophrenia?

    PubMed

    Cannon, Tyrone D

    2009-05-01

    As an epilogue to the themed papers on "Theories of Schizophrenia" in this issue of Schizophrenia Bulletin, this article reviews some basic philosophy of science principles in regard to the role of theories in the evolving state of a natural science discipline. While in early phases inductive and abductive logic are the primary vehicles for organizing observations and developing models, when a critical set of "facts" have been elucidated which can be explained by competing theoretical perspectives, hypothetico-deductive logic provides a more robust and efficient approach to scientific progress. The key principle is to determine where two or more theories predict different observations and then to devise studies that collect critical observations-correlations or experimental outcomes that are predicted differentially by the competing theories. To a large extent, current theories of schizophrenia (eg, focusing on aberrant dopaminergic signaling, neural dysconnectivity, and disrupted neural development) are not (and are not intended by their authors to be) mutually exclusive of each other. Rather, they provide explanations that differ in relative emphases, eg, on distal vs proximal causes and on broad vs narrow behavioral end points. It is therefore possible for all of them to be "right" at least in a general sense. This non-exclusivity is problematic when considered in light of the strong inferences principles characteristic of a mature natural science discipline. The contrast points are likely to be found in constructions that integrate influences across different levels of analysis, as in additive vs interactive models, direct effects vs mediation models, and developmental vs deteriorative models.

  5. Prioritizing health services research: an economic perspective.

    PubMed

    Gandjour, Afschin

    2016-05-01

    Given limited resources policymakers need to decide about how much and in what areas of health services research (HSR) to invest. The purpose of this study is to provide guidance for priority setting of HSR projects based on economic theory. The conceptual analysis starts from the premise that competition in health care is valuable-a position that seems to predominate among Western policymakers. The principle of competition rests on economic theory and, in particular, its branch of welfare economics. Based on economic theory, the role of HSR is to detect and alleviate information asymmetry, negative externalities, and harm caused by competition and inappropriate incentives for competition. A hierarchy of HSR projects is provided; following the ethical principle of harm ('do not harm'), the detection and prevention of harm would receive highest priority among HSR projects. Agreeing that competition is valuable in achieving efficiency and quality of care (and therefore agreeing to the assumptions of economic theory) implies accepting the role of HSR in detecting market failure and the HSR hierarchy as suggested. Disagreement would require an alternative coherent concept of improving efficiency and quality of care.

  6. The differential effects of tangible rewards and praise on intrinsic motivation: A comparison of cognitive evaluation theory and operant theory.

    PubMed

    Carton, J S

    1996-01-01

    Substantial research indicates that tangible rewards, such as money, prizes, and tokens, decrease response rates by undermining intrinsic motivation. In contrast, praise appears to increase response rates by enhancing intrinsic motivation. Based on their interpretation of available evidence, many social-cognitive researchers warn not to use tangible rewards in applied settings and to use praise instead. Furthermore, they suggest that the differential effects of the two types of rewards on intrinsic motivation cannot be explained using principles of operant psychology. Cognitive evaluation theory provides one of the most recent and widely cited social-cognitive explanations for the different effects of the two types of rewards on intrinsic motivation (Deci & Ryan, 1985). However, a review of existing research found little support for the explanations based on this theory and revealed three potential confounding effects: (a) temporal contiguity, (b) the number of reward administrations, and (c) discriminative stimuli associated with reward availability. These three confounding factors provide explanations for the effects of tangible rewards and praise on intrinsic motivation that are consistent with principles of operant psychology.

  7. The differential effects of tangible rewards and praise on intrinsic motivation: A comparison of cognitive evaluation theory and operant theory

    PubMed Central

    Carton, John S.

    1996-01-01

    Substantial research indicates that tangible rewards, such as money, prizes, and tokens, decrease response rates by undermining intrinsic motivation. In contrast, praise appears to increase response rates by enhancing intrinsic motivation. Based on their interpretation of available evidence, many social-cognitive researchers warn not to use tangible rewards in applied settings and to use praise instead. Furthermore, they suggest that the differential effects of the two types of rewards on intrinsic motivation cannot be explained using principles of operant psychology. Cognitive evaluation theory provides one of the most recent and widely cited social-cognitive explanations for the different effects of the two types of rewards on intrinsic motivation (Deci & Ryan, 1985). However, a review of existing research found little support for the explanations based on this theory and revealed three potential confounding effects: (a) temporal contiguity, (b) the number of reward administrations, and (c) discriminative stimuli associated with reward availability. These three confounding factors provide explanations for the effects of tangible rewards and praise on intrinsic motivation that are consistent with principles of operant psychology. PMID:22478261

  8. Resource Theory of Superposition

    NASA Astrophysics Data System (ADS)

    Theurer, T.; Killoran, N.; Egloff, D.; Plenio, M. B.

    2017-12-01

    The superposition principle lies at the heart of many nonclassical properties of quantum mechanics. Motivated by this, we introduce a rigorous resource theory framework for the quantification of superposition of a finite number of linear independent states. This theory is a generalization of resource theories of coherence. We determine the general structure of operations which do not create superposition, find a fundamental connection to unambiguous state discrimination, and propose several quantitative superposition measures. Using this theory, we show that trace decreasing operations can be completed for free which, when specialized to the theory of coherence, resolves an outstanding open question and is used to address the free probabilistic transformation between pure states. Finally, we prove that linearly independent superposition is a necessary and sufficient condition for the faithful creation of entanglement in discrete settings, establishing a strong structural connection between our theory of superposition and entanglement theory.

  9. Implementing accountability for reasonableness--the case of pharmaceutical reimbursement in Sweden.

    PubMed

    Jansson, Sandra

    2007-04-01

    This paper aims to describe the priority-setting procedure for new original pharmaceuticals practiced by the Swedish Pharmaceutical Benefits Board (LFN), to analyse the outcome of the procedure in terms of decisions and the relative importance of ethical principles, and to examine the reactions of stakeholders. All the 'principally important' decisions made by the LFN during its first 33 months of operation were analysed. The study is theoretically anchored in the theory of fair and legitimate priority-setting procedures by Daniels and Sabin, and is based on public documents, media articles, and semi-structured interviews. Only nine cases resulted in a rejection of a subsidy by the LFN and 15 in a limited or conditional subsidy. Total rejections rather than limitations gave rise to actions by stakeholders. Primarily, the principle of cost-effectiveness was used when limiting/conditioning or totally rejecting a subsidy. This study suggests that implementing a priority-setting process that fulfils the conditions of accountability for reasonableness can result in a priority-setting process which is generally perceived as fair and legitimate by the major stakeholders and may increase social learning in terms of accepting the necessity of priority setting in health care. The principle of cost-effectiveness increased in importance when the demand for openness and transparency increased.

  10. Inertial Frames Without the Relativity Principle: Breaking Lorentz Symmetry

    NASA Astrophysics Data System (ADS)

    Baccetti, Valentina; Tate, Kyle; Visser, Matt

    2015-01-01

    We investigate inertial frames in the absence of Lorentz invariance, reconsidering the usual group structure implied by the relativity principle. We abandon the relativity principle, discarding the group structure for the transformations between inertial frames, while requiring these transformations to be at least linear (to preserve homogeneity). In theories with a preferred frame (aether), the set of transformations between inertial frames forms a groupoid/pseudogroup instead of a group, a characteristic essential to evading the von Ignatowsky theorems. In order to understand the dynamics, we also demonstrate that the transformation rules for energy and momentum are in general affine. We finally focus on one specific and compelling model implementing a minimalist violation of Lorentz invariance.

  11. Theory-Driven Intervention for Changing Personality: Expectancy Value Theory, Behavioral Activation, and Conscientiousness

    PubMed Central

    Magidson, Jessica F.; Roberts, Brent; Collado-Rodriguez, Anahi; Lejuez, C.W.

    2013-01-01

    Considerable evidence suggests that personality traits may be changeable, raising the possibility that personality traits most linked to health problems can be modified with intervention. A growing body of research suggests that problematic personality traits may be altered with behavioral intervention using a bottom-approach. That is, by targeting core behaviors that underlie personality traits with the goal of engendering new, healthier patterns of behavior that over time become automatized and manifest in changes in personality traits. Nevertheless, a bottom-up model for changing personality traits is somewhat diffuse and requires clearer integration of theory and relevant interventions to enable real clinical application. As such, this manuscript proposes a set of guiding principles for theory-driven modification of targeted personality traits using a bottom-up approach, focusing specifically on targeting the trait of conscientiousness using a relevant behavioral intervention, Behavioral Activation (BA), considered within the motivational framework of Expectancy Value Theory (EVT). We conclude with a real case example of the application of BA to alter behaviors counter to conscientiousness in a substance dependent patient, highlighting the EVT principles most relevant to the approach and the importance and viability of a theoretically-driven, bottom-up approach to changing personality traits. PMID:23106844

  12. Illustrating Caffeine's Pharmacological and Expectancy Effects Utilizing a Balanced Placebo Design.

    ERIC Educational Resources Information Center

    Lotshaw, Sandra C.; And Others

    1996-01-01

    Hypothesizes that pharmacological and expectancy effects may be two principles that govern caffeine consumption in the same way they affect other drug use. Tests this theory through a balanced placebo design on 100 male undergraduate students. Expectancy set and caffeine content appeared equally powerful, and worked additionally, to affect…

  13. Socio-Economic and Educational Reforms in Ethiopia (1942-1974): Correspondence and Contradiction.

    ERIC Educational Resources Information Center

    Asayehgn, Desta

    Using the theory of correspondence and contradiction, the author analyzes the interaction between socioeconomic and educational changes in Ethiopia from 1942 to 1974. An introductory section sets forth the principles of correspondence and contradiction, which refer to how the means of economic production determine conditions in the noneconomic…

  14. Graphic Arts: Book One. Orientation, Composition, and Paste-up.

    ERIC Educational Resources Information Center

    Farajollahi, Karim; And Others

    The first of a three-volume set of instructional materials for a graphic arts course, this manual consists of 13 instructional units. Covered in the units are orientation (career overview, shop safety, shop organization, photo-offset theory, legal restrictions, and applying for a job); principles of copy planning (overview of copy planning and…

  15. Student Development in Urban Commuter Colleges.

    ERIC Educational Resources Information Center

    Creamer, Don G.

    A conceptual view of student development and the milieu of an urban commuter college are discussed. Student development is defined as the application of human development theory, principles, and concepts in an educational setting to identify the forms of development in students to which the institution is willing and able to commit its resources.…

  16. Arguments for a Common Set of Principles for Collaborative Inquiry in Evaluation

    ERIC Educational Resources Information Center

    Cousins, J. Bradley; Whitmore, Elizabeth; Shulha, Lyn

    2013-01-01

    In this article, we critique two recent theoretical developments about collaborative inquiry in evaluation--using logic models as a means to understand theory, and efforts to compartmentalize versions of collaborative inquiry into discrete genres--as a basis for considering future direction for the field. We argue that collaborative inquiry in…

  17. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Faizal, Mir, E-mail: f2mir@uwaterloo.ca; Majumder, Barun, E-mail: barunbasanta@iitgn.ac.in

    In this paper, we will incorporate the generalized uncertainty principle into field theories with Lifshitz scaling. We will first construct both bosonic and fermionic theories with Lifshitz scaling based on generalized uncertainty principle. After that we will incorporate the generalized uncertainty principle into a non-abelian gauge theory with Lifshitz scaling. We will observe that even though the action for this theory is non-local, it is invariant under local gauge transformations. We will also perform the stochastic quantization of this Lifshitz fermionic theory based generalized uncertainty principle.

  18. Adiabatic quantum computation

    NASA Astrophysics Data System (ADS)

    Albash, Tameem; Lidar, Daniel A.

    2018-01-01

    Adiabatic quantum computing (AQC) started as an approach to solving optimization problems and has evolved into an important universal alternative to the standard circuit model of quantum computing, with deep connections to both classical and quantum complexity theory and condensed matter physics. This review gives an account of the major theoretical developments in the field, while focusing on the closed-system setting. The review is organized around a series of topics that are essential to an understanding of the underlying principles of AQC, its algorithmic accomplishments and limitations, and its scope in the more general setting of computational complexity theory. Several variants are presented of the adiabatic theorem, the cornerstone of AQC, and examples are given of explicit AQC algorithms that exhibit a quantum speedup. An overview of several proofs of the universality of AQC and related Hamiltonian quantum complexity theory is given. Considerable space is devoted to stoquastic AQC, the setting of most AQC work to date, where obstructions to success and their possible resolutions are discussed.

  19. A first-principles examination of the asymmetric induction model in the binap/Rh(I)-catalysed 1,4-addition of phenylboronic acid to cyclic enones by density functional theory calculations.

    PubMed

    Qin, Hua-Li; Chen, Xiao-Qing; Huang, Yi-Zhen; Kantchev, Eric Assen B

    2014-09-26

    First-principles modelling of the diastereomeric transition states in the enantiodiscrimination stage of the catalytic cycle can reveal intimate details about the mechanism of enantioselection. This information can be invaluable for further improvement of the catalytic protocols by rational design. Herein, we present a density functional theory (IEFPCM/PBE0/DGDZVP level of theory) modelling of the carborhodation step for the asymmetric 1,4-arylation of cyclic α,β-unsaturated ketones mediated by a [(binap)Rh(I)] catalyst. The calculations completely support the older, qualitative, pictorial model predicting the sense of the asymmetric induction for both the chelating diphosphane (binap) and the more recent chiral diene (Phbod) ligands, while also permitting quantification of the enantiomeric excess (ee). The effect of dispersion interaction correction and basis sets has been also investigated. Dispersion-corrected functionals and solvation models significantly improve the predicted ee values. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  20. Molecular Theory of Detonation Initiation: Insight from First Principles Modeling of the Decomposition Mechanisms of Organic Nitro Energetic Materials.

    PubMed

    Tsyshevsky, Roman V; Sharia, Onise; Kuklja, Maija M

    2016-02-19

    This review presents a concept, which assumes that thermal decomposition processes play a major role in defining the sensitivity of organic energetic materials to detonation initiation. As a science and engineering community we are still far away from having a comprehensive molecular detonation initiation theory in a widely agreed upon form. However, recent advances in experimental and theoretical methods allow for a constructive and rigorous approach to design and test the theory or at least some of its fundamental building blocks. In this review, we analyzed a set of select experimental and theoretical articles, which were augmented by our own first principles modeling and simulations, to reveal new trends in energetic materials and to refine known existing correlations between their structures, properties, and functions. Our consideration is intentionally limited to the processes of thermally stimulated chemical reactions at the earliest stage of decomposition of molecules and materials containing defects.

  1. Molecular Theory of Detonation Initiation: Insight from First Principles Modeling of the Decomposition Mechanisms of Organic Nitro Energetic Materials

    DOE PAGES

    Tsyshevsky, Roman; Sharia, Onise; Kuklja, Maija

    2016-02-19

    Our review presents a concept, which assumes that thermal decomposition processes play a major role in defining the sensitivity of organic energetic materials to detonation initiation. As a science and engineering community we are still far away from having a comprehensive molecular detonation initiation theory in a widely agreed upon form. However, recent advances in experimental and theoretical methods allow for a constructive and rigorous approach to design and test the theory or at least some of its fundamental building blocks. In this review, we analyzed a set of select experimental and theoretical articles, which were augmented by our ownmore » first principles modeling and simulations, to reveal new trends in energetic materials and to refine known existing correlations between their structures, properties, and functions. Lastly, our consideration is intentionally limited to the processes of thermally stimulated chemical reactions at the earliest stage of decomposition of molecules and materials containing defects.« less

  2. Middle-aged women's preferred theory-based features in mobile physical activity applications.

    PubMed

    Ehlers, Diane K; Huberty, Jennifer L

    2014-09-01

    The purpose of this study was to describe which theory-based behavioral and technological features middle-aged women prefer to be included in a mobile application designed to help them adopt and maintain regular physical activity (PA). Women aged 30 to 64 years (N = 120) completed an online survey measuring their demographics and mobile PA application preferences. The survey was developed upon behavioral principles of Social Cognitive Theory, recent mobile app research, and technology adoption principles of the Unified Theory of Acceptance and Use of Technology. Frequencies were calculated and content analyses conducted to identify which features women most preferred. Behavioral features that help women self-regulate their PA (PA tracking, goal-setting, progress monitoring) were most preferred. Technological features that enhance perceived effort expectancy and playfulness were most preferred. Many women reported the desire to interact and compete with others through the application. Theory-based PA self-regulation features and theory-based design features that improve perceived effort expectancy and playfulness may be most beneficial in a mobile PA application for middle-aged women. Opportunities to interact with other people and the employment of social, game-like activities may also be attractive. Interdisciplinary engagement of experts in PA behavior change, technology adoption, and software development is needed.

  3. For the Problem of Knowledge of the Universe

    NASA Astrophysics Data System (ADS)

    Kalanov, Temur Z.

    2003-04-01

    A new approach to the problem of knowledge of the Universe is suggested. This approach represents the new point of view which has arisen from the critical analysis of the foundations of physics, cosmology and philosophy (dialectical materialism). The principal idea following from the analysis is that only concepts (categories) and principles of dialectics are a basis of the correct theory of the Universe. The foundation of the correct theory is represented by the following philosophical principles. (a) The principle of materiality (objective reality) of the Nature: the Nature (Universe) is a system (set) of material objects (particles, bodies, fields). (b) The principle of existence of material object: an object exists as the objective reality, and movement is a form of existence of object. (c) The principle (definition) of movement of object: the movement is change (i.e. transition of some states into others) in general; the movement determines a direction, and direction characterizes the movement. (d) The principle of existence of time: the time exist as the parameter of the system of reference. (e) The principle of unity of opposites. (In particular, it means that the opposites are bounds of each other. In other words, if a class is divided into two non intersecting (opposite) subclasses, then each subclass is a bound another). (f) System principle: the properties of the system are not logical consequence, corollary of the properties of the elements of the system. These principles result in the following main statements about the Nature (Universe). The Universe does not contain subset of non objects (i.e. empty subset). Therefore, in accordance with (e), the Universe has no objective bound. It means that size (radius), mass, energy are not parameters of the Universe (i.e. size, mass, energy have physical meaning only in the case of limited objects). Consequences. (1) The generally accepted theories of evolution of the Universe (for example, Big Bang Theory) are incorrect if they contain the assumption of existence of the bound of the Universe (i.e. if concepts of singular state of the Universe, of expending or compressing of the Universe, of stability or instability of the Universe underlie the theories). (2) The principle of movement of the Universe: the movement of the Universe is represented by birth and extermination of objects and, consequently, change of structure of the Universe as a system. In particular, it means that there exist copies of objects. (In fact, the principle of identity of quantum particles of the same kind (for example, electrons) is a form of manifestation (consequence) of the principle of birth and extermination of objects). (3) The principle of movement of the Universe is a form of manifestation of the principle of development of the Universe (Nature). (4) The principle of development of the Universe (Nature) is a form of manifestation of the principle of existence of Unitary (i.e. of God). (5) The principle of existence of Unitary (God): Unitary (God) exists as the dialectical unity, identity of the opposites the real (material) and non real (non material) aspects of Unitary. (6) The principle of existence of Unitary (God) represents principal point of a new dialectical gnosiology (i.e. theory of knowledge). (7) Dialectical unity, identity of human practice and of human non practice (i.e. of human ethics based on comprehension of existence of God) is the criterion human truth. (8) The criterion of human truth determines a true way of knowledge of the Nature (Universe). (9) A bound of human knowledge and of human development is determined by development principle. (10) The existence of the bound of human development means existence of the bound only between Mankind and non Mankind (i.e. Supreme Intellect). Hence, the end of evolution of Mankind means transition, transformation, transmutation, dissolution of Mankind into Supreme Intellect. The comprehension of existence of Supreme Intellect around us will mean a beginning of contact, of dialogue with it.

  4. The foundational principles as psychological lodestars: Theoretical inspiration and empirical direction in rehabilitation psychology.

    PubMed

    Dunn, Dana S; Ehde, Dawn M; Wegener, Stephen T

    2016-02-01

    Historically, the Foundational Principles articulated by Wright (1983) and others guided theory development, research and scholarship, and practice in rehabilitation psychology. In recent decades, these principles have become more implicit and less explicit or expressive in the writings and work of rehabilitation professionals. We believe that the Foundational Principles are essential lodestars for working with people with disabilities that can guide inquiry, practice, and service. To introduce this special issues, this commentary identifies and defines key Foundational Principles, including, for example, Lewin's (1935) person-environment relation, adjustment to disability, the malleability of self-perceptions of bodily states, and the importance of promoting dignity for people with disabilities. We then consider the role the Foundational Principles play in the articles appearing in this special issue. We close by considering some new principles and their potential utility in rehabilitation settings. Readers in rehabilitation psychology and aligned areas (e.g., social-personality psychology, health psychology, rehabilitation therapist, psychiatry, and nursing) are encouraged to consider how the Foundational Principles underlie and can shape their research and practice. (c) 2016 APA, all rights reserved).

  5. The boundary of a boundary principle in field theories and the issue of austerity of the laws of physics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kheyfets, A.; Miller, W.A.

    The boundary of a boundary principle has been suggested by J. A. Wheeler as a realization of the austerity idea in field theories. This principle is described in three basic field theories---electrodynamics, Yang--Mills theory, and general relativity. It is demonstrated that it supplies a unified geometric interpretation of the source current in each of the three theories in terms of a generalized E. Cartan moment of rotation. The extent to which the boundary of a boundary principle represents the austerity principle is discussed. It is concluded that it works in a way analogous to thermodynamic relations and it is arguedmore » that deeper principles might be needed to comprehend the nature of austerity.« less

  6. The middle-range theory of nursing intellectual capital.

    PubMed

    Covell, Christine L

    2008-07-01

    This paper is a report of the development of the middle-range theory of nursing intellectual capital. Rising healthcare costs and advances in technology have contributed to the need for better understanding of the influence of nurses' knowledge, skills and experience on patient and organizational outcomes. The middle-range nursing intellectual capital theory was developed using the strategies of concept and theory derivation. The principles of research synthesis were used to provide empirical support for the propositions of the theory. The middle-range nursing intellectual capital theory was derived from intellectual capital theory to make it relevant and applicable to a specific aspect of nursing, continuing professional development. It proposes that the nursing knowledge available in healthcare organizations is influenced by variables within the work environment, and influences patient and organizational outcomes. The middle-range nursing intellectual capital theory should be tested in different healthcare systems and in different settings and countries to determine its effectiveness in guiding research.

  7. La Meme Chose: How Mathematics Can Explain the Thinking of Children and the Thinking of Children Can Illuminate Mathematical Philosophy

    NASA Astrophysics Data System (ADS)

    Cable, John

    2014-01-01

    This article offers a new interpretation of Piaget's decanting experiments, employing the mathematical notion of equivalence instead of conservation. Some reference is made to Piaget's theories and to his educational legacy, but the focus in on certain of the experiments. The key to the new analysis is the abstraction principle, which has been formally enunciated in mathematical philosophy but has universal application. It becomes necessary to identity fluid objects (both configured and unconfigured) and configured and unconfigured sets-of-objects. Issues emerge regarding the conflict between philosophic realism and anti-realism, including constructivism. Questions are asked concerning mathematics and mathematical philosophy, particularly over the nature of sets, the wisdom of the axiomatic method and aspects of the abstraction principle itself.

  8. Using script theory to cultivate illness script formation and clinical reasoning in health professions education.

    PubMed

    Lubarsky, Stuart; Dory, Valérie; Audétat, Marie-Claude; Custers, Eugène; Charlin, Bernard

    2015-01-01

    Script theory proposes an explanation for how information is stored in and retrieved from the human mind to influence individuals' interpretation of events in the world. Applied to medicine, script theory focuses on knowledge organization as the foundation of clinical reasoning during patient encounters. According to script theory, medical knowledge is bundled into networks called 'illness scripts' that allow physicians to integrate new incoming information with existing knowledge, recognize patterns and irregularities in symptom complexes, identify similarities and differences between disease states, and make predictions about how diseases are likely to unfold. These knowledge networks become updated and refined through experience and learning. The implications of script theory on medical education are profound. Since clinician-teachers cannot simply transfer their customized collections of illness scripts into the minds of learners, they must create opportunities to help learners develop and fine-tune their own sets of scripts. In this essay, we provide a basic sketch of script theory, outline the role that illness scripts play in guiding reasoning during clinical encounters, and propose strategies for aligning teaching practices in the classroom and the clinical setting with the basic principles of script theory.

  9. A personalist approach to public-health ethics.

    PubMed

    Petrini, Carlo; Gainotti, Sabina

    2008-08-01

    First we give an overview of the historical development of public health. Then we present some public-health deontology codes and some ethical principles. We highlight difficulties in defining ethics for public health, with specific reference to three of them that concern: (i) the adaptability to public health of the classical principles of bioethics; (ii) the duty to respect and safeguard the individual while acting within the community perspective that is typical of public health; and (iii) the application-oriented nature of public health and the general lack of attention towards the ethical implications of collective interventions (compared with research). We then mention some proposals drafted from North American bioethics "principles" and utilitarian, liberal and communitarian views. Drawing from other approaches, personalism is outlined as being the theory that offers a consistent set of values and alternative principles that are relevant for public health.

  10. [Memorandum IV: Theoretical and Normative Grounding of Health Services Research].

    PubMed

    Baumann, W; Farin, E; Menzel-Begemann, A; Meyer, T

    2016-05-01

    With Memoranda and other initiatives, the German Network for Health Service Research [Deutsches Netzwerk Versorgungsforschung e.V. (DNVF)] is fostering the methodological quality of care research studies for years. Compared to the standards of empirical research, questions concerning the role and function of theories, theoretical approaches and scientific principles have not been taken up on its own. Therefore, the DNVF e.V. has set up a working group in 2013, which was commissioned to prepare a memorandum on "theories in health care research". This now presented memorandum will primarily challenge scholars in health care services research to pay more attention to questions concerning the theoretical arsenal and the background assumptions in the research process. The foundation in the philosophy of science, the reference to normative principles and the theory-bases of the research process are addressed. Moreover, the memorandum will call on to advance the theorizing in health services research and to strengthen not empirical approaches, research on basic principles or studies with regard to normative sciences and to incorporate these relevant disciplines in health services research. Research structures and funding of health services research needs more open space for theoretical reflection and for self-observation of their own, multidisciplinary research processes. © Georg Thieme Verlag KG Stuttgart · New York.

  11. Understanding College and University Organization: Theories for Effective Policy and Practice. Two Volume Set

    ERIC Educational Resources Information Center

    Bess, James L.; Dee, Jay R.

    2007-01-01

    This two-volume work is intended to help readers develop powerful new ways of thinking about organizational principles, and apply them to policy-making and management in colleges and universities. The book is written with two audiences in mind: administrative and faculty leaders in institutions of higher learning, and students (both doctoral and…

  12. Adlerian Adventure-Based Counseling to Enhance Self-Esteem in School Children

    ERIC Educational Resources Information Center

    Wagner, Holly H.; Elliott, Anna

    2014-01-01

    This article provides a rationale for using adventure-based counseling (ABC) principles to promote children's self-esteem through group work within the school setting. The effectiveness of combining Adlerian theory with ABC to promote self-esteem is established. The process that would allow a school counselor to plan, organize, facilitate,…

  13. Intelligent Tutoring Systems

    NASA Astrophysics Data System (ADS)

    Anderson, John R.; Boyle, C. Franklin; Reiser, Brian J.

    1985-04-01

    Cognitive psychology, artificial intelligence, and computer technology have advanced to the point where it is feasible to build computer systems that are as effective as intelligent human tutors. Computer tutors based on a set of pedagogical principles derived from the ACT theory of cognition have been developed for teaching students to do proofs in geometry and to write computer programs in the language LISP.

  14. Intelligent tutoring systems.

    PubMed

    Anderson, J R; Boyle, C F; Reiser, B J

    1985-04-26

    Cognitive psychology, artificial intelligence, and computer technology have advanced to the point where it is feasible to build computer systems that are as effective as intelligent human tutors. Computer tutors based on a set of pedagogical principles derived from the ACT theory of cognition have been developed for teaching students to do proofs in geometry and to write computer programs in the language LISP.

  15. Response to “Comment on ‘Rethinking first-principles electron transport theories with projection operators: The problems caused by partitioning the basis set’” [J. Chem. Phys. 140, 177103 (2014)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Reuter, Matthew G., E-mail: mgreuter@u.northwestern.edu; Harrison, Robert J.

    2014-05-07

    The thesis of Brandbyge's comment [J. Chem. Phys. 140, 177103 (2014)] is that our operator decoupling condition is immaterial to transport theories, and it appeals to discussions of nonorthogonal basis sets in transport calculations in its arguments. We maintain that the operator condition is to be preferred over the usual matrix conditions and subsequently detail problems in the existing approaches. From this operator perspective, we conclude that nonorthogonal projectors cannot be used and that the projectors must be selected to satisfy the operator decoupling condition. Because these conclusions pertain to operators, the choice of basis set is not germane.

  16. First principles calculations for interaction of tyrosine with (ZnO)3 cluster

    NASA Astrophysics Data System (ADS)

    Singh, Satvinder; Singh, Gurinder; Kaura, Aman; Tripathi, S. K.

    2018-04-01

    First Principles Calculations have been performed to study interactions of Phenol ring of Tyrosine (C6H5OH) with (ZnO)3 atomic cluster. All the calculations have been performed under the Density Functional Theory (DFT) framework. Structural and electronic properties of (ZnO)3/C6H5OH have been studied. Gaussian basis set approach has been adopted for the calculations. A ring type most stable (ZnO)3 atomic cluster has been modeled, analyzed and used for the calculations. The compatibility of the results with previous studies has been presented here.

  17. Future Directions for Dissemination and Implementation Science: Aligning Ecological Theory and Public Health to Close the Research to Practice Gap

    PubMed Central

    Rusch, Dana; Mehta, Tara G.; Lakind, Davielle

    2015-01-01

    Dissemination and implementation science (DI) has evolved as a major research model for children’s mental health in response to a longstanding call to integrate science and practice and bridge the elusive research to practice gap. However, to address the complex and urgent needs of the most vulnerable children and families, future directions for DI require a new alignment of ecological theory and public health to provide effective, sustainable, and accessible mental health services. We present core principles of ecological theory to emphasize how contextual factors impact behavior and allow for the reciprocal impact individuals have on the settings they occupy, and an alignment of these principles with a public health model to ensure that services span the prevention to intervention continuum. We provide exemplars from our ongoing work in urban schools, and a new direction for research to address the mental health needs of immigrant Latino families. Through these examples we illustrate how DI can expand its reach by embedding within natural settings to build on local capacity and indigenous resources, incorporating the local knowledge necessary to more substantively address long-standing mental health disparities. This paradigm shift for DI, away from an over-emphasis on promoting program adoption, calls for fitting interventions within settings that matter most to children’s healthy development, and utilizing and strengthening available community resources. In this way, we can meet the challenge of addressing our nation’s mental health burden by supporting the needs and values of families and communities within their own unique social ecologies. PMID:26155972

  18. Future Directions for Dissemination and Implementation Science: Aligning Ecological Theory and Public Health to Close the Research to Practice Gap.

    PubMed

    Atkins, Marc S; Rusch, Dana; Mehta, Tara G; Lakind, Davielle

    2016-01-01

    Dissemination and implementation science (DI) has evolved as a major research model for children's mental health in response to a long-standing call to integrate science and practice and bridge the elusive research to practice gap. However, to address the complex and urgent needs of the most vulnerable children and families, future directions for DI require a new alignment of ecological theory and public health to provide effective, sustainable, and accessible mental health services. We present core principles of ecological theory to emphasize how contextual factors impact behavior and allow for the reciprocal impact individuals have on the settings they occupy, and an alignment of these principles with a public health model to ensure that services span the prevention to intervention continuum. We provide exemplars from our ongoing work in urban schools and a new direction for research to address the mental health needs of immigrant Latino families. Through these examples we illustrate how DI can expand its reach by embedding within natural settings to build on local capacity and indigenous resources, incorporating the local knowledge necessary to more substantively address long-standing mental health disparities. This paradigm shift for DI, away from an overemphasis on promoting program adoption, calls for fitting interventions within settings that matter most to children's healthy development and for utilizing and strengthening available community resources. In this way, we can meet the challenge of addressing our nation's mental health burden by supporting the needs and values of families and communities within their own unique social ecologies.

  19. Conceptual Models and Theory-Embedded Principles on Effective Schooling.

    ERIC Educational Resources Information Center

    Scheerens, Jaap

    1997-01-01

    Reviews models and theories on effective schooling. Discusses four rationality-based organization theories and a fifth perspective, chaos theory, as applied to organizational functioning. Discusses theory-embedded principles flowing from these theories: proactive structuring, fit, market mechanisms, cybernetics, and self-organization. The…

  20. Abstract principles and concrete cases in intuitive lawmaking.

    PubMed

    Ellman, Ira Mark; Braver, Sanford L; MacCoun, Robert J

    2012-04-01

    Citizens awaiting jury service were asked a series of items, in Likert format, to determine their endorsement of various statements about principles to use in setting child support amounts. These twenty items were derived from extant child support systems, from past literature and from Ellman and Ellman's (2008) Theory of Child Support. The twenty items were found to coalesce into four factors (principles). There were pervasive gender differences in respondent's endorsement of the principles. More importantly, three of these four principles were systematically reflected, in very rational (if complex) ways, in the respondents' resolution of the individual child support cases they were asked to decide. Differences among respondents in their endorsement of these three principles accounted for differences in their patterns of child support judgments. It is suggested that the pattern of coherent arbitrariness (Ariely et al., Q J Econ 118(1):73-105, 2003) in those support judgments, noted in an earlier study (Ellman, Braver, & MacCoun, 2009) is thus partially explained, in that the seeming arbitrariness of respondents' initial support judgments reflect in part their differing views about the basic principles that should decide the cases.

  1. Theory-driven intervention for changing personality: expectancy value theory, behavioral activation, and conscientiousness.

    PubMed

    Magidson, Jessica F; Roberts, Brent W; Collado-Rodriguez, Anahi; Lejuez, C W

    2014-05-01

    Considerable evidence suggests that personality traits may be changeable, raising the possibility that personality traits most linked to health problems can be modified with intervention. A growing body of research suggests that problematic personality traits may be altered with behavioral intervention using a bottom-up approach. That is, by targeting core behaviors that underlie personality traits with the goal of engendering new, healthier patterns of behavior that, over time, become automatized and manifest in changes in personality traits. Nevertheless, a bottom-up model for changing personality traits is somewhat diffuse and requires clearer integration of theory and relevant interventions to enable real clinical application. As such, this article proposes a set of guiding principles for theory-driven modification of targeted personality traits using a bottom-up approach, focusing specifically on targeting the trait of conscientiousness using a relevant behavioral intervention, Behavioral Activation (BA), considered within the motivational framework of expectancy value theory (EVT). We conclude with a real case example of the application of BA to alter behaviors counter to conscientiousness in a substance-dependent patient, highlighting the EVT principles most relevant to the approach and the importance and viability of a theoretically driven, bottom-up approach to changing personality traits. (PsycINFO Database Record (c) 2014 APA, all rights reserved).

  2. Extended Thermodynamics: a Theory of Symmetric Hyperbolic Field Equations

    NASA Astrophysics Data System (ADS)

    Müller, Ingo

    2008-12-01

    Extended thermodynamics is based on a set of equations of balance which are supplemented by local and instantaneous constitutive equations so that the field equations are quasi-linear first order differential equations. If the constitutive functions are subject to the requirements of the entropy principle, one may write them in symmetric hyperbolic form by a suitable choice of fields. The kinetic theory of gases, or the moment theories based on the Boltzmann equation provide an explicit example for extended thermodynamics. The theory proves its usefulness and practicality in the successful treatment of light scattering in rarefied gases. This presentation is based upon the book [1] of which the author of this paper is a co-author. For more details about the motivation and exploitation of the basic principles the interested reader is referred to that reference. It would seem that extended thermodynamics is worthy of the attention of mathematicians. It may offer them a non-trivial field of study concerning hyperbolic equations, if ever they get tired of the Burgers equation. Physicists may prefer to appreciate the success of extended thermodynamics in light scattering and to work on the open problems concerning the modification of the Navier-Stokes-Fourier theory in rarefied gases as predicted by extended thermodynamics of 13, 14, and more moments.

  3. The boundary of a boundary principle in field theories and the issue of austerity of the laws of physics

    NASA Astrophysics Data System (ADS)

    Kheyfets, Arkady; Miller, Warner A.

    1991-11-01

    The boundary of a boundary principle has been suggested by J. A. Wheeler as a realization of the austerity idea in field theories. This principle is described in three basic field theories—electrodynamics, Yang-Mills theory, and general relativity. It is demonstrated that it supplies a unified geometric interpretation of the source current in each of the three theories in terms of a generalized E. Cartan moment of rotation. The extent to which the boundary of a boundary principle represents the austerity principle is discussed. It is concluded that it works in a way analogous to thermodynamic relations and it is argued that deeper principles might be needed to comprehend the nature of austerity.

  4. Theory of Constraints for Services: Past, Present, and Future

    NASA Astrophysics Data System (ADS)

    Ricketts, John A.

    Theory of constraints (TOC) is a thinking process and a set of management applications based on principles that run counter to conventional wisdom. TOC is best known in the manufacturing and distribution sectors where it originated. Awareness is growing in some service sectors, such as Health Care. And it's been adopted in some high-tech industries, such as Computer Software. Until recently, however, TOC was barely known in the Professional, Scientific, and Technical Services (PSTS) sector. Professional services include law, accounting, and consulting. Scientific services include research and development. And Technical services include development, operation, and support of various technologies. The main reason TOC took longer to reach PSTS is it's much harder to apply TOC principles when services are highly customized. Nevertheless, with the management applications described in this chapter, TOC has been successfully adapted for PSTS. Those applications cover management of resources, projects, processes, and finances.

  5. A Variational Reduction and the Existence of a Fully Localised Solitary Wave for the Three-Dimensional Water-Wave Problem with Weak Surface Tension

    NASA Astrophysics Data System (ADS)

    Buffoni, Boris; Groves, Mark D.; Wahlén, Erik

    2017-12-01

    Fully localised solitary waves are travelling-wave solutions of the three- dimensional gravity-capillary water wave problem which decay to zero in every horizontal spatial direction. Their existence has been predicted on the basis of numerical simulations and model equations (in which context they are usually referred to as `lumps'), and a mathematically rigorous existence theory for strong surface tension (Bond number {β} greater than {1/3} ) has recently been given. In this article we present an existence theory for the physically more realistic case {0 < β < 1/3} . A classical variational principle for fully localised solitary waves is reduced to a locally equivalent variational principle featuring a perturbation of the functional associated with the Davey-Stewartson equation. A nontrivial critical point of the reduced functional is found by minimising it over its natural constraint set.

  6. A Variational Reduction and the Existence of a Fully Localised Solitary Wave for the Three-Dimensional Water-Wave Problem with Weak Surface Tension

    NASA Astrophysics Data System (ADS)

    Buffoni, Boris; Groves, Mark D.; Wahlén, Erik

    2018-06-01

    Fully localised solitary waves are travelling-wave solutions of the three- dimensional gravity-capillary water wave problem which decay to zero in every horizontal spatial direction. Their existence has been predicted on the basis of numerical simulations and model equations (in which context they are usually referred to as `lumps'), and a mathematically rigorous existence theory for strong surface tension (Bond number {β} greater than {1/3}) has recently been given. In this article we present an existence theory for the physically more realistic case {0 < β < 1/3}. A classical variational principle for fully localised solitary waves is reduced to a locally equivalent variational principle featuring a perturbation of the functional associated with the Davey-Stewartson equation. A nontrivial critical point of the reduced functional is found by minimising it over its natural constraint set.

  7. Analysis of STM images with pure and CO-functionalized tips: A first-principles and experimental study

    NASA Astrophysics Data System (ADS)

    Gustafsson, Alexander; Okabayashi, Norio; Peronio, Angelo; Giessibl, Franz J.; Paulsson, Magnus

    2017-08-01

    We describe a first-principles method to calculate scanning tunneling microscopy (STM) images, and compare the results to well-characterized experiments combining STM with atomic force microscopy (AFM). The theory is based on density functional theory with a localized basis set, where the wave functions in the vacuum gap are computed by propagating the localized-basis wave functions into the gap using a real-space grid. Constant-height STM images are computed using Bardeen's approximation method, including averaging over the reciprocal space. We consider copper adatoms and single CO molecules adsorbed on Cu(111), scanned with a single-atom copper tip with and without CO functionalization. The calculated images agree with state-of-the-art experiments, where the atomic structure of the tip apex is determined by AFM. The comparison further allows for detailed interpretation of the STM images.

  8. "It Is through Others That We Become Ourselves": A Study of Vygotskian Play in Russian and Irish Schools

    ERIC Educational Resources Information Center

    Murphy, Colette; Doherty, Andrea; Kerr, Karen

    2016-01-01

    Fifty years after publishing his seminal work on play and its role in child development, Vygotskian theory is still highly influential in education, and particularly in early years. This paper presents two examples of full integration of Vygotskian principles into schools in two very different settings. Both report improvements in learning and in…

  9. Is Einsteinian no-signalling violated in Bell tests?

    NASA Astrophysics Data System (ADS)

    Kupczynski, Marian

    2017-11-01

    Relativistic invariance is a physical law verified in several domains of physics. The impossibility of faster than light influences is not questioned by quantum theory. In quantum electrodynamics, in quantum field theory and in the standard model relativistic invariance is incorporated by construction. Quantum mechanics predicts strong long range correlations between outcomes of spin projection measurements performed in distant laboratories. In spite of these strong correlations marginal probability distributions should not depend on what was measured in the other laboratory what is called shortly: non-signalling. In several experiments, performed to test various Bell-type inequalities, some unexplained dependence of empirical marginal probability distributions on distant settings was observed. In this paper we demonstrate how a particular identification and selection procedure of paired distant outcomes is the most probable cause for this apparent violation of no-signalling principle. Thus this unexpected setting dependence does not prove the existence of superluminal influences and Einsteinian no-signalling principle has to be tested differently in dedicated experiments. We propose a detailed protocol telling how such experiments should be designed in order to be conclusive. We also explain how magical quantum correlations may be explained in a locally causal way.

  10. The Theory of Value-Based Payment Incentives and Their Application to Health Care.

    PubMed

    Conrad, Douglas A

    2015-12-01

    To present the implications of agency theory in microeconomics, augmented by behavioral economics, for different methods of value-based payment in health care; and to derive a set of future research questions and policy recommendations based on that conceptual analysis. Original literature of agency theory, and secondarily behavioral economics, combined with applied research and empirical evidence on the application of those principles to value-based payment. Conceptual analysis and targeted review of theoretical research and empirical literature relevant to value-based payment in health care. Agency theory and secondarily behavioral economics have powerful implications for design of value-based payment in health care. To achieve improved value-better patient experience, clinical quality, health outcomes, and lower costs of care-high-powered incentives should directly target improved care processes, enhanced patient experience, and create achievable benchmarks for improved outcomes. Differing forms of value-based payment (e.g., shared savings and risk, reference pricing, capitation, and bundled payment), coupled with adjunct incentives for quality and efficiency, can be tailored to different market conditions and organizational settings. Payment contracts that are "incentive compatible"-which directly encourage better care and reduced cost, mitigate gaming, and selectively induce clinically efficient providers to participate-will focus differentially on evidence-based care processes, will right-size and structure incentives to avoid crowd-out of providers' intrinsic motivation, and will align patient incentives with value. Future research should address the details of putting these and related principles into practice; further, by deploying these insights in payment design, policy makers will improve health care value for patients and purchasers. © Health Research and Educational Trust.

  11. Palliative care, public health and justice: setting priorities in resource poor countries.

    PubMed

    Blinderman, Craig

    2009-12-01

    Many countries have not considered palliative care a public health problem. With limited resources, disease-oriented therapies and prevention measures take priority. In this paper, I intend to describe the moral framework for considering palliative care as a public health priority in resource-poor countries. A distributive theory of justice for health care should consider integrative palliative care as morally required as it contributes to improving normal functioning and preserving opportunities for the individual. For patients requiring terminal care, we are guided less by principles of justice and more by the duty to relieve suffering and society's commitment to protecting the professional's obligation to uphold principles of beneficence, compassion and non-abandonment. A fair deliberation process is necessary to allow these strong moral commitments to serve as reasons when setting priorities in resource poor countries.

  12. New robotics: design principles for intelligent systems.

    PubMed

    Pfeifer, Rolf; Iida, Fumiya; Bongard, Josh

    2005-01-01

    New robotics is an approach to robotics that, in contrast to traditional robotics, employs ideas and principles from biology. While in the traditional approach there are generally accepted methods (e. g., from control theory), designing agents in the new robotics approach is still largely considered an art. In recent years, we have been developing a set of heuristics, or design principles, that on the one hand capture theoretical insights about intelligent (adaptive) behavior, and on the other provide guidance in actually designing and building systems. In this article we provide an overview of all the principles but focus on the principles of ecological balance, which concerns the relation between environment, morphology, materials, and control, and sensory-motor coordination, which concerns self-generated sensory stimulation as the agent interacts with the environment and which is a key to the development of high-level intelligence. As we argue, artificial evolution together with morphogenesis is not only "nice to have" but is in fact a necessary tool for designing embodied agents.

  13. Structures, energetics, vibrational spectra of NH4+ (H2O)(n=4,6) clusters: Ab initio calculations and first principles molecular dynamics simulations.

    PubMed

    Karthikeyan, S; Singh, Jiten N; Park, Mina; Kumar, Rajesh; Kim, Kwang S

    2008-06-28

    Important structural isomers of NH(4) (+)(H(2)O)(n=4,6) have been studied by using density functional theory, Moller-Plesset second order perturbation theory, and coupled-cluster theory with single, double, and perturbative triple excitations [CCSD(T)]. The zero-point energy (ZPE) correction to the complete basis set limit of the CCSD(T) binding energies and free energies is necessary to identify the low energy structures for NH(4) (+)(H(2)O)(n=4,6) because otherwise wrong structures could be assigned for the most probable structures. For NH(4) (+)(H(2)O)(6), the cage-type structure, which is more stable than the previously reported open structure before the ZPE correction, turns out to be less stable after the ZPE correction. In first principles Car-Parrinello molecular dynamics simulations around 100 K, the combined power spectrum of three lowest energy isomers of NH(4) (+)(H(2)O)(4) and two lowest energy isomers of NH(4) (+)(H(2)O)(6) explains each experimental IR spectrum.

  14. Structures, energetics, vibrational spectra of NH4+(H2O)n=4,6 clusters: Ab initio calculations and first principles molecular dynamics simulations

    NASA Astrophysics Data System (ADS)

    Karthikeyan, S.; Singh, Jiten N.; Park, Mina; Kumar, Rajesh; Kim, Kwang S.

    2008-06-01

    Important structural isomers of NH4+(H2O)n=4,6 have been studied by using density functional theory, Møller-Plesset second order perturbation theory, and coupled-cluster theory with single, double, and perturbative triple excitations [CCSD(T)]. The zero-point energy (ZPE) correction to the complete basis set limit of the CCSD(T) binding energies and free energies is necessary to identify the low energy structures for NH4+(H2O)n=4,6 because otherwise wrong structures could be assigned for the most probable structures. For NH4+(H2O)6, the cage-type structure, which is more stable than the previously reported open structure before the ZPE correction, turns out to be less stable after the ZPE correction. In first principles Car-Parrinello molecular dynamics simulations around 100 K, the combined power spectrum of three lowest energy isomers of NH4+(H2O)4 and two lowest energy isomers of NH4+(H2O)6 explains each experimental IR spectrum.

  15. Implementation and benchmark of a long-range corrected functional in the density functional based tight-binding method

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lutsker, V.; Niehaus, T. A., E-mail: thomas.niehaus@physik.uni-regensburg.de; Aradi, B.

    2015-11-14

    Bridging the gap between first principles methods and empirical schemes, the density functional based tight-binding method (DFTB) has become a versatile tool in predictive atomistic simulations over the past years. One of the major restrictions of this method is the limitation to local or gradient corrected exchange-correlation functionals. This excludes the important class of hybrid or long-range corrected functionals, which are advantageous in thermochemistry, as well as in the computation of vibrational, photoelectron, and optical spectra. The present work provides a detailed account of the implementation of DFTB for a long-range corrected functional in generalized Kohn-Sham theory. We apply themore » method to a set of organic molecules and compare ionization potentials and electron affinities with the original DFTB method and higher level theory. The new scheme cures the significant overpolarization in electric fields found for local DFTB, which parallels the functional dependence in first principles density functional theory (DFT). At the same time, the computational savings with respect to full DFT calculations are not compromised as evidenced by numerical benchmark data.« less

  16. Status of the Vibrational Theory of Olfaction

    NASA Astrophysics Data System (ADS)

    Hoehn, Ross D.; Nichols, David E.; Neven, Hartmut; Kais, Sabre

    2018-03-01

    The vibrational theory of olfaction is an attempt to describe a possible mechanism for olfaction which is explanatory and provides researchers with a set of principles which permit predictions allowing for structure-odor relations. Similar theories have occurred several times throughout olfactory science; this theory has again recently come to prominence by Luca Turin who suggested that inelastic electron tunneling is the method by which vibrations are detected by the olfactory receptors within the hose. This work is intended to convey to the reader the an up-to-date account of the vibrational theory of olfaction, both the historical iterations as well as the present iteration. This text is designed to give a chronological account of both theoretical and experimental studies on the topic, while providing context, comments and background where they were found to be needed.

  17. Is abstinence education theory based? The underlying logic of abstinence education programs in Texas.

    PubMed

    Goodson, Patricia; Pruitt, B E; Suther, Sandy; Wilson, Kelly; Buhi, Eric

    2006-04-01

    Authors examined the logic (or the implicit theory) underlying 16 abstinence-only-until-marriage programs in Texas (50% of all programs funded under the federal welfare reform legislation during 2001 and 2002). Defined as a set of propositions regarding the relationship between program activities and their intended outcomes, program staff's implicit theories were summarized and compared to (a) data from studies on adolescent sexual behavior, (b) a theory-based model of youth abstinent behavior, and (c) preliminary findings from the national evaluation of Title V programs. Authors interviewed 62 program directors and instructors and employed selected principles of grounded theory to analyze interview data. Findings indicated that abstinence education staff could clearly articulate the logic guiding program activity choices. Comparisons between interview data and a theory-based model of adolescent sexual behavior revealed striking similarities. Implications of these findings for conceptualizing and evaluating abstinence-only-until-marriage (or similar) programs are examined.

  18. Consistency of a counterexample to Naimark's problem

    PubMed Central

    Akemann, Charles; Weaver, Nik

    2004-01-01

    We construct a C*-algebra that has only one irreducible representation up to unitary equivalence but is not isomorphic to the algebra of compact operators on any Hilbert space. This answers an old question of Naimark. Our construction uses a combinatorial statement called the diamond principle, which is known to be consistent with but not provable from the standard axioms of set theory (assuming that these axioms are consistent). We prove that the statement “there exists a counterexample to Naimark's problem which is generated by \\documentclass[10pt]{article} \\usepackage{amsmath} \\usepackage{wasysym} \\usepackage{amsfonts} \\usepackage{amssymb} \\usepackage{amsbsy} \\usepackage{mathrsfs} \\usepackage{pmc} \\usepackage[Euler]{upgreek} \\pagestyle{empty} \\oddsidemargin -1.0in \\begin{document} \\begin{equation*}{\\aleph}_{1}\\end{equation*}\\end{document} elements” is undecidable in standard set theory. PMID:15131270

  19. Lorenz, Gödel and Penrose: new perspectives on determinism and causality in fundamental physics

    NASA Astrophysics Data System (ADS)

    Palmer, T. N.

    2014-07-01

    Despite being known for his pioneering work on chaotic unpredictability, the key discovery at the core of meteorologist Ed Lorenz's work is the link between space-time calculus and state-space fractal geometry. Indeed, properties of Lorenz's fractal invariant set relate space-time calculus to deep areas of mathematics such as Gödel's Incompleteness Theorem. Could such properties also provide new perspectives on deep unsolved issues in fundamental physics? Recent developments in cosmology motivate what is referred to as the 'cosmological invariant set postulate': that the universe ? can be considered a deterministic dynamical system evolving on a causal measure-zero fractal invariant set ? in its state space. Symbolic representations of ? are constructed explicitly based on permutation representations of quaternions. The resulting 'invariant set theory' provides some new perspectives on determinism and causality in fundamental physics. For example, while the cosmological invariant set appears to have a rich enough structure to allow a description of (quantum) probability, its measure-zero character ensures it is sparse enough to prevent invariant set theory being constrained by the Bell inequality (consistent with a partial violation of the so-called measurement independence postulate). The primacy of geometry as embodied in the proposed theory extends the principles underpinning general relativity. As a result, the physical basis for contemporary programmes which apply standard field quantisation to some putative gravitational lagrangian is questioned. Consistent with Penrose's suggestion of a deterministic but non-computable theory of fundamental physics, an alternative 'gravitational theory of the quantum' is proposed based on the geometry of ?, with new perspectives on the problem of black-hole information loss and potential observational consequences for the dark universe.

  20. TRIZ theory in NEA photocathode preparation system

    NASA Astrophysics Data System (ADS)

    Qiao, Jianliang; Huang, Dayong; Li, Xiangjiang; Gao, Youtang

    2016-09-01

    The solutions to the engineering problems were provided according to the innovation principle based on the theory of TRIZ. The ultra high vacuum test and evaluation system for the preparation of negative electron affinity (NEA) photocathode has the characteristics of complex structure and powerful functions. Segmentation principle, advance function principle, curved surface principle, dynamic characteristics principle and nested principle adopted by the design of ultra high vacuum test and evaluation system for cathode preparation were analyzed. The applications of the physical contradiction and the substance-field analysis method of the theory of TRIZ in the cathode preparation ultra high vacuum test and evaluation system were discussed.

  1. Toward a theory of organisms: Three founding principles in search of a useful integration

    PubMed Central

    SOTO, ANA M.; LONGO, GIUSEPPE; MIQUEL, PAUL-ANTOINE; MONTEVIL, MAËL; MOSSIO, MATTEO; PERRET, NICOLE; POCHEVILLE, ARNAUD; SONNENSCHEIN, CARLOS

    2016-01-01

    Organisms, be they uni- or multi-cellular, are agents capable of creating their own norms; they are continuously harmonizing their ability to create novelty and stability, that is, they combine plasticity with robustness. Here we articulate the three principles for a theory of organisms proposed in this issue, namely: the default state of proliferation with variation and motility, the principle of variation and the principle of organization. These principles profoundly change both biological observables and their determination with respect to the theoretical framework of physical theories. This radical change opens up the possibility of anchoring mathematical modeling in biologically proper principles. PMID:27498204

  2. [Research Progress of Multi-Model Medical Image Fusion at Feature Level].

    PubMed

    Zhang, Junjie; Zhou, Tao; Lu, Huiling; Wang, Huiqun

    2016-04-01

    Medical image fusion realizes advantage integration of functional images and anatomical images.This article discusses the research progress of multi-model medical image fusion at feature level.We firstly describe the principle of medical image fusion at feature level.Then we analyze and summarize fuzzy sets,rough sets,D-S evidence theory,artificial neural network,principal component analysis and other fusion methods’ applications in medical image fusion and get summery.Lastly,we in this article indicate present problems and the research direction of multi-model medical images in the future.

  3. Fuzzy compromise: An effective way to solve hierarchical design problems

    NASA Technical Reports Server (NTRS)

    Allen, J. K.; Krishnamachari, R. S.; Masetta, J.; Pearce, D.; Rigby, D.; Mistree, F.

    1990-01-01

    In this paper, we present a method for modeling design problems using a compromise decision support problem (DSP) incorporating the principles embodied in fuzzy set theory. Specifically, the fuzzy compromise decision support problem is used to study hierarchical design problems. This approach has the advantage that although the system modeled has an element of uncertainty associated with it, the solution obtained is crisp and precise. The efficacy of incorporating fuzzy sets into the solution process is discussed in the context of results obtained for a portal frame.

  4. Nonrelativistic para-Lorentzian mechanics

    NASA Astrophysics Data System (ADS)

    Vargas, J. G.

    1981-04-01

    After reviewing the foundations of special relativity and the room left for rival theories, a set of nonrelativistic para-Lorentzian transformations is derived uniquely, based on (a) a weaker first principle, (b) the requirement that the transformations sought do not give rise to the clock “paradox” (in a refined version), and (c) the compliance of the transformations with the classical experiments of Michelson-Morley, Kennedy-Thorndike, and Ives-Stilwell. The corresponding dynamics is developed. Most of the experimental support of special relativity is reconsidered in the light of the new theory. It is concluded that the relativity of simultaneity has so far not been tested.

  5. Facilitation of learning: part 1.

    PubMed

    Warburton, Tyler; Trish, Houghton; Barry, Debbie

    2016-04-06

    This article, the fourth in a series of 11, discusses the context for the facilitation of learning. It outlines the main principles and theories for understanding the process of learning, including examples which link these concepts to practice. The practical aspects of using these theories in a practice setting will be discussed in the fifth article of this series. Together, these two articles will provide mentors and practice teachers with knowledge of the learning process, which will enable them to meet the second domain of the Nursing and Midwifery Council's Standards to Support Learning and Assessment in Practice on facilitation of learning.

  6. Aids to Computer-Based Multimedia Learning.

    ERIC Educational Resources Information Center

    Mayer, Richard E.; Moreno, Roxana

    2002-01-01

    Presents a cognitive theory of multimedia learning that draws on dual coding theory, cognitive load theory, and constructivist learning theory and derives some principles of instructional design for fostering multimedia learning. These include principles of multiple representation, contiguity, coherence, modality, and redundancy. (SLD)

  7. Principles of General Systems Theory: Some Implications for Higher Education Administration

    ERIC Educational Resources Information Center

    Gilliland, Martha W.; Gilliland, J. Richard

    1978-01-01

    Three principles of general systems theory are presented and systems theory is distinguished from systems analysis. The principles state that all systems tend to become more disorderly, that they must be diverse in order to be stable, and that only those maximizing their resource utilization for doing useful work will survive. (Author/LBH)

  8. Minimization principles for the coupled problem of Darcy-Biot-type fluid transport in porous media linked to phase field modeling of fracture

    NASA Astrophysics Data System (ADS)

    Miehe, Christian; Mauthe, Steffen; Teichtmeister, Stephan

    2015-09-01

    This work develops new minimization and saddle point principles for the coupled problem of Darcy-Biot-type fluid transport in porous media at fracture. It shows that the quasi-static problem of elastically deforming, fluid-saturated porous media is related to a minimization principle for the evolution problem. This two-field principle determines the rate of deformation and the fluid mass flux vector. It provides a canonically compact model structure, where the stress equilibrium and the inverse Darcy's law appear as the Euler equations of a variational statement. A Legendre transformation of the dissipation potential relates the minimization principle to a characteristic three field saddle point principle, whose Euler equations determine the evolutions of deformation and fluid content as well as Darcy's law. A further geometric assumption results in modified variational principles for a simplified theory, where the fluid content is linked to the volumetric deformation. The existence of these variational principles underlines inherent symmetries of Darcy-Biot theories of porous media. This can be exploited in the numerical implementation by the construction of time- and space-discrete variational principles, which fully determine the update problems of typical time stepping schemes. Here, the proposed minimization principle for the coupled problem is advantageous with regard to a new unconstrained stable finite element design, while space discretizations of the saddle point principles are constrained by the LBB condition. The variational principles developed provide the most fundamental approach to the discretization of nonlinear fluid-structure interactions, showing symmetric systems in algebraic update procedures. They also provide an excellent starting point for extensions towards more complex problems. This is demonstrated by developing a minimization principle for a phase field description of fracture in fluid-saturated porous media. It is designed for an incorporation of alternative crack driving forces, such as a convenient criterion in terms of the effective stress. The proposed setting provides a modeling framework for the analysis of complex problems such as hydraulic fracture. This is demonstrated by a spectrum of model simulations.

  9. Study of the Conservation of Mechanical Energy in the Motion of a Pendulum Using a Smartphone

    ERIC Educational Resources Information Center

    Pierratos, Theodoros; Polatoglou, Hariton M.

    2018-01-01

    A common method that scientists use to validate a theory is to utilize known principles and laws to produce results on specific settings, which can be assessed using the appropriate experimental methods and apparatuses. Smartphones have various sensors built-in and could be used for measuring and logging data in physics experiments. In this work,…

  10. Tailoring Healthy Workplace Interventions to Local Healthcare Settings: A Complexity Theory-Informed Workplace of Well-Being Framework

    PubMed Central

    Brand, Sarah L.; Fleming, Lora E.; Wyatt, Katrina M.

    2015-01-01

    Many healthy workplace interventions have been developed for healthcare settings to address the consistently low scores of healthcare professionals on assessments of mental and physical well-being. Complex healthcare settings present challenges for the scale-up and spread of successful interventions from one setting to another. Despite general agreement regarding the importance of the local setting in affecting intervention success across different settings, there is no consensus on what it is about a local setting that needs to be taken into account to design healthy workplace interventions appropriate for different local settings. Complexity theory principles were used to understand a workplace as a complex adaptive system and to create a framework of eight domains (system characteristics) that affect the emergence of system-level behaviour. This Workplace of Well-being (WoW) framework is responsive and adaptive to local settings and allows a shared understanding of the enablers and barriers to behaviour change by capturing local information for each of the eight domains. We use the results of applying the WoW framework to one workplace, a UK National Health Service ward, to describe the utility of this approach in informing design of setting-appropriate healthy workplace interventions that create workplaces conducive to healthy behaviour change. PMID:26380358

  11. Tailoring Healthy Workplace Interventions to Local Healthcare Settings: A Complexity Theory-Informed Workplace of Well-Being Framework.

    PubMed

    Brand, Sarah L; Fleming, Lora E; Wyatt, Katrina M

    2015-01-01

    Many healthy workplace interventions have been developed for healthcare settings to address the consistently low scores of healthcare professionals on assessments of mental and physical well-being. Complex healthcare settings present challenges for the scale-up and spread of successful interventions from one setting to another. Despite general agreement regarding the importance of the local setting in affecting intervention success across different settings, there is no consensus on what it is about a local setting that needs to be taken into account to design healthy workplace interventions appropriate for different local settings. Complexity theory principles were used to understand a workplace as a complex adaptive system and to create a framework of eight domains (system characteristics) that affect the emergence of system-level behaviour. This Workplace of Well-being (WoW) framework is responsive and adaptive to local settings and allows a shared understanding of the enablers and barriers to behaviour change by capturing local information for each of the eight domains. We use the results of applying the WoW framework to one workplace, a UK National Health Service ward, to describe the utility of this approach in informing design of setting-appropriate healthy workplace interventions that create workplaces conducive to healthy behaviour change.

  12. The Psychology of Close Relationships: Fourteen Core Principles.

    PubMed

    Finkel, Eli J; Simpson, Jeffry A; Eastwick, Paul W

    2017-01-03

    Relationship science is a theory-rich discipline, but there have been no attempts to articulate the broader themes or principles that cut across the theories themselves. We have sought to fill that void by reviewing the psychological literature on close relationships, particularly romantic relationships, to extract its core principles. This review reveals 14 principles, which collectively address four central questions: (a) What is a relationship? (b) How do relationships operate? (c) What tendencies do people bring to their relationships? (d) How does the context affect relationships? The 14 principles paint a cohesive and unified picture of romantic relationships that reflects a strong and maturing discipline. However, the principles afford few of the sorts of conflicting predictions that can be especially helpful in fostering novel theory development. We conclude that relationship science is likely to benefit from simultaneous pushes toward both greater integration across theories (to reduce redundancy) and greater emphasis on the circumstances under which existing (or not-yet-developed) principles conflict with one another.

  13. Quantifying inhomogeneity in fractal sets

    NASA Astrophysics Data System (ADS)

    Fraser, Jonathan M.; Todd, Mike

    2018-04-01

    An inhomogeneous fractal set is one which exhibits different scaling behaviour at different points. The Assouad dimension of a set is a quantity which finds the ‘most difficult location and scale’ at which to cover the set and its difference from box dimension can be thought of as a first-level overall measure of how inhomogeneous the set is. For the next level of analysis, we develop a quantitative theory of inhomogeneity by considering the measure of the set of points around which the set exhibits a given level of inhomogeneity at a certain scale. For a set of examples, a family of -invariant subsets of the 2-torus, we show that this quantity satisfies a large deviations principle. We compare members of this family, demonstrating how the rate function gives us a deeper understanding of their inhomogeneity.

  14. A restricted proof that the weak equivalence principle implies the Einstein equivalence principle

    NASA Technical Reports Server (NTRS)

    Lightman, A. P.; Lee, D. L.

    1973-01-01

    Schiff has conjectured that the weak equivalence principle (WEP) implies the Einstein equivalence principle (EEP). A proof is presented of Schiff's conjecture, restricted to: (1) test bodies made of electromagnetically interacting point particles, that fall from rest in a static, spherically symmetric gravitational field; (2) theories of gravity within a certain broad class - a class that includes almost all complete relativistic theories that have been found in the literature, but with each theory truncated to contain only point particles plus electromagnetic and gravitational fields. The proof shows that every nonmentric theory in the class (every theory that violates EEP) must violate WEP. A formula is derived for the magnitude of the violation. It is shown that WEP is a powerful theoretical and experimental tool for constraining the manner in which gravity couples to electromagnetism in gravitation theories.

  15. Critique of pure free energy principle. Comment on "Answering Schrödinger's question: A free-energy formulation" by Maxwell James Désormeau Ramstead et al.

    NASA Astrophysics Data System (ADS)

    Tozzi, Arturo; Peters, James F.

    2018-03-01

    The paper by Ramstead et al. [1] [in this issue] reminds us the efforts of eminent scientists such as Whitehead and Godel. After having produced influential manuscripts, they turned to more philosophical issues, understanding the need for a larger formalization of their bounteous scientific results [2,3]. In a similar way, the successful free-energy principle has been generalized, in order to encompass not only the brain activity of the original formulation, but also the whole spectrum of life [1]. The final result is of prominent importance, because, in touch with Quine's naturalized epistemology [4] and Badiou's account of set theory [5], provides philosophical significance to otherwise purely scientific matters. The free energy principle becomes a novel paradigm that attempts to explain general physical/biological mechanisms in the light of a novel scientific ontology, the "variational neuroethology". The latter, seemingly grounded in a recursive multilevel reductionistic/emergentistic approach à la Bechtel [6], has also its roots in a rationalistic top-down approach that, starting from mathematical/physical general concepts (von Helmholtz's free energy), formulates experimentally testable (and falsifiable) theories.

  16. Microeconomics, Private Security, and the Significance to Operational Planning

    DTIC Science & Technology

    2013-05-23

    of microeconomic principles of supply-demand theory by looking at specific conceptual variables to determine the relationship and influence on the... microeconomic principles of supply-demand theory by looking at specific conceptual variables to determine the relationship and influence on the...study to include existing conceptual understandings of the industry, the microeconomic principles of supply- demand theory, the theoretic framework

  17. Increasing Bellevue School District's elementary teachers' capacity for teaching inquiry-based science: Using ideas from contemporary learning theory to inform professional development

    NASA Astrophysics Data System (ADS)

    Maury, Tracy Anne

    This Capstone project examined how leaders in the Bellevue School District can increase elementary teachers' capacity for teaching inquiry-based science through the use of professional learning activities that are grounded in ideas from human learning theory. A framework for professional development was constructed and from that framework, a set of professional learning activities were developed as a means to support teacher learning while project participants piloted new curriculum called the Isopod Habitat Challenge. Teachers in the project increased their understanding of the learning theory principles of preconceptions and metacognition. Teachers did not increase their understanding of the principle of learning with understanding, although they did articulate the significance of engaging children in student-led inquiry cycles. Data from the curriculum revision and professional development project coupled with ideas from learning theory, cognition and policy implementation, and learning community literatures suggest Bellevue's leaders can encourage peer-to-peer interaction, link professional development to teachers' daily practice, and capitalize on technology as ways to increase elementary teachers' capacity for teaching inquiry-based science. These lessons also have significance for supporting teacher learning and efficacy in other subject areas and at other levels in the system.

  18. Ab initio molecular simulations with numeric atom-centered orbitals

    NASA Astrophysics Data System (ADS)

    Blum, Volker; Gehrke, Ralf; Hanke, Felix; Havu, Paula; Havu, Ville; Ren, Xinguo; Reuter, Karsten; Scheffler, Matthias

    2009-11-01

    We describe a complete set of algorithms for ab initio molecular simulations based on numerically tabulated atom-centered orbitals (NAOs) to capture a wide range of molecular and materials properties from quantum-mechanical first principles. The full algorithmic framework described here is embodied in the Fritz Haber Institute "ab initio molecular simulations" (FHI-aims) computer program package. Its comprehensive description should be relevant to any other first-principles implementation based on NAOs. The focus here is on density-functional theory (DFT) in the local and semilocal (generalized gradient) approximations, but an extension to hybrid functionals, Hartree-Fock theory, and MP2/GW electron self-energies for total energies and excited states is possible within the same underlying algorithms. An all-electron/full-potential treatment that is both computationally efficient and accurate is achieved for periodic and cluster geometries on equal footing, including relaxation and ab initio molecular dynamics. We demonstrate the construction of transferable, hierarchical basis sets, allowing the calculation to range from qualitative tight-binding like accuracy to meV-level total energy convergence with the basis set. Since all basis functions are strictly localized, the otherwise computationally dominant grid-based operations scale as O(N) with system size N. Together with a scalar-relativistic treatment, the basis sets provide access to all elements from light to heavy. Both low-communication parallelization of all real-space grid based algorithms and a ScaLapack-based, customized handling of the linear algebra for all matrix operations are possible, guaranteeing efficient scaling (CPU time and memory) up to massively parallel computer systems with thousands of CPUs.

  19. Distributed communication: Implications of cultural-historical activity theory (CHAT) for communication disorders.

    PubMed

    Hengst, Julie A

    2015-01-01

    This article proposes distributed communication as a promising theoretical framework for building supportive environments for child language development. Distributed communication is grounded in an emerging intersection of cultural-historical activity theory (CHAT) and theories of communicative practices that argue for integrating accounts of language, cognition and culture. The article first defines and illustrates through selected research articles, three key principles of distributed communication: (a) language and all communicative resources are inextricably embedded in activity; (b) successful communication depends on common ground built up through short- and long-term histories of participation in activities; and (c) language cannot act alone, but is always orchestrated with other communicative resources. It then illustrates how these principles are fully integrated in everyday interactions by drawing from my research on Cindy Magic, a verbal make-believe game played by a father and his two daughters. Overall, the research presented here points to the remarkably complex communicative environments and sophisticated forms of distributed communication children routinely engage in as they interact with peer and adult communication partners in everyday settings. The article concludes by considering implications of these theories for, and examples of, distributed communication relevant to clinical intervention. Readers will learn about (1) distributed communication as a conceptual tool grounded in an emerging intersection of cultural-historical activity theory and theories of communicative practices and (2) how to apply distributed communication to the study of child language development and to interventions for children with communication disorders. Copyright © 2015 Elsevier Inc. All rights reserved.

  20. The architecture and dynamics of developing mind: experiential structuralism as a frame for unifying cognitive developmental theories.

    PubMed

    Demetriou, A; Efklides, A; Platsidou, M

    1993-01-01

    This Monograph presents a theory of cognitive development. The theory argues that the mind develops across three fronts. The first refers to a general processing system that defines the general potentials of mind to develop cognitive strategies and skills. The second refers to a hypercognitive system that governs self-understanding and self-regulation. The third involves a set of specialized structural systems (SSSs) that are responsible for the representation and processing of different reality domains. There are specific forces that are responsible for this organization of mind. These are expressed in the Monograph in terms of a set of five organizational principles. The developmental course of the major systems is outlined. Developmental change is ascribed by the theory to the interaction between the various systems. Different types of development require different change mechanisms. Five studies are presented that provide empirical support for these postulates. Study 1 demonstrated the organizational power of principles and SSSs. Study 2 showed that the SSSs constrain the effect of learning. Study 3 established that the hypercognitive system does function as the interface between tasks and SSS-specific processes or between SSSs and general cognitive functions such as attention and memory. Study 4 investigated the relations between one of the components of the processing system, storage, and two different SSSs expressed via two different symbolic systems, namely, the numeric and the imaginal. Finally, Study 5 examined the interaction between the components of the processing system and the relations between each of these components and one SSS, namely, the quantitative-relational SSS. The theoretical implications of these studies with regard to general issues, such as the nature of representation, the causation of cognitive change, and individual differences in cognitive development, are discussed in the concluding chapter.

  1. A Variational Principle for Reconstruction of Elastic Deformations in Shear Deformable Plates and Shells

    NASA Technical Reports Server (NTRS)

    Tessler, Alexander; Spangler, Jan L.

    2003-01-01

    A variational principle is formulated for the inverse problem of full-field reconstruction of three-dimensional plate/shell deformations from experimentally measured surface strains. The formulation is based upon the minimization of a least squares functional that uses the complete set of strain measures consistent with linear, first-order shear-deformation theory. The formulation, which accommodates for transverse shear deformation, is applicable for the analysis of thin and moderately thick plate and shell structures. The main benefit of the variational principle is that it is well suited for C(sup 0)-continuous displacement finite element discretizations, thus enabling the development of robust algorithms for application to complex civil and aeronautical structures. The methodology is especially aimed at the next generation of aerospace vehicles for use in real-time structural health monitoring systems.

  2. Introduction to the Neutrosophic Quantum Theory

    NASA Astrophysics Data System (ADS)

    Smarandache, Florentin

    2014-10-01

    Neutrosophic Quantum Theory (NQT) is the study of the principle that certain physical quantities can assume neutrosophic values, instead of discrete values as in quantum theory. These quantities are thus neutrosophically quantized. A neutrosophic values (neutrosophic amount) is expressed by a set (mostly an interval) that approximates (or includes) a discrete value. An oscillator can lose or gain energy by some neutrosophic amount (we mean neither continuously nor discretely, but as a series of integral sets: S, 2S, 3S, ..., where S is a set). In the most general form, one has an ensemble of sets of sets, i.e. R1S1 ,R2S2 ,R3S3 , ..., where all Rn and Sn are sets that may vary in function of time and of other parameters. Several such sets may be equal, or may be reduced to points, or may be empty. {The multiplication of two sets A and B is classically defined as: AB ={ab, a??A and b??B}. And similarly a number n times a set A is defined as: nA ={na, a??A}.} The unit of neutrosophic energy is Hν , where H is a set (in particular an interval) that includes Planck constant h, and ν is the frequency. Therefore, an oscillator could change its energy by a neutrosophic number of quanta: Hν , 2H ν, 3H ν, etc. For example, when H is an interval [h1 ,h2 ] , with 0 <=h1 <=h2 , that contains Planck constant h, then one has: [h1 ν ,h2 ν ], [2h1 ν , 2h2 ν ], [3h1 ν , 3h2 ν ],..., as series of intervals of energy change of the oscillator. The most general form of the units of neutrosophic energy is Hnνn , where all Hn and νn are sets that similarly as above may vary in function of time and of other oscillator and environment parameters. Neutrosophic quantum theory combines classical mechanics and quantum mechanics.

  3. Modeling Mode Choice Behavior Incorporating Household and Individual Sociodemographics and Travel Attributes Based on Rough Sets Theory

    PubMed Central

    Chen, Xuewu; Wei, Ming; Wu, Jingxian; Hou, Xianyao

    2014-01-01

    Most traditional mode choice models are based on the principle of random utility maximization derived from econometric theory. Alternatively, mode choice modeling can be regarded as a pattern recognition problem reflected from the explanatory variables of determining the choices between alternatives. The paper applies the knowledge discovery technique of rough sets theory to model travel mode choices incorporating household and individual sociodemographics and travel information, and to identify the significance of each attribute. The study uses the detailed travel diary survey data of Changxing county which contains information on both household and individual travel behaviors for model estimation and evaluation. The knowledge is presented in the form of easily understood IF-THEN statements or rules which reveal how each attribute influences mode choice behavior. These rules are then used to predict travel mode choices from information held about previously unseen individuals and the classification performance is assessed. The rough sets model shows high robustness and good predictive ability. The most significant condition attributes identified to determine travel mode choices are gender, distance, household annual income, and occupation. Comparative evaluation with the MNL model also proves that the rough sets model gives superior prediction accuracy and coverage on travel mode choice modeling. PMID:25431585

  4. On the Essence of Space

    NASA Astrophysics Data System (ADS)

    Kalanov, Temur Z.

    2003-04-01

    A new theory of space is suggested. It represents the new point of view which has arisen from the critical analysis of the foundations of physics (in particular the theory of relativity and quantum mechanics), mathematics, cosmology and philosophy. The main idea following from the analysis is that the concept of movement represents a key to understanding of the essence of space. The starting-point of the theory is represented by the following philosophical (dialectical materialistic) principles. (a) The principle of the materiality (of the objective reality) of the Nature: the Nature (the Universe) is a system (a set) of material objects (particles, bodies, fields); each object has properties, features, and the properties, the features are inseparable characteristics of material object and belong only to material object. (b) The principle of the existence of material object: an object exists as the objective reality, and movement is a form of existence of object. (c) The principle (definition) of movement of object: the movement is change (i.e. transition of some states into others) in general; the movement determines a direction, and direction characterizes the movement. (d) The principle of existence of time: the time exists as the parameter of the system of reference. These principles lead to the following statements expressing the essence of space. (1) There is no space in general, and there exist space only as a form of existence of the properties and features of the object. It means that the space is a set of the measures of the object (the measure is the philosophical category meaning unity of the qualitative and quantitative determinacy of the object). In other words, the space of the object is a set of the states of the object. (2) The states of the object are manifested only in a system of reference. The main informational property of the unitary system researched physical object + system of reference is that the system of reference determines (measures, calculates) the parameters of the subsystem researched physical object (for example, the coordinates of the object M); the parameters characterize the system of reference (for example, the system of coordinates S). (3) Each parameter of the object is its measure. Total number of the mutually independent parameters of the object is called dimension of the space of the object. (4) The set of numerical values (i.e. the range, the spectrum) of each parameter is the subspace of the object. (The coordinate space, the momentum space and the energy space are examples of the subspaces of the object). (5) The set of the parameters of the object is divided into two non intersecting (opposite) classes: the class of the internal parameters and the class of the non internal (i.e. external) parameters. The class of the external parameters is divided into two non intersecting (opposite) subclasses: the subclass of the absolute parameters (characterizing the form, the sizes of the object) and the subclass of the non absolute (relative) parameters (characterizing the position, the coordinates of the object). (6) Set of the external parameters forms the external space of object. It is called geometrical space of object. (7) Since a macroscopic object has three mutually independent sizes, the dimension of its external absolute space is equal to three. Consequently, the dimension of its external relative space is also equal to three. Thus, the total dimension of the external space of the macroscopic object is equal to six. (8) In general case, the external absolute space (i.e. the form, the sizes) and the external relative space (i.e. the position, the coordinates) of any object are mutually dependent because of influence of a medium. The geometrical space of such object is called non Euclidean space. If the external absolute space and the external relative space of some object are mutually independent, then the external relative space of such object is the homogeneous and isotropic geometrical space. It is called Euclidean space of the object. Consequences: (i) the question of true geometry of the Universe is incorrect; (ii) the theory of relativity has no physical meaning.

  5. Twenty cultural and learning principles to guide the development of pharmacy curriculum in Pacific Island countries.

    PubMed

    Brown, Andrew N; McCormack, Coralie

    2014-01-01

    A lack of education capacity to support the development of medical supply management competency is a major issue affecting Pacific Islands countries (PICs). Limited human resources and underdeveloped medicines supply management competency are two significant impediments to reaching the health-related Millennium Development Goals in many countries in this rural and remote region. Two recent review publications have provided relevant background documenting factors affecting learning and teaching. These articles have presented available information regarding competency and training requirements for health personnel involved in essential medicine supply management in the region. This background research has provided a platform from which tangible principles can be developed to aid educators and professionals in PICs in the development and delivery of appropriate pharmacy curriculum. Specifically the aim of the present article is to identify culturally meaningful learning and teaching principles to guide the development and delivery of pharmaceutical curriculum in PICs. Subsequently, this information will be applied to develop and trial new pedagogical approaches to the training of health personnel involved in essential medicines supply management, to improve medicine availability for patients in their own environment. This article forms part of a wider research project involving the United Nations Population Fund Suva subregional office, the University of Canberra, Ministry of Health officials and health personnel within identified PICs. Two previous reviews, investigating Pacific culture, learning approaches, and training requirements affecting pharmaceutical personnel, were synthesised into a set of principles that could be applied to the development of pharmaceutical curriculum. These principles were validated through focus groups of health personnel using action research methods. An initial set of 16 principles was developed from the synthesis of the two reviews. These principles were reviewed by two focus groups held in Fiji and the Solomon Islands to produce a set of 20 validated principles. These validated principles can be grouped under the headings of learning theory, structure and design, and learning and teaching methods. The 20 principles outlined in this article will be used to develop and trial culturally relevant training approaches for the development of medicine management competencies for various cadres of health personnel in PICs. These principles provide a practical framework for educators and health professionals to apply to health-based education and training in the Pacific, with potential application to other rural and remote environments.

  6. Do violations of the axioms of expected utility theory threaten decision analysis?

    PubMed

    Nease, R F

    1996-01-01

    Research demonstrates that people violate the independence principle of expected utility theory, raising the question of whether expected utility theory is normative for medical decision making. The author provides three arguments that violations of the independence principle are less problematic than they might first appear. First, the independence principle follows from other more fundamental axioms whose appeal may be more readily apparent than that of the independence principle. Second, the axioms need not be descriptive to be normative, and they need not be attractive to all decision makers for expected utility theory to be useful for some. Finally, by providing a metaphor of decision analysis as a conversation between the actual decision maker and a model decision maker, the author argues that expected utility theory need not be purely normative for decision analysis to be useful. In short, violations of the independence principle do not necessarily represent direct violations of the axioms of expected utility theory; behavioral violations of the axioms of expected utility theory do not necessarily imply that decision analysis is not normative; and full normativeness is not necessary for decision analysis to generate valuable insights.

  7. Deontological foundations for medical ethics?

    PubMed

    Gillon, R

    1985-05-04

    Gillon outlines the principles of the deontological, or duty-based, group of moral theories in one of a series of British Medical Journal articles on the philosophical foundations of medical ethics. He differentiates between monistic theories, such as Immanuel Kant's, which rely on a single moral principle, and pluralistic theories, such as that of W.D. Ross, which rely on several principles that potentially could conflict. He summarizes the contributions of Kant and Ross to the development of deontological thought, then concludes his essay with brief paragraphs on other deontological approaches to the resolution of conflicting moral principles.

  8. A Content Analysis of Instructional Design and Web Design Books: Implications for Inclusion of Web Design in Instructional Design Textbooks

    ERIC Educational Resources Information Center

    Obilade, Titilola T.; Burton, John K.

    2015-01-01

    This textual content analysis set out to determine the extent to which the theories, principles, and guidelines in 4 standard books of instructional design and technology were also addressed in 4 popular books on web design. The standard books on instructional design and the popular books on web design were chosen by experts in the fields. The…

  9. China Report RED FLAG No 14, 6 JULY 1986

    DTIC Science & Technology

    1986-08-28

    sciences, literature, art , science , tech- nology, and morality, and must set new demands on them so as to meet the needs of carrying out reform...necessary patience and enthusiasm for the new atmosphere emerging in current studies in literary theory and in literature and art science . He is not...extremely important principles of Marxist aesthetics? Finally, a general question is: Is reform necessary in our literature and art science ? How

  10. Bit Threads and Holographic Entanglement

    NASA Astrophysics Data System (ADS)

    Freedman, Michael; Headrick, Matthew

    2017-05-01

    The Ryu-Takayanagi (RT) formula relates the entanglement entropy of a region in a holographic theory to the area of a corresponding bulk minimal surface. Using the max flow-min cut principle, a theorem from network theory, we rewrite the RT formula in a way that does not make reference to the minimal surface. Instead, we invoke the notion of a "flow", defined as a divergenceless norm-bounded vector field, or equivalently a set of Planck-thickness "bit threads". The entanglement entropy of a boundary region is given by the maximum flux out of it of any flow, or equivalently the maximum number of bit threads that can emanate from it. The threads thus represent entanglement between points on the boundary, and naturally implement the holographic principle. As we explain, this new picture clarifies several conceptual puzzles surrounding the RT formula. We give flow-based proofs of strong subadditivity and related properties; unlike the ones based on minimal surfaces, these proofs correspond in a transparent manner to the properties' information-theoretic meanings. We also briefly discuss certain technical advantages that the flows offer over minimal surfaces. In a mathematical appendix, we review the max flow-min cut theorem on networks and on Riemannian manifolds, and prove in the network case that the set of max flows varies Lipshitz continuously in the network parameters.

  11. Solving problems by interrogating sets of knowledge systems: Toward a theory of multiple knowledge systems

    NASA Technical Reports Server (NTRS)

    Dekorvin, Andre

    1989-01-01

    The main purpose is to develop a theory for multiple knowledge systems. A knowledge system could be a sensor or an expert system, but it must specialize in one feature. The problem is that we have an exhaustive list of possible answers to some query (such as what object is it). By collecting different feature values, in principle, it should be possible to give an answer to the query, or at least narrow down the list. Since a sensor, or for that matter an expert system, does not in most cases yield a precise value for the feature, uncertainty must be built into the model. Also, researchers must have a formal mechanism to be able to put the information together. Researchers chose to use the Dempster-Shafer approach to handle the problems mentioned above. Researchers introduce the concept of a state of recognition and point out that there is a relation between receiving updates and defining a set valued Markov Chain. Also, deciding what the value of the next set valued variable is can be phrased in terms of classical decision making theory such as minimizing the maximum regret. Other related problems are examined.

  12. Comment on “Rethinking first-principles electron transport theories with projection operators: The problems caused by partitioning the basis set” [J. Chem. Phys. 139, 114104 (2013)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brandbyge, Mads, E-mail: mads.brandbyge@nanotech.dtu.dk

    2014-05-07

    In a recent paper Reuter and Harrison [J. Chem. Phys. 139, 114104 (2013)] question the widely used mean-field electron transport theories, which employ nonorthogonal localized basis sets. They claim these can violate an “implicit decoupling assumption,” leading to wrong results for the current, different from what would be obtained by using an orthogonal basis, and dividing surfaces defined in real-space. We argue that this assumption is not required to be fulfilled to get exact results. We show how the current/transmission calculated by the standard Greens function method is independent of whether or not the chosen basis set is nonorthogonal, andmore » that the current for a given basis set is consistent with divisions in real space. The ambiguity known from charge population analysis for nonorthogonal bases does not carry over to calculations of charge flux.« less

  13. No Quantum Realization of Extremal No-Signaling Boxes

    NASA Astrophysics Data System (ADS)

    Ramanathan, Ravishankar; Tuziemski, Jan; Horodecki, Michał; Horodecki, Paweł

    2016-07-01

    The study of quantum correlations is important for fundamental reasons as well as for quantum communication and information processing tasks. On the one hand, it is of tremendous interest to derive the correlations produced by measurements on separated composite quantum systems from within the set of all correlations obeying the no-signaling principle of relativity, by means of information-theoretic principles. On the other hand, an important ongoing research program concerns the formulation of device-independent cryptographic protocols based on quantum nonlocal correlations for the generation of secure keys, and the amplification and expansion of random bits against general no-signaling adversaries. In both these research programs, a fundamental question arises: Can any measurements on quantum states realize the correlations present in pure extremal no-signaling boxes? Here, we answer this question in full generality showing that no nontrivial (not local realistic) extremal boxes of general no-signaling theories can be realized in quantum theory. We then explore some important consequences of this fact.

  14. Toward a Principled Sampling Theory for Quasi-Orders

    PubMed Central

    Ünlü, Ali; Schrepp, Martin

    2016-01-01

    Quasi-orders, that is, reflexive and transitive binary relations, have numerous applications. In educational theories, the dependencies of mastery among the problems of a test can be modeled by quasi-orders. Methods such as item tree or Boolean analysis that mine for quasi-orders in empirical data are sensitive to the underlying quasi-order structure. These data mining techniques have to be compared based on extensive simulation studies, with unbiased samples of randomly generated quasi-orders at their basis. In this paper, we develop techniques that can provide the required quasi-order samples. We introduce a discrete doubly inductive procedure for incrementally constructing the set of all quasi-orders on a finite item set. A randomization of this deterministic procedure allows us to generate representative samples of random quasi-orders. With an outer level inductive algorithm, we consider the uniform random extensions of the trace quasi-orders to higher dimension. This is combined with an inner level inductive algorithm to correct the extensions that violate the transitivity property. The inner level correction step entails sampling biases. We propose three algorithms for bias correction and investigate them in simulation. It is evident that, on even up to 50 items, the new algorithms create close to representative quasi-order samples within acceptable computing time. Hence, the principled approach is a significant improvement to existing methods that are used to draw quasi-orders uniformly at random but cannot cope with reasonably large item sets. PMID:27965601

  15. Toward a Principled Sampling Theory for Quasi-Orders.

    PubMed

    Ünlü, Ali; Schrepp, Martin

    2016-01-01

    Quasi-orders, that is, reflexive and transitive binary relations, have numerous applications. In educational theories, the dependencies of mastery among the problems of a test can be modeled by quasi-orders. Methods such as item tree or Boolean analysis that mine for quasi-orders in empirical data are sensitive to the underlying quasi-order structure. These data mining techniques have to be compared based on extensive simulation studies, with unbiased samples of randomly generated quasi-orders at their basis. In this paper, we develop techniques that can provide the required quasi-order samples. We introduce a discrete doubly inductive procedure for incrementally constructing the set of all quasi-orders on a finite item set. A randomization of this deterministic procedure allows us to generate representative samples of random quasi-orders. With an outer level inductive algorithm, we consider the uniform random extensions of the trace quasi-orders to higher dimension. This is combined with an inner level inductive algorithm to correct the extensions that violate the transitivity property. The inner level correction step entails sampling biases. We propose three algorithms for bias correction and investigate them in simulation. It is evident that, on even up to 50 items, the new algorithms create close to representative quasi-order samples within acceptable computing time. Hence, the principled approach is a significant improvement to existing methods that are used to draw quasi-orders uniformly at random but cannot cope with reasonably large item sets.

  16. The free-energy principle: a unified brain theory?

    PubMed

    Friston, Karl

    2010-02-01

    A free-energy principle has been proposed recently that accounts for action, perception and learning. This Review looks at some key brain theories in the biological (for example, neural Darwinism) and physical (for example, information theory and optimal control theory) sciences from the free-energy perspective. Crucially, one key theme runs through each of these theories - optimization. Furthermore, if we look closely at what is optimized, the same quantity keeps emerging, namely value (expected reward, expected utility) or its complement, surprise (prediction error, expected cost). This is the quantity that is optimized under the free-energy principle, which suggests that several global brain theories might be unified within a free-energy framework.

  17. Inelastic transport theory from first principles: Methodology and application to nanoscale devices

    NASA Astrophysics Data System (ADS)

    Frederiksen, Thomas; Paulsson, Magnus; Brandbyge, Mads; Jauho, Antti-Pekka

    2007-05-01

    We describe a first-principles method for calculating electronic structure, vibrational modes and frequencies, electron-phonon couplings, and inelastic electron transport properties of an atomic-scale device bridging two metallic contacts under nonequilibrium conditions. The method extends the density-functional codes SIESTA and TRANSIESTA that use atomic basis sets. The inelastic conductance characteristics are calculated using the nonequilibrium Green’s function formalism, and the electron-phonon interaction is addressed with perturbation theory up to the level of the self-consistent Born approximation. While these calculations often are computationally demanding, we show how they can be approximated by a simple and efficient lowest order expansion. Our method also addresses effects of energy dissipation and local heating of the junction via detailed calculations of the power flow. We demonstrate the developed procedures by considering inelastic transport through atomic gold wires of various lengths, thereby extending the results presented in Frederiksen [Phys. Rev. Lett. 93, 256601 (2004)]. To illustrate that the method applies more generally to molecular devices, we also calculate the inelastic current through different hydrocarbon molecules between gold electrodes. Both for the wires and the molecules our theory is in quantitative agreement with experiments, and characterizes the system-specific mode selectivity and local heating.

  18. Toward a new methodological paradigm for testing theories of health behavior and health behavior change.

    PubMed

    Noar, Seth M; Mehrotra, Purnima

    2011-03-01

    Traditional theory testing commonly applies cross-sectional (and occasionally longitudinal) survey research to test health behavior theory. Since such correlational research cannot demonstrate causality, a number of researchers have called for the increased use of experimental methods for theory testing. We introduce the multi-methodological theory-testing (MMTT) framework for testing health behavior theory. The MMTT framework introduces a set of principles that broaden the perspective of how we view evidence for health behavior theory. It suggests that while correlational survey research designs represent one method of testing theory, the weaknesses of this approach demand that complementary approaches be applied. Such approaches include randomized lab and field experiments, mediation analysis of theory-based interventions, and meta-analysis. These alternative approaches to theory testing can demonstrate causality in a much more robust way than is possible with correlational survey research methods. Such approaches should thus be increasingly applied in order to more completely and rigorously test health behavior theory. Greater application of research derived from the MMTT may lead researchers to refine and modify theory and ultimately make theory more valuable to practitioners. Copyright © 2010 Elsevier Ireland Ltd. All rights reserved.

  19. Site specific interaction between ZnO nanoparticles and tyrosine: A density functional theory study

    NASA Astrophysics Data System (ADS)

    Singh, Satvinder; Singh, Janpreet; Singh, Baljinder; Singh, Gurinder; Kaura, Aman; Tripathi, S. K.

    2018-05-01

    First Principles Calculations have been performed on ZnO/Tyrosine atomic complex to study site specific interaction of Tyrosine and ZnO nanoparticles. Calculated results shows that -COOH group present in Tyrosine is energetically more favorable than -NH2 group. Interactions show ionic bonding between ZnO and Tyrosine. All the calculations have been performed under the Density Functional Theory (DFT) framework. Structural and electronic properties of (ZnO)3/Tyrosine complex have been studied. Gaussian basis set approach has been adopted for the calculations. A ring type most stable (ZnO)3 atomic cluster has been modeled, analyzed and used for the calculations.

  20. A critique of principlism.

    PubMed

    Clouser, K D; Gert, B

    1990-04-01

    The authors use the term "principlism" to refer to the practice of using "principles" to replace both moral theory and particular moral rules and ideals in dealing with the moral problems that arise in medical practice. The authors argue that these "principles" do not function as claimed, and that their use is misleading both practically and theoretically. The "principles" are in fact not guides to action, but rather they are merely names for a collection of sometimes superficially related matters for consideration when dealing with a moral problem. The "principles" lack any systematic relationship to each other, and they often conflict with each other. These conflicts are unresolvable, since there is no unified moral theory from which they are all derived. For comparison the authors sketch the advantages of using a unified moral theory.

  1. Using extant literature in a grounded theory study: a personal account.

    PubMed

    Yarwood-Ross, Lee; Jack, Kirsten

    2015-03-01

    To provide a personal account of the factors in a doctoral study that led to the adoption of classic grounded theory principles relating to the use of literature. Novice researchers considering grounded theory methodology will become aware of the contentious issue of how and when extant literature should be incorporated into a study. The three main grounded theory approaches are classic, Straussian and constructivist, and the seminal texts provide conflicting beliefs surrounding the use of literature. A classic approach avoids a pre-study literature review to minimise preconceptions and emphasises the constant comparison method, while the Straussian and constructivist approaches focus more on the beneficial aspects of an initial literature review and researcher reflexivity. The debate also extends into the wider academic community, where no consensus exists. This is a methodological paper detailing the authors' engagement in the debate surrounding the role of the literature in a grounded theory study. In the authors' experience, researchers can best understand the use of literature in grounded theory through immersion in the seminal texts, engaging with wider academic literature, and examining their preconceptions of the substantive area. The authors concluded that classic grounded theory principles were appropriate in the context of their doctoral study. Novice researchers will have their own sets of circumstances when preparing their studies and should become aware of the different perspectives to make decisions that they can ultimately justify. This paper can be used by other novice researchers as an example of the decision-making process that led to delaying a pre-study literature review and identifies the resources used to write a research proposal when using a classic grounded theory approach.

  2. No interpretation without representation: the role of domain-specific representations and inferences in the Wason selection task.

    PubMed

    Fiddick, L; Cosmides, L; Tooby, J

    2000-10-16

    The Wason selection task is a tool used to study reasoning about conditional rules. Performance on this task changes systematically when one varies its content, and these content effects have been used to argue that the human cognitive architecture contains a number of domain-specific representation and inference systems, such as social contract algorithms and hazard management systems. Recently, however, Sperber, Cara & Girotto (Sperber, D., Cara, F., & Girotto, V. (1995). Relevance theory explains the selection task. Cognition, 57, 31-95) have proposed that relevance theory can explain performance on the selection task - including all content effects - without invoking inference systems that are content-specialized. Herein, we show that relevance theory alone cannot explain a variety of content effects - effects that were predicted in advance and are parsimoniously explained by theories that invoke domain-specific algorithms for representing and making inferences about (i) social contracts and (ii) reducing risk in hazardous situations. Moreover, although Sperber et al. (1995) were able to use relevance theory to produce some new content effects in other domains, they conducted no experiments involving social exchanges or precautions, and so were unable to determine which - content-specialized algorithms or relevance effects - dominate reasoning when the two conflict. When experiments, reported herein, are constructed so that the different theories predict divergent outcomes, the results support the predictions of social contract theory and hazard management theory, indicating that these inference systems override content-general relevance factors. The fact that social contract and hazard management algorithms provide better explanations for performance in their respective domains does not mean that the content-general logical procedures posited by relevance theory do not exist, or that relevance effects never occur. It does mean, however, that one needs a principled way of explaining which effects will dominate when a set of inputs activate more than one reasoning system. We propose the principle of pre-emptive specificity - that the human cognitive architecture should be designed so that more specialized inference systems pre-empt more general ones whenever the stimuli centrally fit the input conditions of the more specialized system. This principle follows from evolutionary and computational considerations that are common to both relevance theory and the ecological rationality approach.

  3. Symmetry as Bias: Rediscovering Special Relativity

    NASA Technical Reports Server (NTRS)

    Lowry, Michael R.

    1992-01-01

    This paper describes a rational reconstruction of Einstein's discovery of special relativity, validated through an implementation: the Erlanger program. Einstein's discovery of special relativity revolutionized both the content of physics and the research strategy used by theoretical physicists. This research strategy entails a mutual bootstrapping process between a hypothesis space for biases, defined through different postulated symmetries of the universe, and a hypothesis space for physical theories. The invariance principle mutually constrains these two spaces. The invariance principle enables detecting when an evolving physical theory becomes inconsistent with its bias, and also when the biases for theories describing different phenomena are inconsistent. Structural properties of the invariance principle facilitate generating a new bias when an inconsistency is detected. After a new bias is generated. this principle facilitates reformulating the old, inconsistent theory by treating the latter as a limiting approximation. The structural properties of the invariance principle can be suitably generalized to other types of biases to enable primal-dual learning.

  4. Not just autonomy--the principles of American biomedical ethics.

    PubMed Central

    Holm, S

    1995-01-01

    The Principles of Biomedical Ethics by Tom L Beauchamp and James F Childress which is now in its fourth edition has had a great influence on the development of bioethics through its exposition of a theory based on the four principles: respect for autonomy; non-maleficence; beneficence, and justice (1). The theory is developed as a common-morality theory, and the present paper attempts to show how this approach, starting from American common-morality, leads to an underdevelopment of beneficence and justice, and that the methods offered for specification and balancing of principles are inadequate. PMID:8778456

  5. Viewing Eye Movements During Reading through the Lens of Chaos Theory: How Reading Is Like the Weather

    ERIC Educational Resources Information Center

    Paulson, Eric J.

    2005-01-01

    This theoretical article examines reading processes using chaos theory as an analogy. Three principles of chaos theory are identified and discussed, then related to reading processes as revealed through eye movement research. Used as an analogy, the chaos theory principle of sensitive dependence contributes to understanding the difficulty in…

  6. Design Principles for the Atomic and Electronic Structure of Halide Perovskite Photovoltaic Materials: Insights from Computation.

    PubMed

    Berger, Robert F

    2018-02-09

    In the current decade, perovskite solar cell research has emerged as a remarkably active, promising, and rapidly developing field. Alongside breakthroughs in synthesis and device engineering, halide perovskite photovoltaic materials have been the subject of predictive and explanatory computational work. In this Minireview, we focus on a subset of this computation: density functional theory (DFT)-based work highlighting the ways in which the electronic structure and band gap of this class of materials can be tuned via changes in atomic structure. We distill this body of computational literature into a set of underlying design principles for the band gap engineering of these materials, and rationalize these principles from the viewpoint of band-edge orbital character. We hope that this perspective provides guidance and insight toward the rational design and continued improvement of perovskite photovoltaics. © 2018 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.

  7. Foundations for a theory of gravitation theories

    NASA Technical Reports Server (NTRS)

    Thorne, K. S.; Lee, D. L.; Lightman, A. P.

    1972-01-01

    A foundation is laid for future analyses of gravitation theories. This foundation is applicable to any theory formulated in terms of geometric objects defined on a 4-dimensional spacetime manifold. The foundation consists of (1) a glossary of fundamental concepts; (2) a theorem that delineates the overlap between Lagrangian-based theories and metric theories; (3) a conjecture (due to Schiff) that the Weak Equivalence Principle implies the Einstein Equivalence Principle; and (4) a plausibility argument supporting this conjecture for the special case of relativistic, Lagrangian-based theories.

  8. Endeavoring to Contextualize Curricula Within an EBP Framework: A Grounded Theory Study.

    PubMed

    Malik, Gulzar; McKenna, Lisa; Griffiths, Debra

    2018-01-01

    Adopting evidence-based practice (EBP) principles in undergraduate education can facilitate nursing students' appreciation of EBP. Using grounded theory method, this study aimed to explore processes used by nurse academics while integrating EBP concepts in undergraduate nursing curricula across Australian universities. Twenty-three nurse academics were interviewed and nine were observed during teaching of undergraduate students. In addition, 20 unit/subject guides were analyzed using grounded theory approach of data analysis. The theory " On a path to success: Endeavoring to contextualize curricula within an EBP framework" reflects academics' endeavors toward linking EBP concepts to practice, aiming to contextualize curricula in a manner that engages students within an EBP framework. However, academics' journeys were influenced by several contextual factors which require strategies to accomplish their endeavors. In conclusion, initiatives to minimize barriers, faculty development, and provision of resources across educational and clinical settings are fundamental to achieving undergraduate curricula underpinned by EBP concepts.

  9. On the Pontryagin maximum principle for systems with delays. Economic applications

    NASA Astrophysics Data System (ADS)

    Kim, A. V.; Kormyshev, V. M.; Kwon, O. B.; Mukhametshin, E. R.

    2017-11-01

    The Pontryagin maximum principle [6] is the key stone of finite-dimensional optimal control theory [1, 2, 5]. So beginning with opening the maximum principle it was important to extend the maximum principle on various classes of dynamical systems. In t he paper we consider some aspects of application of i-smooth analysis [3, 4] in the theory of the Pontryagin maximum principle [6] for systems with delays, obtained results can be applied by elaborating optimal program controls in economic models with delays.

  10. The method of 'principlism': a critique of the critique.

    PubMed

    Lustig, B A

    1992-10-01

    Several scholars have recently criticized the dominant emphasis upon mid-level principles in bioethics best exemplified by Beauchamp and Childress's Principles of Biomedical Ethics. In Part I of this essay, I assess the fairness and cogency of three broad criticisms raised against 'principlism' as an approach: (1) that principlism, as an exercise in applied ethics, is insufficiently attentive to the dialectical relations between ethical theory and mortal practice; (2) that principlism fails to offer a systematic account of the principles of non-maleficence, beneficence, respect for autonomy, and justice; and (3) that principlism, as a version of moral pluralism, is fatally flawed by its theoretical agnosticism. While acknowledging that Beauchamp and Childress's reliance upon Ross's version of intuitionism is problematic, I conclude that the critics of principlism have failed to make a compelling case against its theoretical or practical adequacy as an ethical approach. In Part II, I assess the moral theory developed by Bernard Gert in Mortality: A New Justification of the Moral Rules, because Gert has recommended his approach as a systematic alternative to principlism. I judge Gert's theory to be seriously incomplete and, in contrast to principlism, unable to generate coherent conclusions about cases of active euthanasia and paternalism.

  11. Scaling Principles for Understanding and Exploiting Adhesion

    NASA Astrophysics Data System (ADS)

    Crosby, Alfred

    A grand challenge in the science of adhesion is the development of a general design paradigm for adhesive materials that can sustain large forces across an interface yet be detached with minimal force upon command. Essential to this challenge is the generality of achieving this performance under a wide set of external conditions and across an extensive range of forces. Nature has provided some guidance through various examples, e.g. geckos, for how to meet this challenge; however, a single solution is not evident upon initial investigation. To help provide insight into nature's ability to scale reversible adhesion and adapt to different external constraints, we have developed a general scaling theory that describes the force capacity of an adhesive interface in the context of biological locomotion. We have demonstrated that this scaling theory can be used to understand the relative performance of a wide range of organisms, including numerous gecko species and insects, as well as an extensive library of synthetic adhesive materials. We will present the development and testing of this scaling theory, and how this understanding has helped guide the development of new composite materials for high capacity adhesives. We will also demonstrate how this scaling theory has led to the development of new strategies for transfer printing and adhesive applications in manufacturing processes. Overall, the developed scaling principles provide a framework for guiding the design of adhesives.

  12. What is general relativity?

    NASA Astrophysics Data System (ADS)

    Coley, Alan A.; Wiltshire, David L.

    2017-05-01

    General relativity is a set of physical and geometric principles, which lead to a set of (Einstein) field equations that determine the gravitational field and to the geodesic equations that describe light propagation and the motion of particles on the background. But open questions remain, including: what is the scale on which matter and geometry are dynamically coupled in the Einstein equations? Are the field equations valid on small and large scales? What is the largest scale on which matter can be coarse grained while following a geodesic of a solution to Einstein’s equations? We address these questions. If the field equations are causal evolution equations, whose average on cosmological scales is not an exact solution of the Einstein equations, then some simplifying physical principle is required to explain the statistical homogeneity of the late epoch Universe. Such a principle may have its origin in the dynamical coupling between matter and geometry at the quantum level in the early Universe. This possibility is hinted at by diverse approaches to quantum gravity which find a dynamical reduction to two effective dimensions at high energies on one hand, and by cosmological observations which are beginning to strongly restrict the class of viable inflationary phenomenologies on the other. We suggest that the foundational principles of general relativity will play a central role in reformulating the theory of spacetime structure to meet the challenges of cosmology in the 21st century.

  13. Conventional Principles in Science: On the foundations and development of the relativized a priori

    NASA Astrophysics Data System (ADS)

    Ivanova, Milena; Farr, Matt

    2015-11-01

    The present volume consists of a collection of papers originally presented at the conference Conventional Principles in Science, held at the University of Bristol, August 2011, which featured contributions on the history and contemporary development of the notion of 'relativized a priori' principles in science, from Henri Poincaré's conventionalism to Michael Friedman's contemporary defence of the relativized a priori. In Science and Hypothesis, Poincaré assessed the problematic epistemic status of Euclidean geometry and Newton's laws of motion, famously arguing that each has the status of 'convention' in that their justification is neither analytic nor empirical in nature. In The Theory of Relativity and A Priori Knowledge, Hans Reichenbach, in light of the general theory of relativity, proposed an updated notion of the Kantian synthetic a priori to account for the dynamic inter-theoretic status of geometry and other non-empirical physical principles. Reichenbach noted that one may reject the 'necessarily true' aspect of the synthetic a priori whilst preserving the feature of being constitutive of the object of knowledge. Such constitutive principles are theory-relative, as illustrated by the privileged role of non-Euclidean geometry in general relativity theory. This idea of relativized a priori principles in spacetime physics has been analysed and developed at great length in the modern literature in the work of Michael Friedman, in particular the roles played by the light postulate and the equivalence principle - in special and general relativity respectively - in defining the central terms of their respective theories and connecting the abstract mathematical formalism of the theories with their empirical content. The papers in this volume guide the reader through the historical development of conventional and constitutive principles in science, from the foundational work of Poincaré, Reichenbach and others, to contemporary issues and applications of the relativized a priori concerning the notion of measurement, physical possibility, and the interpretation of scientific theories.

  14. The geometric approach to sets of ordinary differential equations and Hamiltonian dynamics

    NASA Technical Reports Server (NTRS)

    Estabrook, F. B.; Wahlquist, H. D.

    1975-01-01

    The calculus of differential forms is used to discuss the local integration theory of a general set of autonomous first order ordinary differential equations. Geometrically, such a set is a vector field V in the space of dependent variables. Integration consists of seeking associated geometric structures invariant along V: scalar fields, forms, vectors, and integrals over subspaces. It is shown that to any field V can be associated a Hamiltonian structure of forms if, when dealing with an odd number of dependent variables, an arbitrary equation of constraint is also added. Families of integral invariants are an immediate consequence. Poisson brackets are isomorphic to Lie products of associated CT-generating vector fields. Hamilton's variational principle follows from the fact that the maximal regular integral manifolds of a closed set of forms must include the characteristics of the set.

  15. Using the Music Industry To Teach Economic Principles.

    ERIC Educational Resources Information Center

    Stamm, K. Brad

    The key purpose of this paper is to provide economics and business professors, particularly those teaching principles courses, with concrete examples of economic theory applied to the music industry. A second objective is to further the interest in economic theory among business majors and expose non-majors to economic principles via real world…

  16. An Analysis and Evaluation of the Montessori Theory of Inner Discipline.

    ERIC Educational Resources Information Center

    Burns, Sister Alicia

    The principles of the Montessori theory of inner discipline are discussed and evaluated through examination of the writings of and about Maria Montessori. The principles are also discussed in relation to available empirical and descriptive research concerning discipline. The principles of inner discipline may be summarized as follows: The child is…

  17. Theory and experiment in gravitational physics

    NASA Technical Reports Server (NTRS)

    Will, C. M.

    1981-01-01

    New technological advances have made it feasible to conduct measurements with precision levels which are suitable for experimental tests of the theory of general relativity. This book has been designed to fill a new need for a complete treatment of techniques for analyzing gravitation theory and experience. The Einstein equivalence principle and the foundations of gravitation theory are considered, taking into account the Dicke framework, basic criteria for the viability of a gravitation theory, experimental tests of the Einstein equivalence principle, Schiff's conjecture, and a model theory devised by Lightman and Lee (1973). Gravitation as a geometric phenomenon is considered along with the parametrized post-Newtonian formalism, the classical tests, tests of the strong equivalence principle, gravitational radiation as a tool for testing relativistic gravity, the binary pulsar, and cosmological tests.

  18. Theory and experiment in gravitational physics

    NASA Astrophysics Data System (ADS)

    Will, C. M.

    New technological advances have made it feasible to conduct measurements with precision levels which are suitable for experimental tests of the theory of general relativity. This book has been designed to fill a new need for a complete treatment of techniques for analyzing gravitation theory and experience. The Einstein equivalence principle and the foundations of gravitation theory are considered, taking into account the Dicke framework, basic criteria for the viability of a gravitation theory, experimental tests of the Einstein equivalence principle, Schiff's conjecture, and a model theory devised by Lightman and Lee (1973). Gravitation as a geometric phenomenon is considered along with the parametrized post-Newtonian formalism, the classical tests, tests of the strong equivalence principle, gravitational radiation as a tool for testing relativistic gravity, the binary pulsar, and cosmological tests.

  19. Concept-Based Learning in Clinical Experiences: Bringing Theory to Clinical Education for Deep Learning.

    PubMed

    Nielsen, Ann

    2016-07-01

    Concept-based learning is used increasingly in nursing education to support the organization, transfer, and retention of knowledge. Concept-based learning activities (CBLAs) have been used in clinical education to explore key aspects of the patient situation and principles of nursing care, without responsibility for total patient care. The nature of best practices in teaching and the resultant learning are not well understood. The purpose of this multiple-case study research was to explore and describe concept-based learning in the context of clinical education in inpatient settings. Four clinical groups (each a case) were observed while they used CBLAs in the clinical setting. Major findings include that concept-based learning fosters deep learning, connection of theory with practice, and clinical judgment. Strategies used to support learning, major teaching-learning foci, and preconditions for concept-based teaching and learning will be described. Concept-based learning is promising to support integration of theory with practice and clinical judgment through application experiences with patients. [J Nurs Educ. 2016;55(7):365-371.]. Copyright 2016, SLACK Incorporated.

  20. An Alternative to the Problematic Macro-Micro Structure of Introductory Economics.

    ERIC Educational Resources Information Center

    Tinari, Frank D.

    The paper explains an alternative structure to teaching micro and macroeconomic theory and describes the characteristics that make it an effective framework for introductory and principles courses. The teaching of economics principles typically proceeds by separating macroeconomic theory and microeconomic theory. But the use of the macro-micro…

  1. Making Decisions about an Educational Game, Simulation or Workshop: A 'Game Theory' Perspective.

    ERIC Educational Resources Information Center

    Cryer, Patricia

    1988-01-01

    Uses game theory to help practitioners make decisions about educational games, simulations, or workshops whose outcomes depend to some extent on chance. Highlights include principles for making decisions involving risk; elementary laws of probability; utility theory; and principles for making decisions involving uncertainty. (eight references)…

  2. Using Gestalt Theory to Teach Document Design and Graphics.

    ERIC Educational Resources Information Center

    Moore, Patrick; Fitz, Chad

    1993-01-01

    Presents a brief overview of Gestalt theory. Discusses and illustrates six key principles of Gestalt psychology as they apply to document design and graphics. Presents exercise that students may use to improve their understanding of the principles and develop their document design skills. Distinguishes between Gestalt theory and rhetoric. (RS)

  3. Statistical mechanical theory for steady state systems. VI. Variational principles

    NASA Astrophysics Data System (ADS)

    Attard, Phil

    2006-12-01

    Several variational principles that have been proposed for nonequilibrium systems are analyzed. These include the principle of minimum rate of entropy production due to Prigogine [Introduction to Thermodynamics of Irreversible Processes (Interscience, New York, 1967)], the principle of maximum rate of entropy production, which is common on the internet and in the natural sciences, two principles of minimum dissipation due to Onsager [Phys. Rev. 37, 405 (1931)] and to Onsager and Machlup [Phys. Rev. 91, 1505 (1953)], and the principle of maximum second entropy due to Attard [J. Chem.. Phys. 122, 154101 (2005); Phys. Chem. Chem. Phys. 8, 3585 (2006)]. The approaches of Onsager and Attard are argued to be the only viable theories. These two are related, although their physical interpretation and mathematical approximations differ. A numerical comparison with computer simulation results indicates that Attard's expression is the only accurate theory. The implications for the Langevin and other stochastic differential equations are discussed.

  4. On international cost-sharing of pharmaceutical R&D.

    PubMed

    Barros, Pedro Pita; Martinez-Giralt, Xavier

    2008-12-01

    Ramsey pricing has been proposed in the pharmaceutical industry as a principle to price discriminate among markets while allowing to recover the (fixed) R&D cost. However, such analyses neglect the presence of insurance or the fund raising costs for most of drug reimbursement. By incorporating these new elements, we aim at providing some building blocks towards an economic theory incorporating Ramsey pricing and insurance coverage. We show how coinsurance affects the optimal prices to pay for the R&D investment. We also show that under certain conditions, there is no strategic incentive by governments to set coinsurance rates in order to shift the financial burden of R&D. This will have important implications to the application of Ramsey pricing principles to pharmaceutical products across countries.

  5. Intersections of Critical Systems Thinking and Community Based Participatory Research: A Learning Organization Example with the Autistic Community

    PubMed Central

    Raymaker, Dora M

    2016-01-01

    Critical systems thinking (CST) and community based participatory research (CBPR) are distinct approaches to inquiry which share a primary commitment to holism and human emancipation, as well as common grounding in critical theory and emancipatory and pragmatic philosophy. This paper explores their intersections and complements on a historical, philosophical, and theoretical level, and then proposes a hybrid approach achieved by applying CBPR's principles and considerations for operationalizing emancipatory practice to traditional systems thinking frameworks and practices. This hybrid approach is illustrated in practice with examples drawn from of the implementation of the learning organization model in an action research setting with the Autistic community. Our experience of being able to actively attend to, and continuously equalize, power relations within an organizational framework that otherwise has great potential for reinforcing power inequity suggests CBPR's principles and considerations for operationalizing emancipatory practice could be useful in CST settings, and CST's vocabulary, methods, and clarity around systems thinking concepts could be valuable to CBPR practioners. PMID:27833398

  6. Intersections of Critical Systems Thinking and Community Based Participatory Research: A Learning Organization Example with the Autistic Community.

    PubMed

    Raymaker, Dora M

    2016-10-01

    Critical systems thinking (CST) and community based participatory research (CBPR) are distinct approaches to inquiry which share a primary commitment to holism and human emancipation, as well as common grounding in critical theory and emancipatory and pragmatic philosophy. This paper explores their intersections and complements on a historical, philosophical, and theoretical level, and then proposes a hybrid approach achieved by applying CBPR's principles and considerations for operationalizing emancipatory practice to traditional systems thinking frameworks and practices. This hybrid approach is illustrated in practice with examples drawn from of the implementation of the learning organization model in an action research setting with the Autistic community. Our experience of being able to actively attend to, and continuously equalize, power relations within an organizational framework that otherwise has great potential for reinforcing power inequity suggests CBPR's principles and considerations for operationalizing emancipatory practice could be useful in CST settings, and CST's vocabulary, methods, and clarity around systems thinking concepts could be valuable to CBPR practioners.

  7. Application and Utility of Psychodynamic Principles in Forensic Assessment.

    PubMed

    Simopoulos, Eugene F; Cohen, Bruce

    2015-12-01

    Effective practice of forensic psychiatry is dependent on a clinical recognition and understanding of core psychodynamic principles and theory. Practice guidelines, rooted in the ethics-based imperative to strive for honesty and objectivity, demand that practitioners remain vigilant to the development of bias and appreciate interpersonal dynamics that may be re-enacted in the forensic setting. Although it is not feasible to maintain complete impartiality, especially when confronted with the nature of certain offenses, knowledge of both conscious and unconscious responses can bolster the intellectual integrity of the clinical assessment. The identification of defense mechanisms within both the evaluator and evaluee and attention to transference and countertransference are essential for an accurate conceptualization of an offender's psychological functioning, vulnerabilities, and risk of reoffense. In this article, we review psychodynamic concepts and their potential impact in the forensic setting and underscore interventions that may aid in the elucidation and management of these processes. © 2015 American Academy of Psychiatry and the Law.

  8. The Use of Reverse Auction Within the U.S. Army

    DTIC Science & Technology

    2016-12-01

    by conducting a literature review on auction theory and the economic principles surrounding open markets and competition. Books, magazine articles...economic principles within auction theory examine buyer and seller motivation. B. AUCTION THEORY Auction theory explains how market participants...that leverage the power of fluid market conditions through a dynamic pricing environment. This project examines the use of RAs within the Army

  9. Providing Nutritional Care in the Office Practice: Teams, Tools, and Techniques.

    PubMed

    Kushner, Robert F

    2016-11-01

    Provision of dietary counseling in the office setting is enhanced by using team-based care and electronic tools. Effective provider-patient communication is essential for fostering behavior change: the key component of lifestyle medicine. The principles of communication and behavior change are skill-based and grounded in scientific theories and models. Motivational interviewing and shared decision making, a collaboration process between patients and their providers to reach agreement about a health decision, is an important process in counseling. The stages of change, self-determination, health belief model, social cognitive model, theory of planned behavior, and cognitive behavioral therapy are used in the counseling process. Copyright © 2016 Elsevier Inc. All rights reserved.

  10. Reconceptualization of the Diffusion Process: An Application of Selected Principles from Modern Systems Theory.

    ERIC Educational Resources Information Center

    Silver, Wayne

    A description of the communication behaviors in high innovation societies depends on the application of selected principles from modern systems theory. The first is the principle of equifinality which explains the activities of open systems. If the researcher views society as an open system, he frees himself from the client approach since society…

  11. The heuristic-analytic theory of reasoning: extension and evaluation.

    PubMed

    Evans, Jonathan St B T

    2006-06-01

    An extensively revised heuristic-analytic theory of reasoning is presented incorporating three principles of hypothetical thinking. The theory assumes that reasoning and judgment are facilitated by the formation of epistemic mental models that are generated one at a time (singularity principle) by preconscious heuristic processes that contextualize problems in such a way as to maximize relevance to current goals (relevance principle). Analytic processes evaluate these models but tend to accept them unless there is good reason to reject them (satisficing principle). At a minimum, analytic processing of models is required so as to generate inferences or judgments relevant to the task instructions, but more active intervention may result in modification or replacement of default models generated by the heuristic system. Evidence for this theory is provided by a review of a wide range of literature on thinking and reasoning.

  12. Initial conditions of inhomogeneous universe and the cosmological constant problem

    NASA Astrophysics Data System (ADS)

    Totani, Tomonori

    2016-06-01

    Deriving the Einstein field equations (EFE) with matter fluid from the action principle is not straightforward, because mass conservation must be added as an additional constraint to make rest-frame mass density variable in reaction to metric variation. This can be avoided by introducing a constraint 0δ(√-g) = to metric variations δ gμν, and then the cosmological constant Λ emerges as an integration constant. This is a removal of one of the four constraints on initial conditions forced by EFE at the birth of the universe, and it may imply that EFE are unnecessarily restrictive about initial conditions. I then adopt a principle that the theory of gravity should be able to solve time evolution starting from arbitrary inhomogeneous initial conditions about spacetime and matter. The equations of gravitational fields satisfying this principle are obtained, by setting four auxiliary constraints on δ gμν to extract six degrees of freedom for gravity. The cost of achieving this is a loss of general covariance, but these equations constitute a consistent theory if they hold in the special coordinate systems that can be uniquely specified with respect to the initial space-like hypersurface when the universe was born. This theory predicts that gravity is described by EFE with non-zero Λ in a homogeneous patch of the universe created by inflation, but Λ changes continuously across different patches. Then both the smallness and coincidence problems of the cosmological constant are solved by the anthropic argument. This is just a result of inhomogeneous initial conditions, not requiring any change of the fundamental physical laws in different patches.

  13. Evolution of cooperative strategies from first principles.

    PubMed

    Burtsev, Mikhail; Turchin, Peter

    2006-04-20

    One of the greatest challenges in the modern biological and social sciences is to understand the evolution of cooperative behaviour. General outlines of the answer to this puzzle are currently emerging as a result of developments in the theories of kin selection, reciprocity, multilevel selection and cultural group selection. The main conceptual tool used in probing the logical coherence of proposed explanations has been game theory, including both analytical models and agent-based simulations. The game-theoretic approach yields clear-cut results but assumes, as a rule, a simple structure of payoffs and a small set of possible strategies. Here we propose a more stringent test of the theory by developing a computer model with a considerably extended spectrum of possible strategies. In our model, agents are endowed with a limited set of receptors, a set of elementary actions and a neural net in between. Behavioural strategies are not predetermined; instead, the process of evolution constructs and reconstructs them from elementary actions. Two new strategies of cooperative attack and defence emerge in simulations, as well as the well-known dove, hawk and bourgeois strategies. Our results indicate that cooperative strategies can evolve even under such minimalist assumptions, provided that agents are capable of perceiving heritable external markers of other agents.

  14. Marcus Theory: Thermodynamics CAN Control the Kinetics of Electron Transfer Reactions

    ERIC Educational Resources Information Center

    Silverstein, Todd P.

    2012-01-01

    Although it is generally true that thermodynamics do not influence kinetics, this is NOT the case for electron transfer reactions in solution. Marcus Theory explains why this is so, using straightforward physical chemical principles such as transition state theory, Arrhenius' Law, and the Franck-Condon Principle. Here the background and…

  15. Principled Practical Knowledge: Not a Bridge but a Ladder

    ERIC Educational Resources Information Center

    Bereiter, Carl

    2014-01-01

    The much-lamented gap between theory and practice in education cannot be filled by practical knowledge alone or by explanatory knowledge alone. Principled practical knowledge (PPK) is a type of knowledge that has characteristics of both practical know-how and scientific theory. Like basic scientific theory, PPK meets standards of explanatory…

  16. Relativistic tests with lunar laser ranging

    NASA Astrophysics Data System (ADS)

    Hofmann, F.; Müller, J.

    2018-02-01

    This paper presents the recent version of the lunar laser ranging (LLR) analysis model at the Institut für Erdmessung (IfE), Leibniz Universität Hannover and highlights a few tests of Einstein’s theory of gravitation using LLR data. Investigations related to a possible temporal variation of the gravitational constant, the equivalence principle, the PPN parameters β and γ as well as the geodetic precession were carried out. The LLR analysis model was updated by gravitational effects of the Sun and planets with the Moon as extended body. The higher-order gravitational interaction between Earth and Moon as well as effects of the solid Earth tides on the lunar motion were refined. The basis for the modeled lunar rotation is now a 2-layer core/mantle model according to the DE430 ephemeris. The validity of Einstein’s theory was studied using this updated analysis model and an LLR data set from 1970 to January 2015. Within the estimated accuracies, no deviations from Einstein’s theory are detected. A relative temporal variation of the gravitational constant is estimated as \\dot{G}/G_0=(7.1+/-7.6)×10-14~yr-1 , the test of the equivalence principle gives Δ(m_g/m_i)EM=(-3+/-5)×10-14 and the Nordtvedt parameter \

  17. Introductory Remarks

    NASA Astrophysics Data System (ADS)

    Gavroglu, Kostas

    Practitioners of many (sub)-disciplines in the sciences are, at times, confronted with an apparent bliss which often turns into a nightmare: they are stuck with too good and too fertile a theory. 'Normal' science is surely a rewarding practice-but for that very reason it may, at times, also become boring. Theories or theoretical schemata may make successful predictions, may clarify 'mechanisms', they may show the way to further developments, and they may be amenable to non-controversial approximations. If one is really lucky, they may even-at least in principle-be able to answer all questions. There have-especially in the history of physics-been many such theories. Laplacian physics, ether physics and superstrings have historically defined the frameworks for such utopias where everything could be answerable, at least in principle. But one is truly at a loss when one is confronted with this in principle. In principle but not in practice? In principle but never? Confronted with the deadlocks that are implicit in such utopias, scientists started to collectively display a Procrustean psychopathology. They would prepare the beds and, yet, the theories would manage to trick the tricksters: almost all theories appeared to be fitting to any Procrustean bed. They were short and tall and normal at the same time.

  18. The Effects of Embedded Generative Learning Strategies and Collaboration on Knowledge Acquisition in a Cognitive Flexibility-Based Computer Learning Environment

    DTIC Science & Technology

    1998-08-07

    cognitive flexibility theory and generative learning theory which focus primarily on the individual student’s cognitive development , collaborative... develop "Handling Transfusion Hazards," a computer program based upon cognitive flexibility theory principles. The Program: Handling Transfusion Hazards...computer program was developed according to cognitive flexibility theory principles. A generative version was then developed by embedding

  19. Spin and gravitation

    NASA Technical Reports Server (NTRS)

    Ray, J. R.

    1982-01-01

    The fundamental variational principle for a perfect fluid in general relativity is extended so that it applies to the metric-torsion Einstein-Cartan theory. Field equations for a perfect fluid in the Einstein-Cartan theory are deduced. In addition, the equations of motion for a fluid with intrinsic spin in general relativity are deduced from a special relativistic variational principle. The theory is a direct extension of the theory of nonspinning fluids in special relativity.

  20. Foundations of quantum gravity: The role of principles grounded in empirical reality

    NASA Astrophysics Data System (ADS)

    Holman, Marc

    2014-05-01

    When attempting to assess the strengths and weaknesses of various principles in their potential role of guiding the formulation of a theory of quantum gravity, it is crucial to distinguish between principles which are strongly supported by empirical data - either directly or indirectly - and principles which instead (merely) rely heavily on theoretical arguments for their justification. Principles in the latter category are not necessarily invalid, but their a priori foundational significance should be regarded with due caution. These remarks are illustrated in terms of the current standard models of cosmology and particle physics, as well as their respective underlying theories, i.e., essentially general relativity and quantum (field) theory. For instance, it is clear that both standard models are severely constrained by symmetry principles: an effective homogeneity and isotropy of the known universe on the largest scales in the case of cosmology and an underlying exact gauge symmetry of nuclear and electromagnetic interactions in the case of particle physics. However, in sharp contrast to the cosmological situation, where the relevant symmetry structure is more or less established directly on observational grounds, all known, nontrivial arguments for the "gauge principle" are purely theoretical (and far less conclusive than usually advocated). Similar remarks apply to the larger theoretical structures represented by general relativity and quantum (field) theory, where - actual or potential - empirical principles, such as the (Einstein) equivalence principle or EPR-type nonlocality, should be clearly differentiated from theoretical ones, such as general covariance or renormalizability. It is argued that if history is to be of any guidance, the best chance to obtain the key structural features of a putative quantum gravity theory is by deducing them, in some form, from the appropriate empirical principles (analogous to the manner in which, say, the idea that gravitation is a curved spacetime phenomenon is arguably implied by the equivalence principle). Theoretical principles may still be useful however in formulating a concrete theory (analogous to the manner in which, say, a suitable form of general covariance can still act as a sieve for separating theories of gravity from one another). It is subsequently argued that the appropriate empirical principles for deducing the key structural features of quantum gravity should at least include (i) quantum nonlocality, (ii) irreducible indeterminacy (or, essentially equivalently, given (i), relativistic causality), (iii) the thermodynamic arrow of time, (iv) homogeneity and isotropy of the observable universe on the largest scales. In each case, it is explained - when appropriate - how the principle in question could be implemented mathematically in a theory of quantum gravity, why it is considered to be of fundamental significance and also why contemporary accounts of it are insufficient. For instance, the high degree of uniformity observed in the Cosmic Microwave Background is usually regarded as theoretically problematic because of the existence of particle horizons, whereas the currently popular attempts to resolve this situation in terms of inflationary models are, for a number of reasons, less than satisfactory. However, rather than trying to account for the required empirical features dynamically, an arguably much more fruitful approach consists in attempting to account for these features directly, in the form of a lawlike initial condition within a theory of quantum gravity.

  1. Dynamical quantum phase transitions: a review

    NASA Astrophysics Data System (ADS)

    Heyl, Markus

    2018-05-01

    Quantum theory provides an extensive framework for the description of the equilibrium properties of quantum matter. Yet experiments in quantum simulators have now opened up a route towards the generation of quantum states beyond this equilibrium paradigm. While these states promise to show properties not constrained by equilibrium principles, such as the equal a priori probability of the microcanonical ensemble, identifying the general properties of nonequilibrium quantum dynamics remains a major challenge, especially in view of the lack of conventional concepts such as free energies. The theory of dynamical quantum phase transitions attempts to identify such general principles by lifting the concept of phase transitions to coherent quantum real-time evolution. This review provides a pedagogical introduction to this field. Starting from the general setting of nonequilibrium dynamics in closed quantum many-body systems, we give the definition of dynamical quantum phase transitions as phase transitions in time with physical quantities becoming nonanalytic at critical times. We summarize the achieved theoretical advances as well as the first experimental observations, and furthermore provide an outlook to major open questions as well as future directions of research.

  2. Dynamical quantum phase transitions: a review.

    PubMed

    Heyl, Markus

    2018-05-01

    Quantum theory provides an extensive framework for the description of the equilibrium properties of quantum matter. Yet experiments in quantum simulators have now opened up a route towards the generation of quantum states beyond this equilibrium paradigm. While these states promise to show properties not constrained by equilibrium principles, such as the equal a priori probability of the microcanonical ensemble, identifying the general properties of nonequilibrium quantum dynamics remains a major challenge, especially in view of the lack of conventional concepts such as free energies. The theory of dynamical quantum phase transitions attempts to identify such general principles by lifting the concept of phase transitions to coherent quantum real-time evolution. This review provides a pedagogical introduction to this field. Starting from the general setting of nonequilibrium dynamics in closed quantum many-body systems, we give the definition of dynamical quantum phase transitions as phase transitions in time with physical quantities becoming nonanalytic at critical times. We summarize the achieved theoretical advances as well as the first experimental observations, and furthermore provide an outlook to major open questions as well as future directions of research.

  3. Neural principles of memory and a neural theory of analogical insight

    NASA Astrophysics Data System (ADS)

    Lawson, David I.; Lawson, Anton E.

    1993-12-01

    Grossberg's principles of neural modeling are reviewed and extended to provide a neural level theory to explain how analogies greatly increase the rate of learning and can, in fact, make learning and retention possible. In terms of memory, the key point is that the mind is able to recognize and recall when it is able to match sensory input from new objects, events, or situations with past memory records of similar objects, events, or situations. When a match occurs, an adaptive resonance is set up in which the synaptic strengths of neurons are increased; thus a long term record of the new input is formed in memory. Systems of neurons called outstars and instars are presumably the underlying units that enable this to occur. Analogies can greatly facilitate learning and retention because they activate the outstars (i.e., the cells that are sampling the to-be-learned pattern) and cause the neural activity to grow exponentially by forming feedback loops. This increased activity insures the boost in synaptic strengths of neurons, thus causing storage and retention in long-term memory (i.e., learning).

  4. Beable-guided quantum theories: Generalizing quantum probability laws

    NASA Astrophysics Data System (ADS)

    Kent, Adrian

    2013-02-01

    Beable-guided quantum theories (BGQT) are generalizations of quantum theory, inspired by Bell's concept of beables. They modify the quantum probabilities for some specified set of fundamental events, histories, or other elements of quasiclassical reality by probability laws that depend on the realized configuration of beables. For example, they may define an additional probability weight factor for a beable configuration, independent of the quantum dynamics. Beable-guided quantum theories can be fitted to observational data to provide foils against which to compare explanations based on standard quantum theory. For example, a BGQT could, in principle, characterize the effects attributed to dark energy or dark matter, or any other deviation from the predictions of standard quantum dynamics, without introducing extra fields or a cosmological constant. The complexity of the beable-guided theory would then parametrize how far we are from a standard quantum explanation. Less conservatively, we give reasons for taking suitably simple beable-guided quantum theories as serious phenomenological theories in their own right. Among these are the possibility that cosmological models defined by BGQT might in fact fit the empirical data better than any standard quantum explanation, and the fact that BGQT suggest potentially interesting nonstandard ways of coupling quantum matter to gravity.

  5. Supporting shared decision-making for older people with multiple health and social care needs: a protocol for a realist synthesis to inform integrated care models

    PubMed Central

    Bunn, Frances; Goodman, Claire; Manthorpe, Jill; Durand, Marie-Anne; Hodkinson, Isabel; Rait, Greta; Millac, Paul; Davies, Sue L; Russell, Bridget; Wilson, Patricia

    2017-01-01

    Introduction Including the patient or user perspective is a central organising principle of integrated care. Moreover, there is increasing recognition of the importance of strengthening relationships among patients, carers and practitioners, particularly for individuals receiving substantial health and care support, such as those with long-term or multiple conditions. The overall aims of this synthesis are to provide a context-relevant understanding of how models to facilitate shared decision-making (SDM) might work for older people with multiple health and care needs, and how they might be applied to integrated care models. Methods and analysis The synthesis draws on the principles of realist inquiry, to explain how, in what contexts and for whom, interventions that aim to strengthen SDM among older patients, carers and practitioners are effective. We will use an iterative, stakeholder-driven, three-phase approach. Phase 1: development of programme theory/theories that will be tested through a first scoping of the literature and consultation with key stakeholder groups; phase 2: systematic searches of the evidence to test and develop the theories identified in phase 1; phase 3: validation of programme theory/theories with a purposive sample of participants from phase 1. The synthesis will draw on prevailing theories such as candidacy, self-efficacy, personalisation and coproduction. Ethics and dissemination Ethics approval for the stakeholder interviews was obtained from the University of Hertfordshire ECDA (Ethics Committee with Delegated Authority), reference number HSK/SF/UH/02387. The propositions arising from this review will be used to develop recommendations about how to tailor SDM interventions to older people with complex health and social care needs in an integrated care setting. PMID:28174225

  6. Dual Processes in Decision Making and Developmental Neuroscience: A Fuzzy-Trace Model.

    PubMed

    Reyna, Valerie F; Brainerd, Charles J

    2011-09-01

    From Piaget to the present, traditional and dual-process theories have predicted improvement in reasoning from childhood to adulthood, and improvement has been observed. However, developmental reversals-that reasoning biases emerge with development -have also been observed in a growing list of paradigms. We explain how fuzzy-trace theory predicts both improvement and developmental reversals in reasoning and decision making. Drawing on research on logical and quantitative reasoning, as well as on risky decision making in the laboratory and in life, we illustrate how the same small set of theoretical principles apply to typical neurodevelopment, encompassing childhood, adolescence, and adulthood, and to neurological conditions such as autism and Alzheimer's disease. For example, framing effects-that risk preferences shift when the same decisions are phrases in terms of gains versus losses-emerge in early adolescence as gist-based intuition develops. In autistic individuals, who rely less on gist-based intuition and more on verbatim-based analysis, framing biases are attenuated (i.e., they outperform typically developing control subjects). In adults, simple manipulations based on fuzzy-trace theory can make framing effects appear and disappear depending on whether gist-based intuition or verbatim-based analysis is induced. These theoretical principles are summarized and integrated in a new mathematical model that specifies how dual modes of reasoning combine to produce predictable variability in performance. In particular, we show how the most popular and extensively studied model of decision making-prospect theory-can be derived from fuzzy-trace theory by combining analytical (verbatim-based) and intuitive (gist-based) processes.

  7. Cognitive Theory of Multimedia Learning, Instructional Design Principles, and Students with Learning Disabilities in Computer-Based and Online Learning Environments

    ERIC Educational Resources Information Center

    Greer, Diana L.; Crutchfield, Stephen A.; Woods, Kari L.

    2013-01-01

    Struggling learners and students with Learning Disabilities often exhibit unique cognitive processing and working memory characteristics that may not align with instructional design principles developed with typically developing learners. This paper explains the Cognitive Theory of Multimedia Learning and underlying Cognitive Load Theory, and…

  8. Lean and leadership practices: development of an initial realist program theory.

    PubMed

    Goodridge, Donna; Westhorp, Gill; Rotter, Thomas; Dobson, Roy; Bath, Brenna

    2015-09-07

    Lean as a management system has been increasingly adopted in health care settings in an effort to enhance quality, capacity and safety, while simultaneously containing or reducing costs. The Ministry of Health in the province of Saskatchewan, Canada has made a multi-million dollar investment in Lean initiatives to create "better health, better value, better care, and better teams", affording a unique opportunity to advance our understanding of the way in which Lean philosophy, principles and tools work in health care. In order to address the questions, "What changes in leadership practices are associated with the implementation of Lean?" and "When leadership practices change, how do the changed practices contribute to subsequent outcomes?", we used a qualitative, multi-stage approach to work towards developing an initial realist program theory. We describe the implications of realist assumptions for evaluation of this Lean initiative. Formal theories including Normalization Process Theory, Theories of Double Loop and Organization Leaning and the Theory of Cognitive Dissonance help understand this initial rough program theory. Data collection included: key informant consultation; a stakeholder workshop; documentary review; 26 audiotaped and transcribed interviews with health region personnel; and team discussions. A set of seven initial hypotheses regarding the manner in which Lean changes leadership practices were developed from our data. We hypothesized that Lean, as implemented in this particular setting, changes leadership practices in the following ways. Lean: a) aligns the aims and objectives of health regions; b) authorizes attention and resources to quality improvement and change management c) provides an integrated set of tools for particular tasks; d) changes leaders' attitudes or beliefs about appropriate leadership and management styles and behaviors; e) demands increased levels of expertise, accountability and commitment from leaders; f) measures and uses data effectively to identify actual and relevant local problems and the root causes of those problems; and g) creates or supports a 'learning organization' culture. This study has generated initial hypotheses and realist program theory that can form the basis for future evaluation of Lean initiatives. Developing leadership capacity and culture is theorized to be a necessary precursor to other systemic and observable changes arising from Lean initiatives.

  9. Universal behavior of generalized causal set d’Alembertians in curved spacetime

    NASA Astrophysics Data System (ADS)

    Belenchia, Alessio

    2016-07-01

    Causal set non-local wave operators allow both for the definition of an action for causal set theory and the study of deviations from local physics that may have interesting phenomenological consequences. It was previously shown that, in all dimensions, the (unique) minimal discrete operators give averaged continuum non-local operators that reduce to \\square -R/2 in the local limit. Recently, dropping the constraint of minimality, it was shown that there exist an infinite number of discrete operators satisfying basic physical requirements and with the right local limit in flat spacetime. In this work, we consider this entire class of generalized causal set d’Alembertins in curved spacetimes and extend to them the result about the universality of the -R/2 factor. Finally, we comment on the relation of this result to the Einstein equivalence principle.

  10. In search of principles for a Theory of Organisms

    PubMed Central

    Longo, Giuseppe; Montévil, Maël; Sonnenschein, Carlos; Soto, Ana M

    2017-01-01

    Lacking an operational theory to explain the organization and behaviour of matter in unicellular and multicellular organisms hinders progress in biology. Such a theory should address life cycles from ontogenesis to death. This theory would complement the theory of evolution that addresses phylogenesis, and would posit theoretical extensions to accepted physical principles and default states in order to grasp the living state of matter and define proper biological observables. Thus, we favour adopting the default state implicit in Darwin’s theory, namely, cell proliferation with variation plus motility, and a framing principle, namely, life phenomena manifest themselves as non-identical iterations of morphogenetic processes. From this perspective, organisms become a consequence of the inherent variability generated by proliferation, motility and self-organization. Morphogenesis would then be the result of the default state plus physical constraints, like gravity, and those present in living organisms, like muscular tension. PMID:26648040

  11. Qualitative data analysis for health services research: developing taxonomy, themes, and theory.

    PubMed

    Bradley, Elizabeth H; Curry, Leslie A; Devers, Kelly J

    2007-08-01

    To provide practical strategies for conducting and evaluating analyses of qualitative data applicable for health services researchers. DATA SOURCES AND DESIGN: We draw on extant qualitative methodological literature to describe practical approaches to qualitative data analysis. Approaches to data analysis vary by discipline and analytic tradition; however, we focus on qualitative data analysis that has as a goal the generation of taxonomy, themes, and theory germane to health services research. We describe an approach to qualitative data analysis that applies the principles of inductive reasoning while also employing predetermined code types to guide data analysis and interpretation. These code types (conceptual, relationship, perspective, participant characteristics, and setting codes) define a structure that is appropriate for generation of taxonomy, themes, and theory. Conceptual codes and subcodes facilitate the development of taxonomies. Relationship and perspective codes facilitate the development of themes and theory. Intersectional analyses with data coded for participant characteristics and setting codes can facilitate comparative analyses. Qualitative inquiry can improve the description and explanation of complex, real-world phenomena pertinent to health services research. Greater understanding of the processes of qualitative data analysis can be helpful for health services researchers as they use these methods themselves or collaborate with qualitative researchers from a wide range of disciplines.

  12. Real-time gray-scale photolithography for fabrication of continuous microstructure

    NASA Astrophysics Data System (ADS)

    Peng, Qinjun; Guo, Yongkang; Liu, Shijie; Cui, Zheng

    2002-10-01

    A novel real-time gray-scale photolithography technique for the fabrication of continuous microstructures that uses a LCD panel as a real-time gray-scale mask is presented. The principle of design of the technique is explained, and computer simulation results based on partially coherent imaging theory are given for the patterning of a microlens array and a zigzag grating. An experiment is set up, and a microlens array and a zigzag grating on panchromatic silver halide sensitized gelatin with trypsinase etching are obtained.

  13. Use of Process Improvement Tools in Radiology.

    PubMed

    Rawson, James V; Kannan, Amogha; Furman, Melissa

    2016-01-01

    Process improvement techniques are common in manufacturing and industry. Over the past few decades these principles have been slowly introduced in select health care settings. This article reviews the Plan, Do, Study, and Act cycle, Six Sigma, the System of Profound Knowledge, Lean, and the theory of constraints. Specific process improvement tools in health care and radiology are presented in the order the radiologist is likely to encounter them in an improvement project. Copyright © 2015 Mosby, Inc. All rights reserved.

  14. The Sagnac effect and its interpretation by Paul Langevin

    NASA Astrophysics Data System (ADS)

    Pascoli, Gianni

    2017-11-01

    The French physicist Georges Sagnac is nowdays frequently cited by the engineers who work on devices such as ring-laser gyroscopes. These systems operate on the principle of the Sagnac effect. It is less known that Sagnac was a strong opponent to the theory of special relativity proposed by Albert Einstein. He set up his experiment to prove the existence of the aether discarded by the Einsteinian relativity. An accurate explanation of the phenomenon was provided by Paul Langevin in 1921.

  15. Fair reckoning: a qualitative investigation of responses to an economic health resource allocation survey

    PubMed Central

    Giacomini, Mita; Hurley, Jeremiah; DeJean, Deirdre

    2012-01-01

    Abstract Objective  To investigate how participants in an economic resource allocation survey construct notions of fairness. Design  Qualitative interview study guided by interpretive grounded theory methods. Setting and participants  Qualitative interviews were conducted with volunteer university‐ (n = 39) and community‐based (n = 7) economic survey participants. Intervention or main variables studied  We explored how participants constructed meanings to guide or explain fair survey choices, focusing on rationales, imagery and additional desired information not provided in the survey scenarios. Main outcome measures  Data were transcribed and coded into qualitative categories. Analysis iterated with data collection iterated through three waves of interviews. Results  Participants compared the survey dilemmas to domains outside the health system. Most compared them with other micro‐level, inter‐personal sharing tasks. Participants raised several fairness‐relevant factors beyond need or capacity to benefit. These included age, weight, poverty, access to other options and personal responsibility for illness; illness duration, curability or seriousness; life expectancy; possibilities for sharing; awareness of other’s needs; and ability to explain allocations to those affected. They also articulated a fairness principle little considered by equity theories: that everybody must get something and nobody should get nothing. Discussion and conclusions  Lay criteria for judging fairness are myriad. Simple scenarios may be used to investigate lay commitments to abstract principles. Although principles are the focus of analysis and inference, participants may solve simplified dilemmas by imputing extraneous features to the problem or applying unanticipated principles. These possibilities should be taken into account in the design of resource allocation surveys eliciting the views of the public. PMID:22390183

  16. Principles of Surgical Treatment in the Midface Trauma - Theory and Practice

    PubMed Central

    VRINCEANU, Daniela; BANICA, Bogdan

    2014-01-01

    Introduction: Facial trauma is a common injury in the urban setting. Many studies have been published on the epidemiology and treatment of facial fractures, but few of them conducted in emergencies hospital as ours. The purpose of this study was to present theory and practice in surgical treatment of midface trauma. Materials and method: We will present a retrospective study and a cases series report with our personal experience in diagnosis and treatment of middle floor facial trauma. Craniofacial trauma in context of polytrauma involves a screening condition assessment of the patient to prioritize lesions and frequently require a multidisciplinary approach: neurosurgeon, ENT surgeon, maxillo-facial surgeon, ophthalmologist, plastic surgeon and so on. Axial and coronal CT are mandatory and three-dimensional CT reconstruction can be extremely useful. Surgical indication in middle floor facial trauma is given by functional and aesthetic deficits. Results: We will present the surgical principles we use in treatment of fractured nose, in fractures of maxilla, in fractures of the zygomatic arch with or without zygoma body fractures and fractures of the floor of orbit. Discussions: The surgical technique was imposed by coexisting lesions of neuro and viscerocranium, by the complexity of the fracture, by functional or aesthetic deficits and by our surgical experience. Conclusions: The main principles in middle face trauma are an accurate and complete lesions evaluation; mixed surgery team with maxillofacial surgeon and neurosurgeon. PMID:25705306

  17. Unification Principle and a Geometric Field Theory

    NASA Astrophysics Data System (ADS)

    Wanas, Mamdouh I.; Osman, Samah N.; El-Kholy, Reham I.

    2015-08-01

    In the context of the geometrization philosophy, a covariant field theory is constructed. The theory satisfies the unification principle. The field equations of the theory are constructed depending on a general differential identity in the geometry used. The Lagrangian scalar used in the formalism is neither curvature scalar nor torsion scalar, but an alloy made of both, the W-scalar. The physical contents of the theory are explored depending on different methods. The analysis shows that the theory is capable of dealing with gravity, electromagnetism and material distribution with possible mutual interactions. The theory is shown to cover the domain of general relativity under certain conditions.

  18. Designing the Electronic Classroom: Applying Learning Theory and Ergonomic Design Principles.

    ERIC Educational Resources Information Center

    Emmons, Mark; Wilkinson, Frances C.

    2001-01-01

    Applies learning theory and ergonomic principles to the design of effective learning environments for library instruction. Discusses features of electronic classroom ergonomics, including the ergonomics of physical space, environmental factors, and workstations; and includes classroom layouts. (Author/LRW)

  19. Dispersion correction derived from first principles for density functional theory and Hartree-Fock theory.

    PubMed

    Guidez, Emilie B; Gordon, Mark S

    2015-03-12

    The modeling of dispersion interactions in density functional theory (DFT) is commonly performed using an energy correction that involves empirically fitted parameters for all atom pairs of the system investigated. In this study, the first-principles-derived dispersion energy from the effective fragment potential (EFP) method is implemented for the density functional theory (DFT-D(EFP)) and Hartree-Fock (HF-D(EFP)) energies. Overall, DFT-D(EFP) performs similarly to the semiempirical DFT-D corrections for the test cases investigated in this work. HF-D(EFP) tends to underestimate binding energies and overestimate intermolecular equilibrium distances, relative to coupled cluster theory, most likely due to incomplete accounting for electron correlation. Overall, this first-principles dispersion correction yields results that are in good agreement with coupled-cluster calculations at a low computational cost.

  20. Theory of protective empowering for balancing patient safety and choices.

    PubMed

    Chiovitti, Rosalina F

    2011-01-01

    Registered nurses in psychiatric-mental health nursing continuously balance the ethical principles of duty to do good (beneficence) and no harm (non-maleficence) with the duty to respect patient choices (autonomy). However, the problem of nurses' level of control versus patients' choices remains a challenge. The aim of this article is to discuss how nurses accomplish their simultaneous responsibility for balancing patient safety (beneficence and non-maleficence) with patient choices (autonomy) through the theory of protective empowering. This is done by reflecting on interview excerpts about caring from 17 registered nurses taking part in a grounded theory study conducted in three acute urban psychiatric hospital settings in Canada. The interplay between the protective and empowering dimensions of the theory of protective empowering was found to correspond with international, national, and local nursing codes of ethics and standards. The overall core process of protective empowering, and its associated reflective questions, is offered as a new lens for balancing patient safety with choices.

  1. Magnetic exchange couplings from constrained density functional theory: an efficient approach utilizing analytic derivatives.

    PubMed

    Phillips, Jordan J; Peralta, Juan E

    2011-11-14

    We introduce a method for evaluating magnetic exchange couplings based on the constrained density functional theory (C-DFT) approach of Rudra, Wu, and Van Voorhis [J. Chem. Phys. 124, 024103 (2006)]. Our method shares the same physical principles as C-DFT but makes use of the fact that the electronic energy changes quadratically and bilinearly with respect to the constraints in the range of interest. This allows us to use coupled perturbed Kohn-Sham spin density functional theory to determine approximately the corrections to the energy of the different spin configurations and construct a priori the relevant energy-landscapes obtained by constrained spin density functional theory. We assess this methodology in a set of binuclear transition-metal complexes and show that it reproduces very closely the results of C-DFT. This demonstrates a proof-of-concept for this method as a potential tool for studying a number of other molecular phenomena. Additionally, routes to improving upon the limitations of this method are discussed. © 2011 American Institute of Physics

  2. The commerce and crossover of resources: resource conservation in the service of resilience.

    PubMed

    Chen, Shoshi; Westman, Mina; Hobfoll, Stevan E

    2015-04-01

    Conservation of resources (COR) theory was originally introduced as a framework for understanding and predicting the consequences of major and traumatic stress, but following the work of Hobfoll and Shirom (1993), COR theory has been adopted to understanding and predicting work-related stress and both the stress and resilience that occur within work settings and work culture. COR theory underscores the critical role of resource possession, lack, loss and gain and depicts personal, social and material resources co-travelling in resource caravans, rather than piecemeal. We briefly review the principles of COR theory and integrate it in the crossover model, which provides a key mechanism for multi-person exchange of emotions, experiences and resources. Understanding the impact of resource reservoirs, resource passageways and crossover provides a framework for research and intervention promoting resilience to employees as well as to organizations. It emphasizes that the creation and maintenance of resource caravan passageways promote resource gain climates through resource crossover processes. Copyright © 2014 John Wiley & Sons, Ltd.

  3. Refined Zigzag Theory for Homogeneous, Laminated Composite, and Sandwich Plates: A Homogeneous Limit Methodology for Zigzag Function Selection

    NASA Technical Reports Server (NTRS)

    Tessler, Alexander; DiSciuva, Marco; Gherlone, marco

    2010-01-01

    The Refined Zigzag Theory (RZT) for homogeneous, laminated composite, and sandwich plates is presented from a multi-scale formalism starting with the inplane displacement field expressed as a superposition of coarse and fine contributions. The coarse kinematic field is that of first-order shear-deformation theory, whereas the fine kinematic field has a piecewise-linear zigzag distribution through the thickness. The condition of limiting homogeneity of transverse-shear properties is proposed and yields four distinct sets of zigzag functions. By examining elastostatic solutions for highly heterogeneous sandwich plates, the best-performing zigzag functions are identified. The RZT predictive capabilities to model homogeneous and highly heterogeneous sandwich plates are critically assessed, demonstrating its superior efficiency, accuracy ; and a wide range of applicability. The present theory, which is derived from the virtual work principle, is well-suited for developing computationally efficient CO-continuous finite elements, and is thus appropriate for the analysis and design of high-performance load-bearing aerospace structures.

  4. An Applied Ecological Framework for Evaluating Infrastructure to Promote Walking and Cycling: The iConnect Study

    PubMed Central

    Bull, Fiona; Powell, Jane; Cooper, Ashley R.; Brand, Christian; Mutrie, Nanette; Preston, John; Rutter, Harry

    2011-01-01

    Improving infrastructure for walking and cycling is increasingly recommended as a means to promote physical activity, prevent obesity, and reduce traffic congestion and carbon emissions. However, limited evidence from intervention studies exists to support this approach. Drawing on classic epidemiological methods, psychological and ecological models of behavior change, and the principles of realistic evaluation, we have developed an applied ecological framework by which current theories about the behavioral effects of environmental change may be tested in heterogeneous and complex intervention settings. Our framework guides study design and analysis by specifying the most important data to be collected and relations to be tested to confirm or refute specific hypotheses and thereby refine the underlying theories. PMID:21233429

  5. Identifying X-consumers using causal recipes: "whales" and "jumbo shrimps" casino gamblers.

    PubMed

    Woodside, Arch G; Zhang, Mann

    2012-03-01

    X-consumers are the extremely frequent (top 2-3%) users who typically consume 25% of a product category. This article shows how to use fuzzy-set qualitative comparative analysis (QCA) to provide "causal recipes" sufficient for profiling X-consumers accurately. The study extends Dik Twedt's "heavy-half" product users for building theory and strategies to nurture or control X-behavior. The study here applies QCA to offer configurations that are sufficient in identifying "whales" and "jumbo shrimps" among X-casino gamblers. The findings support the principle that not all X-consumers are alike. The theory and method are applicable for identifying the degree of consistency and coverage of alternative X-consumers among users of all product-service category and brands.

  6. What is behind small deviations of quantum mechanics theory from experiments? Observer's mathematics point of view

    NASA Astrophysics Data System (ADS)

    Khots, Boris; Khots, Dmitriy

    2014-12-01

    Certain results that have been predicted by Quantum Mechanics (QM) theory are not always supported by experiments. This defines a deep crisis in contemporary physics and, in particular, quantum mechanics. We believe that, in fact, the mathematical apparatus employed within today's physics is a possible reason. In particular, we consider the concept of infinity that exists in today's mathematics as the root cause of this problem. We have created Observer's Mathematics that offers an alternative to contemporary mathematics. This paper is an attempt to relay how Observer's Mathematics may explain some of the contradictions in QM theory results. We consider the Hamiltonian Mechanics, Newton equation, Schrodinger equation, two slit interference, wave-particle duality for single photons, uncertainty principle, Dirac equations for free electron in a setting of arithmetic, algebra, and topology provided by Observer's Mathematics (see www.mathrelativity.com). Certain results and communications pertaining to solution of these problems are provided.

  7. Advances in cognitive theory and therapy: the generic cognitive model.

    PubMed

    Beck, Aaron T; Haigh, Emily A P

    2014-01-01

    For over 50 years, Beck's cognitive model has provided an evidence-based way to conceptualize and treat psychological disorders. The generic cognitive model represents a set of common principles that can be applied across the spectrum of psychological disorders. The updated theoretical model provides a framework for addressing significant questions regarding the phenomenology of disorders not explained in previous iterations of the original model. New additions to the theory include continuity of adaptive and maladaptive function, dual information processing, energizing of schemas, and attentional focus. The model includes a theory of modes, an organization of schemas relevant to expectancies, self-evaluations, rules, and memories. A description of the new theoretical model is followed by a presentation of the corresponding applied model, which provides a template for conceptualizing a specific disorder and formulating a case. The focus on beliefs differentiates disorders and provides a target for treatment. A variety of interventions are described.

  8. Conformal manifolds: ODEs from OPEs

    NASA Astrophysics Data System (ADS)

    Behan, Connor

    2018-03-01

    The existence of an exactly marginal deformation in a conformal field theory is very special, but it is not well understood how this is reflected in the allowed dimensions and OPE coefficients of local operators. To shed light on this question, we compute perturbative corrections to several observables in an abstract CFT, starting with the beta function. This yields a sum rule that the theory must obey in order to be part of a conformal manifold. The set of constraints relating CFT data at different values of the coupling can in principle be written as a dynamical system that allows one to flow arbitrarily far. We begin the analysis of it by finding a simple form for the differential equations when the spacetime and theory space are both one-dimensional. A useful feature we can immediately observe is that our system makes it very difficult for level crossing to occur.

  9. Utilizing tenets of inoculation theory to develop and evaluate a preventive alcohol education intervention.

    PubMed

    Duryea, E J

    1983-04-01

    With the advent of the Surgeon General's Report, Healthy People, a renewed interest in and concern for the health-risky practices of the school aged has emerged. Moreover, because the mortality rates for the 15 to 24 year age group continues to increase while the mortality rates for every other age group continues to decline, a school health education imperative has become prevention-based interventions. The experimental, prevention-based alcohol education program reported here describes one such intervention directed at 9th grade students. The program was grounded on the principles of Inoculation Theory and evaluated using a Solomon Four-Group Design. Results indicate that the formulation of preventive alcohol education programs utilizing Inoculation Theory in a school setting is both feasible and productive in achieving designated objectives. Longitudinal assessment of the subjects with regard to their alcohol-related behavior is continuing throughout their high school careers.

  10. Modeling Mixed Groups of Humans and Robots with Reflexive Game Theory

    NASA Astrophysics Data System (ADS)

    Tarasenko, Sergey

    The Reflexive Game Theory is based on decision-making principles similar to the ones used by humans. This theory considers groups of subjects and allows to predict which action from the set each subject in the group will choose. It is possible to influence subject's decision in a way that he will make a particular choice. The purpose of this study is to illustrate how robots can refrain humans from risky actions. To determine the risky actions, the Asimov's Three Laws of robotics are employed. By fusing the RGT's power to convince humans on the mental level with Asimov's Laws' safety, we illustrate how robots in the mixed groups of humans and robots can influence on human subjects in order to refrain humans from risky actions. We suggest that this fusion has a potential to device human-like motor behaving and looking robots with the human-like decision-making algorithms.

  11. Holographic definition of points and distances

    NASA Astrophysics Data System (ADS)

    Czech, Bartłomiej; Lamprou, Lampros

    2014-11-01

    We discuss the way in which field theory quantities assemble the spatial geometry of three-dimensional anti-de Sitter space (AdS3). The field theory ingredients are the entanglement entropies of boundary intervals. A point in AdS3 corresponds to a collection of boundary intervals which is selected by a variational principle we discuss. Coordinates in AdS3 are integration constants of the resulting equation of motion. We propose a distance function for this collection of points, which obeys the triangle inequality as a consequence of the strong subadditivity of entropy. Our construction correctly reproduces the static slice of AdS3 and the Ryu-Takayanagi relation between geodesics and entanglement entropies. We discuss how these results extend to quotients of AdS3 —the conical defect and the BTZ geometries. In these cases, the set of entanglement entropies must be supplemented by other field theory quantities, which can carry the information about lengths of nonminimal geodesics.

  12. From needs to goals and representations: Foundations for a unified theory of motivation, personality, and development.

    PubMed

    Dweck, Carol S

    2017-11-01

    Drawing on both classic and current approaches, I propose a theory that integrates motivation, personality, and development within one framework, using a common set of principles and mechanisms. The theory begins by specifying basic needs and by suggesting how, as people pursue need-fulfilling goals, they build mental representations of their experiences (beliefs, representations of emotions, and representations of action tendencies). I then show how these needs, goals, and representations can serve as the basis of both motivation and personality, and can help to integrate disparate views of personality. The article builds on this framework to provide a new perspective on development, particularly on the forces that propel development and the roles of nature and nurture. I argue throughout that the focus on representations provides an important entry point for change and growth. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  13. Hierarchical structure of biological systems

    PubMed Central

    Alcocer-Cuarón, Carlos; Rivera, Ana L; Castaño, Victor M

    2014-01-01

    A general theory of biological systems, based on few fundamental propositions, allows a generalization of both Wierner and Berthalanffy approaches to theoretical biology. Here, a biological system is defined as a set of self-organized, differentiated elements that interact pair-wise through various networks and media, isolated from other sets by boundaries. Their relation to other systems can be described as a closed loop in a steady-state, which leads to a hierarchical structure and functioning of the biological system. Our thermodynamical approach of hierarchical character can be applied to biological systems of varying sizes through some general principles, based on the exchange of energy information and/or mass from and within the systems. PMID:24145961

  14. Hierarchical structure of biological systems: a bioengineering approach.

    PubMed

    Alcocer-Cuarón, Carlos; Rivera, Ana L; Castaño, Victor M

    2014-01-01

    A general theory of biological systems, based on few fundamental propositions, allows a generalization of both Wierner and Berthalanffy approaches to theoretical biology. Here, a biological system is defined as a set of self-organized, differentiated elements that interact pair-wise through various networks and media, isolated from other sets by boundaries. Their relation to other systems can be described as a closed loop in a steady-state, which leads to a hierarchical structure and functioning of the biological system. Our thermodynamical approach of hierarchical character can be applied to biological systems of varying sizes through some general principles, based on the exchange of energy information and/or mass from and within the systems.

  15. Entropy bound of local quantum field theory with generalized uncertainty principle

    NASA Astrophysics Data System (ADS)

    Kim, Yong-Wan; Lee, Hyung Won; Myung, Yun Soo

    2009-03-01

    We study the entropy bound for local quantum field theory (LQFT) with generalized uncertainty principle. The generalized uncertainty principle provides naturally a UV cutoff to the LQFT as gravity effects. Imposing the non-gravitational collapse condition as the UV-IR relation, we find that the maximal entropy of a bosonic field is limited by the entropy bound A 3 / 4 rather than A with A the boundary area.

  16. The modification of generalized uncertainty principle applied in the detection technique of femtosecond laser

    NASA Astrophysics Data System (ADS)

    Li, Ziyi

    2017-12-01

    Generalized uncertainty principle (GUP), also known as the generalized uncertainty relationship, is the modified form of the classical Heisenberg’s Uncertainty Principle in special cases. When we apply quantum gravity theories such as the string theory, the theoretical results suggested that there should be a “minimum length of observation”, which is about the size of the Planck-scale (10-35m). Taking into account the basic scale of existence, we need to fix a new common form of Heisenberg’s uncertainty principle in the thermodynamic system and make effective corrections to statistical physical questions concerning about the quantum density of states. Especially for the condition at high temperature and high energy levels, generalized uncertainty calculations have a disruptive impact on classical statistical physical theories but the present theory of Femtosecond laser is still established on the classical Heisenberg’s Uncertainty Principle. In order to improve the detective accuracy and temporal resolution of the Femtosecond laser, we applied the modified form of generalized uncertainty principle to the wavelength, energy and pulse time of Femtosecond laser in our work. And we designed three typical systems from micro to macro size to estimate the feasibility of our theoretical model and method, respectively in the chemical solution condition, crystal lattice condition and nuclear fission reactor condition.

  17. A Principle of Intentionality.

    PubMed

    Turner, Charles K

    2017-01-01

    The mainstream theories and models of the physical sciences, including neuroscience, are all consistent with the principle of causality. Wholly causal explanations make sense of how things go, but are inherently value-neutral, providing no objective basis for true beliefs being better than false beliefs, nor for it being better to intend wisely than foolishly. Dennett (1987) makes a related point in calling the brain a syntactic (procedure-based) engine. He says that you cannot get to a semantic (meaning-based) engine from there. He suggests that folk psychology revolves around an intentional stance that is independent of the causal theories of the brain, and accounts for constructs such as meanings, agency, true belief, and wise desire. Dennett proposes that the intentional stance is so powerful that it can be developed into a valid intentional theory. This article expands Dennett's model into a principle of intentionality that revolves around the construct of objective wisdom. This principle provides a structure that can account for all mental processes, and for the scientific understanding of objective value. It is suggested that science can develop a far more complete worldview with a combination of the principles of causality and intentionality than would be possible with scientific theories that are consistent with the principle of causality alone.

  18. A Principle of Intentionality

    PubMed Central

    Turner, Charles K.

    2017-01-01

    The mainstream theories and models of the physical sciences, including neuroscience, are all consistent with the principle of causality. Wholly causal explanations make sense of how things go, but are inherently value-neutral, providing no objective basis for true beliefs being better than false beliefs, nor for it being better to intend wisely than foolishly. Dennett (1987) makes a related point in calling the brain a syntactic (procedure-based) engine. He says that you cannot get to a semantic (meaning-based) engine from there. He suggests that folk psychology revolves around an intentional stance that is independent of the causal theories of the brain, and accounts for constructs such as meanings, agency, true belief, and wise desire. Dennett proposes that the intentional stance is so powerful that it can be developed into a valid intentional theory. This article expands Dennett’s model into a principle of intentionality that revolves around the construct of objective wisdom. This principle provides a structure that can account for all mental processes, and for the scientific understanding of objective value. It is suggested that science can develop a far more complete worldview with a combination of the principles of causality and intentionality than would be possible with scientific theories that are consistent with the principle of causality alone. PMID:28223954

  19. Communication Theory.

    ERIC Educational Resources Information Center

    Penland, Patrick R.

    Three papers are presented which delineate the foundation of theory and principles which underlie the research and instructional approach to communications at the Graduate School of Library and Information Science, University of Pittsburgh. Cybernetic principles provide the integration, and validation is based in part on a situation-producing…

  20. Initial conditions of inhomogeneous universe and the cosmological constant problem

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Totani, Tomonori, E-mail: totani@astron.s.u-tokyo.ac.jp

    Deriving the Einstein field equations (EFE) with matter fluid from the action principle is not straightforward, because mass conservation must be added as an additional constraint to make rest-frame mass density variable in reaction to metric variation. This can be avoided by introducing a constraint 0δ(√− g ) = to metric variations δ g {sup μν}, and then the cosmological constant Λ emerges as an integration constant. This is a removal of one of the four constraints on initial conditions forced by EFE at the birth of the universe, and it may imply that EFE are unnecessarily restrictive about initialmore » conditions. I then adopt a principle that the theory of gravity should be able to solve time evolution starting from arbitrary inhomogeneous initial conditions about spacetime and matter. The equations of gravitational fields satisfying this principle are obtained, by setting four auxiliary constraints on δ g {sup μν} to extract six degrees of freedom for gravity. The cost of achieving this is a loss of general covariance, but these equations constitute a consistent theory if they hold in the special coordinate systems that can be uniquely specified with respect to the initial space-like hypersurface when the universe was born. This theory predicts that gravity is described by EFE with non-zero Λ in a homogeneous patch of the universe created by inflation, but Λ changes continuously across different patches. Then both the smallness and coincidence problems of the cosmological constant are solved by the anthropic argument. This is just a result of inhomogeneous initial conditions, not requiring any change of the fundamental physical laws in different patches.« less

  1. How education for sustainable development is implemented in Germany: Looking through the lens of educational governance theory

    NASA Astrophysics Data System (ADS)

    Bormann, Inka; Nikel, Jutta

    2017-12-01

    The United Nations (UN) Decade of Education for Sustainable Development (ESD) aimed to integrate the principles, values and practices of sustainable development into all aspects of education and learning around the world. The authors of this article address the implementation process of ESD in Germany during the UN Decade (2005-2014). By undertaking a meta-analysis of the findings of four related sub-studies they carried out during a three-year project funded by the German Federal Ministry of Education and Research, the authors contribute to the understanding of the process of transferring the concept of ESD within a multi-level education system. They investigated this process at two levels - the federal state (a sub-national entity in Germany) and the communal level. Drawing on educational governance theory, the authors unveil principles, norms, rules and procedures in the coordination of action within constellations of heterogeneous actors contributing to the implementation of ESD in their social entities. The outcome of the meta-analysis describes an emerging governance regime in ESD, taking into consideration the following features impacting the coordination of action being carried out by the actors involved: (1) the understanding of the normative concept of ESD as content of negotiation; (2) the perceived opportunity for actors to gain and increase appreciation within the field of ESD as an incentive for and driver of engagement; and (3) the dynamic quality of the set-up, rules and principles of the coordination of action, which renders these subject to situative changes. In the final part of the paper, the findings are discussed from the perspectives of the theory of transfer and the current empirical basis of ESD policy and governance.

  2. Building a functional multiple intelligences theory to advance educational neuroscience.

    PubMed

    Cerruti, Carlo

    2013-01-01

    A key goal of educational neuroscience is to conduct constrained experimental research that is theory-driven and yet also clearly related to educators' complex set of questions and concerns. However, the fields of education, cognitive psychology, and neuroscience use different levels of description to characterize human ability. An important advance in research in educational neuroscience would be the identification of a cognitive and neurocognitive framework at a level of description relatively intuitive to educators. I argue that the theory of multiple intelligences (MI; Gardner, 1983), a conception of the mind that motivated a past generation of teachers, may provide such an opportunity. I criticize MI for doing little to clarify for teachers a core misunderstanding, specifically that MI was only an anatomical map of the mind but not a functional theory that detailed how the mind actually processes information. In an attempt to build a "functional MI" theory, I integrate into MI basic principles of cognitive and neural functioning, namely interregional neural facilitation and inhibition. In so doing I hope to forge a path toward constrained experimental research that bears upon teachers' concerns about teaching and learning.

  3. A New Principle in Physiscs: the Principle "Finiteness", and Some Consequences

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Abraham Sternlieb

    2010-06-25

    In this paper I propose a new principle in physics: the principle of "finiteness". It stems from the definition of physics as a science that deals (among other things) with measurable dimensional physical quantities. Since measurement results, including their errors, are always finite, the principle of finiteness postulates that the mathematical formulation of "legitimate" laws of physics should prevent exactly zero or infinite solutions. Some consequences of the principle of finiteness are discussed, in general, and then more specifically in the fields of special relativity, quantum mechanics, and quantum gravity. The consequences are derived independently of any other theory ormore » principle in physics. I propose "finiteness" as a postulate (like the constancy of the speed of light in vacuum, "c"), as opposed to a notion whose validity has to be corroborated by, or derived theoretically or experimentally from other facts, theories, or principles.« less

  4. Exactly soluble local bosonic cocycle models, statistical transmutation, and simplest time-reversal symmetric topological orders in 3+1 dimensions

    NASA Astrophysics Data System (ADS)

    Wen, Xiao-Gang

    2017-05-01

    We propose a generic construction of exactly soluble local bosonic models that realize various topological orders with gappable boundaries. In particular, we construct an exactly soluble bosonic model that realizes a (3+1)-dimensional [(3+1)D] Z2-gauge theory with emergent fermionic Kramers doublet. We show that the emergence of such a fermion will cause the nucleation of certain topological excitations in space-time without pin+ structure. The exactly soluble model also leads to a statistical transmutation in (3+1)D. In addition, we construct exactly soluble bosonic models that realize 2 types of time-reversal symmetry-enriched Z2 topological orders in 2+1 dimensions, and 20 types of simplest time-reversal symmetry-enriched topological (SET) orders which have only one nontrivial pointlike and stringlike topological excitation. Many physical properties of those topological states are calculated using the exactly soluble models. We find that some time-reversal SET orders have pointlike excitations that carry Kramers doublet, a fractionalized time-reversal symmetry. We also find that some Z2 SET orders have stringlike excitations that carry anomalous (nononsite) Z2 symmetry, which can be viewed as a fractionalization of Z2 symmetry on strings. Our construction is based on cochains and cocycles in algebraic topology, which is very versatile. In principle, it can also realize emergent topological field theory beyond the twisted gauge theory.

  5. Beyond the Virtues-Principles Debate.

    ERIC Educational Resources Information Center

    Keat, Marilyn S.

    1992-01-01

    Indicates basic ontological assumptions in the virtues-principles debate in moral philosophy, noting Aristotle's and Kant's fundamental ideas about morality and considering a hermeneutic synthesis of theories. The article discusses what acceptance of the synthesis might mean in the theory and practice of moral pedagogy, offering examples of…

  6. An approach to children's smoking behavior using social cognitive learning theory.

    PubMed

    Bektas, Murat; Ozturk, Candan; Armstrong, Merry

    2010-01-01

    This review article discusses the theoretical principles of social cognitive learning theory and children's risk-taking behavior of cigarette smoking, along with preventive initiatives. Social cognitive learning theorists examine the behavior of initiating and sustained smoking using a social systems approach. The authors discuss the reciprocal determinism aspect of the theory as applied to the importance of individual factors, and environment and behavioral interactions that influence smoking behavior. Included is the concept of vicarious capability that suggests that smoking behavior is determined in response to and interaction with feedback provided by the environment. The principle of self-regulatory capability asserts that people have control over their own behavior and thus that behavior change is possible. The principle of self-efficacy proposes that high level of self-efficacy of an individual may decrease the behavior of attempting to or continuing to smoke. Examples of initiatives to be undertaken in order to prevent smoking in accordance with social cognitive learning theory are presented at the end of each principle.

  7. Adopting Basic Principles of the United Nations Academic Impact Initiative (UNAI): Can Cultural Differences Be Predicted from Value Orientations and Globalization?

    PubMed Central

    Nechtelberger, Andrea; Renner, Walter; Nechtelberger, Martin; Supeková, Soňa Chovanová; Hadjimarkou, Maria; Offurum, Chino; Ramalingam, Panchalan; Senft, Birgit; Redfern, Kylie

    2017-01-01

    The United Nations Academic Impact (UNAI) Initiative has set forth 10 Basic Principles for higher education. In the present study, a 10 item self-report questionnaire measuring personal endorsement of these principles has been tested by self-report questionnaires with university and post-graduate students from Austria, China, Cyprus, India, Nigeria, and Slovakia (total N = 976, N = 627 female, mean age 24.7 years, s = 5.7). Starting from the assumptions of Moral Foundations Theory (MFT), we expected that personal attitudes toward the UNAI Basic Principles would be predicted by endorsement of various moral foundations as suggested by MFT and by the individual's degree of globalization. Whereas for the Austrian, Cypriot, and Nigerian sub- samples this assumption was largely confirmed, for the Chinese, Indian, and Slovak sub- samples only small amounts of the variance could be explained by regression models. All six sub-samples differed substantially with regard to their overall questionnaire responses: by five discriminant functions 83.6% of participants were classified correctly. We conclude that implementation of UNAI principles should adhere closely to the cultural requirements of the respective society and, where necessary should be accompanied by thorough informational campaigns about UN educational goals. PMID:29180977

  8. Adopting Basic Principles of the United Nations Academic Impact Initiative (UNAI): Can Cultural Differences Be Predicted from Value Orientations and Globalization?

    PubMed

    Nechtelberger, Andrea; Renner, Walter; Nechtelberger, Martin; Supeková, Soňa Chovanová; Hadjimarkou, Maria; Offurum, Chino; Ramalingam, Panchalan; Senft, Birgit; Redfern, Kylie

    2017-01-01

    The United Nations Academic Impact (UNAI) Initiative has set forth 10 Basic Principles for higher education. In the present study, a 10 item self-report questionnaire measuring personal endorsement of these principles has been tested by self-report questionnaires with university and post-graduate students from Austria, China, Cyprus, India, Nigeria, and Slovakia (total N = 976, N = 627 female, mean age 24.7 years, s = 5.7). Starting from the assumptions of Moral Foundations Theory (MFT), we expected that personal attitudes toward the UNAI Basic Principles would be predicted by endorsement of various moral foundations as suggested by MFT and by the individual's degree of globalization. Whereas for the Austrian, Cypriot, and Nigerian sub- samples this assumption was largely confirmed, for the Chinese, Indian, and Slovak sub- samples only small amounts of the variance could be explained by regression models. All six sub-samples differed substantially with regard to their overall questionnaire responses: by five discriminant functions 83.6% of participants were classified correctly. We conclude that implementation of UNAI principles should adhere closely to the cultural requirements of the respective society and, where necessary should be accompanied by thorough informational campaigns about UN educational goals.

  9. Are Smart People Less Racist? Verbal Ability, Anti-Black Prejudice, and the Principle-Policy Paradox

    PubMed Central

    Wodtke, Geoffrey T.

    2016-01-01

    It is commonly hypothesized that higher cognitive abilities promote racial tolerance and a greater commitment to racial equality, but an alternative theoretical framework contends that higher cognitive abilities merely enable members of a dominant racial group to articulate a more refined legitimizing ideology for racial inequality. According to this perspective, ideological refinement occurs in response to shifting patterns of racial conflict and is characterized by rejection of overt prejudice, superficial support for racial equality in principle, and opposition to policies that challenge the dominant group's status. This study estimates the impact of verbal ability on a comprehensive set of racial attitudes, including anti-black prejudice, views about black-white equality in principle, and racial policy support. It also investigates cohort differences in the effects of verbal ability on these attitudes. Results suggest that high-ability whites are less likely than low-ability whites to report prejudicial attitudes and more likely to support racial equality in principle. Despite these liberalizing effects, high-ability whites are no more likely to support a variety of remedial policies for racial inequality. Results also suggest that the ostensibly liberalizing effects of verbal ability on anti-black prejudice and views about racial equality in principle emerged slowly over time, consistent with ideological refinement theory. PMID:27134315

  10. Are Smart People Less Racist? Verbal Ability, Anti-Black Prejudice, and the Principle-Policy Paradox.

    PubMed

    Wodtke, Geoffrey T

    2016-01-08

    It is commonly hypothesized that higher cognitive abilities promote racial tolerance and a greater commitment to racial equality, but an alternative theoretical framework contends that higher cognitive abilities merely enable members of a dominant racial group to articulate a more refined legitimizing ideology for racial inequality. According to this perspective, ideological refinement occurs in response to shifting patterns of racial conflict and is characterized by rejection of overt prejudice, superficial support for racial equality in principle, and opposition to policies that challenge the dominant group's status. This study estimates the impact of verbal ability on a comprehensive set of racial attitudes, including anti-black prejudice, views about black-white equality in principle, and racial policy support. It also investigates cohort differences in the effects of verbal ability on these attitudes. Results suggest that high-ability whites are less likely than low-ability whites to report prejudicial attitudes and more likely to support racial equality in principle. Despite these liberalizing effects, high-ability whites are no more likely to support a variety of remedial policies for racial inequality. Results also suggest that the ostensibly liberalizing effects of verbal ability on anti-black prejudice and views about racial equality in principle emerged slowly over time, consistent with ideological refinement theory.

  11. Energy, Metaphysics, and Space: Ernst Mach's Interpretation of Energy Conservation as the Principle of Causality

    NASA Astrophysics Data System (ADS)

    Guzzardi, Luca

    2014-06-01

    This paper discusses Ernst Mach's interpretation of the principle of energy conservation (EC) in the context of the development of energy concepts and ideas about causality in nineteenth-century physics and theory of science. In doing this, it focuses on the close relationship between causality, energy conservation and space in Mach's antireductionist view of science. Mach expounds his thesis about EC in his first historical-epistemological essay, Die Geschichte und die Wurzel des Satzes von der Erhaltung der Arbeit (1872): far from being a new principle, it is used from the early beginnings of mechanics independently from other principles; in fact, EC is a pre-mechanical principle which is generally applied in investigating nature: it is, indeed, nothing but a form of the principle of causality. The paper focuses on the scientific-historical premises and philosophical underpinnings of Mach's thesis, beginning with the classic debate on the validity and limits of the notion of cause by Hume, Kant, and Helmholtz. Such reference also implies a discussion of the relationship between causality on the one hand and space and time on the other. This connection plays a major role for Mach, and in the final paragraphs its importance is argued in order to understand his antireductionist perspective, i.e. the rejection of any attempt to give an ultimate explanation of the world via reduction of nature to one fundamental set of phenomena.

  12. Set-Theoretic Analysis of Ethical Systems for Off-Planet Future Engagement with Living Organisms

    NASA Astrophysics Data System (ADS)

    Helman, Daniel S.

    2016-10-01

    Living organisms are a conundrum. Their origin and provenance are open questions. An operational definition for their detection has been settled upon for practical reasons, i.e. in order to plan mission goals. The spirit of such undertakings is typically noble, and yet the question arises clearly related to how humanity will engage with other living organisms. Prudence demands a pre-contact appraisal of ethical requirements towards other living organisms. To answer this question, an anology with the number line in mathematics (integers versus the set of real numbers) will be presented to explore the structure of finite versus open-ended hierarchies. In this, the architecture of set theory will be used as a basis to describe the validity of systems hierarchies in general. Note that how numbers populate sets follow distinct rules when the elements of the sets or the sets themselves are unbounded. Principles of axiomatic versus observed conclusions will be emphasized. Results from mathematics will be used to inform analysis and dilemmas in ethical systems.

  13. Design of a Laboratory Hall Thruster with Magnetically Shielded Channel Walls, Phase III: Comparison of Theory with Experiment

    NASA Technical Reports Server (NTRS)

    Mikellides, Ioannis G.; Katz, Ira; Hofer, Richard R.; Goebel, Dan M.

    2012-01-01

    A proof-of-principle effort to demonstrate a technique by which erosion of the acceleration channel in Hall thrusters of the magnetic-layer type can be eliminated has been completed. The first principles of the technique, now known as "magnetic shielding," were derived based on the findings of numerical simulations in 2-D axisymmetric geometry. The simulations, in turn, guided the modification of an existing 6-kW laboratory Hall thruster. This magnetically shielded (MS) thruster was then built and tested. Because neither theory nor experiment alone can validate fully the first principles of the technique, the objective of the 2-yr effort was twofold: (1) to demonstrate in the laboratory that the erosion rates can be reduced by >order of magnitude, and (2) to demonstrate that the near-wall plasma properties can be altered according to the theoretical predictions. This paper concludes the demonstration of magnetic shielding by reporting on a wide range of comparisons between results from numerical simulations and laboratory diagnostics. Collectively, we find that the comparisons validate the theory. Near the walls of the MS thruster, theory and experiment agree: (1) the plasma potential has been sustained at values near the discharge voltage, and (2) the electron temperature has been lowered by at least 2.5-3 times compared to the unshielded (US) thruster. Also, based on carbon deposition measurements, the erosion rates at the inner and outer walls of the MS thruster are found to be lower by at least 2300 and 1875 times, respectively. Erosion was so low along these walls that the rates were below the resolution of the profilometer. Using a sputtering yield model with an energy threshold of 25 V, the simulations predict a reduction of 600 at the MS inner wall. At the outer wall ion energies are computed to be below 25 V, for which case we set the erosion to zero in the simulations. When a 50-V threshold is used the computed ion energies are below the threshold at both sides of the channel. Uncertainties, sensitivities and differences between theory and experiment are also discussed.

  14. Physics architecture

    NASA Astrophysics Data System (ADS)

    Konopleva, Nelly

    2017-03-01

    Fundamental physical theory axiomatics is closely connected with methods of experimental measurements. The difference between the theories using global and local symmetries is explained. It is shown that symmetry group localization leads not only to the change of the relativity principle, but to the fundamental modification of experimental programs testing physical theory predictions. It is noticed that any fundamental physical theory must be consistent with the measurement procedures employed for its testing. These ideas are illustrated by events of my biography connected with Yang-Mills theory transformation from an ordinary phenomenological model to a fundamental physical theory based on local symmetry principles like the Einsteinian General Relativity. Baldin position in this situation is demonstrated.

  15. Enhancing John Rawls's Theory of Justice to Cover Health and Social Determinants of Health1

    PubMed Central

    Ekmekci, Perihan Elif; Arda, Berna

    2015-01-01

    The vast improvements in medical technology reviled the crucial role of social determinants of health for the etiology, prevalence and prognosis of diseases. This changed the content of the right to health concept from a demand of health services, to a claim of having access to all social determinants of health. Thus, the just allocation of scarce resources of health and social determinants of health became an issue of ethical theories. John Rawls developed a theory of justice. His theory suggests that the principles of justice should be determined by individuals in a hypothetic initial position. In the initial position, individuals agree on principles of justice. Rawls puts forth that the institutions of the society should be structured in compliance with these principles to reach a fair social system. Although Rawls did not justify right to health in his theory, the efforts to enlarge the theory to cover right to health flourished quite fast. In this paper first the basic components of Rawls theory is explained. Then the most outstanding approaches to enlarge his theory to cover right to health is introduced and discussed within the discourse of Rawls theory of justice. PMID:27340331

  16. Enhancing John Rawls's Theory of Justice to Cover Health and Social Determinants of Health.

    PubMed

    Ekmekci, Perihan Elif; Arda, Berna

    2015-11-01

    The vast improvements in medical technology reviled the crucial role of social determinants of health for the etiology, prevalence and prognosis of diseases. This changed the content of the right to health concept from a demand of health services, to a claim of having access to all social determinants of health. Thus, the just allocation of scarce resources of health and social determinants of health became an issue of ethical theories. John Rawls developed a theory of justice. His theory suggests that the principles of justice should be determined by individuals in a hypothetic initial position. In the initial position, individuals agree on principles of justice. Rawls puts forth that the institutions of the society should be structured in compliance with these principles to reach a fair social system. Although Rawls did not justify right to health in his theory, the efforts to enlarge the theory to cover right to health flourished quite fast. In this paper first the basic components of Rawls theory is explained. Then the most outstanding approaches to enlarge his theory to cover right to health is introduced and discussed within the discourse of Rawls theory of justice.

  17. The European Union's Adequacy Approach to Privacy and International Data Sharing in Health Research.

    PubMed

    Stoddart, Jennifer; Chan, Benny; Joly, Yann

    2016-03-01

    The European Union (EU) approach to data protection consists of assessing the adequacy of the data protection offered by the laws of a particular jurisdiction against a set of principles that includes purpose limitation, transparency, quality, proportionality, security, access, and rectification. The EU's Data Protection Directive sets conditions on the transfer of data to third countries by prohibiting Member States from transferring to such countries as have been deemed inadequate in terms of the data protection regimes. In theory, each jurisdiction is evaluated similarly and must be found fully compliant with the EU's data protection principles to be considered adequate. In practice, the inconsistency with which these evaluations are made presents a hurdle to international data-sharing and makes difficult the integration of different data-sharing approaches; in the 20 years since the Directive was first adopted, the laws of only five countries from outside of the EU, Economic Area, or the European Free Trade Agreement have been deemed adequate to engage in data transfers without the need for further administrative safeguards. © 2016 American Society of Law, Medicine & Ethics.

  18. Black hole thermodynamics from a variational principle: asymptotically conical backgrounds

    DOE PAGES

    An, Ok Song; Cvetič, Mirjam; Papadimitriou, Ioannis

    2016-03-14

    The variational problem of gravity theories is directly related to black hole thermodynamics. For asymptotically locally AdS backgrounds it is known that holographic renormalization results in a variational principle in terms of equivalence classes of boundary data under the local asymptotic symmetries of the theory, which automatically leads to finite conserved charges satisfying the first law of thermodynamics. We show that this connection holds well beyond asymptotically AdS black holes. In particular, we formulate the variational problem for N = 2 STU supergravity in four dimensions with boundary conditions corresponding to those obeyed by the so called ‘subtracted geometries’. Wemore » show that such boundary conditions can be imposed covariantly in terms of a set of asymptotic second class constraints, and we derive the appropriate boundary terms that render the variational problem well posed in two different duality frames of the STU model. This allows us to define finite conserved charges associated with any asymptotic Killing vector and to demonstrate that these charges satisfy the Smarr formula and the first law of thermodynamics. Moreover, by uplifting the theory to five dimensions and then reducing on a 2-sphere, we provide a precise map between the thermodynamic observables of the subtracted geometries and those of the BTZ black hole. Finally, surface terms play a crucial role in this identification.« less

  19. Black hole thermodynamics from a variational principle: asymptotically conical backgrounds

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    An, Ok Song; Cvetič, Mirjam; Papadimitriou, Ioannis

    The variational problem of gravity theories is directly related to black hole thermodynamics. For asymptotically locally AdS backgrounds it is known that holographic renormalization results in a variational principle in terms of equivalence classes of boundary data under the local asymptotic symmetries of the theory, which automatically leads to finite conserved charges satisfying the first law of thermodynamics. We show that this connection holds well beyond asymptotically AdS black holes. In particular, we formulate the variational problem for N = 2 STU supergravity in four dimensions with boundary conditions corresponding to those obeyed by the so called ‘subtracted geometries’. Wemore » show that such boundary conditions can be imposed covariantly in terms of a set of asymptotic second class constraints, and we derive the appropriate boundary terms that render the variational problem well posed in two different duality frames of the STU model. This allows us to define finite conserved charges associated with any asymptotic Killing vector and to demonstrate that these charges satisfy the Smarr formula and the first law of thermodynamics. Moreover, by uplifting the theory to five dimensions and then reducing on a 2-sphere, we provide a precise map between the thermodynamic observables of the subtracted geometries and those of the BTZ black hole. Finally, surface terms play a crucial role in this identification.« less

  20. Just-in-Time Adaptive Interventions (JITAIs) in Mobile Health: Key Components and Design Principles for Ongoing Health Behavior Support.

    PubMed

    Nahum-Shani, Inbal; Smith, Shawna N; Spring, Bonnie J; Collins, Linda M; Witkiewitz, Katie; Tewari, Ambuj; Murphy, Susan A

    2018-05-18

    The just-in-time adaptive intervention (JITAI) is an intervention design aiming to provide the right type/amount of support, at the right time, by adapting to an individual's changing internal and contextual state. The availability of increasingly powerful mobile and sensing technologies underpins the use of JITAIs to support health behavior, as in such a setting an individual's state can change rapidly, unexpectedly, and in his/her natural environment. Despite the increasing use and appeal of JITAIs, a major gap exists between the growing technological capabilities for delivering JITAIs and research on the development and evaluation of these interventions. Many JITAIs have been developed with minimal use of empirical evidence, theory, or accepted treatment guidelines. Here, we take an essential first step towards bridging this gap. Building on health behavior theories and the extant literature on JITAIs, we clarify the scientific motivation for JITAIs, define their fundamental components, and highlight design principles related to these components. Examples of JITAIs from various domains of health behavior research are used for illustration. As we enter a new era of technological capacity for delivering JITAIs, it is critical that researchers develop sophisticated and nuanced health behavior theories capable of guiding the construction of such interventions. Particular attention has to be given to better understanding the implications of providing timely and ecologically sound support for intervention adherence and retention.

  1. Nonlocal dynamics of dissipative phononic fluids

    NASA Astrophysics Data System (ADS)

    Nemati, Navid; Lee, Yoonkyung E.; Lafarge, Denis; Duclos, Aroune; Fang, Nicholas

    2017-06-01

    We describe the nonlocal effective properties of a two-dimensional dissipative phononic crystal made by periodic arrays of rigid and motionless cylinders embedded in a viscothermal fluid such as air. The description is based on a nonlocal theory of sound propagation in stationary random fluid/rigid media that was proposed by Lafarge and Nemati [Wave Motion 50, 1016 (2013), 10.1016/j.wavemoti.2013.04.007]. This scheme arises from a deep analogy with electromagnetism and a set of physics-based postulates including, particularly, the action-response procedures, whereby the effective density and bulk modulus are determined. Here, we revisit this approach, and clarify further its founding physical principles through presenting it in a unified formulation together with the two-scale asymptotic homogenization theory that is interpreted as the local limit. Strong evidence is provided to show that the validity of the principles and postulates within the nonlocal theory extends to high-frequency bands, well beyond the long-wavelength regime. In particular, we demonstrate that up to the third Brillouin zone including the Bragg scattering, the complex and dispersive phase velocity of the least-attenuated wave in the phononic crystal which is generated by our nonlocal scheme agrees exactly with that reproduced by a direct approach based on the Bloch theorem and multiple scattering method. In high frequencies, the effective wave and its associated parameters are analyzed by treating the phononic crystal as a random medium.

  2. An Evolutionary Comparison of the Handicap Principle and Hybrid Equilibrium Theories of Signaling.

    PubMed

    Kane, Patrick; Zollman, Kevin J S

    2015-01-01

    The handicap principle has come under significant challenge both from empirical studies and from theoretical work. As a result, a number of alternative explanations for honest signaling have been proposed. This paper compares the evolutionary plausibility of one such alternative, the "hybrid equilibrium," to the handicap principle. We utilize computer simulations to compare these two theories as they are instantiated in Maynard Smith's Sir Philip Sidney game. We conclude that, when both types of communication are possible, evolution is unlikely to lead to handicap signaling and is far more likely to result in the partially honest signaling predicted by hybrid equilibrium theory.

  3. Chaos and Crisis: Propositions for a General Theory of Crisis Communication.

    ERIC Educational Resources Information Center

    Seeger, Matthew W.

    2002-01-01

    Presents key concepts of chaos theory (CT) as a general framework for describing organizational crisis and crisis communication. Discusses principles of predictability, sensitive dependence on initial conditions, bifurcation as system breakdown, emergent self-organization, and fractals and strange attractors as principles of organization. Explores…

  4. The Human Activity of Evaluation Theorizing.

    ERIC Educational Resources Information Center

    Alkin, Marvin C.; Ellett, Frederick, Jr.

    Theorizing about evaluation should be conceptualized as a human activity governed by certain strategies and principles. The theories advanced by various evaluators have changed over the years, thus illustrating ten principles of evaluation. The starting point for theory development or modification is self-reflection and review of one's own…

  5. Analytic and Computational Perspectives of Multi-Scale Theory for Homogeneous, Laminated Composite, and Sandwich Beams and Plates

    NASA Technical Reports Server (NTRS)

    Tessler, Alexander; Gherlone, Marco; Versino, Daniele; DiSciuva, Marco

    2012-01-01

    This paper reviews the theoretical foundation and computational mechanics aspects of the recently developed shear-deformation theory, called the Refined Zigzag Theory (RZT). The theory is based on a multi-scale formalism in which an equivalent single-layer plate theory is refined with a robust set of zigzag local layer displacements that are free of the usual deficiencies found in common plate theories with zigzag kinematics. In the RZT, first-order shear-deformation plate theory is used as the equivalent single-layer plate theory, which represents the overall response characteristics. Local piecewise-linear zigzag displacements are used to provide corrections to these overall response characteristics that are associated with the plate heterogeneity and the relative stiffnesses of the layers. The theory does not rely on shear correction factors and is equally accurate for homogeneous, laminated composite, and sandwich beams and plates. Regardless of the number of material layers, the theory maintains only seven kinematic unknowns that describe the membrane, bending, and transverse shear plate-deformation modes. Derived from the virtual work principle, RZT is well-suited for developing computationally efficient, C(sup 0)-continuous finite elements; formulations of several RZT-based elements are highlighted. The theory and its finite element approximations thus provide a unified and reliable computational platform for the analysis and design of high-performance load-bearing aerospace structures.

  6. Analytic and Computational Perspectives of Multi-Scale Theory for Homogeneous, Laminated Composite, and Sandwich Beams and Plates

    NASA Technical Reports Server (NTRS)

    Tessler, Alexander; Gherlone, Marco; Versino, Daniele; Di Sciuva, Marco

    2012-01-01

    This paper reviews the theoretical foundation and computational mechanics aspects of the recently developed shear-deformation theory, called the Refined Zigzag Theory (RZT). The theory is based on a multi-scale formalism in which an equivalent single-layer plate theory is refined with a robust set of zigzag local layer displacements that are free of the usual deficiencies found in common plate theories with zigzag kinematics. In the RZT, first-order shear-deformation plate theory is used as the equivalent single-layer plate theory, which represents the overall response characteristics. Local piecewise-linear zigzag displacements are used to provide corrections to these overall response characteristics that are associated with the plate heterogeneity and the relative stiffnesses of the layers. The theory does not rely on shear correction factors and is equally accurate for homogeneous, laminated composite, and sandwich beams and plates. Regardless of the number of material layers, the theory maintains only seven kinematic unknowns that describe the membrane, bending, and transverse shear plate-deformation modes. Derived from the virtual work principle, RZT is well-suited for developing computationally efficient, C0-continuous finite elements; formulations of several RZT-based elements are highlighted. The theory and its finite elements provide a unified and reliable computational platform for the analysis and design of high-performance load-bearing aerospace structures.

  7. Mental health literacy as theory: current challenges and future directions.

    PubMed

    Spiker, Douglas A; Hammer, Joseph H

    2018-02-13

    Mental health literacy (MHL) is one increasingly researched factor thought to influence mental health behaviors. Researchers have argued for expanding the definition of MHL to include additional constructs, but no consensus has yet been reached on what constructs should be included as part of MHL. The purpose of this paper is to (i) elucidate how the expansion of the MHL construct has impeded the growth of MHL research and (ii) through the lens of construct and theory development, highlight how these challenges might be remedied. An inclusive search of the literature was undertaken to identify MHL studies. The principles of construct and theory development guided a critical analysis of MHL. The review of the literature found that MHL violates many principles of what constitutes an acceptable construct definition. To address these concerns, we proposed conceptualizing MHL as a theory and recommended principles of theory development that should be taken into consideration. A theory of MHL can guide future researchers to clearly delineate important constructs and their interrelationships. For practitioners, a theory of MHL can help inform how to improve MHL at both the individual and community level.

  8. Public health ethics theory: review and path to convergence.

    PubMed

    Lee, Lisa M

    2012-01-01

    Public health ethics is a nascent field, emerging over the past decade as an applied field merging concepts of clinical and research ethics. Because the "patient" in public health is the population rather than the individual, existing principles might be weighted differently, or there might be different ethical principles to consider. This paper reviewed the evolution of public health ethics, the use of bioethics as its model, and the proposed frameworks for public health ethics through 2010. Review of 13 major public health ethics frameworks published over the past 15 years yields a wide variety of theoretical approaches, some similar foundational values, and a few similar operating principles. Coming to a consensus on the reach, purpose, and ends of public health is necessary if we are to agree on what ethical underpinnings drive us, what foundational values bring us to these underpinnings, and what operating principles practitioners must implement to make ethical decisions. If public health is distinct enough from clinical medicine to warrant its own set of ethical and philosophical underpinnings, then a decision must be made as to whether a single approach is warranted or we can tolerate a variety of equal but different perspectives. © 2012 American Society of Law, Medicine & Ethics, Inc.

  9. Exploring the bases for a mixed reality stroke rehabilitation system, Part I: A unified approach for representing action, quantitative evaluation, and interactive feedback

    PubMed Central

    2011-01-01

    Background Although principles based in motor learning, rehabilitation, and human-computer interfaces can guide the design of effective interactive systems for rehabilitation, a unified approach that connects these key principles into an integrated design, and can form a methodology that can be generalized to interactive stroke rehabilitation, is presently unavailable. Results This paper integrates phenomenological approaches to interaction and embodied knowledge with rehabilitation practices and theories to achieve the basis for a methodology that can support effective adaptive, interactive rehabilitation. Our resulting methodology provides guidelines for the development of an action representation, quantification of action, and the design of interactive feedback. As Part I of a two-part series, this paper presents key principles of the unified approach. Part II then describes the application of this approach within the implementation of the Adaptive Mixed Reality Rehabilitation (AMRR) system for stroke rehabilitation. Conclusions The accompanying principles for composing novel mixed reality environments for stroke rehabilitation can advance the design and implementation of effective mixed reality systems for the clinical setting, and ultimately be adapted for home-based application. They furthermore can be applied to other rehabilitation needs beyond stroke. PMID:21875441

  10. Advanced access: reducing waiting and delays in primary care.

    PubMed

    Murray, Mark; Berwick, Donald M

    2003-02-26

    Delay of care is a persistent and undesirable feature of current health care systems. Although delay seems to be inevitable and linked to resource limitations, it often is neither. Rather, it is usually the result of unplanned, irrational scheduling and resource allocation. Application of queuing theory and principles of industrial engineering, adapted appropriately to clinical settings, can reduce delay substantially, even in small practices, without requiring additional resources. One model, sometimes referred to as advanced access, has increasingly been shown to reduce waiting times in primary care. The core principle of advanced access is that patients calling to schedule a physician visit are offered an appointment the same day. Advanced access is not sustainable if patient demand for appointments is permanently greater than physician capacity to offer appointments. Six elements of advanced access are important in its application balancing supply and demand, reducing backlog, reducing the variety of appointment types, developing contingency plans for unusual circumstances, working to adjust demand profiles, and increasing the availability of bottleneck resources. Although these principles are powerful, they are counter to deeply held beliefs and established practices in health care organizations. Adopting these principles requires strong leadership investment and support.

  11. Angle-dependent strong-field molecular ionization rates with tuned range-separated time-dependent density functional theory.

    PubMed

    Sissay, Adonay; Abanador, Paul; Mauger, François; Gaarde, Mette; Schafer, Kenneth J; Lopata, Kenneth

    2016-09-07

    Strong-field ionization and the resulting electronic dynamics are important for a range of processes such as high harmonic generation, photodamage, charge resonance enhanced ionization, and ionization-triggered charge migration. Modeling ionization dynamics in molecular systems from first-principles can be challenging due to the large spatial extent of the wavefunction which stresses the accuracy of basis sets, and the intense fields which require non-perturbative time-dependent electronic structure methods. In this paper, we develop a time-dependent density functional theory approach which uses a Gaussian-type orbital (GTO) basis set to capture strong-field ionization rates and dynamics in atoms and small molecules. This involves propagating the electronic density matrix in time with a time-dependent laser potential and a spatial non-Hermitian complex absorbing potential which is projected onto an atom-centered basis set to remove ionized charge from the simulation. For the density functional theory (DFT) functional we use a tuned range-separated functional LC-PBE*, which has the correct asymptotic 1/r form of the potential and a reduced delocalization error compared to traditional DFT functionals. Ionization rates are computed for hydrogen, molecular nitrogen, and iodoacetylene under various field frequencies, intensities, and polarizations (angle-dependent ionization), and the results are shown to quantitatively agree with time-dependent Schrödinger equation and strong-field approximation calculations. This tuned DFT with GTO method opens the door to predictive all-electron time-dependent density functional theory simulations of ionization and ionization-triggered dynamics in molecular systems using tuned range-separated hybrid functionals.

  12. Angle-dependent strong-field molecular ionization rates with tuned range-separated time-dependent density functional theory

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sissay, Adonay; Abanador, Paul; Mauger, François

    2016-09-07

    Strong-field ionization and the resulting electronic dynamics are important for a range of processes such as high harmonic generation, photodamage, charge resonance enhanced ionization, and ionization-triggered charge migration. Modeling ionization dynamics in molecular systems from first-principles can be challenging due to the large spatial extent of the wavefunction which stresses the accuracy of basis sets, and the intense fields which require non-perturbative time-dependent electronic structure methods. In this paper, we develop a time-dependent density functional theory approach which uses a Gaussian-type orbital (GTO) basis set to capture strong-field ionization rates and dynamics in atoms and small molecules. This involves propagatingmore » the electronic density matrix in time with a time-dependent laser potential and a spatial non-Hermitian complex absorbing potential which is projected onto an atom-centered basis set to remove ionized charge from the simulation. For the density functional theory (DFT) functional we use a tuned range-separated functional LC-PBE*, which has the correct asymptotic 1/r form of the potential and a reduced delocalization error compared to traditional DFT functionals. Ionization rates are computed for hydrogen, molecular nitrogen, and iodoacetylene under various field frequencies, intensities, and polarizations (angle-dependent ionization), and the results are shown to quantitatively agree with time-dependent Schrödinger equation and strong-field approximation calculations. This tuned DFT with GTO method opens the door to predictive all-electron time-dependent density functional theory simulations of ionization and ionization-triggered dynamics in molecular systems using tuned range-separated hybrid functionals.« less

  13. Lessons from a broad view of science: a response to Dr Robergs’ article

    PubMed Central

    Pires, Flavio Oliveira

    2018-01-01

    Dr Robergs suggested that the central governor model (CGM) is not a well-worded theory, as it deviated from the tenant of falsification criteria. According to his view of science, exercise researches with the intent to prove rather than disprove the theory contribute little to new knowledge and condemn the theory to the label of pseudoscience. However, exercise scientists should be aware of limitations of the falsification criteria. First, the number of potential falsifiers for a given hypothesis is always infinite so that there is no mean to ensure asymmetric comparison between theories. Thus, assuming a competition between CGM and dichotomised central versus peripheral fatigue theories, scientists guided by the falsification principle should know, a priori, all possible falsifiers between these two theories in order to choose the finest one, thereby leading to an oversimplification of the theories. Second, the failure to formulate refutable hypothesis may be a simple consequence of the lack of instruments to make crucial measurements. The use of refutation principles to test the CGM theory requires capable technology for online feedback and feedforward measures integrated in the central nervous system, in a real-time exercise. Consequently, falsification principle is currently impracticable to test CGM theory. The falsification principle must be applied with equilibrium, as we should do with positive induction process, otherwise Popperian philosophy will be incompatible with the actual practice in science. Rather than driving the scientific debate on a biased single view of science, researchers in the field of exercise sciences may benefit more from different views of science. PMID:29629188

  14. Precautionary discourse. Thinking through the distinction between the precautionary principle and the precautionary approach in theory and practice.

    PubMed

    Dinneen, Nathan

    2013-01-01

    This paper addresses the distinction, arising from the different ways the European Union and United States have come to adopt precaution regarding various environmental and health-related risks, between the precautionary principle and the precautionary approach in both theory and practice. First, this paper addresses how the precautionary principle has been variously defined, along with an exploration of some of the concepts with which it has been associated. Next, it addresses how the distinction between the precautionary principle and precautionary approach manifested itself within the political realm. Last, it considers the theoretical foundation of the precautionary principle in the philosophy of Hans Jonas, considering whether the principled-pragmatic distinction regarding precaution does or doesn't hold up in Jonas' thought.

  15. Situational theory of leadership.

    PubMed

    Waller, D J; Smith, S R; Warnock, J T

    1989-11-01

    The situational theory of leadership and the LEAD instruments for determining leadership style are explained, and the application of the situational leadership theory to the process of planning for and implementing organizational change is described. Early studies of leadership style identified two basic leadership styles: the task-oriented autocratic style and the relationship-oriented democratic style. Subsequent research found that most leaders exhibited one of four combinations of task and relationship behaviors. The situational leadership theory holds that the difference between the effectiveness and ineffectiveness of the four leadership styles is the appropriateness of the leader's behavior to the particular situation in which it is used. The task maturity of the individual or group being led must also be accounted for; follower readiness is defined in terms of the capacity to set high but attainable goals, willingness or ability to accept responsibility, and possession of the necessary education or experience for a specific task. A person's leadership style, range, and adaptability can be determined from the LEADSelf and LEADOther questionnaires. By applying the principles of the situational leadership theory and adapting their managerial styles to specific tasks and levels of follower maturity, the authors were successful in implementing 24-hour pharmacokinetic dosing services provided by staff pharmacists with little previous experience in clinical services. The situational leadership model enables a leader to identify a task, set goals, determine the task maturity of the individual or group, select an appropriate leadership style, and modify the style as change occurs. Pharmacy managers can use this model when implementing clinical pharmacy services.

  16. A learning theory account of depression.

    PubMed

    Ramnerö, Jonas; Folke, Fredrik; Kanter, Jonathan W

    2015-06-11

    Learning theory provides a foundation for understanding and deriving treatment principles for impacting a spectrum of functional processes relevant to the construct of depression. While behavioral interventions have been commonplace in the cognitive behavioral tradition, most often conceptualized within a cognitive theoretical framework, recent years have seen renewed interest in more purely behavioral models. These modern learning theory accounts of depression focus on the interchange between behavior and the environment, mainly in terms of lack of reinforcement, extinction of instrumental behavior, and excesses of aversive control, and include a conceptualization of relevant cognitive and emotional variables. These positions, drawn from extensive basic and applied research, cohere with biological theories on reduced reward learning and reward responsiveness and views of depression as a heterogeneous, complex set of disorders. Treatment techniques based on learning theory, often labeled Behavioral Activation (BA) focus on activating the individual in directions that increase contact with potential reinforcers, as defined ideographically with the client. BA is considered an empirically well-established treatment that generalizes well across diverse contexts and populations. The learning theory account is discussed in terms of being a parsimonious model and ground for treatments highly suitable for large scale dissemination. © 2015 Scandinavian Psychological Associations and John Wiley & Sons Ltd.

  17. Many-body calculations of molecular electric polarizabilities in asymptotically complete basis sets

    NASA Astrophysics Data System (ADS)

    Monten, Ruben; Hajgató, Balázs; Deleuze, Michael S.

    2011-10-01

    The static dipole polarizabilities of Ne, CO, N2, F2, HF, H2O, HCN, and C2H2 (acetylene) have been determined close to the Full-CI limit along with an asymptotically complete basis set (CBS), according to the principles of a Focal Point Analysis. For this purpose the results of Finite Field calculations up to the level of Coupled Cluster theory including Single, Double, Triple, Quadruple and perturbative Pentuple excitations [CCSDTQ(P)] were used, in conjunction with suited extrapolations of energies obtained using augmented and doubly-augmented Dunning's correlation consistent polarized valence basis sets of improving quality. The polarizability characteristics of C2H4 (ethylene) and C2H6 (ethane) have been determined on the same grounds at the CCSDTQ level in the CBS limit. Comparison is made with results obtained using lower levels in electronic correlation, or taking into account the relaxation of the molecular structure due to an adiabatic polarization process. Vibrational corrections to electronic polarizabilities have been empirically estimated according to Born-Oppenheimer Molecular Dynamical simulations employing Density Functional Theory. Confrontation with experiment ultimately indicates relative accuracies of the order of 1 to 2%.

  18. Annette Bunge: developing the principles in percutaneous absorption using chemical engineering principles.

    PubMed

    Stinchcomb, A L

    2013-01-01

    Annette Bunge and her research group have had the central theme of mathematically modeling the dermal absorption process. Most of the research focus has been on estimating dermal absorption for the purpose of risk assessment, for exposure scenarios in the environment and in the occupational setting. Her work is the basis for the United States Environmental Protection Agency's estimations for dermal absorption from contaminated water. It is also the basis of the dermal absorption estimates used in determining if chemicals should be assigned a 'skin notation' for potential systemic toxicity following occupational skin exposure. The work is truly translational in that it started with mathematical theory, is validated with preclinical and human experiments, and then is used in guidelines to protect human health. Her valued research has also extended into the topical drug bioavailability and bioequivalence assessment field.

  19. Ancient Cosmology, superfine structure of the Universe and Anthropological Principle

    NASA Astrophysics Data System (ADS)

    Arakelyan, Hrant; Vardanyan, Susan

    2015-07-01

    The modern cosmology by its spirit, conception of the Big Bang is closer to the ancient cosmology, than to the cosmological paradigm of the XIX century. Repeating the speculations of the ancients, but using at the same time subtle mathematical methods and relying on the steadily accumulating empirical material, the modern theory tends to a quantitative description of nature, in which increasing role are playing the numerical ratios between the physical constants. The detailed analysis of the influence of the numerical values -- of physical quantities on the physical state of the universe revealed amazing relations called fine and hyperfine tuning. In order to explain, why the observable universe comes to be a certain set of interrelated fundamental parameters, in fact a speculative anthropic principle was proposed, which focuses on the fact of the existence of sentient beings.

  20. Excitons in molecular crystals from first-principles many-body perturbation theory: Picene versus pentacene

    NASA Astrophysics Data System (ADS)

    Cudazzo, Pierluigi; Gatti, Matteo; Rubio, Angel

    2012-11-01

    By solving the first-principles many-body Bethe-Salpeter equation, we compare the optical properties of two prototype and technological relevant organic molecular crystals: picene and pentacene. Albeit very similar for the structural and electronic properties, picene and pentacene show remarkable differences in their optical spectra. While for pentacene the absorption onset is due to a charge-transfer exciton, in picene it is related to a strongly localized Frenkel exciton. The detailed comparison between the two materials allows us to discuss, on general grounds, how the interplay between the electronic band dispersion and the exchange electron-hole interaction plays a fundamental role in setting the nature of the exciton. It represents a clear example of the relevance of the competition between localization and delocalization in the description of two-particle electronic correlation.

  1. Interactive Distance Education: A Cognitive Load Perspective

    ERIC Educational Resources Information Center

    Kalyuga, Slava

    2012-01-01

    Evidence-based approaches to the design of the next generation of interactive distance education need to take into account established multimedia learning principles. Cognitive load theory is a theory that has significantly contributed to the development of such principles. It has applied our knowledge of major features and processing limitations…

  2. Teaching Literacy: From Theory to Practice.

    ERIC Educational Resources Information Center

    Kerr, Hugo

    This book examines the basic principles of the cognitive psychology of literacy and explains how insights gained from that theory can inform and improve reading, spelling, and writing instruction aimed at adults. The following are among the topics discussed in the book's eight chapters: the principles of cognitive psychology (the nervous system,…

  3. Bridging Levels of Analysis: Learning, Information Theory, and the Lexicon

    ERIC Educational Resources Information Center

    Dye, Melody

    2017-01-01

    While information theory is typically considered in the context of modern computing and engineering, its core mathematical principles provide a potentially useful lens through which to consider human language. Like the artificial communication systems such principles were invented to describe, natural languages involve a sender and receiver, a…

  4. Applying Learning Principles and Theories to Vocational Education.

    ERIC Educational Resources Information Center

    Loomis, Linda Jacobsen; Prickett, Charlotte

    This monograph is intended for use by vocational teachers, state supervisors, administrators, and curriculum developers as a resource guide in the development of sound curriculum for vocational programs in Arizona. It examines learning theories and principles and, where applicable, applies these ideas to vocational education. Chapter 1 introduces…

  5. Students' Understanding of Acids/Bases in Organic Chemistry Contexts

    ERIC Educational Resources Information Center

    Cartrette, David P.; Mayo, Provi M.

    2011-01-01

    Understanding key foundational principles is vital to learning chemistry across different contexts. One such foundational principle is the acid/base behavior of molecules. In the general chemistry sequence, the Bronsted-Lowry theory is stressed, because it lends itself well to studying equilibrium and kinetics. However, the Lewis theory of…

  6. Two-dimensional models as testing ground for principles and concepts of local quantum physics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schroer, Bert

    In the past two-dimensional models of QFT have served as theoretical laboratories for testing new concepts under mathematically controllable condition. In more recent times low-dimensional models (e.g., chiral models, factorizing models) often have been treated by special recipes in a way which sometimes led to a loss of unity of QFT. In the present work, I try to counteract this apartheid tendency by reviewing past results within the setting of the general principles of QFT. To this I add two new ideas: (1) a modular interpretation of the chiral model Diff(S)-covariance with a close connection to the recently formulated localmore » covariance principle for QFT in curved spacetime and (2) a derivation of the chiral model temperature duality from a suitable operator formulation of the angular Wick rotation (in analogy to the Nelson-Symanzik duality in the Ostertwalder-Schrader setting) for rational chiral theories. The SL (2, Z) modular Verlinde relation is a special case of this thermal duality and (within the family of rational models) the matrix S appearing in the thermal duality relation becomes identified with the statistics character matrix S. The relevant angular 'Euclideanization' is done in the setting of the Tomita-Takesaki modular formalism of operator algebras. I find it appropriate to dedicate this work to the memory of J.A. Swieca with whom I shared the interest in two-dimensional models as a testing ground for QFT for more than one decade. This is a significantly extended version of an 'Encyclopedia of Mathematical Physics' contribution hep-th/0502125.« less

  7. Two-dimensional models as testing ground for principles and concepts of local quantum physics

    NASA Astrophysics Data System (ADS)

    Schroer, Bert

    2006-02-01

    In the past two-dimensional models of QFT have served as theoretical laboratories for testing new concepts under mathematically controllable condition. In more recent times low-dimensional models (e.g., chiral models, factorizing models) often have been treated by special recipes in a way which sometimes led to a loss of unity of QFT. In the present work, I try to counteract this apartheid tendency by reviewing past results within the setting of the general principles of QFT. To this I add two new ideas: (1) a modular interpretation of the chiral model Diff( S)-covariance with a close connection to the recently formulated local covariance principle for QFT in curved spacetime and (2) a derivation of the chiral model temperature duality from a suitable operator formulation of the angular Wick rotation (in analogy to the Nelson-Symanzik duality in the Ostertwalder-Schrader setting) for rational chiral theories. The SL (2, Z) modular Verlinde relation is a special case of this thermal duality and (within the family of rational models) the matrix S appearing in the thermal duality relation becomes identified with the statistics character matrix S. The relevant angular "Euclideanization" is done in the setting of the Tomita-Takesaki modular formalism of operator algebras. I find it appropriate to dedicate this work to the memory of J.A. Swieca with whom I shared the interest in two-dimensional models as a testing ground for QFT for more than one decade. This is a significantly extended version of an "Encyclopedia of Mathematical Physics" contribution hep-th/0502125.

  8. Supporting culturally and linguistically diverse children with speech, language and communication needs: Overarching principles, individual approaches.

    PubMed

    Verdon, Sarah; McLeod, Sharynne; Wong, Sandie

    2015-01-01

    Speech-language pathologists (SLPs) are working with an increasing number of families from culturally and linguistically diverse backgrounds as the world's population continues to become more internationally mobile. The heterogeneity of these diverse populations makes it impossible to identify and document a one size fits all strategy for working with culturally and linguistically diverse families. This paper explores approaches to practice by SLPs identified as specialising in multilingual and multicultural practice in culturally and linguistically diverse contexts from around the world. Data were obtained from ethnographic observation of 14 sites in 5 countries on 4 continents. The sites included hospital settings, university clinics, school-based settings, private practices and Indigenous community-based services. There were 652 individual artefacts collected from the sites which included interview transcripts, photographs, videos, narrative reflections, informal and formal field notes. The data were analysed using Cultural-Historical Activity Theory (Engeström, 1987). From the analysis six overarching Principles of Culturally Competent Practice (PCCP) were identified. These were: (1) identification of culturally appropriate and mutually motivating therapy goals, (2) knowledge of languages and culture, (3) use of culturally appropriate resources, (4) consideration of the cultural, social and political context, (5) consultation with families and communities, and (6) collaboration between professionals. These overarching principles align with the six position statements developed by the International Expert Panel on Multilingual Children's Speech (2012) which aim to enhance the cultural competence of speech pathologists and their practice. The international examples provided in the current study demonstrate the individualised ways that these overarching principles are enacted in a range of different organisational, social, cultural and political contexts. Tensions experienced in enacting the principles are also discussed. This paper emphasises the potential for individual SLPs to enhance their practice by adopting these overarching principles to support the individual children and families in diverse contexts around the world. Copyright © 2015 Elsevier Inc. All rights reserved.

  9. Vibrational signatures in the THz spectrum of 1,3-DNB: A first-principles and experimental study

    NASA Astrophysics Data System (ADS)

    Ahmed, Towfiq; Azad, Abul K.; Chellappa, Raja; Higginbotham-Duque, Amanda; Dattelbaum, Dana M.; Zhu, Jian-Xin; Moore, David; Graf, Matthias J.

    2016-05-01

    Understanding the fundamental processes of light-matter interaction is important for detection of explosives and other energetic materials, which are active in the infrared and terahertz (THz) region. We report a comprehensive study on electronic and vibrational lattice properties of structurally similar 1,3-dinitrobenzene (1,3-DNB) crystals through first-principles electronic structure calculations and THz spectroscopy measurements on polycrystalline samples. Starting from reported x-ray crystal structures, we use density-functional theory (DFT) with periodic boundary conditions to optimize the structures and perform linear response calculations of the vibrational properties at zero phonon momentum. The theoretically identified normal modes agree qualitatively with those obtained experimentally in a frequency range up to 2.5 THz and quantitatively at much higher frequencies. The latter frequencies are set by intra-molecular forces. Our results suggest that van der Waals dispersion forces need to be included to improve the agreement between theory and experiment in the THz region, which is dominated by intermolecular modes and sensitive to details in the DFT calculation. An improved comparison is needed to assess and distinguish between intra- and intermolecular vibrational modes characteristic of energetic materials.

  10. Novice to Expert Practice via Postprofessional Athletic Training Education: A Grounded Theory

    PubMed Central

    Neibert, Peter J

    2009-01-01

    Objective: To discover the theoretic constructs that confirm, disconfirm, or extend the principles and their applications appropriate for National Athletic Trainers' Association (NATA)–accredited postprofessional athletic training education programs. Design: Interviews at the 2003 NATA Annual Meeting & Clinical Symposia. Setting: Qualitative study using grounded theory procedures. Patients and Other Participants: Thirteen interviews were conducted with postprofessional graduates. Participants were purposefully selected based on theoretic sampling and availability. Data Collection and Analysis: The transcribed interviews were analyzed using open coding, axial coding, and selective coding procedures. Member checks, reflective journaling, and triangulation were used to ensure trustworthiness. Results: The participants' comments confirmed and extended the current principles of postprofessional athletic training education programs and offered additional suggestions for more effective practical applications. Conclusions: The emergence of this central category of novice to expert practice is a paramount finding. The tightly woven fabric of the 10 processes, when interlaced with one another, provides a strong tapestry supporting novice to expert practice via postprofessional athletic training education. The emergence of this theoretic position pushes postprofessional graduate athletic training education forward to the future for further investigation into the theoretic constructs of novice to expert practice. PMID:19593420

  11. Birth of an abstraction: a dynamical systems account of the discovery of an elsewhere principle in a category learning task.

    PubMed

    Tabor, Whitney; Cho, Pyeong W; Dankowicz, Harry

    2013-01-01

    Human participants and recurrent ("connectionist") neural networks were both trained on a categorization system abstractly similar to natural language systems involving irregular ("strong") classes and a default class. Both the humans and the networks exhibited staged learning and a generalization pattern reminiscent of the Elsewhere Condition (Kiparsky, 1973). Previous connectionist accounts of related phenomena have often been vague about the nature of the networks' encoding systems. We analyzed our network using dynamical systems theory, revealing topological and geometric properties that can be directly compared with the mechanisms of non-connectionist, rule-based accounts. The results reveal that the networks "contain" structures related to mechanisms posited by rule-based models, partly vindicating the insights of these models. On the other hand, they support the one mechanism (OM), as opposed to the more than one mechanism (MOM), view of symbolic abstraction by showing how the appearance of MOM behavior can arise emergently from one underlying set of principles. The key new contribution of this study is to show that dynamical systems theory can allow us to explicitly characterize the relationship between the two perspectives in implemented models. © 2013 Cognitive Science Society, Inc.

  12. Design of a universal two-layered neural network derived from the PLI theory

    NASA Astrophysics Data System (ADS)

    Hu, Chia-Lun J.

    2004-05-01

    The if-and-only-if (IFF) condition that a set of M analog-to-digital vector-mapping relations can be learned by a one-layered-feed-forward neural network (OLNN) is that all the input analog vectors dichotomized by the i-th output bit must be positively, linearly independent, or PLI. If they are not PLI, then the OLNN just cannot learn no matter what learning rules is employed because the solution of the connection matrix does not exist mathematically. However, in this case, one can still design a parallel-cascaded, two-layered, perceptron (PCTLP) to acheive this general mapping goal. The design principle of this "universal" neural network is derived from the major mathematical properties of the PLI theory - changing the output bits of the dependent relations existing among the dichotomized input vectors to make the PLD relations PLI. Then with a vector concatenation technique, the required mapping can still be learned by this PCTLP system with very high efficiency. This paper will report in detail the mathematical derivation of the general design principle and the design procedures of the PCTLP neural network system. It then will be verified in general by a practical numerical example.

  13. Uncertainty, imprecision, and the precautionary principle in climate change assessment.

    PubMed

    Borsuk, M E; Tomassini, L

    2005-01-01

    Statistical decision theory can provide useful support for climate change decisions made under conditions of uncertainty. However, the probability distributions used to calculate expected costs in decision theory are themselves subject to uncertainty, disagreement, or ambiguity in their specification. This imprecision can be described using sets of probability measures, from which upper and lower bounds on expectations can be calculated. However, many representations, or classes, of probability measures are possible. We describe six of the more useful classes and demonstrate how each may be used to represent climate change uncertainties. When expected costs are specified by bounds, rather than precise values, the conventional decision criterion of minimum expected cost is insufficient to reach a unique decision. Alternative criteria are required, and the criterion of minimum upper expected cost may be desirable because it is consistent with the precautionary principle. Using simple climate and economics models as an example, we determine the carbon dioxide emissions levels that have minimum upper expected cost for each of the selected classes. There can be wide differences in these emissions levels and their associated costs, emphasizing the need for care when selecting an appropriate class.

  14. Implementing large-scale workforce change: learning from 55 pilot sites of allied health workforce redesign in Queensland, Australia

    PubMed Central

    2013-01-01

    Background Increasingly, health workforces are undergoing high-level ‘re-engineering’ to help them better meet the needs of the population, workforce and service delivery. Queensland Health implemented a large scale 5-year workforce redesign program across more than 13 health-care disciplines. This study synthesized the findings from this program to identify and codify mechanisms associated with successful workforce redesign to help inform other large workforce projects. Methods This study used Inductive Logic Reasoning (ILR), a process that uses logic models as the primary functional tool to develop theories of change, which are subsequently validated through proposition testing. Initial theories of change were developed from a systematic review of the literature and synthesized using a logic model. These theories of change were then developed into propositions and subsequently tested empirically against documentary, interview, and survey data from 55 projects in the workforce redesign program. Results Three overarching principles were identified that optimized successful workforce redesign: (1) drivers for change need to be close to practice; (2) contexts need to be supportive both at the local levels and legislatively; and (3) mechanisms should include appropriate engagement, resources to facilitate change management, governance, and support structures. Attendance to these factors was uniformly associated with success of individual projects. Conclusions ILR is a transparent and reproducible method for developing and testing theories of workforce change. Despite the heterogeneity of projects, professions, and approaches used, a consistent set of overarching principles underpinned success of workforce change interventions. These concepts have been operationalized into a workforce change checklist. PMID:24330616

  15. Dual Processes in Decision Making and Developmental Neuroscience: A Fuzzy-Trace Model

    PubMed Central

    Reyna, Valerie F.; Brainerd, Charles J.

    2011-01-01

    From Piaget to the present, traditional and dual-process theories have predicted improvement in reasoning from childhood to adulthood, and improvement has been observed. However, developmental reversals—that reasoning biases emerge with development —have also been observed in a growing list of paradigms. We explain how fuzzy-trace theory predicts both improvement and developmental reversals in reasoning and decision making. Drawing on research on logical and quantitative reasoning, as well as on risky decision making in the laboratory and in life, we illustrate how the same small set of theoretical principles apply to typical neurodevelopment, encompassing childhood, adolescence, and adulthood, and to neurological conditions such as autism and Alzheimer's disease. For example, framing effects—that risk preferences shift when the same decisions are phrases in terms of gains versus losses—emerge in early adolescence as gist-based intuition develops. In autistic individuals, who rely less on gist-based intuition and more on verbatim-based analysis, framing biases are attenuated (i.e., they outperform typically developing control subjects). In adults, simple manipulations based on fuzzy-trace theory can make framing effects appear and disappear depending on whether gist-based intuition or verbatim-based analysis is induced. These theoretical principles are summarized and integrated in a new mathematical model that specifies how dual modes of reasoning combine to produce predictable variability in performance. In particular, we show how the most popular and extensively studied model of decision making—prospect theory—can be derived from fuzzy-trace theory by combining analytical (verbatim-based) and intuitive (gist-based) processes. PMID:22096268

  16. Effects of Modality and Redundancy Principles on the Learning and Attitude of a Computer-Based Music Theory Lesson among Jordanian Primary Pupils

    ERIC Educational Resources Information Center

    Aldalalah, Osamah Ahmad; Fong, Soon Fook

    2010-01-01

    The purpose of this study was to investigate the effects of modality and redundancy principles on the attitude and learning of music theory among primary pupils of different aptitudes in Jordan. The lesson of music theory was developed in three different modes, audio and image (AI), text with image (TI) and audio with image and text (AIT). The…

  17. James Clerk Maxwell, a precursor of system identification and control science

    NASA Astrophysics Data System (ADS)

    Bittanti, Sergio

    2015-12-01

    One hundred and fifty years ago James Clerk Maxwell published his celebrated paper 'Dynamical theory of electromagnetic field', where the interaction between electricity and magnetism eventually found an explanation. However, Maxwell was also a precursor of model identification and control ideas. Indeed, with the paper 'On Governors' of 1869, he introduced the concept of feedback control system; and moreover, with his essay on Saturn's rings of 1856 he set the basic principle of system identification. This paper is a tutorial exposition having the aim to enlighten these latter aspects of Maxwell's work.

  18. Verification of the quantum dimension effects in electricsl condactivity with different topology of laser-induced thin-film structures

    NASA Astrophysics Data System (ADS)

    Arakelian, S.; Kucherik, A.; Kutrovskaya, S.; Osipov, A.; Istratov, A.; Skryabin, I.

    2018-01-01

    A clear physical model for the quantum states verification in nanocluster structures with jump/tunneling electroconductivity are under study in both theory and experiment. The accent is made on consideration of low-dimensional structures when the structural phase transitions occur and the tendency to high enhancement electroconductivity obtained. The results give us an opportunity to establish a basis for new physical principles to create the functional elements for the optoelectronics and photonics in hybrid set-up (optics + electrophysics) by the nanocluster technology approach.

  19. The QTP family of consistent functionals and potentials in Kohn-Sham density functional theory

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jin, Yifan; Bartlett, Rodney J., E-mail: bartlett@qtp.ufl.edu

    This manuscript presents the second, consistent density functional in the QTP (Quantum Theory Project) family, that is, the CAM-QTP(01). It is a new range-separated exchange-correlation functional in which the non-local exchange contribution is 100% at large separation. It follows the same basic principles of this family that the Kohn-Sham eigenvalues of the occupied orbitals approximately equal the vertical ionization energies, which is not fulfilled by most of the traditional density functional methods. This new CAM-QTP(01) functional significantly improves the accuracy of the vertical excitation energies especially for the Rydberg states in the test set. It also reproduces many other propertiesmore » such as geometries, reaction barrier heights, and atomization energies.« less

  20. Social competence intervention for youth with Asperger Syndrome and high-functioning autism: an initial investigation.

    PubMed

    Stichter, Janine P; Herzog, Melissa J; Visovsky, Karen; Schmidt, Carla; Randolph, Jena; Schultz, Tia; Gage, Nicholas

    2010-09-01

    Individuals with high functioning autism (HFA) or Asperger Syndrome (AS) exhibit difficulties in the knowledge or correct performance of social skills. This subgroup's social difficulties appear to be associated with deficits in three social cognition processes: theory of mind, emotion recognition and executive functioning. The current study outlines the development and initial administration of the group-based Social Competence Intervention (SCI), which targeted these deficits using cognitive behavioral principles. Across 27 students age 11-14 with a HFA/AS diagnosis, results indicated significant improvement on parent reports of social skills and executive functioning. Participants evidenced significant growth on direct assessments measuring facial expression recognition, theory of mind and problem solving. SCI appears promising, however, larger samples and application in naturalistic settings are warranted.

  1. Demonstrating the unit hydrograph and flow routing processes involving active student participation - a university lecture experiment

    NASA Astrophysics Data System (ADS)

    Schulz, Karsten; Burgholzer, Reinhard; Klotz, Daniel; Wesemann, Johannes; Herrnegger, Mathew

    2018-05-01

    The unit hydrograph (UH) has been one of the most widely employed hydrological modelling techniques to predict rainfall-runoff behaviour of hydrological catchments, and is still used to this day. Its concept is based on the idea that a unit of effective precipitation per time unit (e.g. mm h-1) will always lead to a specific catchment response in runoff. Given its relevance, the UH is an important topic that is addressed in most (engineering) hydrology courses at all academic levels. While the principles of the UH seem to be simple and easy to understand, teaching experiences in the past suggest strong difficulties in students' perception of the UH theory and application. In order to facilitate a deeper understanding of the theory and application of the UH for students, we developed a simple and cheap lecture theatre experiment which involved active student participation. The seating of the students in the lecture theatre represented the hydrological catchment in its size and form. A set of plastic balls, prepared with a piece of magnetic strip to be tacked to any white/black board, each represented a unit amount of effective precipitation. The balls are evenly distributed over the lecture theatre and routed by some given rules down the catchment to the catchment outlet, where the resulting hydrograph is monitored and illustrated at the black/white board. The experiment allowed an illustration of the underlying principles of the UH, including stationarity, linearity, and superposition of the generated runoff and subsequent routing. In addition, some variations of the experimental setup extended the UH concept to demonstrate the impact of elevation, different runoff regimes, and non-uniform precipitation events on the resulting hydrograph. In summary, our own experience in the classroom, a first set of student exams, as well as student feedback and formal evaluation suggest that the integration of such an experiment deepened the learning experience by active participation. The experiment also initialized a more experienced based discussion of the theory and assumptions behind the UH. Finally, the experiment was a welcome break within a 3 h lecture setting, and great fun to prepare and run.

  2. Boltzmann, Darwin and Directionality theory

    NASA Astrophysics Data System (ADS)

    Demetrius, Lloyd A.

    2013-09-01

    Boltzmann’s statistical thermodynamics is a mathematical theory which relates the macroscopic properties of aggregates of interacting molecules with the laws of their interaction. The theory is based on the concept thermodynamic entropy, a statistical measure of the extent to which energy is spread throughout macroscopic matter. Macroscopic evolution of material aggregates is quantitatively explained in terms of the principle: Thermodynamic entropy increases as the composition of the aggregate changes under molecular collision. Darwin’s theory of evolution is a qualitative theory of the origin of species and the adaptation of populations to their environment. A central concept in the theory is fitness, a qualitative measure of the capacity of an organism to contribute to the ancestry of future generations. Macroscopic evolution of populations of living organisms can be qualitatively explained in terms of a neo-Darwinian principle: Fitness increases as the composition of the population changes under variation and natural selection. Directionality theory is a quantitative model of the Darwinian argument of evolution by variation and selection. This mathematical theory is based on the concept evolutionary entropy, a statistical measure which describes the rate at which an organism appropriates energy from the environment and reinvests this energy into survivorship and reproduction. According to directionality theory, microevolutionary dynamics, that is evolution by mutation and natural selection, can be quantitatively explained in terms of a directionality principle: Evolutionary entropy increases when the resources are diverse and of constant abundance; but decreases when the resource is singular and of variable abundance. This report reviews the analytical and empirical support for directionality theory, and invokes the microevolutionary dynamics of variation and selection to delineate the principles which govern macroevolutionary dynamics of speciation and extinction. We also elucidate the relation between thermodynamic entropy, which pertains to the extent of energy spreading and sharing within inanimate matter, and evolutionary entropy, which refers to the rate of energy appropriation from the environment and allocation within living systems. We show that the entropic principle of thermodynamics is the limit as R→0, M→∞, (where R denote the resource production rate, and M denote population size) of the entropic principle of evolution. We exploit this relation between the thermodynamic and evolutionary tenets to propose a physico-chemical model of the transition from inanimate matter which is under thermodynamic selection, to living systems which are subject to evolutionary selection. Life history variation and the evolution of senescence The evolutionary dynamics of speciation and extinction Evolutionary trends in body size. The origin of sporadic forms of cancer and neurological diseases, and the evolution of cooperation are important recent applications of directionality theory. These applications, which draw from the medical sciences and sociobiology, appeal to methods which lie outside the formalism described in this report. A companion review, Demetrius and Gundlach (submitted for publication), gives an account of these applications.An important aspect of this report pertains to the connection between statistical mechanics and evolutionary theory and its implications towards understanding the processes which underlie the emergence of living systems from inanimate matter-a problem which has recently attracted considerable attention, Morowitz (1992), Eigen (1992), Dyson (2000), Pross (2012).The connection between the two disciplines can be addressed by appealing to certain extremal principles which are considered the mainstay of the respective theories.The extremal principle in statistical mechanics can be stated as follows:

  3. An Evolutionary Comparison of the Handicap Principle and Hybrid Equilibrium Theories of Signaling

    PubMed Central

    Kane, Patrick; Zollman, Kevin J. S.

    2015-01-01

    The handicap principle has come under significant challenge both from empirical studies and from theoretical work. As a result, a number of alternative explanations for honest signaling have been proposed. This paper compares the evolutionary plausibility of one such alternative, the “hybrid equilibrium,” to the handicap principle. We utilize computer simulations to compare these two theories as they are instantiated in Maynard Smith’s Sir Philip Sidney game. We conclude that, when both types of communication are possible, evolution is unlikely to lead to handicap signaling and is far more likely to result in the partially honest signaling predicted by hybrid equilibrium theory. PMID:26348617

  4. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mohapi, N.; Hees, A.; Larena, J., E-mail: n.mohapi@gmail.com, E-mail: a.hees@ru.ac.za, E-mail: j.larena@ru.ac.za

    The Einstein Equivalence Principle is a fundamental principle of the theory of General Relativity. While this principle has been thoroughly tested with standard matter, the question of its validity in the Dark sector remains open. In this paper, we consider a general tensor-scalar theory that allows to test the equivalence principle in the Dark sector by introducing two different conformal couplings to standard matter and to Dark matter. We constrain these couplings by considering galactic observations of strong lensing and of velocity dispersion. Our analysis shows that, in the case of a violation of the Einstein Equivalence Principle, data favourmore » violations through coupling strengths that are of opposite signs for ordinary and Dark matter. At the same time, our analysis does not show any significant deviations from General Relativity.« less

  5. Many-Worlds Interpretation of Quantum Theory and Mesoscopic Anthropic Principle

    NASA Astrophysics Data System (ADS)

    Kamenshchik, A. Yu.; Teryaev, O. V.

    2008-10-01

    We suggest to combine the Anthropic Principle with Many-Worlds Interpretation of Quantum Theory. Realizing the multiplicity of worlds it provides an opportunity of explanation of some important events which are assumed to be extremely improbable. The Mesoscopic Anthropic Principle suggested here is aimed to explain appearance of such events which are necessary for emergence of Life and Mind. It is complementary to Cosmological Anthropic Principle explaining the fine tuning of fundamental constants. We briefly discuss various possible applications of Mesoscopic Anthropic Principle including the Solar Eclipses and assembling of complex molecules. Besides, we address the problem of Time's Arrow in the framework of Many-World Interpretation. We suggest the recipe for disentangling of quantities defined by fundamental physical laws and by an anthropic selection.

  6. Research and exploration of product innovative design for function

    NASA Astrophysics Data System (ADS)

    Wang, Donglin; Wei, Zihui; Wang, Youjiang; Tan, Runhua

    2009-07-01

    Products innovation is under the prerequisite of realizing the new function, the realization of the new function must solve the contradiction. A new process model of new product innovative design was proposed based on Axiomatic Design (AD) Theory and Functional Structure Analysis (FSA), imbedded Principle of Solving Contradiction. In this model, employ AD Theory to guide FSA, determine the contradiction for the realization of the principle solution. To provide powerful support for innovative design tools in principle solution, Principle of Solving Contradiction in the model were imbedded, so as to boost up the innovation of principle solution. As a case study, an innovative design of button battery separator paper punching machine has been achieved with application of the proposed model.

  7. Finite-temperature Gutzwiller approximation from the time-dependent variational principle

    NASA Astrophysics Data System (ADS)

    Lanatà, Nicola; Deng, Xiaoyu; Kotliar, Gabriel

    2015-08-01

    We develop an extension of the Gutzwiller approximation to finite temperatures based on the Dirac-Frenkel variational principle. Our method does not rely on any entropy inequality, and is substantially more accurate than the approaches proposed in previous works. We apply our theory to the single-band Hubbard model at different fillings, and show that our results compare quantitatively well with dynamical mean field theory in the metallic phase. We discuss potential applications of our technique within the framework of first-principle calculations.

  8. Gestalt theory: implications for radiology education.

    PubMed

    Koontz, Nicholas A; Gunderman, Richard B

    2008-05-01

    The Gestalt theory of modern psychology is grounded in the ideas that holistic rather than atomistic approaches are necessary to understand the mind, and that the mental whole is greater than the sum of its component parts. Although the Gestalt school fell out of favor due to its descriptive rather than explanatory nature, it permanently changed our understanding of perception. For the radiologist, such fundamental Gestalt concepts as figure-ground relationships and a variety of "grouping principles" (the laws of closure, proximity, similarity, common region, continuity, and symmetry) are ubiquitous in daily work, not to mention in art and personal life. By considering the applications of these principles and the stereotypical ways in which humans perceive visual stimuli, a radiology learner may incur fewer errors of diagnosis. This article serves to introduce several important principles of Gestalt theory, identify examples of these principles in widely recognizable fine art, and highlight their implications for radiology education.

  9. Metaphoric identity mapping: facilitating goal setting and engagement in rehabilitation after traumatic brain injury.

    PubMed

    Ylvisaker, Mark; McPherson, Kathryn; Kayes, Nicola; Pellett, Ellen

    2008-01-01

    Difficulty re-establishing an organised and compelling sense of personal identity has increasingly been identified as a critical theme in outcome studies of individuals with severe traumatic brain injury (TBI) and a serious obstacle to active engagement in rehabilitation. There exists little empirical support for approaches to identity reconstruction that address common impairments associated with TBI. Similarly, there is as yet little empirical support for theoretically sound approaches to promoting engagement in goal setting for this population. This article has two purposes. First, theory and procedures associated with metaphoric identity mapping are discussed in relation to goal setting in TBI rehabilitation. Second, the results of a qualitative pilot study are presented. The study explored metaphoric identity mapping as a facilitator of personally meaningful goal setting with five individuals with significant disability many years after their injury. Drawing on principles of grounded theory, the investigators extracted data from semi-structured interviews with clients and clinicians, from focus groups with the clinicians, and from observation of client-clinician interaction. Analysis of the data yielded five general themes concerning the use of this approach: All clients and clinicians found identity mapping to be an acceptable process and also useful for deriving meaningful rehabilitation goals. Both clients and clinicians saw client-centred goals as important. Cognitive impairments posed obstacles to this goal-setting intervention and mandated creative compensations. And finally, identity-related goal setting appeared to require a "mind shift" for some clinicians and demanded clinical skills not uniformly distributed among rehabilitation professionals.

  10. Patterns of interactions at grade 5 classroom in learning the topic of statistics viewed from cognitive load theory

    NASA Astrophysics Data System (ADS)

    Setianingsih, R.

    2018-01-01

    The nature of interactions that occurs among teacher, students, learning sources, and learning environment creates different settings to enhance learning. Any setting created by a teacher is affected by 3 (three) types of cognitive load: intrinsic cognitive load, extraneous cognitive load, and germane cognitive load. This study is qualitative in nature, aims to analyse the patterns of interaction that are constituted in mathematics instructions by taking into account the cognitive load theory. The subjects of this study are 21 fifth-grade students who learn mathematics in small groups and whole-class interactive lessons. The data were collected through classroom observations which were videotaped, while field notes were also taken. The data analysis revealed that students engaged in productive interaction and inquiry while they were learning mathematics in small groups or in whole class setting, in which there was a different type of cognitive load that dominantly affecting the learning processes at each setting. During learning mathematics in whole class setting, the most frequently found interaction patterns were to discuss and compare solution based on self-developed models, followed by expressing opinions. This is consistent with the principles of mathematics learning, which gives students wide opportunities to construct mathematical knowledge through individual learning, learning in small groups as well as learning in whole class settings. It means that by participating in interactive learning, the students are habitually engaged in productive interactions and high level of mathematical thinking.

  11. Astrology and Astronomy:From Conjunction to Opposition

    NASA Astrophysics Data System (ADS)

    Kunth, D.

    2009-02-01

    A theory is not some hunch, or half-baked idea that you come up with while taking a shower, or being under the influence of something or other. A theory, as scientists understand the meaning of the word, is a scientifically tested principle or body of principles that incorporates and explains a significant body of evidence.

  12. Appropriation from a Script Theory of Guidance Perspective: A Response to Pierre Tchounikine

    ERIC Educational Resources Information Center

    Stegmann, Karsten; Kollar, Ingo; Weinberger, Armin; Fischer, Frank

    2016-01-01

    In a recent paper, Pierre Tchounikine has suggested to advance the Script Theory of Guidance (SToG) by addressing the question how learners appropriate collaboration scripts presented to them in learning environments. Tchounikine's main criticism addresses SToG's "internal script configuration principle." This principle states that in…

  13. Artificial Instruction. A Method for Relating Learning Theory to Instructional Design.

    ERIC Educational Resources Information Center

    Ohlsson, Stellan

    Prior research on learning has been linked to instruction by the derivation of general principles of instructional design from learning theories. However, such design principles are often difficult to apply to particular instructional issues. A new method for relating research on learning to instructional design is proposed: Different ways of…

  14. It's Theories All the Way Down: A Response to Scientific Research in Education

    ERIC Educational Resources Information Center

    Gee, James Paul

    2005-01-01

    This article considers the six principles that the National Research Council's report Scientific Research in Education claims define an enterprise as scientific. I argue that these principles are relatively vacuous generalities because one cannot determine anything about any of them from outside specific theories of specific domains (and domains…

  15. Experimental investigation of the no-signalling principle in parity-time symmetric theory using an open quantum system

    NASA Astrophysics Data System (ADS)

    Tang, Jian-Shun; Wang, Yi-Tao; Yu, Shang; He, De-Yong; Xu, Jin-Shi; Liu, Bi-Heng; Chen, Geng; Sun, Yong-Nan; Sun, Kai; Han, Yong-Jian; Li, Chuan-Feng; Guo, Guang-Can

    2016-10-01

    The experimental progress achieved in parity-time () symmetry in classical optics is the most important accomplishment in the past decade and stimulates many new applications, such as unidirectional light transport and single-mode lasers. However, in the quantum regime, some controversial effects are proposed for -symmetric theory, for example, the potential violation of the no-signalling principle. It is therefore important to understand whether -symmetric theory is consistent with well-established principles. Here, we experimentally study this no-signalling problem related to the -symmetric theory using two space-like separated entangled photons, with one of them passing through a post-selected quantum gate, which effectively simulates a -symmetric evolution. Our results suggest that the superluminal information transmission can be simulated when the successfully -symmetrically evolved subspace is solely considered. However, considering this subspace is only a part of the full Hermitian system, additional information regarding whether the -symmetric evolution is successful is necessary, which transmits to the receiver at maximally light speed, maintaining the no-signalling principle.

  16. Experimental investigation of the no-signalling principle in parity-time symmetric theory using an open quantum system

    NASA Astrophysics Data System (ADS)

    Tang, Jian-Shun; Wang, Yi-Tao; Han, Yong-Jian; Li, Chuan-Feng; Guo, Guang-Can

    The experimental progress achieved in parity-time (PT) symmetry in classical optics is the most important accomplishment in the past decade and stimulates many new applications, such as unidirectional light transport and single-mode lasers. However, in the quantum regime, some controversial effects are proposed for PT-symmetric theory, for example, the potential violation of the no-signalling principle. It is therefore important to understand whether PT-symmetric theory is consistent with well-established principles. Here, we experimentally study this no-signalling problem related to the PT-symmetric theory using two space-like separated entangled photons, with one of them passing through a post-selected quantum gate, which effectively simulates a PT-symmetric evolution. Our results suggest that the superluminal information transmission can be simulated when the successfully PT-symmetrically evolved subspace is solely considered. However, considering this subspace is only a part of the full Hermitian system, additional information regarding whether the PT-symmetric evolution is successful is necessary, which transmits to the receiver at maximally light speed, maintaining the no-signalling principle.

  17. Aeroacoustic and aerodynamic applications of the theory of nonequilibrium thermodynamics

    NASA Technical Reports Server (NTRS)

    Horne, W. Clifton; Smith, Charles A.; Karamcheti, Krishnamurty

    1991-01-01

    Recent developments in the field of nonequilibrium thermodynamics associated with viscous flows are examined and related to developments to the understanding of specific phenomena in aerodynamics and aeroacoustics. A key element of the nonequilibrium theory is the principle of minimum entropy production rate for steady dissipative processes near equilibrium, and variational calculus is used to apply this principle to several examples of viscous flow. A review of nonequilibrium thermodynamics and its role in fluid motion are presented. Several formulations are presented of the local entropy production rate and the local energy dissipation rate, two quantities that are of central importance to the theory. These expressions and the principle of minimum entropy production rate for steady viscous flows are used to identify parallel-wall channel flow and irrotational flow as having minimally dissipative velocity distributions. Features of irrotational, steady, viscous flow near an airfoil, such as the effect of trailing-edge radius on circulation, are also found to be compatible with the minimum principle. Finally, the minimum principle is used to interpret the stability of infinitesimal and finite amplitude disturbances in an initially laminar, parallel shear flow, with results that are consistent with experiment and linearized hydrodynamic stability theory. These results suggest that a thermodynamic approach may be useful in unifying the understanding of many diverse phenomena in aerodynamics and aeroacoustics.

  18. Restoring the Pauli principle in the random phase approximation ground state

    NASA Astrophysics Data System (ADS)

    Kosov, D. S.

    2017-12-01

    Random phase approximation ground state contains electronic configurations where two (and more) identical electrons can occupy the same molecular spin-orbital violating the Pauli exclusion principle. This overcounting of electronic configurations happens due to quasiboson approximation in the treatment of electron-hole pair operators. We describe the method to restore the Pauli principle in the RPA wavefunction. The proposed theory is illustrated by the calculations of molecular dipole moments and electronic kinetic energies. The Hartree-Fock based RPA, which is corrected for the Pauli principle, gives the results of comparable accuracy with Møller-Plesset second order perturbation theory and coupled-cluster singles and doubles method.

  19. The Possibility of a New Metaphysics for Quantum Mechanics from Meinong's Theory of Objects

    NASA Astrophysics Data System (ADS)

    Graffigna, Matías

    According to de Ronde it was Bohr's interpretation of Quantum Mechanics (QM) which closed the possibility of understanding physical reality beyond the realm of the actual, so establishing the Orthodox Line of Research. In this sense, it is not the task of any physical theory to look beyond the language and metaphysics supposed by classical physics, in order to account for what QM describes. If one wishes to maintain a realist position (though not nave) regarding physical theories, one seems then to be trapped by an array of concepts that do not allow to understand the main principles involved in the most successful physical theory thus far, mainly: the quantum postulate, the principle of indetermination and the superposition principle. If de Ronde is right in proposing QM can only be completed as a physical theory by the introduction of `new concepts' that admit as real a domain beyond actuality, then a new ontology that goes beyond Aristotelian and Newtonian actualism is needed. It was already in the early 20th century that misunderstood philosopher Alexius von Meinong proposed a Theory of Objects that admits a domain of being beyond existence-actuality. Member of the so called `School of Brentano', Meinong's concerns were oriented to provide an ontology of everything that can be thought of, and at the same time an intentionality theory of how objects are thought of. I wish to argue that in Meinong's theory of objects we find the rudiments of the ontology and the intentionality theory we need to account for QM's basic principles: mainly the possibility of predicating properties of non-entities, or in other words, the possibility of objectively describing a domain of what is, that is different from the domain of actual existence.

  20. Experiences from Participants in Large-Scale Group Practice of the Maharishi Transcendental Meditation and TM-Sidhi Programs and Parallel Principles of Quantum Theory, Astrophysics, Quantum Cosmology, and String Theory: Interdisciplinary Qualitative Correspondences

    NASA Astrophysics Data System (ADS)

    Svenson, Eric Johan

    Participants on the Invincible America Assembly in Fairfield, Iowa, and neighboring Maharishi Vedic City, Iowa, practicing Maharishi Transcendental Meditation(TM) (TM) and the TM-Sidhi(TM) programs in large groups, submitted written experiences that they had had during, and in some cases shortly after, their daily practice of the TM and TM-Sidhi programs. Participants were instructed to include in their written experiences only what they observed and to leave out interpretation and analysis. These experiences were then read by the author and compared with principles and phenomena of modern physics, particularly with quantum theory, astrophysics, quantum cosmology, and string theory as well as defining characteristics of higher states of consciousness as described by Maharishi Vedic Science. In all cases, particular principles or phenomena of physics and qualities of higher states of consciousness appeared qualitatively quite similar to the content of the given experience. These experiences are presented in an Appendix, in which the corresponding principles and phenomena of physics are also presented. These physics "commentaries" on the experiences were written largely in layman's terms, without equations, and, in nearly every case, with clear reference to the corresponding sections of the experiences to which a given principle appears to relate. An abundance of similarities were apparent between the subjective experiences during meditation and principles of modern physics. A theoretic framework for understanding these rich similarities may begin with Maharishi's theory of higher states of consciousness provided herein. We conclude that the consistency and richness of detail found in these abundant similarities warrants the further pursuit and development of such a framework.

  1. First-Principles pH Theory

    NASA Astrophysics Data System (ADS)

    Kim, Yong-Hyun; Zhang, S. B.

    2006-03-01

    Despite being one of the most important macroscopic measures and a long history even before the quantum mechanics, the concept of pH has rarely been mentioned in microscopic theories, nor being incorporated computationally into first-principles theory of aqueous solutions. Here, we formulate a theory for the pH dependence of solution formation energy by introducing the proton chemical potential as the microscopic counterpart of pH in atomistic solution models. Within the theory, the general acid-base chemistry can be cast in a simple pictorial representation. We adopt density-functional molecular dynamics to demonstrate the usefulness of the method by studying a number of solution systems including water, small solute molecules such as NH3 and HCOOH, and more complex amino acids with several functional groups. For pure water, we calculated the auto- ionization constant to be 13.2 with a 95 % accuracy. For other solutes, the calculated dissociation constants, i.e., the so- called pKa, are also in reasonable agreement with experiments. Our first-principles pH theory can be readily applied to broad solution chemistry problems such as redox reactions.

  2. Setting conservation priorities.

    PubMed

    Wilson, Kerrie A; Carwardine, Josie; Possingham, Hugh P

    2009-04-01

    A generic framework for setting conservation priorities based on the principles of classic decision theory is provided. This framework encapsulates the key elements of any problem, including the objective, the constraints, and knowledge of the system. Within the context of this framework the broad array of approaches for setting conservation priorities are reviewed. While some approaches prioritize assets or locations for conservation investment, it is concluded here that prioritization is incomplete without consideration of the conservation actions required to conserve the assets at particular locations. The challenges associated with prioritizing investments through time in the face of threats (and also spatially and temporally heterogeneous costs) can be aided by proper problem definition. Using the authors' general framework for setting conservation priorities, multiple criteria can be rationally integrated and where, how, and when to invest conservation resources can be scheduled. Trade-offs are unavoidable in priority setting when there are multiple considerations, and budgets are almost always finite. The authors discuss how trade-offs, risks, uncertainty, feedbacks, and learning can be explicitly evaluated within their generic framework for setting conservation priorities. Finally, they suggest ways that current priority-setting approaches may be improved.

  3. Chemical Principles Exemplified

    ERIC Educational Resources Information Center

    Plumb, Robert C.

    1970-01-01

    This is the first of a new series of brief ancedotes about materials and phenomena which exemplify chemical principles. Examples include (1) the sea-lab experiment illustrating principles of the kinetic theory of gases, (2) snow-making machines illustrating principles of thermodynamics in gas expansions and phase changes, and (3) sunglasses that…

  4. Development of an ICT in IBSE course for science teachers: A design-based research

    NASA Astrophysics Data System (ADS)

    Tran, Trinh-Ba

    2018-01-01

    Integration of ICT tools for measuring with sensors, analyzing video, and modelling into Inquiry-Based Science Education (IBSE) is a need globally recognized. The challenge to teachers is how to turn manipulation of equipment and software into manipulation of ideas. We have developed a short ICT in IBSE course to prepare and support science teachers to teach inquiry-based activities with ICT tools. Within the framework of design-based research, we first defined the pedagogical principles from the literature, developed core materials for teacher learning, explored boundary conditions of the training in different countries, and elaborated set-ups of the course for the Dutch, Slovak, and Vietnamese contexts. Next, we taught and evaluated three iterative cycles of the Dutch course set-ups for pre-service science teachers from four teacher-education institutes nationwide. In each cycle, data on the teacher learning was collected via observations, questionnaires, interviews, and documents. These data were then analyzed for the questions about faithful implementation and effectiveness of the course. Following the same approach, we taught and evaluated two cycles of the Slovak course set-ups for in-service science teachers in the context of the national accreditation programme for teacher professional development. In addition, we investigated applicability of the final Dutch course set-up in the context of the physics-education master program in Vietnam with adaptations geared to educational and cultural difference. Through the iterations of implementation, evaluation, and revision, eventually the course objectives were achieved to certain extent; the pedagogical principles and core materials proved to be effective and applicable in different contexts. We started this research and design project with the pedagogical principles and concluded it with these principles (i.e. complete theory-practice cycle, depth first, distributed learning, and ownership of learning) as the core of the basic design of the ICT in IBSE course. These principles can be considered as independent, validated educational products, which teacher educators can "buy into" and use for broader aims than only "ICT in IBSE" integration. Pedagogical principles establish the theoretical model underlying the course design, provide guidelines and structure to the (re)design, implementation, evaluation, and optimization process, and help to communicate the design-based research to others. The role of pedagogical principles in design-based research is indeed essential. Moreover, we incorporated a robustness test and a generalizability/transferability test as a further step in our design-based research and achieved successful outcomes with this step. Consequently, we strongly recommend the testing of the design product in routine implementation conditions and in considerably different contexts (e.g. different programmes or even countries) as part of design-based research.

  5. The harm principle as a mid-level principle?: three problems from the context of infectious disease control.

    PubMed

    Krom, André

    2011-10-01

    Effective infectious disease control may require states to restrict the liberty of individuals. Since preventing harm to others is almost universally accepted as a legitimate (prima facie) reason for restricting the liberty of individuals, it seems plausible to employ a mid-level harm principle in infectious disease control. Moral practices like infectious disease control support - or even require - a certain level of theory-modesty. However, employing a mid-level harm principle in infectious disease control faces at least three problems. First, it is unclear what we gain by attaining convergence on a specific formulation of the harm principle. Likely candidates for convergence, a harm principle aimed at preventing harmful conduct, supplemented by considerations of effectiveness and always choosing the least intrusive means still leave ample room for normative disagreement. Second, while mid-level principles are sometimes put forward in response to the problem of normative theories attaching different weight to moral principles, employing a mid-level harm principle completely leaves open how to determine what weight to attach to it in application. Third, there appears to be a trade-off between attaining convergence and finding a formulation of the harm principle that can justify liberty-restrictions in all situations of contagion, including interventions that are commonly allowed. These are not reasons to abandon mid-level theorizing altogether. But there is no reason to be too theory-modest in applied ethics. Morally justifying e.g. if a liberty-restriction in infectious disease control is proportional to the aim of harm-prevention, promptly requires moving beyond the mid-level harm principle. © 2011 Blackwell Publishing Ltd.

  6. Supporting shared decision-making for older people with multiple health and social care needs: a protocol for a realist synthesis to inform integrated care models.

    PubMed

    Bunn, Frances; Goodman, Claire; Manthorpe, Jill; Durand, Marie-Anne; Hodkinson, Isabel; Rait, Greta; Millac, Paul; Davies, Sue L; Russell, Bridget; Wilson, Patricia

    2017-02-07

    Including the patient or user perspective is a central organising principle of integrated care. Moreover, there is increasing recognition of the importance of strengthening relationships among patients, carers and practitioners, particularly for individuals receiving substantial health and care support, such as those with long-term or multiple conditions. The overall aims of this synthesis are to provide a context-relevant understanding of how models to facilitate shared decision-making (SDM) might work for older people with multiple health and care needs, and how they might be applied to integrated care models. The synthesis draws on the principles of realist inquiry, to explain how, in what contexts and for whom, interventions that aim to strengthen SDM among older patients, carers and practitioners are effective. We will use an iterative, stakeholder-driven, three-phase approach. Phase 1: development of programme theory/theories that will be tested through a first scoping of the literature and consultation with key stakeholder groups; phase 2: systematic searches of the evidence to test and develop the theories identified in phase 1; phase 3: validation of programme theory/theories with a purposive sample of participants from phase 1. The synthesis will draw on prevailing theories such as candidacy, self-efficacy, personalisation and coproduction. Ethics approval for the stakeholder interviews was obtained from the University of Hertfordshire ECDA (Ethics Committee with Delegated Authority), reference number HSK/SF/UH/02387. The propositions arising from this review will be used to develop recommendations about how to tailor SDM interventions to older people with complex health and social care needs in an integrated care setting. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/.

  7. Principles and Practices Fostering Inclusive Excellence: Lessons from the Howard Hughes Medical Institute’s Capstone Institutions

    PubMed Central

    DiBartolo, Patricia Marten; Gregg-Jolly, Leslie; Gross, Deborah; Manduca, Cathryn A.; Iverson, Ellen; Cooke, David B.; Davis, Gregory K.; Davidson, Cameron; Hertz, Paul E.; Hibbard, Lisa; Ireland, Shubha K.; Mader, Catherine; Pai, Aditi; Raps, Shirley; Siwicki, Kathleen; Swartz, Jim E.

    2016-01-01

    Best-practices pedagogy in science, technology, engineering, and mathematics (STEM) aims for inclusive excellence that fosters student persistence. This paper describes principles of inclusivity across 11 primarily undergraduate institutions designated as Capstone Awardees in Howard Hughes Medical Institute’s (HHMI) 2012 competition. The Capstones represent a range of institutional missions, student profiles, and geographical locations. Each successfully directed activities toward persistence of STEM students, especially those from traditionally underrepresented groups, through a set of common elements: mentoring programs to build community; research experiences to strengthen scientific skill/identity; attention to quantitative skills; and outreach/bridge programs to broaden the student pool. This paper grounds these program elements in learning theory, emphasizing their essential principles with examples of how they were implemented within institutional contexts. We also describe common assessment approaches that in many cases informed programming and created traction for stakeholder buy-in. The lessons learned from our shared experiences in pursuit of inclusive excellence, including the resources housed on our companion website, can inform others’ efforts to increase access to and persistence in STEM in higher education. PMID:27562960

  8. Hamiltonian approach to GR - Part 1: covariant theory of classical gravity

    NASA Astrophysics Data System (ADS)

    Cremaschini, Claudio; Tessarotto, Massimo

    2017-05-01

    A challenging issue in General Relativity concerns the determination of the manifestly covariant continuum Hamiltonian structure underlying the Einstein field equations and the related formulation of the corresponding covariant Hamilton-Jacobi theory. The task is achieved by adopting a synchronous variational principle requiring distinction between the prescribed deterministic metric tensor \\widehat{g}(r)≡ { \\widehat{g}_{μ ν }(r)} solution of the Einstein field equations which determines the geometry of the background space-time and suitable variational fields x≡ { g,π } obeying an appropriate set of continuum Hamilton equations, referred to here as GR-Hamilton equations. It is shown that a prerequisite for reaching such a goal is that of casting the same equations in evolutionary form by means of a Lagrangian parametrization for a suitably reduced canonical state. As a result, the corresponding Hamilton-Jacobi theory is established in manifestly covariant form. Physical implications of the theory are discussed. These include the investigation of the structural stability of the GR-Hamilton equations with respect to vacuum solutions of the Einstein equations, assuming that wave-like perturbations are governed by the canonical evolution equations.

  9. Solid Rocket Fuel Constitutive Theory and Polymer Cure

    NASA Technical Reports Server (NTRS)

    Ream, Robert

    2006-01-01

    Solid Rocket Fuel is a complex composite material for which no general constitutive theory, based on first principles, has been developed. One of the principles such a relation would depend on is the morphology of the binder. A theory of polymer curing is required to determine this morphology. During work on such a theory an algorithm was developed for counting the number of ways a polymer chain could assemble. The methods used to develop and check this algorithm led to an analytic solution to the problem. This solution is used in a probability distribution function which characterizes the morphology of the polymer.

  10. What is an adequate sample size? Operationalising data saturation for theory-based interview studies.

    PubMed

    Francis, Jill J; Johnston, Marie; Robertson, Clare; Glidewell, Liz; Entwistle, Vikki; Eccles, Martin P; Grimshaw, Jeremy M

    2010-12-01

    In interview studies, sample size is often justified by interviewing participants until reaching 'data saturation'. However, there is no agreed method of establishing this. We propose principles for deciding saturation in theory-based interview studies (where conceptual categories are pre-established by existing theory). First, specify a minimum sample size for initial analysis (initial analysis sample). Second, specify how many more interviews will be conducted without new ideas emerging (stopping criterion). We demonstrate these principles in two studies, based on the theory of planned behaviour, designed to identify three belief categories (Behavioural, Normative and Control), using an initial analysis sample of 10 and stopping criterion of 3. Study 1 (retrospective analysis of existing data) identified 84 shared beliefs of 14 general medical practitioners about managing patients with sore throat without prescribing antibiotics. The criterion for saturation was achieved for Normative beliefs but not for other beliefs or studywise saturation. In Study 2 (prospective analysis), 17 relatives of people with Paget's disease of the bone reported 44 shared beliefs about taking genetic testing. Studywise data saturation was achieved at interview 17. We propose specification of these principles for reporting data saturation in theory-based interview studies. The principles may be adaptable for other types of studies.

  11. Theories of Matter, Space and Time; Classical theories

    NASA Astrophysics Data System (ADS)

    Evans, N.; King, S. F.

    2017-12-01

    This book and its sequel ('Theories of Matter Space and Time: Quantum Theories') are taken from third and fourth year undergraduate Physics courses at Southampton University, UK. The aim of both books is to move beyond the initial courses in classical mechanics, special relativity, electromagnetism, and quantum theory to more sophisticated views of these subjects and their interdependence. The goal is to guide undergraduates through some of the trickier areas of theoretical physics with concise analysis while revealing the key elegance of each subject. The first chapter introduces the key areas of the principle of least action, an alternative treatment of Newtownian dynamics, that provides new understanding of conservation laws. In particular, it shows how the formalism evolved from Fermat's principle of least time in optics. The second introduces special relativity leading quickly to the need and form of four-vectors. It develops four-vectors for all kinematic variables and generalize Newton's second law to the relativistic environment; then returns to the principle of least action for a free relativistic particle. The third chapter presents a review of the integral and differential forms of Maxwell's equations before massaging them to four-vector form so that the Lorentz boost properties of electric and magnetic fields are transparent. Again, it then returns to the action principle to formulate minimal substitution for an electrically charged particle.

  12. Self-Regulation Principles Underlying Risk Perception and Decision Making within the Context of Genomic Testing

    PubMed Central

    Cameron, Linda D.; Biesecker, Barbara Bowles; Peters, Ellen; Taber, Jennifer M.; Klein, William M. P.

    2017-01-01

    Advances in theory and research on self-regulation and decision-making processes have yielded important insights into how cognitive, emotional, and social processes shape risk perceptions and risk-related decisions. We examine how self-regulation theory can be applied to inform our understanding of decision-making processes within the context of genomic testing, a clinical arena in which individuals face complex risk information and potentially life-altering decisions. After presenting key principles of self-regulation, we present a genomic testing case example to illustrate how principles related to risk representations, approach and avoidance motivations, emotion regulation, defensive responses, temporal construals, and capacities such as numeric abilities can shape decisions and psychological responses during the genomic testing process. We conclude with implications for using self-regulation theory to advance science within genomic testing and opportunities for how this research can inform further developments in self-regulation theory. PMID:29225669

  13. Self-Regulation Principles Underlying Risk Perception and Decision Making within the Context of Genomic Testing.

    PubMed

    Cameron, Linda D; Biesecker, Barbara Bowles; Peters, Ellen; Taber, Jennifer M; Klein, William M P

    2017-05-01

    Advances in theory and research on self-regulation and decision-making processes have yielded important insights into how cognitive, emotional, and social processes shape risk perceptions and risk-related decisions. We examine how self-regulation theory can be applied to inform our understanding of decision-making processes within the context of genomic testing, a clinical arena in which individuals face complex risk information and potentially life-altering decisions. After presenting key principles of self-regulation, we present a genomic testing case example to illustrate how principles related to risk representations, approach and avoidance motivations, emotion regulation, defensive responses, temporal construals, and capacities such as numeric abilities can shape decisions and psychological responses during the genomic testing process. We conclude with implications for using self-regulation theory to advance science within genomic testing and opportunities for how this research can inform further developments in self-regulation theory.

  14. Using Participatory Action Research to Develop a Working Model That Enhances Psychiatric Nurses' Professionalism: The Architecture of Stability.

    PubMed

    Salzmann-Erikson, Martin

    2017-11-01

    Ward rules in psychiatric care aim to promote safety for both patients and staff. Simultaneously, ward rules are associated with increased patient violence, leading to neither a safe work environment nor a safe caring environment. Although ward rules are routinely used, few studies have explicitly accounted for their impact. To describe the process of a team development project considering ward rule issues, and to develop a working model to empower staff in their daily in-patient psychiatric nursing practices. The design of this study is explorative and descriptive. Participatory action research methodology was applied to understand ward rules. Data consists of audio-recorded group discussions, observations and field notes, together creating a data set of 556 text pages. More than 100 specific ward rules were identified. In this process, the word rules was relinquished in favor of adopting the term principles, since rules are inconsistent with a caring ideology. A linguistic transition led to the development of a framework embracing the (1) Principle of Safety, (2) Principle of Structure and (3) Principle of Interplay. The principles were linked to normative guidelines and applied ethical theories: deontology, consequentialism and ethics of care. The work model reminded staff about the principles, empowered their professional decision-making, decreased collegial conflicts because of increased acceptance for individual decisions, and, in general, improved well-being at work. Furthermore, the work model also empowered staff to find support for their decisions based on principles that are grounded in the ethics of totality.

  15. Experimenting `learn by doing' and `learn by failing'

    NASA Astrophysics Data System (ADS)

    Pozzi, Rossella; Noè, Carlo; Rossi, Tommaso

    2015-01-01

    According to the literature, in recent years, developing experiential learning has fulfilled the requirement of a deep understanding of lean philosophy by engineering students, demonstrating the advantages and disadvantages of some of the key principles of lean manufacturing. On the other hand, the literature evidences how some kinds of game-based experiential learning overlook daily difficulties, which play a central role in manufacturing systems. To fill the need of a game overcoming such lack of vision, an innovative game direct in-field, named Kart Factory, has been developed. Actual production shifts are simulated, while keeping all the elements peculiar to a real production set (i.e. complexity, effort, safety). The working environment is a real pedal car assembly department, the products to be assembled have relevant size and weight (i.e. up to 35 kg approximately), and the provided tools are real production equipment (e.g. keys, screwdrivers, trans-pallets, etc.). Due to the need to maximise the impact on students, a labour-intensive process characterises the production department. The whole training process is based on three educational principles: Experience Value Principle, Error Value Principle, and Team Value Principle. As the 'learn by doing' and 'learn by failing' are favoured, the theory follows the practice, while crating the willingness to 'do' instead of just designing or planning. The gathered data prove the Kart Factory's effectiveness in reaching a good knowledge of lean concepts, notwithstanding the students' initial knowledge level.

  16. An Exploratory Review of Design Principles in Constructivist Gaming Learning Environments

    ERIC Educational Resources Information Center

    Rosario, Roberto A. Munoz; Widmeyer, George R.

    2009-01-01

    Creating a design theory for Constructivist Gaming Learning Environment necessitates, among other things, the establishment of design principles. These principles have the potential to help designers produce games, where users achieve higher levels of learning. This paper focuses on twelve design principles: Probing, Distributed, Multiple Routes,…

  17. Generalized uncertainty principles and quantum field theory

    NASA Astrophysics Data System (ADS)

    Husain, Viqar; Kothawala, Dawood; Seahra, Sanjeev S.

    2013-01-01

    Quantum mechanics with a generalized uncertainty principle arises through a representation of the commutator [x^,p^]=if(p^). We apply this deformed quantization to free scalar field theory for f±=1±βp2. The resulting quantum field theories have a rich fine scale structure. For small wavelength modes, the Green’s function for f+ exhibits a remarkable transition from Lorentz to Galilean invariance, whereas for f- such modes effectively do not propagate. For both cases Lorentz invariance is recovered at long wavelengths.

  18. Holograms of Flat Space

    NASA Astrophysics Data System (ADS)

    Bagchi, Arjun; Grumiller, Daniel

    2013-07-01

    The holographic principle has a concrete realization in the Anti-de Sitter/Conformal Field Theory (AdS/CFT) correspondence. If this principle is a true fact about quantum gravity then it must also hold beyond AdS/CFT. In this paper, we address specifically holographic field theory duals of gravitational theories in asymptotically flat spacetimes. We present some evidence of our recent conjecture that three-dimensional (3d) conformal Chern-Simons gravity (CSG) with flat space boundary conditions is dual to an extremal CFT.

  19. A Critical Praxis: Narrowing the Gap between Identity, Theory, and Practice

    ERIC Educational Resources Information Center

    Waller, Laurel; Wethers, Kinsey; De Costa, Peter I.

    2017-01-01

    Praxis is the balance of pedagogical theory and practice. This literature review explores praxis from a critical lens in terms of identity for both students and teachers. The authors center their framework on Hawkins and Norton's (2009) five principles for critical language teaching. The first principle relates to the situated nature of the…

  20. Using Relational-Cultural Theory to Conceptualize Couple Interventions in the Treatment of Sex Addiction

    ERIC Educational Resources Information Center

    Vogel, Joanne E.

    2007-01-01

    Sex addictions have become an increasing concern since the growth of the sex industry, sex in advertising, and the ease of Internet access to sex. This article uses the foundational principles of Relational-Cultural Theory (RCT) to conceptualize sexual addiction and its relational impact. Particular attention is paid to the principles of…

  1. Building a functional multiple intelligences theory to advance educational neuroscience

    PubMed Central

    Cerruti, Carlo

    2013-01-01

    A key goal of educational neuroscience is to conduct constrained experimental research that is theory-driven and yet also clearly related to educators’ complex set of questions and concerns. However, the fields of education, cognitive psychology, and neuroscience use different levels of description to characterize human ability. An important advance in research in educational neuroscience would be the identification of a cognitive and neurocognitive framework at a level of description relatively intuitive to educators. I argue that the theory of multiple intelligences (MI; Gardner, 1983), a conception of the mind that motivated a past generation of teachers, may provide such an opportunity. I criticize MI for doing little to clarify for teachers a core misunderstanding, specifically that MI was only an anatomical map of the mind but not a functional theory that detailed how the mind actually processes information. In an attempt to build a “functional MI” theory, I integrate into MI basic principles of cognitive and neural functioning, namely interregional neural facilitation and inhibition. In so doing I hope to forge a path toward constrained experimental research that bears upon teachers’ concerns about teaching and learning. PMID:24391613

  2. Theory of optimal balance predicts and explains the amplitude and decay time of synaptic inhibition

    PubMed Central

    Kim, Jaekyung K.; Fiorillo, Christopher D.

    2017-01-01

    Synaptic inhibition counterbalances excitation, but it is not known what constitutes optimal inhibition. We previously proposed that perfect balance is achieved when the peak of an excitatory postsynaptic potential (EPSP) is exactly at spike threshold, so that the slightest variation in excitation determines whether a spike is generated. Using simulations, we show that the optimal inhibitory postsynaptic conductance (IPSG) increases in amplitude and decay rate as synaptic excitation increases from 1 to 800 Hz. As further proposed by theory, we show that optimal IPSG parameters can be learned through anti-Hebbian rules. Finally, we compare our theoretical optima to published experimental data from 21 types of neurons, in which rates of synaptic excitation and IPSG decay times vary by factors of about 100 (5–600 Hz) and 50 (1–50 ms), respectively. From an infinite range of possible decay times, theory predicted experimental decay times within less than a factor of 2. Across a distinct set of 15 types of neuron recorded in vivo, theory predicted the amplitude of synaptic inhibition within a factor of 1.7. Thus, the theory can explain biophysical quantities from first principles. PMID:28281523

  3. Theory of optimal balance predicts and explains the amplitude and decay time of synaptic inhibition

    NASA Astrophysics Data System (ADS)

    Kim, Jaekyung K.; Fiorillo, Christopher D.

    2017-03-01

    Synaptic inhibition counterbalances excitation, but it is not known what constitutes optimal inhibition. We previously proposed that perfect balance is achieved when the peak of an excitatory postsynaptic potential (EPSP) is exactly at spike threshold, so that the slightest variation in excitation determines whether a spike is generated. Using simulations, we show that the optimal inhibitory postsynaptic conductance (IPSG) increases in amplitude and decay rate as synaptic excitation increases from 1 to 800 Hz. As further proposed by theory, we show that optimal IPSG parameters can be learned through anti-Hebbian rules. Finally, we compare our theoretical optima to published experimental data from 21 types of neurons, in which rates of synaptic excitation and IPSG decay times vary by factors of about 100 (5-600 Hz) and 50 (1-50 ms), respectively. From an infinite range of possible decay times, theory predicted experimental decay times within less than a factor of 2. Across a distinct set of 15 types of neuron recorded in vivo, theory predicted the amplitude of synaptic inhibition within a factor of 1.7. Thus, the theory can explain biophysical quantities from first principles.

  4. Effective Mass Theory of 2D Excitons Revisited

    NASA Astrophysics Data System (ADS)

    Gonzalez, Joseph; Oleynik, Ivan

    Two-dimensional (2D) semiconducting materials possess an exceptionally unique set of electronic and excitonic properties due to the combined effects of quantum and dielectric confinement. Reliable determination of exciton binding energies from both first-principles many-body perturbation theory (GW/BSE) and experiment is very challenging due to the enormous computational expense as well as the tremendous technical difficulties in experiment.. Very recently, effective mass theories of 2D excitons have been developed as an attractive alternative for inexpensive and accurate evaluation of the exciton binding energies. In this presentation, we evaluate two effective mass theory approaches by Velizhanin et al and Olsen et al in predicting exciton binding energies across a wide range of 2D materials. We specifically analyze the trends related to the varying screening lengths and exciton effective masses. We also extended the effective mass theory of 2D excitons to include effects of electron and hole mass anisotropies (mx ≠ my) , the latter showing a substantial influence on exciton binding energies. The recent predictions of exciton binding energies being independent of the exciton effective mass and a linear correlation with the band gap of a specific material are also critically reexamined.

  5. Science and technology convergence: with emphasis for nanotechnology-inspired convergence

    NASA Astrophysics Data System (ADS)

    Bainbridge, William S.; Roco, Mihail C.

    2016-07-01

    Convergence offers a new universe of discovery, innovation, and application opportunities through specific theories, principles, and methods to be implemented in research, education, production, and other societal activities. Using a holistic approach with shared goals, convergence seeks to transcend existing human limitations to achieve improved conditions for work, learning, aging, physical, and cognitive wellness. This paper outlines ten key theories that offer complementary perspectives on this complex dynamic. Principles and methods are proposed to facilitate and enhance science and technology convergence. Several convergence success stories in the first part of the 21st century—including nanotechnology and other emerging technologies—are discussed in parallel with case studies focused on the future. The formulation of relevant theories, principles, and methods aims at establishing the convergence science.

  6. A finite state, finite memory minimum principle, part 2. [a discussion of game theory, signaling, stochastic processes, and control theory

    NASA Technical Reports Server (NTRS)

    Sandell, N. R., Jr.; Athans, M.

    1975-01-01

    The development of the theory of the finite - state, finite - memory (FSFM) stochastic control problem is discussed. The sufficiency of the FSFM minimum principle (which is in general only a necessary condition) was investigated. By introducing the notion of a signaling strategy as defined in the literature on games, conditions under which the FSFM minimum principle is sufficient were determined. This result explicitly interconnects the information structure of the FSFM problem with its optimality conditions. The min-H algorithm for the FSFM problem was studied. It is demonstrated that a version of the algorithm always converges to a particular type of local minimum termed a person - by - person extremal.

  7. Application of Consider Covariance to the Extended Kalman Filter

    NASA Technical Reports Server (NTRS)

    Lundberg, John B.

    1996-01-01

    The extended Kalman filter (EKF) is the basis for many applications of filtering theory to real-time problems where estimates of the state of a dynamical system are to be computed based upon some set of observations. The form of the EKF may vary somewhat from one application to another, but the fundamental principles are typically unchanged among these various applications. As is the case in many filtering applications, models of the dynamical system (differential equations describing the state variables) and models of the relationship between the observations and the state variables are created. These models typically employ a set of constants whose values are established my means of theory or experimental procedure. Since the estimates of the state are formed assuming that the models are perfect, any modeling errors will affect the accuracy of the computed estimates. Note that the modeling errors may be errors of commission (errors in terms included in the model) or omission (errors in terms excluded from the model). Consequently, it becomes imperative when evaluating the performance of real-time filters to evaluate the effect of modeling errors on the estimates of the state.

  8. First principles of Hamiltonian medicine.

    PubMed

    Crespi, Bernard; Foster, Kevin; Úbeda, Francisco

    2014-05-19

    We introduce the field of Hamiltonian medicine, which centres on the roles of genetic relatedness in human health and disease. Hamiltonian medicine represents the application of basic social-evolution theory, for interactions involving kinship, to core issues in medicine such as pathogens, cancer, optimal growth and mental illness. It encompasses three domains, which involve conflict and cooperation between: (i) microbes or cancer cells, within humans, (ii) genes expressed in humans, (iii) human individuals. A set of six core principles, based on these domains and their interfaces, serves to conceptually organize the field, and contextualize illustrative examples. The primary usefulness of Hamiltonian medicine is that, like Darwinian medicine more generally, it provides novel insights into what data will be productive to collect, to address important clinical and public health problems. Our synthesis of this nascent field is intended predominantly for evolutionary and behavioural biologists who aspire to address questions directly relevant to human health and disease.

  9. The design of supercritical wings by the use of three-dimensional transonic theory

    NASA Technical Reports Server (NTRS)

    Mann, M. J.

    1979-01-01

    A procedure was developed for the design of transonic wings by the iterative use of three dimensional, inviscid, transonic analysis methods. The procedure was based on simple principles of supersonic flow and provided the designer with a set of guidelines for the systematic alteration of wing profile shapes to achieve some desired pressure distribution. The method was generally applicable to wing design at conditions involving a large region of supercriterical flow. To illustrate the method, it was applied to the design of a wing for a supercritical maneuvering fighter that operates at high lift and transonic Mach number. The wing profiles were altered to produce a large region of supercritical flow which was terminated by a weak shock wave. The spanwise variation of drag of this wing and some principles for selecting the streamwise pressure distribution are also discussed.

  10. Principles of dynamical modularity in biological regulatory networks

    PubMed Central

    Deritei, Dávid; Aird, William C.; Ercsey-Ravasz, Mária; Regan, Erzsébet Ravasz

    2016-01-01

    Intractable diseases such as cancer are associated with breakdown in multiple individual functions, which conspire to create unhealthy phenotype-combinations. An important challenge is to decipher how these functions are coordinated in health and disease. We approach this by drawing on dynamical systems theory. We posit that distinct phenotype-combinations are generated by interactions among robust regulatory switches, each in control of a discrete set of phenotypic outcomes. First, we demonstrate the advantage of characterizing multi-switch regulatory systems in terms of their constituent switches by building a multiswitch cell cycle model which points to novel, testable interactions critical for early G2/M commitment to division. Second, we define quantitative measures of dynamical modularity, namely that global cell states are discrete combinations of switch-level phenotypes. Finally, we formulate three general principles that govern the way coupled switches coordinate their function. PMID:26979940

  11. Learning theories 101: application to everyday teaching and scholarship.

    PubMed

    Kay, Denise; Kibble, Jonathan

    2016-03-01

    Shifts in educational research, in how scholarship in higher education is defined, and in how funding is appropriated suggest that educators within basic science fields can benefit from increased understanding of learning theory and how it applies to classroom practice. This article uses a mock curriculum design scenario as a framework for the introduction of five major learning theories. Foundational constructs and principles from each theory and how they apply to the proposed curriculum designs are described. A summative table that includes basic principles, constructs, and classroom applications as well as the role of the teacher and learner is also provided for each theory. Copyright © 2016 The American Physiological Society.

  12. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yu, Liang; Abild-Pedersen, Frank

    On the basis of an extensive set of density functional theory calculations, it is shown that a simple scheme provides a fundamental understanding of variations in the transition state energies and structures of reaction intermediates on transition metal surfaces across the periodic table. The scheme is built on the bond order conservation principle and requires a limited set of input data, still achieving transition state energies as a function of simple descriptors with an error smaller than those of approaches based on linear fits to a set of calculated transition state energies. Here, we have applied this approach together withmore » linear scaling of adsorption energies to obtain the energetics of the NH 3 decomposition reaction on a series of stepped fcc(211) transition metal surfaces. Moreover, this information is used to establish a microkinetic model for the formation of N 2 and H 2, thus providing insight into the components of the reaction that determines the activity.« less

  13. What is behind small deviations of quantum mechanics theory from experiments? Observer's mathematics point of view

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Khots, Boris, E-mail: bkhots@cccglobal.com; Khots, Dmitriy, E-mail: dkhots@imathconsulting.com

    2014-12-10

    Certain results that have been predicted by Quantum Mechanics (QM) theory are not always supported by experiments. This defines a deep crisis in contemporary physics and, in particular, quantum mechanics. We believe that, in fact, the mathematical apparatus employed within today's physics is a possible reason. In particular, we consider the concept of infinity that exists in today's mathematics as the root cause of this problem. We have created Observer's Mathematics that offers an alternative to contemporary mathematics. This paper is an attempt to relay how Observer's Mathematics may explain some of the contradictions in QM theory results. We considermore » the Hamiltonian Mechanics, Newton equation, Schrodinger equation, two slit interference, wave-particle duality for single photons, uncertainty principle, Dirac equations for free electron in a setting of arithmetic, algebra, and topology provided by Observer's Mathematics (see www.mathrelativity.com). Certain results and communications pertaining to solution of these problems are provided.« less

  14. Higher groupoid bundles, higher spaces, and self-dual tensor field equations

    NASA Astrophysics Data System (ADS)

    Jurčo, Branislav; Sämann, Christian; Wolf, Martin

    2016-08-01

    We develop a description of higher gauge theory with higher groupoids as gauge structure from first principles. This approach captures ordinary gauge theories and gauged sigma models as well as their categorifications on a very general class of (higher) spaces comprising presentable differentiable stacks, as e.g. orbifolds. We start off with a self-contained review on simplicial sets as models of $(\\infty,1)$-categories. We then discuss principal bundles in terms of simplicial maps and their homotopies. We explain in detail a differentiation procedure, suggested by Severa, that maps higher groupoids to $L_\\infty$-algebroids. Generalising this procedure, we define connections for higher groupoid bundles. As an application, we obtain six-dimensional superconformal field theories via a Penrose-Ward transform of higher groupoid bundles over a twistor space. This construction reduces the search for non-Abelian self-dual tensor field equations in six dimensions to a search for the appropriate (higher) gauge structure. The treatment aims to be accessible to theoretical physicists.

  15. On Trial: the Compatibility of Measurement in the Physical and Social Sciences

    NASA Astrophysics Data System (ADS)

    Cano, S. J.; Vosk, T.; Pendrill, L. R.; Stenner, A. J.

    2016-11-01

    In this paper, we put social measurement on trial: providing two perspectives arguing why measurement in the social and in the physical sciences are incompatible and counter with two perspectives supporting compatibility. For the case ‘against’, we first argue that there is a lack of definition in the social sciences. Thus, while measurement in the physical sciences is supported by empirical evidence, calibrated instruments, and predictive theory that work together to test the quantitative nature of properties, measurement in the social sciences, in the main, rests on a vague, discretionary definition of measurement that places hardly any restrictions on empirical data, does not require calibrated instruments, and rarely articulates predictive theories. The second argument for the case ‘against’ introduces the problem associated with psychometrics, including different approaches, methodologies, criteria for success and failure, and considerations as to what counts as measurement. Making the first case ‘for’, we highlight practical principles for improved social measurement including units, laws, theory, and metrology. The second argument ‘for’ introduces the exemplar of the Lexile Framework for reading that exploits metrological principles and parallels the paths taken by, for example, thermometry. We conclude by proposing a way forward potentially applicable to both physical and social measurement, in which inferences are modelled in terms of a measurement system, where specifically the output of the instrument in response to probing the object (‘entity’) is a performance metric, i.e. how well the set-up performs the assessment.

  16. Ready…, Set, Go!. Comment on "Towards a Computational Comparative Neuroprimatology: Framing the language-ready brain" by Michael A. Arbib

    NASA Astrophysics Data System (ADS)

    Iriki, Atsushi

    2016-03-01

    ;Language-READY brain; in the title of this article [1] seems to be the expression that the author prefers to use to illustrate his theoretical framework. The usage of the term ;READY; appears to be of extremely deep connotation, for three reasons. Firstly, of course it needs a ;principle; - the depth and the width of the computational theory depicted here is as expected from the author's reputation. However, ;readiness; implies that it is much more than just ;a theory;. That is, such a principle is not static, but it rather has dynamic properties, which are ready to gradually proceed to flourish once brains are put in adequate conditions to make time progressions - namely, evolution and development. So the second major connotation is that this article brought in the perspectives of the comparative primatology as a tool to relativise the language-realizing human brains among other animal species, primates in particular, in the context of evolutionary time scale. The tertiary connotation lies in the context of the developmental time scale. The author claims that it is the interaction of the newborn with its care takers, namely its mother and other family or social members in its ecological conditions, that brings the brain mechanism subserving language faculty to really mature to its final completion. Taken together, this article proposes computational theories and mechanisms of Evo-Devo-Eco interactions for language acquisition in the human brains.

  17. Just-in-Time Adaptive Interventions (JITAIs) in Mobile Health: Key Components and Design Principles for Ongoing Health Behavior Support

    PubMed Central

    Nahum-Shani, Inbal; Smith, Shawna N.; Spring, Bonnie J.; Collins, Linda M.; Witkiewitz, Katie; Tewari, Ambuj; Murphy, Susan A.

    2016-01-01

    Background The just-in-time adaptive intervention (JITAI) is an intervention design aiming to provide the right type/amount of support, at the right time, by adapting to an individual's changing internal and contextual state. The availability of increasingly powerful mobile and sensing technologies underpins the use of JITAIs to support health behavior, as in such a setting an individual's state can change rapidly, unexpectedly, and in his/her natural environment. Purpose Despite the increasing use and appeal of JITAIs, a major gap exists between the growing technological capabilities for delivering JITAIs and research on the development and evaluation of these interventions. Many JITAIs have been developed with minimal use of empirical evidence, theory, or accepted treatment guidelines. Here, we take an essential first step towards bridging this gap. Methods Building on health behavior theories and the extant literature on JITAIs, we clarify the scientific motivation for JITAIs, define their fundamental components, and highlight design principles related to these components. Examples of JITAIs from various domains of health behavior research are used for illustration. Conclusion As we enter a new era of technological capacity for delivering JITAIs, it is critical that researchers develop sophisticated and nuanced health behavior theories capable of guiding the construction of such interventions. Particular attention has to be given to better understanding the implications of providing timely and ecologically sound support for intervention adherence and retention. PMID:27663578

  18. What Communication Theories Can Teach the Designer of Computer-Based Training.

    ERIC Educational Resources Information Center

    Larsen, Ronald E.

    1985-01-01

    Reviews characteristics of computer-based training (CBT) that make application of communication theories appropriate and presents principles from communication theory (e.g., general systems theory, symbolic interactionism, rule theories, and interpersonal communication theories) to illustrate how CBT developers can profitably apply them to…

  19. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hourdequin, Marion, E-mail: Marion.Hourdequin@ColoradoCollege.edu; Department of Philosophy, Colorado College, 14 E. Cache La Poudre St., Colorado Springs, CO 80903; Landres, Peter

    Traditional mechanisms for public participation in environmental impact assessment under U.S. federal law have been criticized as ineffective and unable to resolve conflict. As these mechanisms are modified and new approaches developed, we argue that participation should be designed and evaluated not only on practical grounds of cost-effectiveness and efficiency, but also on ethical grounds based on democratic ideals. In this paper, we review and synthesize modern democratic theory to develop and justify four ethical principles for public participation: equal opportunity to participate, equal access to information, genuine deliberation, and shared commitment. We then explore several tensions that are inherentmore » in applying these ethical principles to public participation in EIA. We next examine traditional NEPA processes and newer collaborative approaches in light of these principles. Finally, we explore the circumstances that argue for more in-depth participatory processes. While improved EIA participatory processes do not guarantee improved outcomes in environmental management, processes informed by these four ethical principles derived from democratic theory may lead to increased public engagement and satisfaction with government agency decisions. - Highlights: Black-Right-Pointing-Pointer Four ethical principles based on democratic theory for public participation in EIA. Black-Right-Pointing-Pointer NEPA and collaboration offer different strengths in meeting these principles. Black-Right-Pointing-Pointer We explore tensions inherent in applying these principles. Black-Right-Pointing-Pointer Improved participatory processes may improve public acceptance of agency decisions.« less

  20. Quenched Large Deviations for Simple Random Walks on Percolation Clusters Including Long-Range Correlations

    NASA Astrophysics Data System (ADS)

    Berger, Noam; Mukherjee, Chiranjib; Okamura, Kazuki

    2018-03-01

    We prove a quenched large deviation principle (LDP) for a simple random walk on a supercritical percolation cluster (SRWPC) on {Z^d} ({d ≥ 2}). The models under interest include classical Bernoulli bond and site percolation as well as models that exhibit long range correlations, like the random cluster model, the random interlacement and the vacant set of random interlacements (for {d ≥ 3}) and the level sets of the Gaussian free field ({d≥ 3}). Inspired by the methods developed by Kosygina et al. (Commun Pure Appl Math 59:1489-1521, 2006) for proving quenched LDP for elliptic diffusions with a random drift, and by Yilmaz (Commun Pure Appl Math 62(8):1033-1075, 2009) and Rosenbluth (Quenched large deviations for multidimensional random walks in a random environment: a variational formula. Ph.D. thesis, NYU, arXiv:0804.1444v1) for similar results regarding elliptic random walks in random environment, we take the point of view of the moving particle and prove a large deviation principle for the quenched distribution of the pair empirical measures of the environment Markov chain in the non-elliptic case of SRWPC. Via a contraction principle, this reduces easily to a quenched LDP for the distribution of the mean velocity of the random walk and both rate functions admit explicit variational formulas. The main difficulty in our set up lies in the inherent non-ellipticity as well as the lack of translation-invariance stemming from conditioning on the fact that the origin belongs to the infinite cluster. We develop a unifying approach for proving quenched large deviations for SRWPC based on exploiting coercivity properties of the relative entropies in the context of convex variational analysis, combined with input from ergodic theory and invoking geometric properties of the supercritical percolation cluster.

  1. Quenched Large Deviations for Simple Random Walks on Percolation Clusters Including Long-Range Correlations

    NASA Astrophysics Data System (ADS)

    Berger, Noam; Mukherjee, Chiranjib; Okamura, Kazuki

    2017-12-01

    We prove a quenched large deviation principle (LDP) for a simple random walk on a supercritical percolation cluster (SRWPC) on {Z^d} ({d ≥ 2} ). The models under interest include classical Bernoulli bond and site percolation as well as models that exhibit long range correlations, like the random cluster model, the random interlacement and the vacant set of random interlacements (for {d ≥ 3} ) and the level sets of the Gaussian free field ({d≥ 3} ). Inspired by the methods developed by Kosygina et al. (Commun Pure Appl Math 59:1489-1521, 2006) for proving quenched LDP for elliptic diffusions with a random drift, and by Yilmaz (Commun Pure Appl Math 62(8):1033-1075, 2009) and Rosenbluth (Quenched large deviations for multidimensional random walks in a random environment: a variational formula. Ph.D. thesis, NYU, arXiv:0804.1444v1) for similar results regarding elliptic random walks in random environment, we take the point of view of the moving particle and prove a large deviation principle for the quenched distribution of the pair empirical measures of the environment Markov chain in the non-elliptic case of SRWPC. Via a contraction principle, this reduces easily to a quenched LDP for the distribution of the mean velocity of the random walk and both rate functions admit explicit variational formulas. The main difficulty in our set up lies in the inherent non-ellipticity as well as the lack of translation-invariance stemming from conditioning on the fact that the origin belongs to the infinite cluster. We develop a unifying approach for proving quenched large deviations for SRWPC based on exploiting coercivity properties of the relative entropies in the context of convex variational analysis, combined with input from ergodic theory and invoking geometric properties of the supercritical percolation cluster.

  2. Water confined in carbon nanotubes: Magnetic response and proton chemical shieldings

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Huang, P; Schwegler, E; Galli, G

    2008-11-14

    We study the proton nuclear magnetic resonance ({sup 1}H-NMR) of a model system consisting of liquid water in infinite carbon nanotubes (CNT). Chemical shieldings are evaluated from linear response theory, where the electronic structure is derived from density functional theory (DFT) with plane-wave basis sets and periodic boundary conditions. The shieldings are sampled from trajectories generated via first-principles molecular dynamics simulations at ambient conditions, for water confined in (14,0) and (19,0) CNTs with diameters d = 11 {angstrom} and 14.9 {angstrom}, respectively. We find that confinement within the CNT leads to a large ({approx} -23 ppm) upfield shift relative tomore » bulk liquid water. This shift is a consequence of strongly anisotropic magnetic fields induced in the CNT by an applied magnetic field.« less

  3. Data Mining Technologies Inspired from Visual Principle

    NASA Astrophysics Data System (ADS)

    Xu, Zongben

    In this talk we review the recent work done by our group on data mining (DM) technologies deduced from simulating visual principle. Through viewing a DM problem as a cognition problems and treading a data set as an image with each light point located at a datum position, we developed a series of high efficient algorithms for clustering, classification and regression via mimicking visual principles. In pattern recognition, human eyes seem to possess a singular aptitude to group objects and find important structure in an efficient way. Thus, a DM algorithm simulating visual system may solve some basic problems in DM research. From this point of view, we proposed a new approach for data clustering by modeling the blurring effect of lateral retinal interconnections based on scale space theory. In this approach, as the data image blurs, smaller light blobs merge into large ones until the whole image becomes one light blob at a low enough level of resolution. By identifying each blob with a cluster, the blurring process then generates a family of clustering along the hierarchy. The proposed approach provides unique solutions to many long standing problems, such as the cluster validity and the sensitivity to initialization problems, in clustering. We extended such an approach to classification and regression problems, through combatively employing the Weber's law in physiology and the cell response classification facts. The resultant classification and regression algorithms are proven to be very efficient and solve the problems of model selection and applicability to huge size of data set in DM technologies. We finally applied the similar idea to the difficult parameter setting problem in support vector machine (SVM). Viewing the parameter setting problem as a recognition problem of choosing a visual scale at which the global and local structures of a data set can be preserved, and the difference between the two structures be maximized in the feature space, we derived a direct parameter setting formula for the Gaussian SVM. The simulations and applications show that the suggested formula significantly outperforms the known model selection methods in terms of efficiency and precision.

  4. Information Theoretic Characterization of Physical Theories with Projective State Space

    NASA Astrophysics Data System (ADS)

    Zaopo, Marco

    2015-08-01

    Probabilistic theories are a natural framework to investigate the foundations of quantum theory and possible alternative or deeper theories. In a generic probabilistic theory, states of a physical system are represented as vectors of outcomes probabilities and state spaces are convex cones. In this picture the physics of a given theory is related to the geometric shape of the cone of states. In quantum theory, for instance, the shape of the cone of states corresponds to a projective space over complex numbers. In this paper we investigate geometric constraints on the state space of a generic theory imposed by the following information theoretic requirements: every non completely mixed state of a system is perfectly distinguishable from some other state in a single shot measurement; information capacity of physical systems is conserved under making mixtures of states. These assumptions guarantee that a generic physical system satisfies a natural principle asserting that the more a state of the system is mixed the less information can be stored in the system using that state as logical value. We show that all theories satisfying the above assumptions are such that the shape of their cones of states is that of a projective space over a generic field of numbers. Remarkably, these theories constitute generalizations of quantum theory where superposition principle holds with coefficients pertaining to a generic field of numbers in place of complex numbers. If the field of numbers is trivial and contains only one element we obtain classical theory. This result tells that superposition principle is quite common among probabilistic theories while its absence gives evidence of either classical theory or an implausible theory.

  5. Modeling recombination processes and predicting energy conversion efficiency of dye sensitized solar cells from first principles

    NASA Astrophysics Data System (ADS)

    Ma, Wei; Meng, Sheng

    2014-03-01

    We present a set of algorithms based on solo first principles calculations, to accurately calculate key properties of a DSC device including sunlight harvest, electron injection, electron-hole recombination, and open circuit voltages. Two series of D- π-A dyes are adopted as sample dyes. The short circuit current can be predicted by calculating the dyes' photo absorption, and the electron injection and recombination lifetime using real-time time-dependent density functional theory (TDDFT) simulations. Open circuit voltage can be reproduced by calculating energy difference between the quasi-Fermi level of electrons in the semiconductor and the electrolyte redox potential, considering the influence of electron recombination. Based on timescales obtained from real time TDDFT dynamics for excited states, the estimated power conversion efficiency of DSC fits nicely with the experiment, with deviation below 1-2%. Light harvesting efficiency, incident photon-to-electron conversion efficiency and the current-voltage characteristics can also be well reproduced. The predicted efficiency can serve as either an ideal limit for optimizing photovoltaic performance of a given dye, or a virtual device that closely mimicking the performance of a real device under different experimental settings.

  6. Theory-driven design of hole-conducting transparent oxides

    NASA Astrophysics Data System (ADS)

    Trimarchi, G.; Peng, H.; Im, J.; Freeman, A. J.; Cloet, V.; Raw, A.; Poeppelmeier, K. R.; Biswas, K.; Lany, S.; Zunger, A.

    2012-02-01

    The design of p-type transparent conducting oxides (TCOs) aims at simultaneously achieving transparency and high hole concentration and hole conductivity in one compound. Such design principles (DPs) define a multi-objective optimization problem that is to be solved by searching a large set of compounds for optimum ones. Here, we screen a large set of ternary compounds, including Ag and Cu oxides and chalcogenides, by calculating via first-principles methods the design properties of each compound, in order to search for optimum p-type TCOs. We first select Ag3VO4 as a case study of the application of ab-initio methods to assess a compound as a candidate p-type TCO. We predict Ag3VO4 (i) to have a hole concentration of 10^14 cm-3 at room temperature, (ii) to be at the verge of transparency, and (iii) to have lower hole effective mass than the prototype p-type TCO CuAlO2. We then map the hole effective mass vs. the band gap in the selected compounds and determine those that best meet the DPs by having simultaneously minimum effective mass and a band gap large enough for transparency.

  7. Ethical principles and concepts in medicine.

    PubMed

    Taylor, Robert M

    2013-01-01

    Clinical ethics is the application of ethical theories, principles, rules, and guidelines to clinical situations in medicine. Therefore, clinical ethics is analogous to clinical medicine in that general principles and concepts must be applied intelligently and thoughtfully to unique clinical circumstances. The three major ethical theories are consequentialism, whereby the consequences of an action determine whether it is ethical; deontology, whereby to be ethical is to do one's duty, and virtue ethics, whereby ethics is a matter of cultivating appropriate virtues. In the real world of medicine, most people find that all three perspectives offer useful insights and are complementary rather than contradictory. The most common approach to clinical ethical analysis is principlism. According to principlism, the medical practitioner must attempt to uphold four important principles: respect for patient autonomy, beneficence, nonmaleficence, and justice. When these principles conflict, resolving them depends on the details of the case. Alternative approaches to medical ethics, including the primacy of beneficence, care-based ethics, feminist ethics, and narrative ethics, help to define the limitations of principlism and provide a broader perspective on medical ethics. © 2013 Elsevier B.V. All rights reserved.

  8. Hypothesis Testing as an Act of Rationality

    NASA Astrophysics Data System (ADS)

    Nearing, Grey

    2017-04-01

    Statistical hypothesis testing is ad hoc in two ways. First, setting probabilistic rejection criteria is, as Neyman (1957) put it, an act of will rather than an act of rationality. Second, physical theories like conservation laws do not inherently admit probabilistic predictions, and so we must use what are called epistemic bridge principles to connect model predictions with the actual methods of hypothesis testing. In practice, these bridge principles are likelihood functions, error functions, or performance metrics. I propose that the reason we are faced with these problems is because we have historically failed to account for a fundamental component of basic logic - namely the portion of logic that explains how epistemic states evolve in the presence of empirical data. This component of Cox' (1946) calculitic logic is called information theory (Knuth, 2005), and adding information theory our hypothetico-deductive account of science yields straightforward solutions to both of the above problems. This also yields a straightforward method for dealing with Popper's (1963) problem of verisimilitude by facilitating a quantitative approach to measuring process isomorphism. In practice, this involves data assimilation. Finally, information theory allows us to reliably bound measures of epistemic uncertainty, thereby avoiding the problem of Bayesian incoherency under misspecified priors (Grünwald, 2006). I therefore propose solutions to four of the fundamental problems inherent in both hypothetico-deductive and/or Bayesian hypothesis testing. - Neyman (1957) Inductive Behavior as a Basic Concept of Philosophy of Science. - Cox (1946) Probability, Frequency and Reasonable Expectation. - Knuth (2005) Lattice Duality: The Origin of Probability and Entropy. - Grünwald (2006). Bayesian Inconsistency under Misspecification. - Popper (1963) Conjectures and Refutations: The Growth of Scientific Knowledge.

  9. A demonstration of an affinity between pyrite and organic matter in a hydrothermal setting

    PubMed Central

    2011-01-01

    One of the key-principles of the iron-sulphur world theory is to bring organic molecules close enough to interact with each other, using the surface of pyrite as a substrate in a hydrothermal setting. The present paper explores the relationship of pyrite and organic matter in a hydrothermal setting from the geological record; in hydrothermal calcite veins from Carboniferous limestones in central Ireland. Here, the organic matter is accumulated as coatings around, and through, pyrite grains. Most of the pyrite grains are euhedral-subhedral crystals, ranging in size from ca 0.1-0.5 mm in diameter, and they are scattered throughout the matrix of the vein calcite. The organic matter was deposited from a hydrothermal fluid at a temperature of at least 200°C, and gives a Raman signature of disordered carbon. This study points to an example from a hydrothermal setting in the geological record, demonstrating that pyrite can have a high potential for the concentration and accumulation of organic materials. PMID:21299877

  10. Testing Einstein's theory of gravity in a millisecond pulsar triple system

    NASA Astrophysics Data System (ADS)

    Archibald, Anne

    2015-04-01

    Einstein's theory of gravity depends on a key postulate, the strong equivalence principle. This principle says, among other things, that all objects fall the same way, even objects with strong self-gravity. Almost every metric theory of gravity other than Einstein's general relativity violates the strong equivalence principle at some level. While the weak equivalence principle--for objects with negligible self-gravity--has been tested in the laboratory, the strong equivalence principle requires astrophysical tests. Lunar laser ranging provides the best current tests by measuring whether the Earth and the Moon fall the same way in the gravitational field of the Sun. These tests are limited by the weak self-gravity of the Earth: the gravitational binding energy (over c2) over the mass is only 4 . 6 ×10-10 . By contrast, for neutron stars this same ratio is expected to be roughly 0 . 1 . Thus the recently-discovered system PSR J0337+17, a hierarchical triple consisting of a millisecond pulsar and two white dwarfs, offers the possibility of a test of the strong equivalence principle that is more sensitive by a factor of 20 to 100 than the best existing test. I will describe our observations of this system and our progress towards such a test.

  11. A Study of Implicit Theories and Beliefs about Teaching in Elementary School Teachers.

    ERIC Educational Resources Information Center

    Marcelo, Carlos

    In this study interactive teaching of two math teachers at elementary schools in Spain was observed. Focus was on the description of the principles of the practice which guide the teaching activities of the subjects, because it was felt these principles form the base of teachers' theories with respect to the teaching and the students. During a…

  12. Application of Multimedia Design Principles to Visuals Used in Course-Books: An Evaluation Tool

    ERIC Educational Resources Information Center

    Kuzu, Abdullah; Akbulut, Yavuz; Sahin, Mehmet Can

    2007-01-01

    This paper introduces an evaluation tool prepared to examine the quality of visuals in course-books. The tool is based on Mayer's Cognitive Theory of Multimedia Learning (i.e. Generative Theory) and its principles regarding the correct use of illustrations within text. The reason to generate the tool, the development process along with the…

  13. Improving the Quality of Online Discussion: The Effects of Strategies Designed Based on Cognitive Load Theory Principles

    ERIC Educational Resources Information Center

    Darabi, Aubteen; Jin, Li

    2013-01-01

    This article focuses on heavy cognitive load as the reason for the lack of quality associated with conventional online discussion. Using the principles of cognitive load theory, four online discussion strategies were designed specifically aiming at reducing the discussants' cognitive load and thus enhancing the quality of their online discussion.…

  14. Application of Theories, Principles and Methods of Adult Learning for Managers to Improve Workplace Reactions to Learning, Knowledge and Performance

    ERIC Educational Resources Information Center

    Steier, E. Joseph, III

    2010-01-01

    The objective of this dissertation was to explore the concept that knowledge and application of theories, principles and methods of adult learning to teaching may be a core management competency needed for companies to improve employee reaction to learning, knowledge transfer and behavior as well as engagement, retention and profitability.…

  15. Expertise facilitates the transfer of anticipation skill across domains.

    PubMed

    Rosalie, Simon M; Müller, Sean

    2014-02-01

    It is unclear whether perceptual-motor skill transfer is based upon similarity between the learning and transfer domains per identical elements theory, or facilitated by an understanding of underlying principles in accordance with general principle theory. Here, the predictions of identical elements theory, general principle theory, and aspects of a recently proposed model for the transfer of perceptual-motor skill with respect to expertise in the learning and transfer domains are examined. The capabilities of expert karate athletes, near-expert karate athletes, and novices to anticipate and respond to stimulus skills derived from taekwondo and Australian football were investigated in ecologically valid contexts using an in situ temporal occlusion paradigm and complex whole-body perceptual-motor skills. Results indicated that the karate experts and near-experts are as capable of using visual information to anticipate and guide motor skill responses as domain experts and near-experts in the taekwondo transfer domain, but only karate experts could perform like domain experts in the Australian football transfer domain. Findings suggest that transfer of anticipation skill is based upon expertise and an understanding of principles but may be supplemented by similarities that exist between the stimulus and response elements of the learning and transfer domains.

  16. Nondimensional Parameters and Equations for Nonlinear and Bifurcation Analyses of Thin Anisotropic Quasi-Shallow Shells

    NASA Technical Reports Server (NTRS)

    Nemeth, Michael P.

    2010-01-01

    A comprehensive development of nondimensional parameters and equations for nonlinear and bifurcations analyses of quasi-shallow shells, based on the Donnell-Mushtari-Vlasov theory for thin anisotropic shells, is presented. A complete set of field equations for geometrically imperfect shells is presented in terms general of lines-of-curvature coordinates. A systematic nondimensionalization of these equations is developed, several new nondimensional parameters are defined, and a comprehensive stress-function formulation is presented that includes variational principles for equilibrium and compatibility. Bifurcation analysis is applied to the nondimensional nonlinear field equations and a comprehensive set of bifurcation equations are presented. An extensive collection of tables and figures are presented that show the effects of lamina material properties and stacking sequence on the nondimensional parameters.

  17. Gravity with free initial conditions: A solution to the cosmological constant problem testable by CMB B -mode polarization

    NASA Astrophysics Data System (ADS)

    Totani, Tomonori

    2017-10-01

    In standard general relativity the universe cannot be started with arbitrary initial conditions, because four of the ten components of the Einstein's field equations (EFE) are constraints on initial conditions. In the previous work it was proposed to extend the gravity theory to allow free initial conditions, with a motivation to solve the cosmological constant problem. This was done by setting four constraints on metric variations in the action principle, which is reasonable because the gravity's physical degrees of freedom are at most six. However, there are two problems about this theory; the three constraints in addition to the unimodular condition were introduced without clear physical meanings, and the flat Minkowski spacetime is unstable against perturbations. Here a new set of gravitational field equations is derived by replacing the three constraints with new ones requiring that geodesic paths remain geodesic against metric variations. The instability problem is then naturally solved. Implications for the cosmological constant Λ are unchanged; the theory converges into EFE with nonzero Λ by inflation, but Λ varies on scales much larger than the present Hubble horizon. Then galaxies are formed only in small Λ regions, and the cosmological constant problem is solved by the anthropic argument. Because of the increased degrees of freedom in metric dynamics, the theory predicts new non-oscillatory modes of metric anisotropy generated by quantum fluctuation during inflation, and CMB B -mode polarization would be observed differently from the standard predictions by general relativity.

  18. Theories of Levels in Organizational Science.

    ERIC Educational Resources Information Center

    Rousseau, Denise M.

    This paper presents concepts and principles pertinent to the development of cross-level and multilevel theory in organizational science by addressing a number of fundamental theoretical issues. It describes hierarchy theory, systems theory, and mixed-level models of organization developed by organizational scientists. Hierarchy theory derives from…

  19. How to improve medical education website design.

    PubMed

    Sisson, Stephen D; Hill-Briggs, Felicia; Levine, David

    2010-04-21

    The Internet provides a means of disseminating medical education curricula, allowing institutions to share educational resources. Much of what is published online is poorly planned, does not meet learners' needs, or is out of date. Applying principles of curriculum development, adult learning theory and educational website design may result in improved online educational resources. Key steps in developing and implementing an education website include: 1) Follow established principles of curriculum development; 2) Perform a needs assessment and repeat the needs assessment regularly after curriculum implementation; 3) Include in the needs assessment targeted learners, educators, institutions, and society; 4) Use principles of adult learning and behavioral theory when developing content and website function; 5) Design the website and curriculum to demonstrate educational effectiveness at an individual and programmatic level; 6) Include a mechanism for sustaining website operations and updating content over a long period of time. Interactive, online education programs are effective for medical training, but require planning, implementation, and maintenance that follow established principles of curriculum development, adult learning, and behavioral theory.

  20. Theory, Practice, and the "Zone of Proximal Development" in Soviet Psychoeducational Research.

    ERIC Educational Resources Information Center

    Wozniak, R. H.

    1980-01-01

    Philosphical principles provide the context for the Soviets' psychological theory (in particular, the "zone of proximal development" concept); this theory then shapes psychoeducational practice. (GDC)

  1. First principle investigation of structural and optical properties of cubic titanium dioxide

    NASA Astrophysics Data System (ADS)

    Dash, Debashish; Chaudhury, Saurabh; Tripathy, Susanta K.

    2018-05-01

    This paper presents an analysis of structural and optical properties of cubic titanium dioxide (TiO2) using Orthogonalzed Linear Combinations of Atomic Orbitals (OLCAO) basis set under the framework of Density Functional Theory (DFT). The structural property, specially the lattice constant `a' and the optical properties such as refractive index, extinction coefficient, and reflectivity are investigated and discussed in the energy range of 0-16 eV. Further, the results have compared with previous theoretical as well as with experimental results. It was found that DFT based simulation results are approximation to experimental results.

  2. Values based practice: a framework for thinking with.

    PubMed

    Mohanna, Kay

    2017-07-01

    Values are those principles that govern behaviours, and values-based practice has been described as a theory and skills base for effective healthcare decision-making where different (and hence potentially conflicting) values are in play. The emphasis is on good process rather than pre-set right outcomes, aiming to achieve balanced decision-making. In this article we will consider the utility of this model by looking at leadership development, a current area of much interest and investment in healthcare. Copeland points out that 'values based leadership behaviors are styles with a moral, authentic and ethical dimension', important qualities in healthcare decision-making.

  3. Stories to Make Thermodynamics and Related Subjects More Palatable

    NASA Astrophysics Data System (ADS)

    Bartell, Lawrence S.

    2001-08-01

    A collection of vignettes either recounting the personalities of some of the architects of thermodynamics or noting steps and missteps in the development of thermodynamics and the kinetic theory is combined with a set of stories illustrating thermodynamic principles. These offerings turned out to be much more easily remembered by students and were more effective in conveying certain points than a direct, unadorned exposition of thermodynamic laws and applications. For one thing, the stories kept the students awake and receptive to ideas. Students had invariably entered the class having heard horror stories about how tedious and impossibly difficult thermodynamics courses are.

  4. Interaction of TGA with CdSe nanoparticles

    NASA Astrophysics Data System (ADS)

    Bharti, Shivani; Singh, Satvinder; Jain, Shikshita; Kaur, Gurvir; Gupta, Shikha; Tripathi, S. K.

    2018-05-01

    In this paper, the interaction of thioglycolic acid (TGA) with CdSe atomic cluster have been studied using first principle calculations and experimentally synthesized using chemical route method. Density Functional Theory (DFT) have been used for all the calculations. Structural and electronic properties have been studied theorectically and results have been compared to the experimentally obtained micrographs from TEM microscopy. The most stable interaction of CdSe cluster is obtained with thiol group of TGA due to the high bond dissiciation energy between Cd-S than Cd-O. Theoretical calculations have been performed using Gaussian basis set approach.

  5. A Prior for Neural Networks utilizing Enclosing Spheres for Normalization

    NASA Astrophysics Data System (ADS)

    v. Toussaint, U.; Gori, S.; Dose, V.

    2004-11-01

    Neural Networks are famous for their advantageous flexibility for problems when there is insufficient knowledge to set up a proper model. On the other hand this flexibility can cause over-fitting and can hamper the generalization properties of neural networks. Many approaches to regularize NN have been suggested but most of them based on ad-hoc arguments. Employing the principle of transformation invariance we derive a general prior in accordance with the Bayesian probability theory for a class of feedforward networks. Optimal networks are determined by Bayesian model comparison verifying the applicability of this approach.

  6. Top-down, bottom-up, and around the jungle gym: a social exchange and networks approach to engaging afterschool programs in implementing evidence-based practices.

    PubMed

    Smith, Emilie Phillips; Wise, Eileen; Rosen, Howard; Rosen, Alison; Childs, Sharon; McManus, Margaret

    2014-06-01

    This paper uses concepts from social networks and social exchange theories to describe the implementation of evidence-based practices in afterschool programs. The members of the LEGACY Together Afterschool Project team have been involved in conducting collaborative research to migrate a behavioral strategy that has been documented to reduce disruptive behaviors in classroom settings to a new setting-that of afterschool programs. We adapted the Paxis Institute's version of the Good Behavior Game to afterschool settings which differ from in-school settings, including more fluid attendance, multiple age groupings, diverse activities that may take place simultaneously, and differences in staff training and experience (Barrish et al. in J Appl Behav Anal 2(2):119-124, 1969; Embry et al. in The Pax Good Behavior Game. Hazelden, Center City, 2003; Hynes et al. in J Child Serv 4(3):4-20, 2009; Kellam et al. in Drug Alcohol Depend 95:S5-S28, 2008; Tingstrom et al. in Behav Modif 30(2):225-253, 2006). This paper presents the experiences of the three adult groups involved in the implementation process who give first-person accounts of implementation: (1) university-based scientist-practitioners, (2) community partners who trained and provided technical assistance/coaching, and (3) an afterschool program administrator. We introduce here the AIMS model used to frame the implementation process conceptualized by this town-gown collaborative team. AIMS builds upon previous work in implementation science using four phases in which the three collaborators have overlapping roles: approach/engagement, implementation, monitoring, and sustainability. Within all four phases principles of Social Exchange Theory and Social Network Theory are highlighted.

  7. Removing barriers to rehabilitation: Theory-based family intervention in community settings after brain injury.

    PubMed

    Stejskal, Taryn M

    2012-01-01

    Rehabilitation professionals have become increasingly aware that family members play a critical role in the recovery process of individuals after brain injury. In addition, researchers have begun to identify a relationship between family member caregivers' well-being and survivors' outcomes. The idea of a continuum of care or following survivors from inpatient care to community reintegration has become an important model of treatment across many hospital and community-based settings. In concert with the continuum of care, present research literature indicates that family intervention may be a key component to successful rehabilitation after brain injury. Yet, clinicians interacting with family members and survivors often feel confounded about how exactly to intervene with the broader family system beyond the individual survivor. Drawing on the systemic nature of the field of marriage and family therapy (MFT), this article provides information to assist clinicians in effectively intervening with families using theory-based interventions in community settings. First, a rationale for the utilization of systems-based, as opposed to individual-based, therapies will be uncovered. Second, historically relevant publications focusing on family psychotherapy and intervention after brain injury are reviewed and their implications discussed. Recommendations for the utilization of systemic theory-based principles and strategies, specifically cognitive behavioral therapy (CBT), narrative therapy (NT), and solution-focused therapy (SFT) will be examined. Descriptions of common challenges families and couples face will be presented along with case examples to illustrate how these theoretical frameworks might be applied to these special concerns postinjury. Finally, the article concludes with an overview of the ideas presented in this manuscript to assist practitioners and systems of care in community-based settings to more effectively intervene with the family system as a whole after brain injury.

  8. Full-scale computation for all the thermoelectric property parameters of half-Heusler compounds

    DOE PAGES

    Hong, A. J.; Li, L.; He, R.; ...

    2016-03-07

    The thermoelectric performance of materials relies substantially on the band structures that determine the electronic and phononic transports, while the transport behaviors compete and counter-act for the power factor PF and figure-of-merit ZT. These issues make a full-scale computation of the whole set of thermoelectric parameters particularly attractive, while a calculation scheme of the electronic and phononic contributions to thermal conductivity remains yet challenging. In this work, we present a full-scale computation scheme based on the first-principles calculations by choosing a set of doped half- Heusler compounds as examples for illustration. The electronic structure is computed using the WIEN2k codemore » and the carrier relaxation times for electrons and holes are calculated using the Bardeen and Shockley’s deformation potential (DP) theory. The finite-temperature electronic transport is evaluated within the framework of Boltzmann transport theory. In sequence, the density functional perturbation combined with the quasi-harmonic approximation and the Klemens’ equation is implemented for calculating the lattice thermal conductivity of carrier-doped thermoelectric materials such as Tidoped NbFeSb compounds without losing a generality. The calculated results show good agreement with experimental data. Lastly, the present methodology represents an effective and powerful approach to calculate the whole set of thermoelectric properties for thermoelectric materials.« less

  9. Full-scale computation for all the thermoelectric property parameters of half-Heusler compounds

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hong, A. J.; Li, L.; He, R.

    The thermoelectric performance of materials relies substantially on the band structures that determine the electronic and phononic transports, while the transport behaviors compete and counter-act for the power factor PF and figure-of-merit ZT. These issues make a full-scale computation of the whole set of thermoelectric parameters particularly attractive, while a calculation scheme of the electronic and phononic contributions to thermal conductivity remains yet challenging. In this work, we present a full-scale computation scheme based on the first-principles calculations by choosing a set of doped half- Heusler compounds as examples for illustration. The electronic structure is computed using the WIEN2k codemore » and the carrier relaxation times for electrons and holes are calculated using the Bardeen and Shockley’s deformation potential (DP) theory. The finite-temperature electronic transport is evaluated within the framework of Boltzmann transport theory. In sequence, the density functional perturbation combined with the quasi-harmonic approximation and the Klemens’ equation is implemented for calculating the lattice thermal conductivity of carrier-doped thermoelectric materials such as Tidoped NbFeSb compounds without losing a generality. The calculated results show good agreement with experimental data. Lastly, the present methodology represents an effective and powerful approach to calculate the whole set of thermoelectric properties for thermoelectric materials.« less

  10. MPR

    PubMed Central

    Killeen, Peter R.; Sitomer, Matthew T.

    2008-01-01

    Mathematical Principles of Reinforcement (MPR) is a theory of reinforcement schedules. This paper reviews the origin of the principles constituting MPR: arousal, association and constraint. Incentives invigorate responses, in particular those preceding and predicting the incentive. The process that generates an associative bond between stimuli, responses and incentives is called coupling. The combination of arousal and coupling constitutes reinforcement. Models of coupling play a central role in the evolution of the theory. The time required to respond constrains the maximum response rates, and generates a hyperbolic relation between rate of responding and rate of reinforcement. Models of control by ratio schedules are developed to illustrate the interaction of the principles. Correlations among parameters are incorporated into the structure of the models, and assumptions that were made in the original theory are refined in light of current data. PMID:12729968

  11. Ergodicity, Maximum Entropy Production, and Steepest Entropy Ascent in the Proofs of Onsager's Reciprocal Relations

    NASA Astrophysics Data System (ADS)

    Benfenati, Francesco; Beretta, Gian Paolo

    2018-04-01

    We show that to prove the Onsager relations using the microscopic time reversibility one necessarily has to make an ergodic hypothesis, or a hypothesis closely linked to that. This is true in all the proofs of the Onsager relations in the literature: from the original proof by Onsager, to more advanced proofs in the context of linear response theory and the theory of Markov processes, to the proof in the context of the kinetic theory of gases. The only three proofs that do not require any kind of ergodic hypothesis are based on additional hypotheses on the macroscopic evolution: Ziegler's maximum entropy production principle (MEPP), the principle of time reversal invariance of the entropy production, or the steepest entropy ascent principle (SEAP).

  12. High resolution transmission electron microscope Imaging and first-principles simulations of atomic-scale features in graphene membrane

    NASA Astrophysics Data System (ADS)

    Wang, Wei; Bhandari, Sagar; Yi, Wei; Bell, David; Westervelt, Robert; Kaxiras, Efthimios

    2012-02-01

    Ultra-thin membranes such as graphene[1] are of great importance for basic science and technology applications. Graphene sets the ultimate limit of thinness, demonstrating that a free-standing single atomic layer not only exists but can be extremely stable and strong [2--4]. However, both theory [5, 6] and experiments [3, 7] suggest that the existence of graphene relies on intrinsic ripples that suppress the long-wavelength thermal fluctuations which otherwise spontaneously destroy long range order in a two dimensional system. Here we show direct imaging of the atomic features in graphene including the ripples resolved using monochromatic aberration-corrected transmission electron microscopy (TEM). We compare the images observed in TEM with simulated images based on an accurate first-principles total potential. We show that these atomic scale features can be mapped through accurate first-principles simulations into high resolution TEM contrast. [1] Geim, A. K. & Novoselov, K. S. Nat. Mater. 6, 183-191, (2007). [2] Novoselov, K. S.et al. Science 306, 666-669, (2004). [3] Meyer, J. C. et al. Nature 446, 60-63, (2007). [4] Lee, C., Wei, X. D., Kysar, J. W. & Hone, J. Science 321, 385-388, (2008). [5] Nelson, D. R. & Peliti, L. J Phys-Paris 48, 1085-1092, (1987). [6] Fasolino, A., Los, J. H. & Katsnelson, M. I. Nat. Mater. 6, 858-861, (2007). [7] Meyer, J. C. et al. Solid State Commun. 143, 101-109, (2007).

  13. Winning Before the Fight: An Armed Suasion Approach to Countering Near Peer Competition

    DTIC Science & Technology

    2017-05-25

    the risk of unintended escalation. This monograph also proposes a tailored set of principles , separate from the principles of joint operations...Suasion; Deterrence; Compellence; Armed Conflict; Conflict Continuum; Principles of Joint Operations. 16. SECURITY CLASSIFICATION OF: 17. LIMITATION...tailored set of principles , separate from the principles of joint operations, which allow a planning staff to balance achieving success with managing

  14. Pure field theories and MACSYMA algorithms

    NASA Technical Reports Server (NTRS)

    Ament, W. S.

    1977-01-01

    A pure field theory attempts to describe physical phenomena through singularity-free solutions of field equations resulting from an action principle. The physics goes into forming the action principle and interpreting specific results. Algorithms for the intervening mathematical steps are sketched. Vacuum general relativity is a pure field theory, serving as model and providing checks for generalizations. The fields of general relativity are the 10 components of a symmetric Riemannian metric tensor; those of the Einstein-Straus generalization are the 16 components of a nonsymmetric. Algebraic properties are exploited in top level MACSYMA commands toward performing some of the algorithms of that generalization. The light cone for the theory as left by Einstein and Straus is found and simplifications of that theory are discussed.

  15. First-principles multiple-barrier diffusion theory. The case study of interstitial diffusion in CdTe

    DOE PAGES

    Yang, Ji -Hui; Park, Ji -Sang; Kang, Joongoo; ...

    2015-02-17

    The diffusion of particles in solid-state materials generally involves several sequential thermal-activation processes. However, presently, diffusion coefficient theory only deals with a single barrier, i.e., it lacks an accurate description to deal with multiple-barrier diffusion. Here, we develop a general diffusion coefficient theory for multiple-barrier diffusion. Using our diffusion theory and first-principles calculated hopping rates for each barrier, we calculate the diffusion coefficients of Cd, Cu, Te, and Cl interstitials in CdTe for their full multiple-barrier diffusion pathways. As a result, we found that the calculated diffusivity agrees well with the experimental measurement, thus justifying our theory, which is generalmore » for many other systems.« less

  16. [Discussion on theory and indes system of Chinese material medical regionalization].

    PubMed

    Zhang, Xiaobo; Guo, Lanping; Zhou, Tao; Huang, Luqi

    2010-09-01

    The paper discusses the theory regarding to the Chinese material medical (CMM) regionalization. It is based on the studying of papers and practical experience in the field of CMM regionalization. The basic theories of CMM regionalization are laws of territorial differentiation and location theory. The basic principles are excellent quality of CMM, difference, similarity and practicability. The study objects are CMM resources, natural environment and social environment. The definition of CMM regionalization is that study on the laws of spatial pattern of resources and regional system in the field of CMM,then regionalize it based on this kind of spatial pattern and law. The index system is built based on the study of the theory,principle,object and index of CMM regionalization.

  17. Improving delivery of a health-promoting-environments program: experiences from Queensland Health.

    PubMed

    Dwyer, S

    1997-01-01

    The purpose of this paper is to outline the key components of a statewide multisite health-promoting-environments program. Contemporary health-promotion programs in settings such as schools, workplaces and hospitals use organisational development theory to address the health issues of the setting, including the physical environment, the organisational environment, and the specific health needs of the employees and consumers of the service. Program principles include management of each project by the participant organisation or site (for example, a school or workplace), using resources available within the organisation and the local community, voluntary participation, social justice and participant-based priority setting, and evaluation and monitoring. Adoption of these principles implies a shift in the role of the health worker from implementer to facilitator. Based on the experience of Queensland Health, it is proposed that the essential building blocks of the health-promoting-environments program are an intersectoral policy base, a model for action, training and resources, local facilitators, support from local organisations, a supportive network of sites, marketing of the program, and a state-based evaluation and monitoring system. The program in Queensland was able to develop a significant number of these components over the 1990-1996 period. In regard to evaluation, process measures can be built around the program components; however, further research is required for development of impact indicators and benchmarks on quality.

  18. The dynamics of copper intercalated molybdenum ditelluride

    NASA Astrophysics Data System (ADS)

    Onofrio, Nicolas; Guzman, David; Strachan, Alejandro

    2016-11-01

    Layered transition metal dichalcogenides are emerging as key materials in nanoelectronics and energy applications. Predictive models to understand their growth, thermomechanical properties, and interaction with metals are needed in order to accelerate their incorporation into commercial products. Interatomic potentials enable large-scale atomistic simulations connecting first principle methods and devices. We present a ReaxFF reactive force field to describe molybdenum ditelluride and its interactions with copper. We optimized the force field parameters to describe the energetics, atomic charges, and mechanical properties of (i) layered MoTe2, Mo, and Cu in various phases, (ii) the intercalation of Cu atoms and small clusters within the van der Waals gap of MoTe2, and (iii) bond dissociation curves. The training set consists of an extensive set of first principles calculations computed using density functional theory (DFT). We validate the force field via the prediction of the adhesion of a single layer MoTe2 on a Cu(111) surface and find good agreement with DFT results not used in the training set. We characterized the mobility of the Cu ions intercalated into MoTe2 under the presence of an external electric field via finite temperature molecular dynamics simulations. The results show a significant increase in drift velocity for electric fields of approximately 0.4 V/Å and that mobility increases with Cu ion concentration.

  19. Lagrangian approach to the Barrett-Crane spin foam model

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bonzom, Valentin; Laboratoire de Physique, ENS Lyon, CNRS UMR 5672, 46 Allee d'Italie, 69007 Lyon; Livine, Etera R.

    2009-03-15

    We provide the Barrett-Crane spin foam model for quantum gravity with a discrete action principle, consisting in the usual BF term with discretized simplicity constraints which in the continuum turn topological BF theory into gravity. The setting is the same as usually considered in the literature: space-time is cut into 4-simplices, the connection describes how to glue these 4-simplices together and the action is a sum of terms depending on the holonomies around each triangle. We impose the discretized simplicity constraints on disjoint tetrahedra and we show how the Lagrange multipliers distort the parallel transport and the correlations between neighboringmore » simplices. We then construct the discretized BF action using a noncommutative * product between SU(2) plane waves. We show how this naturally leads to the Barrett-Crane model. This clears up the geometrical meaning of the model. We discuss the natural generalization of this action principle and the spin foam models it leads to. We show how the recently introduced spin foam fusion coefficients emerge with a nontrivial measure. In particular, we recover the Engle-Pereira-Rovelli spin foam model by weakening the discretized simplicity constraints. Finally, we identify the two sectors of Plebanski's theory and we give the analog of the Barrett-Crane model in the nongeometric sector.« less

  20. Social Learning Theory: its application in the context of nurse education.

    PubMed

    Bahn, D

    2001-02-01

    Cognitive theories are fundamental to enable problem solving and the ability to understand and apply principles in a variety of situations. This article looks at Social Learning Theory, critically analysing its principles, which are based on observational learning and modelling, and considering its value and application in the context of nurse education. It also considers the component processes that will determine the outcome of observed behaviour, other than reinforcement, as identified by Bandura, namely: attention, retention, motor reproduction, and motivation. Copyright 2001 Harcourt Publishers Ltd.

  1. Feminist Social Work: Practice and Theory of Practice.

    PubMed

    Eyal-Lubling, Roni; Krumer-Nevo, Michal

    2016-07-01

    Although feminist social work has been practiced in Israel since the 1970s, little has been written about it. This qualitative study aims to fill this gap by documenting and conceptualizing feminist theory of practice and actual practice based on interviews with 12 feminist social workers. Findings reveal that the interviewees perceive feminist practice as significantly different from traditional social work practice based on four analytical principles: (1) gender analysis, (2) awareness of power relations, (3) analysis of welfare services as structures of oppression, and (4) utilization of feminist language, as well as 10 principles of action. The principles are discussed in the context of feminist social work in Israel and in light of feminist principles described in international literature.

  2. The Application of Context Theory in English Teaching of Reading

    ERIC Educational Resources Information Center

    Zhu, Jiang; Han, Lemeng

    2010-01-01

    Context theory is a very important theory in English teaching, especially the teaching of reading. This paper first analyzes the theory of context, including the features of context and some principles in context theory. Then the paper discusses the application of context theory in English teaching of reading, including some problems met in…

  3. Constructivism, Optimality Theory and Language Acquisition. The Shapes We Make in Each Other's Heads.

    ERIC Educational Resources Information Center

    Whincop, Chris

    1996-01-01

    This paper identifies a feature of human brain neural nets that may be described as the principle of ease of processing (PEP), and that, it is argued, is the primary force guiding a learner towards a target grammar. It is suggested that the same principle lies at the heart of Optimality Theory, which characterizes the course of language…

  4. Problems with McAdams and Pals's (2006) Proposal of a Framework for an Integrative Theory of Personality

    ERIC Educational Resources Information Center

    Epstein, Seymour

    2007-01-01

    Comments on the original article "A New Big Five: Fundamental Principles for an Integrative Science of Personality," by Dan P. McAdams and Jennifer L. Pals (see record 2006-03947-002). Here, the current author begins with a critique of McAdams and Pals's (April 2006) five principles for a framework for an integrative theory of personality. The…

  5. Black-box Brain Experiments, Causal Mathematical Logic, and the Thermodynamics of Intelligence

    NASA Astrophysics Data System (ADS)

    Pissanetzky, Sergio; Lanzalaco, Felix

    2013-12-01

    Awareness of the possible existence of a yet-unknown principle of Physics that explains cognition and intelligence does exist in several projects of emulation, simulation, and replication of the human brain currently under way. Brain simulation projects define their success partly in terms of the emergence of non-explicitly programmed biophysical signals such as self-oscillation and spreading cortical waves. We propose that a recently discovered theory of Physics known as Causal Mathematical Logic (CML) that links intelligence with causality and entropy and explains intelligent behavior from first principles, is the missing link. We further propose the theory as a roadway to understanding more complex biophysical signals, and to explain the set of intelligence principles. The new theory applies to information considered as an entity by itself. The theory proposes that any device that processes information and exhibits intelligence must satisfy certain theoretical conditions irrespective of the substrate where it is being processed. The substrate can be the human brain, a part of it, a worm's brain, a motor protein that self-locomotes in response to its environment, a computer. Here, we propose to extend the causal theory to systems in Neuroscience, because of its ability to model complex systems without heuristic approximations, and to predict emerging signals of intelligence directly from the models. The theory predicts the existence of a large number of observables (or "signals"), all of which emerge and can be directly and mathematically calculated from non-explicitly programmed detailed causal models. This approach is aiming for a universal and predictive language for Neuroscience and AGI based on causality and entropy, detailed enough to describe the finest structures and signals of the brain, yet general enough to accommodate the versatility and wholeness of intelligence. Experiments are focused on a black-box as one of the devices described above of which both the input and the output are precisely known, but not the internal implementation. The same input is separately supplied to a causal virtual machine, and the calculated output is compared with the measured output. The virtual machine, described in a previous paper, is a computer implementation of CML, fixed for all experiments and unrelated to the device in the black box. If the two outputs are equivalent, then the experiment has quantitatively succeeded and conclusions can be drawn regarding details of the internal implementation of the device. Several small black-box experiments were successfully performed and demonstrated the emergence of non-explicitly programmed cognitive function in each case

  6. The Practical Value of Translation Theory.

    ERIC Educational Resources Information Center

    Komissarov, Vilen

    1985-01-01

    Discusses why translation theory has had an inadequate impact on translation practice and gives specific examples of ways in which translation theory can provide the translator with general principles and methods of translating idioms. (SED)

  7. Globalizing rehabilitation psychology: Application of foundational principles to global health and rehabilitation challenges.

    PubMed

    Bentley, Jacob A; Bruyère, Susanne M; LeBlanc, Jeanne; MacLachlan, Malcolm

    2016-02-01

    This article reviewed foundational principles in rehabilitation psychology and explored their application to global health imperatives as outlined in the World Report on Disability (World Health Organization & World Bank, 2011). Historical theories and perspectives are used to assist with conceptual formulation as applied to emerging international rehabilitation psychology topics. According to the World Report on Disability (World Health Organization & World Bank, 2011), there are approximately 1 billion individuals living with some form of disability globally. An estimated 80% of persons with disabilities live in low- to middle-income countries (WHO, 2006). The primary messages and recommendations of the World Report on Disability have been previously summarized as it relates to potential opportunities for contribution within the field of rehabilitation psychology (MacLachlan & Mannan, 2014). Yet, undeniable barriers remain to realizing the full potential for contributions in low- to middle-income country settings. A vision for engaging in international capacity building and public health efforts is needed within the field of rehabilitation psychology. Foundational rehabilitation psychology principles have application to the service of individuals with disabilities in areas of the world facing complex socioeconomic and sociopolitical challenges. Foundational principles of person-environment interaction, importance of social context, and need for involvement of persons with disabilities can provide guidance to the field as it relates to global health and rehabilitation efforts. The authors illustrate the application of rehabilitation psychology foundational principles through case examples and description of ongoing work, and link foundational principles to discreet domains of intervention going forward. (c) 2016 APA, all rights reserved).

  8. Development of practice principles for the management of ongoing suicidal ideation in young people diagnosed with major depressive disorder.

    PubMed

    Rice, Simon M; Simmons, Magenta B; Bailey, Alan P; Parker, Alexandra G; Hetrick, Sarah E; Davey, Christopher G; Phelan, Mark; Blaikie, Simon; Edwards, Jane

    2014-01-01

    There is a lack of clear guidance regarding the management of ongoing suicidality in young people experiencing major depressive disorder. This study utilised an expert consensus approach in identifying practice principles to complement relevant clinical guidelines for the treatment of major depressive disorder in young people. The study also sought to outline a broad treatment framework for clinical intervention with young people experiencing ongoing suicidal ideation. In-depth focus groups were undertaken with a specialist multidisciplinary clinical team (the Youth Mood Clinic at Orygen Youth Health Clinical Program, Melbourne) working with young people aged 15-25 years experiencing ongoing suicidal ideation. Each focus group was audio recorded and transcribed verbatim using orthographic conventions. Principles of grounded theory and thematic analysis were used to analyse and code the resultant data. The identified codes were subsequently synthesised into eight practice principles reflecting engagement and consistency of care, ongoing risk assessment and documentation, individualised crisis planning, engaging systems of support, engendering hopefulness, development of adaptive coping, management of acute risk, and consultation and supervision. The identified practice principles provide a broad management framework, and may assist to improve treatment consistency and clinical management of young people experiencing ongoing suicidal ideation. The practice principles may be of use to health professionals working within a team-based setting involved in the provision of care, even if peripherally, to young people with ongoing suicidal ideation. Findings address the lack of treatment consistency and shared terminology and may provide containment and guidance to multidisciplinary clinicians working with this at-risk group.

  9. The Structuring Principle: Political Socialization and Belief Systems

    ERIC Educational Resources Information Center

    Searing, Donald D.; And Others

    1973-01-01

    Assesses the significance of data on childhood political learning to political theory by testing the structuring principle,'' considered one of the central assumptions of political socialization research. This principle asserts that basic orientations acquired during childhood structure the later learning of specific issue beliefs.'' The…

  10. A new electrophoretic focusing principle: focusing of nonamphoteric weak ionogenic analytes using inverse electromigration dispersion profiles.

    PubMed

    Gebauer, Petr; Malá, Zdena; Bocek, Petr

    2010-03-01

    This contribution introduces a new separation principle in CE which offers focusing of weak nonamphoteric ionogenic species and their inherent transport to the detector. The prerequisite condition for application of this principle is the existence of an inverse electromigration dispersion profile, i.e. a profile where pH is decreasing toward the anode or cathode for focusing of anionic or cationic weak analytes, respectively. The theory presented defines the principal conditions under which an analyte is focused on a profile of this type. Since electromigration dispersion profiles are migrating ones, the new principle offers inherent transport of focused analytes into the detection cell. The focusing principle described utilizes a mechanism different from both CZE (where separation is based on the difference in mobilities) and IEF (where separation is based on difference in pI), and hence, offers another separation dimension in CE. The new principle and its theory presented here are supplemented by convincing experiments as their proof.

  11. Utilization of the Behavior Change Wheel framework to develop a model to improve cardiometabolic screening for people with severe mental illness.

    PubMed

    Mangurian, Christina; Niu, Grace C; Schillinger, Dean; Newcomer, John W; Dilley, James; Handley, Margaret A

    2017-11-14

    Individuals with severe mental illness (e.g., schizophrenia, bipolar disorder) die 10-25 years earlier than the general population, primarily from premature cardiovascular disease (CVD). Contributing factors are complex, but include systemic-related factors of poorly integrated primary care and mental health services. Although evidence-based models exist for integrating mental health care into primary care settings, the evidence base for integrating medical care into specialty mental health settings is limited. Such models are referred to as "reverse" integration. In this paper, we describe the application of an implementation science framework in designing a model to improve CVD outcomes for individuals with severe mental illness (SMI) who receive services in a community mental health setting. Using principles from the theory of planned behavior, focus groups were conducted to understand stakeholder perspectives of barriers to CVD risk factor screening and treatment identify potential target behaviors. We then applied results to the overarching Behavior Change Wheel framework, a systematic and theory-driven approach that incorporates the COM-B model (capability, opportunity, motivation, and behavior), to build an intervention to improve CVD risk factor screening and treatment for people with SMI. Following a stepped approach from the Behavior Change Wheel framework, a model to deliver primary preventive care for people that use community mental health settings as their de facto health home was developed. The CRANIUM (cardiometabolic risk assessment and treatment through a novel integration model for underserved populations with mental illness) model focuses on engaging community psychiatrists to expand their scope of practice to become responsible for CVD risk, with significant clinical decision support. The CRANIUM model was designed by integrating behavioral change theory and implementation theory. CRANIUM is feasible to implement, is highly acceptable to, and targets provider behavior change, and is replicable and efficient for helping to integrate primary preventive care services in community mental health settings. CRANIUM can be scaled up to increase CVD preventive care delivery and ultimately improve health outcomes among people with SMI served within a public mental health care system.

  12. Irreversibility and entropy production in transport phenomena, IV: Symmetry, integrated intermediate processes and separated variational principles for multi-currents

    NASA Astrophysics Data System (ADS)

    Suzuki, Masuo

    2013-10-01

    The mechanism of entropy production in transport phenomena is discussed again by emphasizing the role of symmetry of non-equilibrium states and also by reformulating Einstein’s theory of Brownian motion to derive entropy production from it. This yields conceptual reviews of the previous papers [M. Suzuki, Physica A 390 (2011) 1904; 391 (2012) 1074; 392 (2013) 314]. Separated variational principles of steady states for multi external fields {Xi} and induced currents {Ji} are proposed by extending the principle of minimum integrated entropy production found by the present author for a single external field. The basic strategy of our theory on steady states is to take in all the intermediate processes from the equilibrium state to the final possible steady states in order to study the irreversible physics even in the steady states. As an application of this principle, Gransdorff-Prigogine’s evolution criterion inequality (or stability condition) dXP≡∫dr∑iJidXi≤0 is derived in the stronger form dQi≡∫drJidXi≤0 for individual force Xi and current Ji even in nonlinear responses which depend on all the external forces {Xk} nonlinearly. This is called “separated evolution criterion”. Some explicit demonstrations of the present general theory to simple electric circuits with multi external fields are given in order to clarify the physical essence of our new theory and to realize the condition of its validity concerning the existence of the solutions of the simultaneous equations obtained by the separated variational principles. It is also instructive to compare the two results obtained by the new variational theory and by the old scheme based on the instantaneous entropy production. This seems to be suggestive even to the energy problem in the world.

  13. Applying Theory Y to Library Management

    ERIC Educational Resources Information Center

    Morton, Donald J.

    1975-01-01

    Reviews the principles of the Theory Y approach, reports upon its coverage in library literature, distinguishes between the concepts of Theory Y and participative management, and discusses how Theory Y's application in a small academic library recommends its use for library operations in general. (Author)

  14. Islam and the four principles of medical ethics.

    PubMed

    Mustafa, Yassar

    2014-07-01

    The principles underpinning Islam's ethical framework applied to routine clinical scenarios remain insufficiently understood by many clinicians, thereby unfortunately permitting the delivery of culturally insensitive healthcare.This paper summarises the foundations of the Islamic ethical theory, elucidating the principles and methodology employed by the Muslim jurist in deriving rulings in the field of medical ethics. The four-principles approach, as espoused by Beauchamp and Childress, is also interpreted through the prism of Islamic ethical theory. Each of the four principles (beneficence, nonmaleficence,justice and autonomy) is investigated in turn, looking in particular at the extent to which each is rooted in the Islamic paradigm. This will provide an important insight into Islamic medical ethics, enabling the clinician to have a better informed discussion with the Muslim patient. It will also allow for a higher degree of concordance in consultations and consequently optimise culturally sensitive healthcare delivery.

  15. Testing the quantum superposition principle: matter waves and beyond

    NASA Astrophysics Data System (ADS)

    Ulbricht, Hendrik

    2015-05-01

    New technological developments allow to explore the quantum properties of very complex systems, bringing the question of whether also macroscopic systems share such features, within experimental reach. The interest in this question is increased by the fact that, on the theory side, many suggest that the quantum superposition principle is not exact, departures from it being the larger, the more macroscopic the system. Testing the superposition principle intrinsically also means to test suggested extensions of quantum theory, so-called collapse models. We will report on three new proposals to experimentally test the superposition principle with nanoparticle interferometry, optomechanical devices and by spectroscopic experiments in the frequency domain. We will also report on the status of optical levitation and cooling experiments with nanoparticles in our labs, towards an Earth bound matter-wave interferometer to test the superposition principle for a particle mass of one million amu (atomic mass unit).

  16. Irreversibility and entropy production in transport phenomena, III—Principle of minimum integrated entropy production including nonlinear responses

    NASA Astrophysics Data System (ADS)

    Suzuki, Masuo

    2013-01-01

    A new variational principle of steady states is found by introducing an integrated type of energy dissipation (or entropy production) instead of instantaneous energy dissipation. This new principle is valid both in linear and nonlinear transport phenomena. Prigogine’s dream has now been realized by this new general principle of minimum “integrated” entropy production (or energy dissipation). This new principle does not contradict with the Onsager-Prigogine principle of minimum instantaneous entropy production in the linear regime, but it is conceptually different from the latter which does not hold in the nonlinear regime. Applications of this theory to electric conduction, heat conduction, particle diffusion and chemical reactions are presented. The irreversibility (or positive entropy production) and long time tail problem in Kubo’s formula are also discussed in the Introduction and last section. This constitutes the complementary explanation of our theory of entropy production given in the previous papers (M. Suzuki, Physica A 390 (2011) 1904 and M. Suzuki, Physica A 391 (2012) 1074) and has given the motivation of the present investigation of variational principle.

  17. First-principles analysis of anharmonic nuclear motion and thermal transport in thermoelectric materials

    NASA Astrophysics Data System (ADS)

    Tadano, Terumasa; Tsuneyuki, Shinji

    2015-12-01

    We show a first-principles approach for analyzing anharmonic properties of lattice vibrations in solids. We firstly extract harmonic and anharmonic force constants from accurate first-principles calculations based on the density functional theory. Using the many-body perturbation theory of phonons, we then estimate the phonon scattering probability due to anharmonic phonon-phonon interactions. We show the validity of the approach by computing the lattice thermal conductivity of Si, a typical covalent semiconductor, and selected thermoelectric materials PbTe and Bi2Te3 based on the Boltzmann transport equation. We also show that the phonon lifetime and the lattice thermal conductivity of the high-temperature phase of SrTiO3 can be estimated by employing the perturbation theory on top of the solution of the self-consistent phonon equation.

  18. Understanding consumers' initial expectations of community-based residential mental health rehabilitation in the context of past experiences of care: A mixed-methods pragmatic grounded theory analysis.

    PubMed

    Parker, Stephen; Meurk, Carla; Newman, Ellie; Fletcher, Clayton; Swinson, Isabella; Dark, Frances

    2018-04-16

    This study explores how consumers expect community-based residential mental health rehabilitation to compare with previous experiences of care. Understanding what consumers hope to receive from mental health services, and listening to their perspectives about what has and has not worked in previous care settings, may illuminate pathways to improved service engagement and outcomes. A mixed-methods research design taking a pragmatic approach to grounded theory guided the analysis of 24 semi-structured interviews with consumers on commencement at three Community Care Units (CCUs) in Australia. Two of these CCUs were trialling a staffing model integrating peer support work with clinical care. All interviews were conducted by an independent interviewer within the first 6 weeks of the consumer's stay. All participants expected the CCU to offer an improvement on previous experiences of care. Comparisons were made to acute and subacute inpatient settings, supported accommodation, and outpatient care. Consumers expected differences in the people (staff and co-residents), the focus of care, physical environ, and rules and regulations. Participants from the integrated staffing model sites articulated the expected value of a less clinical approach to care. Overall, consumers' expectations aligned with the principles articulated in policy frameworks for recovery-oriented practice. However, their reflections on past care suggest that these services continue to face significant challenges realizing these principles in practice. Paying attention to the kind of working relationship consumers want to have with mental health services, such as the provision of choice and maintaining a practical and therapeutic supportive focus, could improve their engagement and outcomes. © 2018 Australian College of Mental Health Nurses Inc.

  19. On the 'principle of the quantumness', the quantumness of Relativity, and the computational grand-unification

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    D'Ariano, Giacomo Mauro

    2010-05-04

    I will argue that the proposal of establishing operational foundations of Quantum Theory should have top-priority, and that the Lucien Hardy's program on Quantum Gravity should be paralleled by an analogous program on Quantum Field Theory (QFT), which needs to be reformulated, notwithstanding its experimental success. In this paper, after reviewing recently suggested operational 'principles of the quantumness', I address the problem on whether Quantum Theory and Special Relativity are unrelated theories, or instead, if the one implies the other. I show how Special Relativity can be indeed derived from causality of Quantum Theory, within the computational paradigm 'the universemore » is a huge quantum computer', reformulating QFT as a Quantum-Computational Field Theory (QCFT). In QCFT Special Relativity emerges from the fabric of the computational network, which also naturally embeds gauge invariance. In this scheme even the quantization rule and the Planck constant can in principle be derived as emergent from the underlying causal tapestry of space-time. In this way Quantum Theory remains the only theory operating the huge computer of the universe.Is the computational paradigm only a speculative tautology (theory as simulation of reality), or does it have a scientific value? The answer will come from Occam's razor, depending on the mathematical simplicity of QCFT. Here I will just start scratching the surface of QCFT, analyzing simple field theories, including Dirac's. The number of problems and unmotivated recipes that plague QFT strongly motivates us to undertake the QCFT project, since QCFT makes all such problems manifest, and forces a re-foundation of QFT.« less

  20. What Feynman Could Not yet Use: The Generalised Hong-Ou-Mandel Experiment to Improve the QED Explanation of the Pauli Exclusion Principle

    ERIC Educational Resources Information Center

    Malgieri, Massimiliano; Tenni, Antonio; Onorato, Pasquale; De Ambrosis, Anna

    2016-01-01

    In this paper we present a reasoning line for introducing the Pauli exclusion principle in the context of an introductory course on quantum theory based on the sum over paths approach. We start from the argument originally introduced by Feynman in "QED: The Strange Theory of Light and Matter" and improve it by discussing with students…

  1. Principles of Catholic Social Teaching, Critical Pedagogy, and the Theory of Intersectionality: An Integrated Framework to Examine the Roles of Social Status in the Formation of Catholic Teachers

    ERIC Educational Resources Information Center

    Eick, Caroline Marie; Ryan, Patrick A.

    2014-01-01

    This article discusses the relevance of an analytic framework that integrates principles of Catholic social teaching, critical pedagogy, and the theory of intersectionality to explain attitudes toward marginalized youth held by Catholic students preparing to become teachers. The framework emerges from five years of action research data collected…

  2. Thermodynamic resource theories, non-commutativity and maximum entropy principles

    NASA Astrophysics Data System (ADS)

    Lostaglio, Matteo; Jennings, David; Rudolph, Terry

    2017-04-01

    We discuss some features of thermodynamics in the presence of multiple conserved quantities. We prove a generalisation of Landauer principle illustrating tradeoffs between the erasure costs paid in different ‘currencies’. We then show how the maximum entropy and complete passivity approaches give different answers in the presence of multiple observables. We discuss how this seems to prevent current resource theories from fully capturing thermodynamic aspects of non-commutativity.

  3. Maximum Principle in the Optimal Design of Plates with Stratified Thickness

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Roubicek, Tomas

    2005-03-15

    An optimal design problem for a plate governed by a linear, elliptic equation with bounded thickness varying only in a single prescribed direction and with unilateral isoperimetrical-type constraints is considered. Using Murat-Tartar's homogenization theory for stratified plates and Young-measure relaxation theory, smoothness of the extended cost and constraint functionals is proved, and then the maximum principle necessary for an optimal relaxed design is derived.

  4. Behavioral and neural Darwinism: selectionist function and mechanism in adaptive behavior dynamics.

    PubMed

    McDowell, J J

    2010-05-01

    An evolutionary theory of behavior dynamics and a theory of neuronal group selection share a common selectionist framework. The theory of behavior dynamics instantiates abstractly the idea that behavior is selected by its consequences. It implements Darwinian principles of selection, reproduction, and mutation to generate adaptive behavior in virtual organisms. The behavior generated by the theory has been shown to be quantitatively indistinguishable from that of live organisms. The theory of neuronal group selection suggests a mechanism whereby the abstract principles of the evolutionary theory may be implemented in the nervous systems of biological organisms. According to this theory, groups of neurons subserving behavior may be selected by synaptic modifications that occur when the consequences of behavior activate value systems in the brain. Together, these theories constitute a framework for a comprehensive account of adaptive behavior that extends from brain function to the behavior of whole organisms in quantitative detail. Copyright (c) 2009 Elsevier B.V. All rights reserved.

  5. Psychosomatic medicine and cybernetics.

    PubMed

    Ishikawa, H

    1979-01-01

    In our daily psychosomatic medicine clinics, we have adopted four principles from Wiener's cybernetics and von Bertalanffy's general system theory. We use the polygraphic method for the diagnosis of psychosomatic disease (black box principle). For the control of psychosomatic symptoms, we use the biofeedback method (feedback principle). We use systematic desensitization to relieve social stresses which cause psychosomatic disease (open and closed system principle). And lastly, transactional analysis, which corresponds to the information and energy principle.

  6. Computational and empirical simulations of selective memory impairments: Converging evidence for a single-system account of memory dissociations.

    PubMed

    Curtis, Evan T; Jamieson, Randall K

    2018-04-01

    Current theory has divided memory into multiple systems, resulting in a fractionated account of human behaviour. By an alternative perspective, memory is a single system. However, debate over the details of different single-system theories has overshadowed the converging agreement among them, slowing the reunification of memory. Evidence in favour of dividing memory often takes the form of dissociations observed in amnesia, where amnesic patients are impaired on some memory tasks but not others. The dissociations are taken as evidence for separate explicit and implicit memory systems. We argue against this perspective. We simulate two key dissociations between classification and recognition in a computational model of memory, A Theory of Nonanalytic Association. We assume that amnesia reflects a quantitative difference in the quality of encoding. We also present empirical evidence that replicates the dissociations in healthy participants, simulating amnesic behaviour by reducing study time. In both analyses, we successfully reproduce the dissociations. We integrate our computational and empirical successes with the success of alternative models and manipulations and argue that our demonstrations, taken in concert with similar demonstrations with similar models, provide converging evidence for a more general set of single-system analyses that support the conclusion that a wide variety of memory phenomena can be explained by a unified and coherent set of principles.

  7. The Theory of Quantized Fields. II

    DOE R&D Accomplishments Database

    Schwinger, J.

    1951-01-01

    The arguments leading to the formulation of the Action Principle for a general field are presented. In association with the complete reduction of all numerical matrices into symmetrical and anti-symmetrical parts, the general field is decomposed into two sets, which are identified with Bose-Einstein and Fermi-Dirac fields. The spin restriction on the two kinds of fields is inferred from the time reflection invariance requirement. The consistency of the theory is verified in terms of a criterion involving the various generators of infinitesimal transformations. Following a discussion of charged fields, the electromagnetic field is introduced to satisfy the postulate of general gauge invariance. As an aspect of the latter, it is recognized that the electromagnetic field and charged fields are not kinematically independent. After a discussion of the field-strength commutation relations, the independent dynamical variable of the electromagnetic field are exhibited in terms of a special gauge.

  8. Nuclear and particle physics in the early universe

    NASA Technical Reports Server (NTRS)

    Schramm, D. N.

    1981-01-01

    Basic principles and implications of Big Bang cosmology are reviewed, noting the physical evidence of a previous universe temperature of 10,000 K and theoretical arguments such as grand unification decoupling indicating a primal temperature of 10 to the 15th eV. The Planck time of 10 to the -43rd sec after the Big Bang is set as the limit before which gravity was quantized and nothing is known. Gauge theories of elementary particle physics are reviewed for successful predictions of similarity in weak and electromagnetic interactions and quantum chromodynamic predictions for strong interactions. The large number of photons in the universe relative to the baryons is considered and the grand unified theories are cited as showing the existence of baryon nonconservation as an explanation. Further attention is given to quark-hadron phase transition, the decoupling for the weak interaction and relic neutrinos, and Big Bang nucleosynthesis.

  9. The grounded theory of "trust building".

    PubMed

    Ramezani, Monir; Ahmadi, Fazlollah; Mohammadi, Eesa; Kazemnejad, Anoshirvan

    2017-01-01

    Despite the growing importance of spiritual care, the delivery of spiritual care is still an area of disagreement among healthcare providers. To develop a grounded theory about spiritual care delivery based on Iranian nurses' perceptions and experiences. A grounded theory approach: A qualitative study using the grounded theory approach. Participants and research context: Data were collected through holding 27 interviews with 25 participants (17 staff nurses, 3 physicians, 3 patients, 1 family member, and 1 nurse assistant). The study setting was the Imam Khomeini Hospital Complex. Sampling was started purposively and continued theoretically. Data analysis was performed by the method proposed by Strauss and Corbin. Ethical consideration: The study was approved by the Ethics Committee of Tarbiat Modares University and the agreement of the administrators of the study setting was got before starting the study. The core category of the study was "Trust building" which reflected the nature of spiritual care delivery by nurses. Trust building was the result of eight main categories or strategies including creating a positive mentality at hospital admission, understanding patients in care circumstances, having a caring presence, adhering to care ethics, developing meaningful relationships, promoting positive thinking and energy, establishing effective communication with patients, and attempting to create a safe therapeutic environment. Poor interprofessional coordination negatively affected this process while living toward developing greater cognizance of divinity and adhering to the principles of professional ethics facilitated it. The outcome of the process was to gain a sense of partial psychological security. The "Trust building" theory can be used as a guide for describing and expanding nurses' roles in spiritual care delivery, developing care documentation systems and clinical guidelines, and planning educational programs for nursing students and staff nurses.

  10. Health behaviour change theories: contributions to an ICF-based behavioural exercise therapy for individuals with chronic diseases.

    PubMed

    Geidl, Wolfgang; Semrau, Jana; Pfeifer, Klaus

    2014-01-01

    The purpose of this perspective is (1) to incorporate recent psychological health behaviour change (HBC) theories into exercise therapeutic programmes, and (2) to introduce the International Classification of Functioning (ICF)-based concept of a behavioural exercise therapy (BET). Relevant personal modifiable factors of physical activity (PA) were identified based on three recent psychological HBC theories. Following the principles of intervention mapping, a matrix of proximal programme objectives specifies desirable parameter values for each personal factor. As a result of analysing reviews on behavioural techniques and intervention programmes of the German rehabilitation setting, we identified exercise-related techniques that impact the personal determinants. Finally, the techniques were integrated into an ICF-based BET concept. Individuals' attitudes, skills, emotions, beliefs and knowledge are important personal factors of PA behaviour. BET systematically addresses these personal factors by a systematic combination of adequate exercise contents with related behavioural techniques. The presented 28 intervention techniques serve as a theory-driven "tool box" for designing complex BET programmes to promote PA. The current paper highlights the usefulness of theory-based integrative research in the field of exercise therapy, offers explicit methods and contents for physical therapists to promote PA behaviour, and introduces the ICF-based conceptual idea of a BET. Implications for Rehabilitation Irrespective of the clients' indication, therapeutic exercise programmes should incorporate effective, theory-based approaches to promote physical activity. Central determinants of physical activity behaviour are a number of personal factors: individuals' attitudes, skills, emotions, beliefs and knowledge. Clinicians implementing exercise therapy should set it within a wider theoretical framework including the personal factors that influence physical activity. To increase exercise-adherence and promote long-term physical activity behaviour change, the concept of a behavioural exercise therapy (BET) offers a theory-based approach to systematically address relevant personal factors with a combination of adequate contents of exercise with exercise-related techniques of behaviour change.

  11. Grand canonical electronic density-functional theory: Algorithms and applications to electrochemistry

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sundararaman, Ravishankar; Goddard, III, William A.; Arias, Tomas A.

    First-principles calculations combining density-functional theory and continuum solvation models enable realistic theoretical modeling and design of electrochemical systems. When a reaction proceeds in such systems, the number of electrons in the portion of the system treated quantum mechanically changes continuously, with a balancing charge appearing in the continuum electrolyte. A grand-canonical ensemble of electrons at a chemical potential set by the electrode potential is therefore the ideal description of such systems that directly mimics the experimental condition. We present two distinct algorithms: a self-consistent field method and a direct variational free energy minimization method using auxiliary Hamiltonians (GC-AuxH), to solvemore » the Kohn-Sham equations of electronic density-functional theory directly in the grand canonical ensemble at fixed potential. Both methods substantially improve performance compared to a sequence of conventional fixed-number calculations targeting the desired potential, with the GC-AuxH method additionally exhibiting reliable and smooth exponential convergence of the grand free energy. Lastly, we apply grand-canonical density-functional theory to the under-potential deposition of copper on platinum from chloride-containing electrolytes and show that chloride desorption, not partial copper monolayer formation, is responsible for the second voltammetric peak.« less

  12. Solvatochromic shifts from coupled-cluster theory embedded in density functional theory

    NASA Astrophysics Data System (ADS)

    Höfener, Sebastian; Gomes, André Severo Pereira; Visscher, Lucas

    2013-09-01

    Building on the framework recently reported for determining general response properties for frozen-density embedding [S. Höfener, A. S. P. Gomes, and L. Visscher, J. Chem. Phys. 136, 044104 (2012)], 10.1063/1.3675845, in this work we report a first implementation of an embedded coupled-cluster in density-functional theory (CC-in-DFT) scheme for electronic excitations, where only the response of the active subsystem is taken into account. The formalism is applied to the calculation of coupled-cluster excitation energies of water and uracil in aqueous solution. We find that the CC-in-DFT results are in good agreement with reference calculations and experimental results. The accuracy of calculations is mainly sensitive to factors influencing the correlation treatment (basis set quality, truncation of the cluster operator) and to the embedding treatment of the ground-state (choice of density functionals). This allows for efficient approximations at the excited state calculation step without compromising the accuracy. This approximate scheme makes it possible to use a first principles approach to investigate environment effects with specific interactions at coupled-cluster level of theory at a cost comparable to that of calculations of the individual subsystems in vacuum.

  13. Grand canonical electronic density-functional theory: Algorithms and applications to electrochemistry.

    PubMed

    Sundararaman, Ravishankar; Goddard, William A; Arias, Tomas A

    2017-03-21

    First-principles calculations combining density-functional theory and continuum solvation models enable realistic theoretical modeling and design of electrochemical systems. When a reaction proceeds in such systems, the number of electrons in the portion of the system treated quantum mechanically changes continuously, with a balancing charge appearing in the continuum electrolyte. A grand-canonical ensemble of electrons at a chemical potential set by the electrode potential is therefore the ideal description of such systems that directly mimics the experimental condition. We present two distinct algorithms: a self-consistent field method and a direct variational free energy minimization method using auxiliary Hamiltonians (GC-AuxH), to solve the Kohn-Sham equations of electronic density-functional theory directly in the grand canonical ensemble at fixed potential. Both methods substantially improve performance compared to a sequence of conventional fixed-number calculations targeting the desired potential, with the GC-AuxH method additionally exhibiting reliable and smooth exponential convergence of the grand free energy. Finally, we apply grand-canonical density-functional theory to the under-potential deposition of copper on platinum from chloride-containing electrolytes and show that chloride desorption, not partial copper monolayer formation, is responsible for the second voltammetric peak.

  14. Multivariate Cholesky models of human female fertility patterns in the NLSY.

    PubMed

    Rodgers, Joseph Lee; Bard, David E; Miller, Warren B

    2007-03-01

    Substantial evidence now exists that variables measuring or correlated with human fertility outcomes have a heritable component. In this study, we define a series of age-sequenced fertility variables, and fit multivariate models to account for underlying shared genetic and environmental sources of variance. We make predictions based on a theory developed by Udry [(1996) Biosocial models of low-fertility societies. In: Casterline, JB, Lee RD, Foote KA (eds) Fertility in the United States: new patterns, new theories. The Population Council, New York] suggesting that biological/genetic motivations can be more easily realized and measured in settings in which fertility choices are available. Udry's theory, along with principles from molecular genetics and certain tenets of life history theory, allow us to make specific predictions about biometrical patterns across age. Consistent with predictions, our results suggest that there are different sources of genetic influence on fertility variance at early compared to later ages, but that there is only one source of shared environmental influence that occurs at early ages. These patterns are suggestive of the types of gene-gene and gene-environment interactions for which we must account to better understand individual differences in fertility outcomes.

  15. Grand canonical electronic density-functional theory: Algorithms and applications to electrochemistry

    DOE PAGES

    Sundararaman, Ravishankar; Goddard, III, William A.; Arias, Tomas A.

    2017-03-16

    First-principles calculations combining density-functional theory and continuum solvation models enable realistic theoretical modeling and design of electrochemical systems. When a reaction proceeds in such systems, the number of electrons in the portion of the system treated quantum mechanically changes continuously, with a balancing charge appearing in the continuum electrolyte. A grand-canonical ensemble of electrons at a chemical potential set by the electrode potential is therefore the ideal description of such systems that directly mimics the experimental condition. We present two distinct algorithms: a self-consistent field method and a direct variational free energy minimization method using auxiliary Hamiltonians (GC-AuxH), to solvemore » the Kohn-Sham equations of electronic density-functional theory directly in the grand canonical ensemble at fixed potential. Both methods substantially improve performance compared to a sequence of conventional fixed-number calculations targeting the desired potential, with the GC-AuxH method additionally exhibiting reliable and smooth exponential convergence of the grand free energy. Lastly, we apply grand-canonical density-functional theory to the under-potential deposition of copper on platinum from chloride-containing electrolytes and show that chloride desorption, not partial copper monolayer formation, is responsible for the second voltammetric peak.« less

  16. Merging Brain Research with Educational Learning Principles.

    ERIC Educational Resources Information Center

    Saunders, Allyson D.; Vawdrey, Colleen

    2002-01-01

    Discusses how findings of recent neuroscience research can be combined with learning theories to derive brain-based learning principles. Suggests ways in which teachers can promote deeper learning. (SK)

  17. The evolution of consciousness

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stapp, H.P.

    1996-08-16

    It is argued that the principles of classical physics are inimical to the development of an adequate science of consciousness. The problem is that insofar as the classical principles are valid consciousness can have no effect on the behavior, and hence on the survival prospects, of the organisms in which it inheres. Thus within the classical framework it is not possible to explain in natural terms the development of consciousness to the high-level form found in human beings. In quantum theory, on the other hand, consciousness can be dynamically efficacious: quantum theory does allow consciousness to influence behavior, and thencemore » to evolve in accordance with the principles of natural selection. However, this evolutionary requirement places important constraints upon the details of the formulation of the quantum dynamical principles.« less

  18. A Model for Designing Instructional Narratives for Adult Learners: Connecting the Dots

    ERIC Educational Resources Information Center

    Smith, Debra M.

    2013-01-01

    The purpose of this study was to develop a research-based model for designing and deploying instructional narratives based on principles derived from narrative theory, development theory, communication theory, learning theory and instructional design theory to enable adult learning and retention and the effective transfer of that retained learning…

  19. Implementing Set Based Design into Department of Defense Acquisition

    DTIC Science & Technology

    2016-12-01

    challenges for the DOD. This report identifies the original SBD principles and characteristics based on Toyota Motor Corporation’s Set Based Concurrent...Engineering Model. Additionally, the team reviewed DOD case studies that implemented SBD. The SBD principles , along with the common themes from the...perennial challenges for the DOD. This report identifies the original SBD principles and characteristics based on Toyota Motor Corporation’s Set

  20. Ethical issues and accountability in pressure ulcer prevention.

    PubMed

    Welsh, Lynn

    2014-10-28

    Pressure ulcers represent a considerable cost, both in terms of healthcare spending and quality of life. They are increasingly viewed in terms of patient harm. For clinicians involved in pressure ulcer prevention, ethical issues surrounding accountability may arise from both policy and practice perspectives. It may be useful for clinicians to refer to ethical theories and principles to create frameworks when addressing ethical dilemmas. However, such theories and principles have been criticised for their simplicity and over-generalisation. Alternative theories, for example, virtue ethics and experiential learning, can provide more comprehensive guidance and promote a pluralistic approach to tackling ethical dilemmas.

  1. Generalized Success-Breeds-Success Principle Leading to Time-Dependent Informetric Distributions.

    ERIC Educational Resources Information Center

    Egghe, Leo; Rousseau, Ronald

    1995-01-01

    Reformulates the success-breeds-success (SBS) principle in informetrics in order to generate a general theory of source-item relationships. Topics include a time-dependent probability, a new model for the expected probability that is compared with the SBS principle with exact combinatorial calculations, classical frequency distributions, and…

  2. Science 101: What, Exactly, Is the Heisenberg Uncertainty Principle?

    ERIC Educational Resources Information Center

    Robertson, Bill

    2016-01-01

    Bill Robertson is the author of the NSTA Press book series, "Stop Faking It! Finally Understanding Science So You Can Teach It." In this month's issue, Robertson describes and explains the Heisenberg Uncertainty Principle. The Heisenberg Uncertainty Principle was discussed on "The Big Bang Theory," the lead character in…

  3. The evolutionary theory of asymmetry by V. Geodakyan

    NASA Astrophysics Data System (ADS)

    Geodakyan, Sergey V.

    2015-08-01

    For more than 150 years, all biological theories, including those of C. Darwin and Mendel, were based on the idea of synchronous evolution. They fit for unitary monomodal systems (asexual, symmetrical) but do not work for binary (dioecious, asymmetrical) ones. Examples of such binary conjugated differentiations are two sexes, DNA-proteins, autosomes-sex chromosomes, right and left brain hemispheres, and hands. For their understanding, "asynchronous" theories are needed. Such theories were proposed by Russian theoretical biologist Vigen A. Geodakyan for sexual, brain and body, and chromosomal differentiations. All theories are interconnected and are based on the principle of conjugated subsystems. This article covers the basic tenets of the evolutionary theory of asymmetry and answers the following questions: What benefits does lateralization provide? What logic, what principle is it based on? Why do brain hemispheres control the opposite sides of the body? Why laterality is closely related to sex? What are the biological prerequisites of terrorism?

  4. Degeneracy relations in QCD and the equivalence of two systematic all-orders methods for setting the renormalization scale

    DOE PAGES

    Bi, Huan -Yu; Wu, Xing -Gang; Ma, Yang; ...

    2015-06-26

    The Principle of Maximum Conformality (PMC) eliminates QCD renormalization scale-setting uncertainties using fundamental renormalization group methods. The resulting scale-fixed pQCD predictions are independent of the choice of renormalization scheme and show rapid convergence. The coefficients of the scale-fixed couplings are identical to the corresponding conformal series with zero β-function. Two all-orders methods for systematically implementing the PMC-scale setting procedure for existing high order calculations are discussed in this article. One implementation is based on the PMC-BLM correspondence (PMC-I); the other, more recent, method (PMC-II) uses the R δ-scheme, a systematic generalization of the minimal subtraction renormalization scheme. Both approaches satisfymore » all of the principles of the renormalization group and lead to scale-fixed and scheme-independent predictions at each finite order. In this work, we show that PMC-I and PMC-II scale-setting methods are in practice equivalent to each other. We illustrate this equivalence for the four-loop calculations of the annihilation ratio R e+e– and the Higgs partial width I'(H→bb¯). Both methods lead to the same resummed (‘conformal’) series up to all orders. The small scale differences between the two approaches are reduced as additional renormalization group {β i}-terms in the pQCD expansion are taken into account. In addition, we show that special degeneracy relations, which underly the equivalence of the two PMC approaches and the resulting conformal features of the pQCD series, are in fact general properties of non-Abelian gauge theory.« less

  5. 78 FR 72475 - Derivatives Clearing Organizations and International Standards

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-12-02

    ... principles set forth in the Commodity Exchange Act (``CEA'') for systemically important DCOs (``SIDCOs'') and... all DCOs, which are set forth in the Commission's DCO regulations on compliance with core principles... are consistent with the Principles for Financial Market Infrastructures (``PFMIs'') published by the...

  6. Reflections on bioethics: consolidation of the principle of autonomy and legal aspects.

    PubMed

    Segre, M

    1999-01-01

    The author highlights the importance of emotions in all ethical reflections. He describes the most common positions of ethicists employing duties and rights as the basis for ethical thought. The author, goes to Freudian theory as viewed by the utilitarians, stating that the 'quest for pleasure' is not necessarily egocentric, especially for adults. For example, the feeling of solidarity emerges 'from the inside out', making irrelevant all the emphasis laid on obedience to duty (from the outside in). The article questions the essence of Kantian theory, based exclusively on 'reason' with disregard for feelings, by establishing what he considers a 'positivist' view of rational thought. It emphasizes the principle of autonomy, which it seen as basically opposing the principles of beneficence and fairness. It is proposed that the latter should be seen as what he calls heteronomy (a concept different from that of the rational ethicists). In theory, autonomy is not assigned to anyone on the basis of an external assessment. Any intervention in individual autonomy must be made (by the intervenor) when it becomes imperative in the defense of social or cultural values. The article distinguishes between ethics and morals) and states that the sole acceptable ethical principle is that ethics (theoretically) has no principle.

  7. A roadmap for optimal control: the right way to commute.

    PubMed

    Ross, I Michael

    2005-12-01

    Optimal control theory is the foundation for many problems in astrodynamics. Typical examples are trajectory design and optimization, relative motion control of distributed space systems and attitude steering. Many such problems in astrodynamics are solved by an alternative route of mathematical analysis and deep physical insight, in part because of the perception that an optimal control framework generates hard problems. Although this is indeed true of the Bellman and Pontryagin frameworks, the covector mapping principle provides a neoclassical approach that renders hard problems easy. That is, although the origins of this philosophy can be traced back to Bernoulli and Euler, it is essentially modern as a result of the strong linkage between approximation theory, set-valued analysis and computing technology. Motivated by the broad success of this approach, mission planners are now conceiving and demanding higher performance from space systems. This has resulted in new set of theoretical and computational problems. Recently, under the leadership of NASA-GRC, several workshops were held to address some of these problems. This paper outlines the theoretical issues stemming from practical problems in astrodynamics. Emphasis is placed on how it pertains to advanced mission design problems.

  8. Effective visual design and communication practices for research posters: Exemplars based on the theory and practice of multimedia learning and rhetoric.

    PubMed

    Pedwell, Rhianna K; Hardy, James A; Rowland, Susan L

    2017-05-01

    Evidence shows that science graduates often do not have the communication skills they need to meet workplace standards and expectations. One common mode of science communication is the poster. In a review of the literature we show that poster design is historically problematic, and that the guidance provided to students as they create posters for assessment is frequently inconsistent. To address this inconsistency we provide some guiding design principles for posters that are grounded in communication theory and the fundamentals of rhetoric. We also present three nondiscipline-specific example posters with accompanying notes that explain why the posters are examples of poor, average, and excellent poster design. The subject matter for the posters is a fabricated set of experiments on a topic that could not actually be the subject of research. Instructors may use these resources with their students, secure in the knowledge that they do not and will never represent an answer set to an extant assessment item. © 2016 by The International Union of Biochemistry and Molecular Biology, 45(3):249-261, 2017. © 2016 The International Union of Biochemistry and Molecular Biology.

  9. Unprincipled microgravity

    NASA Astrophysics Data System (ADS)

    Mattingly, James

    2014-05-01

    I argue that the key principle of microgravity is what I have called elsewhere the Lorentzian strategy. This strategy may be seen as either a reverse-engineering approach or a descent with modification approach, but however one sees if the method works neither by attempting to propound a theory that is the quantum version of either an extant or generalized gravitation theory nor by attempting to propound a theory that is the final version of quantum mechanics and finding gravity within it. Instead the method works by beginning with what we are pretty sure is a good approximation to the low-energy limit of whatever the real microprocesses are that generate what we experience as gravitation. This method is powerful, fruitful, and not committed to principles for which we have, as yet, only scant evidence; the method begins with what we do know and teases out what we can know next. The principle is methodological, not ontological.

  10. SENSE IT: Student Enabled Network of Sensors for the Environment using Innovative Technology

    NASA Astrophysics Data System (ADS)

    Hotaling, L. A.; Stolkin, R.; Kirkey, W.; Bonner, J. S.; Lowes, S.; Lin, P.; Ojo, T.

    2010-12-01

    SENSE IT is a project funded by the National Science Foundation (NSF) which strives to enrich science, technology, engineering and mathematics (STEM) education by providing teacher professional development and classroom projects in which high school students build from first principles, program, test and deploy sensors for water quality monitoring. Sensor development is a broad and interdisciplinary area, providing motivating scenarios in which to teach a multitude of STEM subjects, from mathematics and physics to biology and environmental science, while engaging students with hands on problems that reinforce conventional classroom learning by re-presenting theory as practical tools for building real-life working devices. The SENSE IT program is currently developing and implementing a set of high school educational modules which teach environmental science and basic engineering through the lens of fundamental STEM principles, at the same time introducing students to a new set of technologies that are increasingly important in the world of environmental research. Specifically, the project provides students with the opportunity to learn the engineering design process through the design, construction, programming and testing of a student-implemented water monitoring network in the Hudson and St. Lawrence Rivers in New York. These educational modules are aligned to state and national technology and science content standards and are designed to be compatible with standard classroom curricula to support a variety of core science, technology and mathematics classroom material. For example, while designing, programming and calibrating the sensors, the students are led through a series of tasks in which they must use core mathematics and physics theory to solve the real problems of making their sensors work. In later modules, students can explore environmental science and environmental engineering curricula while deploying and monitoring their sensors in local rivers. This presentation will provide an overview of the educational modules. A variety of sensors will be described, which are suitably simple for design and construction from first principles by high school students while being accurate enough for students to make meaningful environmental measurements. The presentation will also describe how the sensor building activities can be tied to core curricula classroom theory, enabling the modules to be utilized in regular classes by mathematics, science and computing teachers without disrupting their semester’s teaching goals. Furthermore, the presentation will address of the first two years of the SENSE IT project, during which 39 teachers have been equipped, trained on these materials, and have implemented the modules with around approximately 2,000 high school students.

  11. Spontaneous Symmetry Breaking as a Basis of Particle Mass

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Quigg, Chris; /Fermilab /CERN

    2007-04-01

    Electroweak theory joins electromagnetism with the weak force in a single quantum field theory, ascribing the two fundamental interactions--so different in their manifestations--to a common symmetry principle. How the electroweak gauge symmetry is hidden is one of the most urgent and challenging questions facing particle physics. The provisional answer incorporated in the ''standard model'' of particle physics was formulated in the 1960s by Higgs, by Brout & Englert, and by Guralnik, Hagen, & Kibble: The agent of electroweak symmetry breaking is an elementary scalar field whose self-interactions select a vacuum state in which the full electroweak symmetry is hidden, leavingmore » a residual phase symmetry of electromagnetism. By analogy with the Meissner effect of the superconducting phase transition, the Higgs mechanism, as it is commonly known, confers masses on the weak force carriers W{sup {+-}} and Z. It also opens the door to masses for the quarks and leptons, and shapes the world around us. It is a good story--though an incomplete story--and we do not know how much of the story is true. Experiments that explore the Fermi scale (the energy regime around 1 TeV) during the next decade will put the electroweak theory to decisive test, and may uncover new elements needed to construct a more satisfying completion of the electroweak theory. The aim of this article is to set the stage by reporting what we know and what we need to know, and to set some ''Big Questions'' that will guide our explorations.« less

  12. Is Teaching Neoclassical Economics as "the" Science of Economics Moral?

    ERIC Educational Resources Information Center

    Parvin, Manoucher

    1992-01-01

    Discusses the morality of teaching neoclassical theory as the only science of economics. Argues that the teaching of neoclassical theory violates moral principles unless each and every attribute of neoclassical theory is proven superior to corresponding attributes of competing theories. Criticizes neoclassical economics for teaching what rather…

  13. Optimizing Computer Assisted Instruction By Applying Principles of Learning Theory.

    ERIC Educational Resources Information Center

    Edwards, Thomas O.

    The development of learning theory and its application to computer-assisted instruction (CAI) are described. Among the early theoretical constructs thought to be important are E. L. Thorndike's concept of learning connectisms, Neal Miller's theory of motivation, and B. F. Skinner's theory of operant conditioning. Early devices incorporating those…

  14. Thermodynamic restrictions on the constitutive equations of electromagnetic theory

    NASA Technical Reports Server (NTRS)

    Coleman, B. D.; Dill, E. H.

    1971-01-01

    Thermodynamics second law restrictions on constitutive equations of electromagnetic theory for nonlinear materials with long-range gradually fading memory, considering dissipation principle consequences

  15. On Nomological Validity and Auxiliary Assumptions: The Importance of Simultaneously Testing Effects in Social Cognitive Theories Applied to Health Behavior and Some Guidelines

    PubMed Central

    Hagger, Martin S.; Gucciardi, Daniel F.; Chatzisarantis, Nikos L. D.

    2017-01-01

    Tests of social cognitive theories provide informative data on the factors that relate to health behavior, and the processes and mechanisms involved. In the present article, we contend that tests of social cognitive theories should adhere to the principles of nomological validity, defined as the degree to which predictions in a formal theoretical network are confirmed. We highlight the importance of nomological validity tests to ensure theory predictions can be disconfirmed through observation. We argue that researchers should be explicit on the conditions that lead to theory disconfirmation, and identify any auxiliary assumptions on which theory effects may be conditional. We contend that few researchers formally test the nomological validity of theories, or outline conditions that lead to model rejection and the auxiliary assumptions that may explain findings that run counter to hypotheses, raising potential for ‘falsification evasion.’ We present a brief analysis of studies (k = 122) testing four key social cognitive theories in health behavior to illustrate deficiencies in reporting theory tests and evaluations of nomological validity. Our analysis revealed that few articles report explicit statements suggesting that their findings support or reject the hypotheses of the theories tested, even when findings point to rejection. We illustrate the importance of explicit a priori specification of fundamental theory hypotheses and associated auxiliary assumptions, and identification of the conditions which would lead to rejection of theory predictions. We also demonstrate the value of confirmatory analytic techniques, meta-analytic structural equation modeling, and Bayesian analyses in providing robust converging evidence for nomological validity. We provide a set of guidelines for researchers on how to adopt and apply the nomological validity approach to testing health behavior models. PMID:29163307

  16. No-Hypersignaling Principle

    NASA Astrophysics Data System (ADS)

    Dall'Arno, Michele; Brandsen, Sarah; Tosini, Alessandro; Buscemi, Francesco; Vedral, Vlatko

    2017-07-01

    A paramount topic in quantum foundations, rooted in the study of the Einstein-Podolsky-Rosen (EPR) paradox and Bell inequalities, is that of characterizing quantum theory in terms of the spacelike correlations it allows. Here, we show that to focus only on spacelike correlations is not enough: we explicitly construct a toy model theory that, while not contradicting classical and quantum theories at the level of spacelike correlations, still displays an anomalous behavior in its timelike correlations. We call this anomaly, quantified in terms of a specific communication game, the "hypersignaling" phenomena. We hence conclude that the "principle of quantumness," if it exists, cannot be found in spacelike correlations alone: nontrivial constraints need to be imposed also on timelike correlations, in order to exclude hypersignaling theories.

  17. No-Hypersignaling Principle.

    PubMed

    Dall'Arno, Michele; Brandsen, Sarah; Tosini, Alessandro; Buscemi, Francesco; Vedral, Vlatko

    2017-07-14

    A paramount topic in quantum foundations, rooted in the study of the Einstein-Podolsky-Rosen (EPR) paradox and Bell inequalities, is that of characterizing quantum theory in terms of the spacelike correlations it allows. Here, we show that to focus only on spacelike correlations is not enough: we explicitly construct a toy model theory that, while not contradicting classical and quantum theories at the level of spacelike correlations, still displays an anomalous behavior in its timelike correlations. We call this anomaly, quantified in terms of a specific communication game, the "hypersignaling" phenomena. We hence conclude that the "principle of quantumness," if it exists, cannot be found in spacelike correlations alone: nontrivial constraints need to be imposed also on timelike correlations, in order to exclude hypersignaling theories.

  18. Bayesian learning for spatial filtering in an EEG-based brain-computer interface.

    PubMed

    Zhang, Haihong; Yang, Huijuan; Guan, Cuntai

    2013-07-01

    Spatial filtering for EEG feature extraction and classification is an important tool in brain-computer interface. However, there is generally no established theory that links spatial filtering directly to Bayes classification error. To address this issue, this paper proposes and studies a Bayesian analysis theory for spatial filtering in relation to Bayes error. Following the maximum entropy principle, we introduce a gamma probability model for describing single-trial EEG power features. We then formulate and analyze the theoretical relationship between Bayes classification error and the so-called Rayleigh quotient, which is a function of spatial filters and basically measures the ratio in power features between two classes. This paper also reports our extensive study that examines the theory and its use in classification, using three publicly available EEG data sets and state-of-the-art spatial filtering techniques and various classifiers. Specifically, we validate the positive relationship between Bayes error and Rayleigh quotient in real EEG power features. Finally, we demonstrate that the Bayes error can be practically reduced by applying a new spatial filter with lower Rayleigh quotient.

  19. Modeling L2,3-Edge X-ray Absorption Spectroscopy with Real-Time Exact Two-Component Relativistic Time-Dependent Density Functional Theory.

    PubMed

    Kasper, Joseph M; Lestrange, Patrick J; Stetina, Torin F; Li, Xiaosong

    2018-04-10

    X-ray absorption spectroscopy is a powerful technique to probe local electronic and nuclear structure. There has been extensive theoretical work modeling K-edge spectra from first principles. However, modeling L-edge spectra directly with density functional theory poses a unique challenge requiring further study. Spin-orbit coupling must be included in the model, and a noncollinear density functional theory is required. Using the real-time exact two-component method, we are able to variationally include one-electron spin-orbit coupling terms when calculating the absorption spectrum. The abilities of different basis sets and density functionals to model spectra for both closed- and open-shell systems are investigated using SiCl 4 and three transition metal complexes, TiCl 4 , CrO 2 Cl 2 , and [FeCl 6 ] 3- . Although we are working in the real-time framework, individual molecular orbital transitions can still be recovered by projecting the density onto the ground state molecular orbital space and separating contributions to the time evolving dipole moment.

  20. Force Field Accelerated Density Functional Theory Molecular Dynamics for Simulation of Reactive Systems at Extreme Conditions

    NASA Astrophysics Data System (ADS)

    Lindsey, Rebecca; Goldman, Nir; Fried, Laurence

    2017-06-01

    Atomistic modeling of chemistry at extreme conditions remains a challenge, despite continuing advances in computing resources and simulation tools. While first principles methods provide a powerful predictive tool, the time and length scales associated with chemistry at extreme conditions (ns and μm, respectively) largely preclude extension of such models to molecular dynamics. In this work, we develop a simulation approach that retains the accuracy of density functional theory (DFT) while decreasing computational effort by several orders of magnitude. We generate n-body descriptions for atomic interactions by mapping forces arising from short density functional theory (DFT) trajectories on to simple Chebyshev polynomial series. We examine the importance of including greater than 2-body interactions, model transferability to different state points, and discuss approaches to ensure smooth and reasonable model shape outside of the distance domain sampled by the DFT training set. This work was performed under the auspices of the U.S. Department of Energy by Lawrence Livermore National Laboratory under Contract DE-AC52-07NA27344.

  1. Force Field Accelerated Density Functional Theory Molecular Dynamics for Simulation of Reactive Systems at Extreme Conditions

    NASA Astrophysics Data System (ADS)

    Lindsey, Rebecca; Goldman, Nir; Fried, Laurence

    Understanding chemistry at extreme conditions is crucial in fields including geochemistry, astrobiology, and alternative energy. First principles methods can provide valuable microscopic insights into such systems while circumventing the risks of physical experiments, however the time and length scales associated with chemistry at extreme conditions (ns and μm, respectively) largely preclude extension of such models to molecular dynamics. In this work, we develop a simulation approach that retains the accuracy of density functional theory (DFT) while decreasing computational effort by several orders of magnitude. We generate n-body descriptions for atomic interactions by mapping forces arising from short density functional theory (DFT) trajectories on to simple Chebyshev polynomial series. We examine the importance of including greater than 2-body interactions, model transferability to different state points, and discuss approaches to ensure smooth and reasonable model shape outside of the distance domain sampled by the DFT training set. This work performed under the auspices of the U.S. Department of Energy by Lawrence Livermore National Laboratory under Contract DE-AC52-07NA27344.

  2. Why do we need theories?

    PubMed Central

    Longo, Giuseppe; Soto, Ana M.

    2017-01-01

    Theories organize knowledge and construct objectivity by framing observations and experiments. The elaboration of theoretical principles is examined in the light of the rich interactions between physics and mathematics. These two disciplines share common principles of construction of concepts and of the proper objects of inquiry. Theory construction in physics relies on mathematical symmetries that preserve the key invariants observed and proposed by such theory; these invariants buttress the idea that the objects of physics are generic and thus interchangeable and they move along specific trajectories which are uniquely determined, in classical and relativistic physics. In contrast to physics, biology is a historical science that centers on the changes that organisms experience while undergoing ontogenesis and phylogenesis. Biological objects, namely organisms, are not generic but specific; they are individuals. The incessant changes they undergo represent the breaking of symmetries, and thus the opposite of symmetry conservation, a central component of physical theories. This instability corresponds to the changes of the environment and the phenotypes. Inspired by Galileo’s principle of inertia, the “default state” of inert matter, we propose a “default state” for biological dynamics following Darwin’s first principle, “descent with modification” that we transform into “proliferation with variation and motility” as a property that spans life, including cells in an organism. These dissimilarities between theories of the inert and of biology also apply to causality: biological causality is to be understood in relation to the distinctive role that constraints assume in this discipline. Consequently, the notion of cause will be reframed in a context where constraints to activity are seen as the core component of biological analyses. Finally, we assert that the radical materiality of life rules out distinctions such as “software vs. hardware.” PMID:27390105

  3. Universe or Multiverse?

    NASA Astrophysics Data System (ADS)

    Carr, Bernard

    2009-08-01

    Part I. Overviews: 1. Introduction and overview Bernard Carr; 2. Living in the multiverse Steven Weinberg; 3. Enlightenment, knowledge, ignorance, temptation Frank Wilczek; Part II. Cosmology and Astrophysics: 4. Cosmology and the multiverse Martin J. Rees; 5. The anthropic principle revisited Bernard Carr; 6. Cosmology from the top down Stephen Hawking; 7. The multiverse hierarchy Max Tegmark; 8. The inflationary universe Andrei Linde; 9. A model of anthropic reasoning: the dark to ordinary matter ratio Frank Wilczek; 10. Anthropic predictions: the case of the cosmological constant Alexander Vilenkin; 11. The definition and classification of universes James D. Bjorken; 12. M/string theory and anthropic reasoning Renata Kallosh; 13. The anthropic principle, dark energy and the LHC Savas Dimopoulos and Scott Thomas; Part III. Particle Physics and Quantum Theory: 14. Quarks, electrons and atoms in closely related universes Craig J. Hogan; 15. The fine-tuning problems of particle physics and anthropic mechanisms John F. Donoghue; 16. The anthropic landscape of string theory Leonard Susskind; 17. Cosmology and the many worlds interpretation of quantum mechanics Viatcheslav Mukhanov; 18. Anthropic reasoning and quantum cosmology James B. Hartle; 19. Micro-anthropic principle for quantum theory Brandon Carter; Part IV. More General Philosophical Issues: 20. Scientific alternatives to the anthropic principle Lee Smolin; 21. Making predictions in a multiverse: conundrums, dangers, coincidences Anthony Aguirre; 22. Multiverses: description, uniqueness and testing George Ellis; 23. Predictions and tests of multiverse theories Don N. Page; 24. Observation selection theory and cosmological fine-tuning Nick Bostrom; 25. Are anthropic arguments, involving multiverses and beyond, legitimate? William R. Stoeger; 26. The multiverse hypothesis: a theistic perspective Robin Collins; 27. Living in a simulated universe John D. Barrow; 28. Universes galore: where will it all end? Paul Davies; Index.

  4. Why do we need theories?

    PubMed

    Longo, Giuseppe; Soto, Ana M

    2016-10-01

    Theories organize knowledge and construct objectivity by framing observations and experiments. The elaboration of theoretical principles is examined in the light of the rich interactions between physics and mathematics. These two disciplines share common principles of construction of concepts and of the proper objects of inquiry. Theory construction in physics relies on mathematical symmetries that preserve the key invariants observed and proposed by such theory; these invariants buttress the idea that the objects of physics are generic and thus interchangeable and they move along specific trajectories which are uniquely determined, in classical and relativistic physics. In contrast to physics, biology is a historical science that centers on the changes that organisms experience while undergoing ontogenesis and phylogenesis. Biological objects, namely organisms, are not generic but specific; they are individuals. The incessant changes they undergo represent the breaking of symmetries, and thus the opposite of symmetry conservation, a central component of physical theories. This instability corresponds to the changes of the environment and the phenotypes. Inspired by Galileo's principle of inertia, the "default state" of inert matter, we propose a "default state" for biological dynamics following Darwin's first principle, "descent with modification" that we transform into "proliferation with variation and motility" as a property that spans life, including cells in an organism. These dissimilarities between theories of the inert and of biology also apply to causality: biological causality is to be understood in relation to the distinctive role that constraints assume in this discipline. Consequently, the notion of cause will be reframed in a context where constraints to activity are seen as the core component of biological analyses. Finally, we assert that the radical materiality of life rules out distinctions such as "software vs. hardware." Copyright © 2016 Elsevier Ltd. All rights reserved.

  5. How far do EPR-Bell experiments constrain physical collapse theories?

    NASA Astrophysics Data System (ADS)

    Leggett, A. J.

    2007-03-01

    A class of theories alternative to standard quantum mechanics, including that of Ghirardi et al ('GRWP'), postulates that when a quantum superposition becomes amplified to the point that the superposed states reach some level of 'macroscopic distinctness', then some non-quantum-mechanical principle comes into play and realizes one or other of the two macroscopic outcomes. Without specializing to any particular theory of this class, I ask how far such 'macrorealistic' theories are generically constrained, if one insists that the physical reduction process should respect Einstein locality, by the results of existing EPR-Bell experiments. I conclude that provided one does not demand that the prescription for reduction respects Lorentz invariance, at least some theories of this type, while in principle inevitably making some predictions that conflict with those of standard quantum mechanics, are not refuted by any existing experiment.

  6. Can Gravity Probe B usefully constrain torsion gravity theories?

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Flanagan, Eanna E.; Rosenthal, Eran

    2007-06-15

    In most theories of gravity involving torsion, the source for torsion is the intrinsic spin of matter. Since the spins of fermions are normally randomly oriented in macroscopic bodies, the amount of torsion generated by macroscopic bodies is normally negligible. However, in a recent paper, Mao et al. (arXiv:gr-qc/0608121) point out that there is a class of theories, including the Hayashi-Shirafuji (1979) theory, in which the angular momentum of macroscopic spinning bodies generates a significant amount of torsion. They further argue that, by the principle of action equals reaction, one would expect the angular momentum of test bodies to couplemore » to a background torsion field, and therefore the precession of the Gravity Probe B gyroscopes should be affected in these theories by the torsion generated by the Earth. We show that in fact the principle of action equals reaction does not apply to these theories, essentially because the torsion is not an independent dynamical degree of freedom. We examine in detail a generalization of the Hayashi-Shirafuji theory suggested by Mao et al. called Einstein-Hayashi-Shirafuji theory. There are a variety of different versions of this theory, depending on the precise form of the coupling to matter chosen for the torsion. We show that, for any coupling to matter that is compatible with the spin transport equation postulated by Mao et al., the theory has either ghosts or an ill-posed initial-value formulation. These theoretical problems can be avoided by specializing the parameters of the theory and in addition choosing the standard minimal coupling to matter of the torsion tensor. This yields a consistent theory, but one in which the action equals reaction principle is violated, and in which the angular momentum of the gyroscopes does not couple to the Earth's torsion field. Thus, the Einstein-Hayashi-Shirafuji theory does not predict a detectable torsion signal for Gravity Probe B. There may be other torsion theories which do.« less

  7. The Singular Universe and the Reality of Time

    NASA Astrophysics Data System (ADS)

    Mangabeira Unger, Roberto; Smolin, Lee

    2015-01-01

    Introduction; Part I. Roberto Mangabeira Unger: 1. The science of the one universe in time; 2. The context and consequences of the argument; 3. The singular existence of the universe; 4. The inclusive reality of time; 5. The mutability of the laws of nature; 6. The selective realism of mathematics; Part II. Lee Smolin: 1. Cosmology in crisis; 2. Principles for a cosmological theory; 3. The setting: the puzzles of contemporary cosmology; 4. Hypotheses for a new cosmology; 5. Mathematics; 6. Approaches to solving the metalaw dilemma; 7. Implications of temporal naturalism for philosophy of mind; 8. An agenda for science; 9. Concluding remarks; A note concerning disagreements between our views.

  8. Development of a switched integrator amplifier for high-accuracy optical measurements.

    PubMed

    Mountford, John; Porrovecchio, Geiland; Smid, Marek; Smid, Radislav

    2008-11-01

    In the field of low flux optical measurements, the development and use of large area silicon detectors is becoming more frequent. The current/voltage conversion of their photocurrent presents a set of problems for traditional transimpedance amplifiers. The switched integration principle overcomes these limitations. We describe the development of a fully characterized current-voltage amplifier using the switched integrator technique. Two distinct systems have been developed in parallel at the United Kingdom's National Physical Laboratory (NPL) and Czech Metrology Institute (CMI) laboratories. We present the circuit theory and best practice in the design and construction of switched integrators. In conclusion the results achieved and future developments are discussed.

  9. A first course in optimum design of yacht sails

    NASA Astrophysics Data System (ADS)

    Sugimoto, Takeshi

    1993-03-01

    The optimum sail geometry is analytically obtained for the case of maximizing the thrust under equality and inequality constraints on the lift and the heeling moment. A single mainsail is assumed to be set close-hauled in uniform wind and upright on the flat sea surface. The governing parameters are the mast height and the gap between the sail foot and the sea surface. The lifting line theory is applied to analyze the aerodynamic forces acting on a sail. The design method consists of the variational principle and a feasibility study. Almost triangular sails are found to be optimum. Their advantages are discussed.

  10. Applications of Support Vector Machines In Chemo And Bioinformatics

    NASA Astrophysics Data System (ADS)

    Jayaraman, V. K.; Sundararajan, V.

    2010-10-01

    Conventional linear & nonlinear tools for classification, regression & data driven modeling are being replaced on a rapid scale by newer techniques & tools based on artificial intelligence and machine learning. While the linear techniques are not applicable for inherently nonlinear problems, newer methods serve as attractive alternatives for solving real life problems. Support Vector Machine (SVM) classifiers are a set of universal feed-forward network based classification algorithms that have been formulated from statistical learning theory and structural risk minimization principle. SVM regression closely follows the classification methodology. In this work recent applications of SVM in Chemo & Bioinformatics will be described with suitable illustrative examples.

  11. Harm reduction principles for healthcare settings.

    PubMed

    Hawk, Mary; Coulter, Robert W S; Egan, James E; Fisk, Stuart; Reuel Friedman, M; Tula, Monique; Kinsky, Suzanne

    2017-10-24

    Harm reduction refers to interventions aimed at reducing the negative effects of health behaviors without necessarily extinguishing the problematic health behaviors completely. The vast majority of the harm reduction literature focuses on the harms of drug use and on specific harm reduction strategies, such as syringe exchange, rather than on the harm reduction philosophy as a whole. Given that a harm reduction approach can address other risk behaviors that often occur alongside drug use and that harm reduction principles have been applied to harms such as sex work, eating disorders, and tobacco use, a natural evolution of the harm reduction philosophy is to extend it to other health risk behaviors and to a broader healthcare audience. Building on the extant literature, we used data from in-depth qualitative interviews with 23 patients and 17 staff members from an HIV clinic in the USA to describe harm reduction principles for use in healthcare settings. We defined six principles of harm reduction and generalized them for use in healthcare settings with patients beyond those who use illicit substances. The principles include humanism, pragmatism, individualism, autonomy, incrementalism, and accountability without termination. For each of these principles, we present a definition, a description of how healthcare providers can deliver interventions informed by the principle, and examples of how each principle may be applied in the healthcare setting. This paper is one of the firsts to provide a comprehensive set of principles for universal harm reduction as a conceptual approach for healthcare provision. Applying harm reduction principles in healthcare settings may improve clinical care outcomes given that the quality of the provider-patient relationship is known to impact health outcomes and treatment adherence. Harm reduction can be a universal precaution applied to all individuals regardless of their disclosure of negative health behaviors, given that health behaviors are not binary or linear but operate along a continuum based on a variety of individual and social determinants.

  12. Self-organization, free energy minimization, and optimal grip on a field of affordances

    PubMed Central

    Bruineberg, Jelle; Rietveld, Erik

    2014-01-01

    In this paper, we set out to develop a theoretical and conceptual framework for the new field of Radical Embodied Cognitive Neuroscience. This framework should be able to integrate insights from several relevant disciplines: theory on embodied cognition, ecological psychology, phenomenology, dynamical systems theory, and neurodynamics. We suggest that the main task of Radical Embodied Cognitive Neuroscience is to investigate the phenomenon of skilled intentionality from the perspective of the self-organization of the brain-body-environment system, while doing justice to the phenomenology of skilled action. In previous work, we have characterized skilled intentionality as the organism's tendency toward an optimal grip on multiple relevant affordances simultaneously. Affordances are possibilities for action provided by the environment. In the first part of this paper, we introduce the notion of skilled intentionality and the phenomenon of responsiveness to a field of relevant affordances. Second, we use Friston's work on neurodynamics, but embed a very minimal version of his Free Energy Principle in the ecological niche of the animal. Thus amended, this principle is helpful for understanding the embeddedness of neurodynamics within the dynamics of the system “brain-body-landscape of affordances.” Next, we show how we can use this adjusted principle to understand the neurodynamics of selective openness to the environment: interacting action-readiness patterns at multiple timescales contribute to the organism's selective openness to relevant affordances. In the final part of the paper, we emphasize the important role of metastable dynamics in both the brain and the brain-body-environment system for adequate affordance-responsiveness. We exemplify our integrative approach by presenting research on the impact of Deep Brain Stimulation on affordance responsiveness of OCD patients. PMID:25161615

  13. Self-organization, free energy minimization, and optimal grip on a field of affordances.

    PubMed

    Bruineberg, Jelle; Rietveld, Erik

    2014-01-01

    In this paper, we set out to develop a theoretical and conceptual framework for the new field of Radical Embodied Cognitive Neuroscience. This framework should be able to integrate insights from several relevant disciplines: theory on embodied cognition, ecological psychology, phenomenology, dynamical systems theory, and neurodynamics. We suggest that the main task of Radical Embodied Cognitive Neuroscience is to investigate the phenomenon of skilled intentionality from the perspective of the self-organization of the brain-body-environment system, while doing justice to the phenomenology of skilled action. In previous work, we have characterized skilled intentionality as the organism's tendency toward an optimal grip on multiple relevant affordances simultaneously. Affordances are possibilities for action provided by the environment. In the first part of this paper, we introduce the notion of skilled intentionality and the phenomenon of responsiveness to a field of relevant affordances. Second, we use Friston's work on neurodynamics, but embed a very minimal version of his Free Energy Principle in the ecological niche of the animal. Thus amended, this principle is helpful for understanding the embeddedness of neurodynamics within the dynamics of the system "brain-body-landscape of affordances." Next, we show how we can use this adjusted principle to understand the neurodynamics of selective openness to the environment: interacting action-readiness patterns at multiple timescales contribute to the organism's selective openness to relevant affordances. In the final part of the paper, we emphasize the important role of metastable dynamics in both the brain and the brain-body-environment system for adequate affordance-responsiveness. We exemplify our integrative approach by presenting research on the impact of Deep Brain Stimulation on affordance responsiveness of OCD patients.

  14. Bond Order Conservation Strategies in Catalysis Applied to the NH 3 Decomposition Reaction

    DOE PAGES

    Yu, Liang; Abild-Pedersen, Frank

    2016-12-14

    On the basis of an extensive set of density functional theory calculations, it is shown that a simple scheme provides a fundamental understanding of variations in the transition state energies and structures of reaction intermediates on transition metal surfaces across the periodic table. The scheme is built on the bond order conservation principle and requires a limited set of input data, still achieving transition state energies as a function of simple descriptors with an error smaller than those of approaches based on linear fits to a set of calculated transition state energies. Here, we have applied this approach together withmore » linear scaling of adsorption energies to obtain the energetics of the NH 3 decomposition reaction on a series of stepped fcc(211) transition metal surfaces. Moreover, this information is used to establish a microkinetic model for the formation of N 2 and H 2, thus providing insight into the components of the reaction that determines the activity.« less

  15. Small Atomic Orbital Basis Set First‐Principles Quantum Chemical Methods for Large Molecular and Periodic Systems: A Critical Analysis of Error Sources

    PubMed Central

    Sure, Rebecca; Brandenburg, Jan Gerit

    2015-01-01

    Abstract In quantum chemical computations the combination of Hartree–Fock or a density functional theory (DFT) approximation with relatively small atomic orbital basis sets of double‐zeta quality is still widely used, for example, in the popular B3LYP/6‐31G* approach. In this Review, we critically analyze the two main sources of error in such computations, that is, the basis set superposition error on the one hand and the missing London dispersion interactions on the other. We review various strategies to correct those errors and present exemplary calculations on mainly noncovalently bound systems of widely varying size. Energies and geometries of small dimers, large supramolecular complexes, and molecular crystals are covered. We conclude that it is not justified to rely on fortunate error compensation, as the main inconsistencies can be cured by modern correction schemes which clearly outperform the plain mean‐field methods. PMID:27308221

  16. Health psychology in primary care: recent research and future directions.

    PubMed

    Thielke, Stephen; Thompson, Alexander; Stuart, Richard

    2011-01-01

    Over the last decade, research about health psychology in primary care has reiterated its contributions to mental and physical health promotion, and its role in addressing gaps in mental health service delivery. Recent meta-analyses have generated mixed results about the effectiveness and cost-effectiveness of health psychology interventions. There have been few studies of health psychology interventions in real-world treatment settings. Several key challenges exist: determining the degree of penetration of health psychology into primary care settings; clarifying the specific roles of health psychologists in integrated care; resolving reimbursement issues; and adapting to the increased prescription of psychotropic medications. Identifying and exploring these issues can help health psychologists and primary care providers to develop the most effective ways of applying psychological principles in primary care settings. In a changing health care landscape, health psychologists must continue to articulate the theories and techniques of health psychology and integrated care, to put their beliefs into practice, and to measure the outcomes of their work.

  17. Search for Effects of an Electrostatic Potential on Clocks in the Frame of Reference of a Charged Particle

    NASA Technical Reports Server (NTRS)

    Ringermacher, Harry I.; Conradi, Mark S.; Cassenti, Brice

    2005-01-01

    Results of experiments to confirm a theory that links classical electromagnetism with the geometry of spacetime are described. The theory, based on the introduction of a Torsion tensor into Einstein s equations and following the approach of Schroedinger, predicts effects on clocks attached to charged particles, subject to intense electric fields, analogous to the effects on clocks in a gravitational field. We show that in order to interpret this theory, one must re-interpret all clock changes, both gravitational and electromagnetic, as arising from changes in potential energy and not merely potential. The clock is provided naturally by proton spins in hydrogen atoms subject to Nuclear Magnetic Resonance trials. No frequency change of clocks was observed to a resolution of 6310(exp -9). A new "Clock Principle" was postulated to explain the null result. There are two possible implications of the experiments: (a) The Clock Principle is invalid and, in fact, no metric theory incorporating electromagnetism is possible; (b) The Clock Principle is valid and it follows that a negative rest mass cannot exist.

  18. Brans-Dicke Galileon and the variational principle

    NASA Astrophysics Data System (ADS)

    Quiros, Israel; García-Salcedo, Ricardo; Gonzalez, Tame; Horta-Rangel, F. Antonio; Saavedra, Joel

    2016-09-01

    This paper is aimed at a (mostly) pedagogical exposition of the derivation of the motion equations of certain modifications of general relativity. Here we derive in all detail the motion equations in the Brans-Dicke theory with cubic self-interaction. This is a modification of the Brans-Dicke theory by the addition of a term in the Lagrangian which is non-linear in the derivatives of the scalar field: it contains second-order derivatives. This is the basis of the so-called Brans-Dicke Galileon. We pay special attention to the variational principle and to the algebraic details of the derivation. It is shown how higher order derivatives of the fields appearing in the intermediate computations cancel out leading to second order motion equations. The reader will find useful tips for the derivation of the field equations of modifications of general relativity such as the scalar-tensor theories and f(R) theories, by means of the (stationary action) variational principle. The content of this paper is particularly recommended to those graduate and postgraduate students who are interested in the study of the mentioned modifications of general relativity.

  19. Fault Management Design Strategies

    NASA Technical Reports Server (NTRS)

    Day, John C.; Johnson, Stephen B.

    2014-01-01

    Development of dependable systems relies on the ability of the system to determine and respond to off-nominal system behavior. Specification and development of these fault management capabilities must be done in a structured and principled manner to improve our understanding of these systems, and to make significant gains in dependability (safety, reliability and availability). Prior work has described a fundamental taxonomy and theory of System Health Management (SHM), and of its operational subset, Fault Management (FM). This conceptual foundation provides a basis to develop framework to design and implement FM design strategies that protect mission objectives and account for system design limitations. Selection of an SHM strategy has implications for the functions required to perform the strategy, and it places constraints on the set of possible design solutions. The framework developed in this paper provides a rigorous and principled approach to classifying SHM strategies, as well as methods for determination and implementation of SHM strategies. An illustrative example is used to describe the application of the framework and the resulting benefits to system and FM design and dependability.

  20. Cosmic evolution: the context for astrobiology and its cultural implications

    NASA Astrophysics Data System (ADS)

    Dick, Steven J.

    2012-10-01

    Astrobiology must be seen in the context of cosmic evolution, the 13.7 billion-year master narrative of the universe. The idea of an evolving universe dates back only to the 19th century, and became a guiding principle for astronomical research only in the second half of the 20th century. The modern synthesis in evolutionary biology hastened the acceptance of the idea in its cosmic setting, as did the confirmation of the Big Bang theory for the origin of the universe. NASA programmes such as Origins incorporated it as a guiding principle. Cosmic evolution encompasses physical, biological and cultural evolution, and may result in a physical, biological or postbiological universe, each with its own implications for long-term human destiny, and each imbuing the meaning of life with different values. It has the status of an increasingly accepted worldview that is beginning to have a profound effect not only in science but also in religion and philosophy.

Top