Sample records for dissymmetry combining theory

  1. Louis Pasteur, language, and molecular chirality. I. Background and dissymmetry.

    PubMed

    Gal, Joseph

    2011-01-01

    Louis Pasteur resolved sodium ammonium (±)-tartrate in 1848, thereby discovering molecular chirality. Although hindered by the primitive state of organic chemistry, he introduced new terminology and nomenclature for his new science of molecular and crystal chirality. He was well prepared for this task by his rigorous education and innate abilities, and his linguistic achievements eventually earned him membership in the supreme institution for the French language, the Académie française. Dissymmetry had been in use in French from the early 1820s for disruption or absence of symmetry or for dissimilarity or difference in appearance between two objects, and Pasteur initially used it in the latter connotation, without any reference to handedness or enantiomorphism. Soon, however, he adopted it in the meaning of chirality. Asymmetry had been in use in French since 1691 but Pasteur ignored it in favor of dissymmetry. The two terms are not synonymous but it is not clear whether Pasteur recognized this difference in choosing the former over the latter. However, much of the literature mistranslates his dissymmetry as asymmetry. Twenty years before Pasteur the British polymath John Herschel proposed that optical rotation in the noncrystalline state is due to the "unsymmetrical" [his term] nature of the molecules and later used dissymmetrical for handed. Chirality, coined by Lord Kelvin in 1894 and introduced into chemistry by Mislow in 1962, has nearly completely replaced dissymmetry in the meaning of handedness, but the use of dissymmetry continues today in other contexts for lack of symmetry, reduction of symmetry, or dissimilarity. Copyright © 2010 Wiley-Liss, Inc.

  2. Chiral quantum supercrystals with total dissymmetry of optical response

    NASA Astrophysics Data System (ADS)

    Baimuratov, Anvar S.; Gun'Ko, Yurii K.; Baranov, Alexander V.; Fedorov, Anatoly V.; Rukhlenko, Ivan D.

    2016-03-01

    Since chiral nanoparticles are much smaller than the optical wavelength, their enantiomers show little difference in the interaction with circularly polarized light. This scale mismatch makes the enhancement of enantioselectivity in optical excitation of nanoobjects a fundamental challenge in modern nanophotonics. Here we demonstrate that a strong dissymmetry of optical response from achiral nanoobjects can be achieved through their arrangement into chiral superstructures with the length scale comparable to the optical wavelength. This concept is illustrated by the example of the simple helix supercrystal made of semiconductor quantum dots. We show that this supercrystal almost fully absorbs light with one circular polarization and does not absorb the other. The giant circular dichroism of the supercrystal comes from the formation of chiral bright excitons, which are the optically active collective excitations of the entire supercrystal. Owing to the recent advances in assembly and self-organization of nanocrystals in large superparticle structures, the proposed principle of enantioselectivity enhancement has great potential of benefiting various chiral and analytical methods, which are used in biophysics, chemistry, and pharmaceutical science.

  3. Optical activity of chirally distorted nanocrystals

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tepliakov, Nikita V.; Baimuratov, Anvar S.; Baranov, Alexander V.

    2016-05-21

    We develop a general theory of optical activity of semiconductor nanocrystals whose chirality is induced by a small perturbation of their otherwise achiral electronic subsystems. The optical activity is described using the quantum-mechanical expressions for the rotatory strengths and dissymmetry factors introduced by Rosenfeld. We show that the rotatory strengths of optically active transitions are decomposed on electric dipole and magnetic dipole contributions, which correspond to the electric dipole and magnetic dipole transitions between the unperturbed quantum states. Remarkably, while the two kinds of rotatory strengths are of the same order of magnitude, the corresponding dissymmetry factors can differ bymore » a factor of 10{sup 5}. By maximizing the dissymmetry of magnetic dipole absorption one can significantly enhance the enantioselectivity in the interaction of semiconductor nanocrystals with circularly polarized light. This feature may advance chiral and analytical methods, which will benefit biophysics, chemistry, and pharmaceutical science. The developed theory is illustrated by an example of intraband transitions inside a semiconductor nanocuboid, whose rotatory strengths and dissymmetry factors are calculated analytically.« less

  4. Optical activity of chirally distorted nanocrystals

    NASA Astrophysics Data System (ADS)

    Tepliakov, Nikita V.; Baimuratov, Anvar S.; Baranov, Alexander V.; Fedorov, Anatoly V.; Rukhlenko, Ivan D.

    2016-05-01

    We develop a general theory of optical activity of semiconductor nanocrystals whose chirality is induced by a small perturbation of their otherwise achiral electronic subsystems. The optical activity is described using the quantum-mechanical expressions for the rotatory strengths and dissymmetry factors introduced by Rosenfeld. We show that the rotatory strengths of optically active transitions are decomposed on electric dipole and magnetic dipole contributions, which correspond to the electric dipole and magnetic dipole transitions between the unperturbed quantum states. Remarkably, while the two kinds of rotatory strengths are of the same order of magnitude, the corresponding dissymmetry factors can differ by a factor of 105. By maximizing the dissymmetry of magnetic dipole absorption one can significantly enhance the enantioselectivity in the interaction of semiconductor nanocrystals with circularly polarized light. This feature may advance chiral and analytical methods, which will benefit biophysics, chemistry, and pharmaceutical science. The developed theory is illustrated by an example of intraband transitions inside a semiconductor nanocuboid, whose rotatory strengths and dissymmetry factors are calculated analytically.

  5. Combined linear theory/impact theory method for analysis and design of high speed configurations

    NASA Technical Reports Server (NTRS)

    Brooke, D.; Vondrasek, D. V.

    1980-01-01

    Pressure distributions on a wing body at Mach 4.63 are calculated. The combined theory is shown to give improved predictions over either linear theory or impact theory alone. The combined theory is also applied in the inverse design mode to calculate optimum camber slopes at Mach 4.63. Comparisons with optimum camber slopes obtained from unmodified linear theory show large differences. Analysis of the results indicate that the combined theory correctly predicts the effect of thickness on the loading distributions at high Mach numbers, and that finite thickness wings optimized at high Mach numbers using unmodified linear theory will not achieve the minimum drag characteristics for which they are designed.

  6. Evidence Combination From an Evolutionary Game Theory Perspective

    PubMed Central

    Deng, Xinyang; Han, Deqiang; Dezert, Jean; Deng, Yong; Shyr, Yu

    2017-01-01

    Dempster-Shafer evidence theory is a primary methodology for multi-source information fusion because it is good at dealing with uncertain information. This theory provides a Dempster’s rule of combination to synthesize multiple evidences from various information sources. However, in some cases, counter-intuitive results may be obtained based on that combination rule. Numerous new or improved methods have been proposed to suppress these counter-intuitive results based on perspectives, such as minimizing the information loss or deviation. Inspired by evolutionary game theory, this paper considers a biological and evolutionary perspective to study the combination of evidences. An evolutionary combination rule (ECR) is proposed to help find the most biologically supported proposition in a multi-evidence system. Within the proposed ECR, we develop a Jaccard matrix game (JMG) to formalize the interaction between propositions in evidences, and utilize the replicator dynamics to mimick the evolution of propositions. Experimental results show that the proposed ECR can effectively suppress the counter-intuitive behaviors appeared in typical paradoxes of evidence theory, compared with many existing methods. Properties of the ECR, such as solution’s stability and convergence, have been mathematically proved as well. PMID:26285231

  7. Evidence Combination From an Evolutionary Game Theory Perspective.

    PubMed

    Deng, Xinyang; Han, Deqiang; Dezert, Jean; Deng, Yong; Shyr, Yu

    2016-09-01

    Dempster-Shafer evidence theory is a primary methodology for multisource information fusion because it is good at dealing with uncertain information. This theory provides a Dempster's rule of combination to synthesize multiple evidences from various information sources. However, in some cases, counter-intuitive results may be obtained based on that combination rule. Numerous new or improved methods have been proposed to suppress these counter-intuitive results based on perspectives, such as minimizing the information loss or deviation. Inspired by evolutionary game theory, this paper considers a biological and evolutionary perspective to study the combination of evidences. An evolutionary combination rule (ECR) is proposed to help find the most biologically supported proposition in a multievidence system. Within the proposed ECR, we develop a Jaccard matrix game to formalize the interaction between propositions in evidences, and utilize the replicator dynamics to mimick the evolution of propositions. Experimental results show that the proposed ECR can effectively suppress the counter-intuitive behaviors appeared in typical paradoxes of evidence theory, compared with many existing methods. Properties of the ECR, such as solution's stability and convergence, have been mathematically proved as well.

  8. Explaining Academic Progress via Combining Concepts of Integration Theory and Rational Choice Theory.

    ERIC Educational Resources Information Center

    Beekhoven, S.; De Jong, U.; Van Hout, H.

    2002-01-01

    Compared elements of rational choice theory and integration theory on the basis of their power to explain variance in academic progress. Asserts that the concepts should be combined, and the distinction between social and academic integration abandoned. Empirical analysis showed that an extended model, comprising both integration and rational…

  9. Challenges in combining different data sets during analysis when using grounded theory.

    PubMed

    Rintala, Tuula-Maria; Paavilainen, Eija; Astedt-Kurki, Päivi

    2014-05-01

    To describe the challenges in combining two data sets during grounded theory analysis. The use of grounded theory in nursing research is common. It is a suitable method for studying human action and interaction. It is recommended that many alternative sources of data are collected to create as rich a dataset as possible. Data from interviews with people with diabetes (n=19) and their family members (n=19). Combining two data sets. When using grounded theory, there are numerous challenges in collecting and managing data, especially for the novice researcher. One challenge is to combine different data sets during the analysis. There are many methodological textbooks about grounded theory but there is little written in the literature about combining different data sets. Discussion is needed on the management of data and the challenges of grounded theory. This article provides a means for combining different data sets in the grounded theory analysis process.

  10. Feasibility of combining linear theory and impact theory methods for the analysis and design of high speed configurations

    NASA Technical Reports Server (NTRS)

    Brooke, D.; Vondrasek, D. V.

    1978-01-01

    The aerodynamic influence coefficients calculated using an existing linear theory program were used to modify the pressures calculated using impact theory. Application of the combined approach to several wing-alone configurations shows that the combined approach gives improved predictions of the local pressure and loadings over either linear theory alone or impact theory alone. The approach not only removes most of the short-comings of the individual methods, as applied in the Mach 4 to 8 range, but also provides the basis for an inverse design procedure applicable to high speed configurations.

  11. Projected Hartree-Fock theory as a polynomial of particle-hole excitations and its combination with variational coupled cluster theory

    NASA Astrophysics Data System (ADS)

    Qiu, Yiheng; Henderson, Thomas M.; Scuseria, Gustavo E.

    2017-05-01

    Projected Hartree-Fock theory provides an accurate description of many kinds of strong correlations but does not properly describe weakly correlated systems. Coupled cluster theory, in contrast, does the opposite. It therefore seems natural to combine the two so as to describe both strong and weak correlations with high accuracy in a relatively black-box manner. Combining the two approaches, however, is made more difficult by the fact that the two techniques are formulated very differently. In earlier work, we showed how to write spin-projected Hartree-Fock in a coupled-cluster-like language. Here, we fill in the gaps in that earlier work. Further, we combine projected Hartree-Fock and coupled cluster theory in a variational formulation and show how the combination performs for the description of the Hubbard Hamiltonian and for several small molecular systems.

  12. Simultaneous control of emission localization and two-photon absorption efficiency in dissymmetrical chromophores

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tretiak, Sergei

    2009-01-01

    The aim of the present work is to demonstrate that combined spectral tuning of fluorescence and two-photon absorption (TPA) properties of multipolar chromophores can be achieved by introduction of slight electronic chemical dissymmetry. In that perspective, two novel series of structurally related chromophores have been designed and studied: a first series based on rod-like quadrupolar chromophores bearing different electron-donating (D) end groups and a second series based on three-branched octupolar chromophores built from a trigonal donating moiety and bearing various acceptor (A) peripheral groups. The influence of the electronic dissymmetry is investigated by combined experimental and theoretical studies of themore » linear and nonlinear optical properties of dissymmetric chromophores compared to their symmetrical counterparts. In both types of systems (i.e. quadrupoles and octupoles) experiments and theory reveal that excitation is essentially delocalized and that excitation involves synchronized charge redistribution between the different D and A moieties within the multipolar structure (i.e. concerted intramolecular charge transfer). In contrast, the emission stems only from a particular dipolar subunit bearing the strongest D or A moieties due to fast excitation localization after excitation prior to emission. Hence control of emission characteristics (polarization and emission spectrum) in addition to localization can be achieved by controlled introduction of electronic dissymmetry (i.e. replacement of one of the D or A end-groups by a slightly stronger D{prime} or A{prime} units). Interestingly dissymmetrical functionalization of both quadrupolar and octupolar compounds does not lead to significant loss in TPA responses and can even be beneficial due to the spectral broadening and peak position tuning that it allows. This study thus reveals an original molecular engineering route strategy allowing major TPA enhancement in multipolar structures due to

  13. Theory of electronically phased coherent beam combination without a reference beam

    NASA Astrophysics Data System (ADS)

    Shay, Thomas M.

    2006-12-01

    The first theory for two novel coherent beam combination architectures that are the first electronic beam combination architectures that completely eliminate the need for a separate reference beam are presented. Detailed theoretical models are developed and presented for the first time.

  14. Atomic Theory and Multiple Combining Proportions: The Search for Whole Number Ratios.

    PubMed

    Usselman, Melvyn C; Brown, Todd A

    2015-04-01

    John Dalton's atomic theory, with its postulate of compound formation through atom-to-atom combination, brought a new perspective to weight relationships in chemical reactions. A presumed one-to-one combination of atoms A and B to form a simple compound AB allowed Dalton to construct his first table of relative atomic weights from literature analyses of appropriate binary compounds. For such simple binary compounds, the atomic theory had little advantages over affinity theory as an explanation of fixed proportions by weight. For ternary compounds of the form AB2, however, atomic theory made quantitative predictions that were not deducible from affinity theory. Atomic theory required that the weight of B in the compound AB2 be exactly twice that in the compound AB. Dalton, Thomas Thomson and William Hyde Wollaston all published within a few years of each other experimental data that claimed to give the predicted results with the required accuracy. There are nonetheless several experimental barriers to obtaining the desired integral multiple proportions. In this paper I will discuss replication experiments which demonstrate that only Wollaston's results are experimentally reliable. It is likely that such replicability explains why Wollaston's experiments were so influential.

  15. Towards a Theory for Combining Information From Related Experiments

    DTIC Science & Technology

    2002-11-11

    TITLE AND SUBTITLE Final Report on ARO Contract # DAAD 19-99-1- 1082 entitled “Towards a theory for combining information from related...Contract # DAAD 19-99-1- 1082 4. AUTHOR OF THE REPORT: Francisco J. Samaniego, Principal Investigator 5. PERFPORMING ORGANIZATION: University of...preparation under the support of ARO Contract # DAAD 19-99-1- 1082 [1] "Linear Data Fusion", Proceedings of the Fourth Army Conference on Applied

  16. An improved exceedance theory for combined random stresses

    NASA Technical Reports Server (NTRS)

    Lester, H. C.

    1974-01-01

    An extension is presented of Rice's classic solution for the exceedances of a constant level by a single random process to its counterpart for an n-dimensional vector process. An interaction boundary, analogous to the constant level considered by Rice for the one-dimensional case, is assumed in the form of a hypersurface. The theory for the numbers of boundary exceedances is developed by using a joint statistical approach which fully accounts for all cross-correlation effects. An exact expression is derived for the n-dimensional exceedance density function, which is valid for an arbitrary interaction boundary. For application to biaxial states of combined random stress, the general theory is reduced to the two-dimensional case. An elliptical stress interaction boundary is assumed and the exact expression for the density function is presented. The equations are expressed in a format which facilitates calculating the exceedances by numerically evaluating a line integral. The behavior of the density function for the two-dimensional case is briefly discussed.

  17. Combining Density Functional Theory and Green's Function Theory: Range-Separated, Nonlocal, Dynamic, and Orbital-Dependent Hybrid Functional.

    PubMed

    Kananenka, Alexei A; Zgid, Dominika

    2017-11-14

    We present a rigorous framework which combines single-particle Green's function theory with density functional theory based on a separation of electron-electron interactions into short- and long-range components. Short-range contribution to the total energy and exchange-correlation potential is provided by a density functional approximation, while the long-range contribution is calculated using an explicit many-body Green's function method. Such a hybrid results in a nonlocal, dynamic, and orbital-dependent exchange-correlation functional of a single-particle Green's function. In particular, we present a range-separated hybrid functional called srSVWN5-lrGF2 which combines the local-density approximation and the second-order Green's function theory. We illustrate that similarly to density functional approximations, the new functional is weakly basis-set dependent. Furthermore, it offers an improved description of the short-range dynamic correlation. The many-body contribution to the functional mitigates the many-electron self-interaction error present in many density functional approximations and provides a better description of molecular properties. Additionally, we illustrate that the new functional can be used to scale down the self-energy and, therefore, introduce an additional sparsity to the self-energy matrix that in the future can be exploited in calculations for large molecules or periodic systems.

  18. Elemental representation and configural mappings: combining elemental and configural theories of associative learning.

    PubMed

    McLaren, I P L; Forrest, C L; McLaren, R P

    2012-09-01

    In this article, we present our first attempt at combining an elemental theory designed to model representation development in an associative system (based on McLaren, Kaye, & Mackintosh, 1989) with a configural theory that models associative learning and memory (McLaren, 1993). After considering the possible advantages of such a combination (and some possible pitfalls), we offer a hybrid model that allows both components to produce the phenomena that they are capable of without introducing unwanted interactions. We then successfully apply the model to a range of phenomena, including latent inhibition, perceptual learning, the Espinet effect, and first- and second-order retrospective revaluation. In some cases, we present new data for comparison with our model's predictions. In all cases, the model replicates the pattern observed in our experimental results. We conclude that this line of development is a promising one for arriving at general theories of associative learning and memory.

  19. Combining symmetry collective states with coupled-cluster theory: Lessons from the Agassi model Hamiltonian

    NASA Astrophysics Data System (ADS)

    Hermes, Matthew R.; Dukelsky, Jorge; Scuseria, Gustavo E.

    2017-06-01

    The failures of single-reference coupled-cluster theory for strongly correlated many-body systems is flagged at the mean-field level by the spontaneous breaking of one or more physical symmetries of the Hamiltonian. Restoring the symmetry of the mean-field determinant by projection reveals that coupled-cluster theory fails because it factorizes high-order excitation amplitudes incorrectly. However, symmetry-projected mean-field wave functions do not account sufficiently for dynamic (or weak) correlation. Here we pursue a merger of symmetry projection and coupled-cluster theory, following previous work along these lines that utilized the simple Lipkin model system as a test bed [J. Chem. Phys. 146, 054110 (2017), 10.1063/1.4974989]. We generalize the concept of a symmetry-projected mean-field wave function to the concept of a symmetry projected state, in which the factorization of high-order excitation amplitudes in terms of low-order ones is guided by symmetry projection and is not exponential, and combine them with coupled-cluster theory in order to model the ground state of the Agassi Hamiltonian. This model has two separate channels of correlation and two separate physical symmetries which are broken under strong correlation. We show how the combination of symmetry collective states and coupled-cluster theory is effective in obtaining correlation energies and order parameters of the Agassi model throughout its phase diagram.

  20. Information Theory and Voting Based Consensus Clustering for Combining Multiple Clusterings of Chemical Structures.

    PubMed

    Saeed, Faisal; Salim, Naomie; Abdo, Ammar

    2013-07-01

    Many consensus clustering methods have been applied in different areas such as pattern recognition, machine learning, information theory and bioinformatics. However, few methods have been used for chemical compounds clustering. In this paper, an information theory and voting based algorithm (Adaptive Cumulative Voting-based Aggregation Algorithm A-CVAA) was examined for combining multiple clusterings of chemical structures. The effectiveness of clusterings was evaluated based on the ability of the clustering method to separate active from inactive molecules in each cluster, and the results were compared with Ward's method. The chemical dataset MDL Drug Data Report (MDDR) and the Maximum Unbiased Validation (MUV) dataset were used. Experiments suggest that the adaptive cumulative voting-based consensus method can improve the effectiveness of combining multiple clusterings of chemical structures. Copyright © 2013 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  1. A combinational theory for maintenance of sex

    PubMed Central

    Hörandl, E

    2010-01-01

    Sexual reproduction implies high costs, but it is difficult to give evidence for evolutionary advantages that would explain the predominance of meiotic sex in eukaryotes. A combinational theory discussing evolution, maintenance and loss of sex may resolve the problem. The main function of sex is the restoration of DNA and consequently a higher quality of offspring. Recombination at meiosis evolved, perhaps, as a repair mechanism of DNA strand damages. This mechanism is most efficient for DNA restoration in multicellular eukaryotes, because the initial cell starts with a re-optimized genome, which is passed to all the daughter cells. Meiosis acts also as creator of variation in haploid stages, in which selection can purge most efficiently deleterious mutations. A prolonged diploid phase buffers the effects of deleterious recessive alleles as well as epigenetic defects and is thus optimal for prolonged growth periods. For complex multicellular organisms, the main advantage of sexuality is thus the alternation of diploid and haploid stages, combining advantages of both. A loss of sex is constrained by several, partly group-specific, developmental features. Hybridization may trigger shifts from sexual to asexual reproduction, but crossing barriers of the parental sexual species limit this process. For the concerted break-up of meiosis-outcrossing cycles plus silencing of secondary features, various group-specific changes in the regulatory system may be required. An establishment of asexuals requires special functional modifications and environmental opportunities. Costs for maintenance of meiotic sex are consequently lower than a shift to asexual reproduction. PMID:19623209

  2. Analysing collaboration among HIV agencies through combining network theory and relational coordination.

    PubMed

    Khosla, Nidhi; Marsteller, Jill Ann; Hsu, Yea Jen; Elliott, David L

    2016-02-01

    Agencies with different foci (e.g. nutrition, social, medical, housing) serve people living with HIV (PLHIV). Serving needs of PLHIV comprehensively requires a high degree of coordination among agencies which often benefits from more frequent communication. We combined Social Network theory and Relational Coordination theory to study coordination among HIV agencies in Baltimore. Social Network theory implies that actors (e.g., HIV agencies) establish linkages amongst themselves in order to access resources (e.g., information). Relational Coordination theory suggests that high quality coordination among agencies or teams relies on the seven dimensions of frequency, timeliness and accuracy of communication, problem-solving communication, knowledge of agencies' work, mutual respect and shared goals. We collected data on frequency of contact from 57 agencies using a roster method. Response options were ordinal ranging from 'not at all' to 'daily'. We analyzed data using social network measures. Next, we selected agencies with which at least one-third of the sample reported monthly or more frequent interaction. This yielded 11 agencies whom we surveyed on seven relational coordination dimensions with questions scored on a Likert scale of 1-5. Network density, defined as the proportion of existing connections to all possible connections, was 20% when considering monthly or higher interaction. Relational coordination scores from individual agencies to others ranged between 1.17 and 5.00 (maximum possible score 5). The average scores for different dimensions across all agencies ranged between 3.30 and 4.00. Shared goals (4.00) and mutual respect (3.91) scores were highest, while scores such as knowledge of each other's work and problem-solving communication were relatively lower. Combining theoretically driven analyses in this manner offers an innovative way to provide a comprehensive picture of inter-agency coordination and the quality of exchange that underlies

  3. Analysis of surface segregation in polymer mixtures: A combination of mean field and statistical associated fluid theories

    NASA Astrophysics Data System (ADS)

    Krawczyk, Jaroslaw; Croce, Salvatore; Chakrabarti, Buddhapriya; Tasche, Jos

    The surface segregation in polymer mixtures remains a challenging problem for both academic exploration as well as industrial applications. Despite its ubiquity and several theoretical attempts a good agreement between computed and experimentally observed profiles has not yet been achieved. A simple theoretical model proposed in this context by Schmidt and Binder combines Flory-Huggins free energy of mixing with the square gradient theory of wetting of a wall by fluid. While the theory gives us a qualitative understanding of the surface induced segregation and the surface enrichment it lacks the quantitative comparison with the experiment. The statistical associating fluid theory (SAFT) allows us to calculate accurate free energy for a real polymeric materials. In an earlier work we had shown that increasing the bulk modulus of a polymer matrix through which small molecules migrate to the free surface causes reduction in the surface migrant fraction using Schmidt-Binder and self-consistent field theories. In this work we validate this idea by combining mean field theories and SAFT to identify parameter ranges where such an effect should be observable. Department of Molecular Physics, Łódź University of Technology, Żeromskiego 116, 90-924 Łódź, Poland.

  4. Learning from doing: the case for combining normalisation process theory and participatory learning and action research methodology for primary healthcare implementation research.

    PubMed

    de Brún, Tomas; O'Reilly-de Brún, Mary; O'Donnell, Catherine A; MacFarlane, Anne

    2016-08-03

    The implementation of research findings is not a straightforward matter. There are substantive and recognised gaps in the process of translating research findings into practice and policy. In order to overcome some of these translational difficulties, a number of strategies have been proposed for researchers. These include greater use of theoretical approaches in research focused on implementation, and use of a wider range of research methods appropriate to policy questions and the wider social context in which they are placed. However, questions remain about how to combine theory and method in implementation research. In this paper, we respond to these proposals. Focussing on a contemporary social theory, Normalisation Process Theory, and a participatory research methodology, Participatory Learning and Action, we discuss the potential of their combined use for implementation research. We note ways in which Normalisation Process Theory and Participatory Learning and Action are congruent and may therefore be used as heuristic devices to explore, better understand and support implementation. We also provide examples of their use in our own research programme about community involvement in primary healthcare. Normalisation Process Theory alone has, to date, offered useful explanations for the success or otherwise of implementation projects post-implementation. We argue that Normalisation Process Theory can also be used to prospectively support implementation journeys. Furthermore, Normalisation Process Theory and Participatory Learning and Action can be used together so that interventions to support implementation work are devised and enacted with the expertise of key stakeholders. We propose that the specific combination of this theory and methodology possesses the potential, because of their combined heuristic force, to offer a more effective means of supporting implementation projects than either one might do on its own, and of providing deeper understandings of

  5. Exploring the Combination of Dempster-Shafer Theory and Neural Network for Predicting Trust and Distrust

    PubMed Central

    Wang, Xin; Wang, Ying; Sun, Hongbin

    2016-01-01

    In social media, trust and distrust among users are important factors in helping users make decisions, dissect information, and receive recommendations. However, the sparsity and imbalance of social relations bring great difficulties and challenges in predicting trust and distrust. Meanwhile, there are numerous inducing factors to determine trust and distrust relations. The relationship among inducing factors may be dependency, independence, and conflicting. Dempster-Shafer theory and neural network are effective and efficient strategies to deal with these difficulties and challenges. In this paper, we study trust and distrust prediction based on the combination of Dempster-Shafer theory and neural network. We firstly analyze the inducing factors about trust and distrust, namely, homophily, status theory, and emotion tendency. Then, we quantify inducing factors of trust and distrust, take these features as evidences, and construct evidence prototype as input nodes of multilayer neural network. Finally, we propose a framework of predicting trust and distrust which uses multilayer neural network to model the implementing process of Dempster-Shafer theory in different hidden layers, aiming to overcome the disadvantage of Dempster-Shafer theory without optimization method. Experimental results on a real-world dataset demonstrate the effectiveness of the proposed framework. PMID:27034651

  6. Combined theory of reflectance and emittance spectroscopy

    NASA Technical Reports Server (NTRS)

    Hapke, Bruce

    1995-01-01

    The theory in which either or both reflected sunlight and thermally emitted radiation contribute to the power received by a detector viewing a particulate medium, such as a powder in the laboratory or a planetary regolith, is considered theoretically. This theory is of considerable interest for the interpretation of data from field or spacecraft instruments that are sensitive to the near-infrared region of the spectrum, such as NIMS (near-infrared mapping spectrometer) and VIMS (visual and infrared mapping spectrometer), as well as thermal infrared detectors.

  7. A novel method for multifactorial bio-chemical experiments design based on combinational design theory.

    PubMed

    Wang, Xun; Sun, Beibei; Liu, Boyang; Fu, Yaping; Zheng, Pan

    2017-01-01

    Experimental design focuses on describing or explaining the multifactorial interactions that are hypothesized to reflect the variation. The design introduces conditions that may directly affect the variation, where particular conditions are purposely selected for observation. Combinatorial design theory deals with the existence, construction and properties of systems of finite sets whose arrangements satisfy generalized concepts of balance and/or symmetry. In this work, borrowing the concept of "balance" in combinatorial design theory, a novel method for multifactorial bio-chemical experiments design is proposed, where balanced templates in combinational design are used to select the conditions for observation. Balanced experimental data that covers all the influencing factors of experiments can be obtianed for further processing, such as training set for machine learning models. Finally, a software based on the proposed method is developed for designing experiments with covering influencing factors a certain number of times.

  8. Combining evidence and diffusion of innovation theory to enhance influenza immunization.

    PubMed

    Britto, Maria T; Pandzik, Geralyn M; Meeks, Connie S; Kotagal, Uma R

    2006-08-01

    Children and adolescents with chronic conditions such as asthma, diabetes, and HIV are at high risk of influenza-related morbidity, and there are recommendations to immunize these populations annually. At Cincinnati Children's Hospital Medical Center, the influenza immunization rate increased to 90.4% (5% declined) among 200 patients with cystic fibrosis (CF). Diffusion of innovation theory was used to guide the design and implementation of spread to other clinics. The main intervention strategies were: (1) engagement of interested, nurse-led teams, (2) A collaborative learning session, (3) A tool kit including literature, sample goals, reminder postcards, communication strategies, and team member roles and processes, (4) open-access scheduling and standing orders (5) A simple Web-based registry, (6) facilitated vaccine ordering, (7) recall phone calls, and (8) weekly results posting. Clinic-specific immunization rates ranged from 32.7% to 92.8%, with the highest rate reported in the CF clinic. All teams used multiple strategies; with six of the seven using four or more. Overall, 60.0% (762/1,269) of the population was immunized. Barriers included vaccine shortages, lack of time for reminder calls, and lack of physician support in one clinic. A combination of interventions, guided by evidence and diffusion of innovation theory, led to immunization rates higher than those reported in the literature.

  9. Baryon chiral perturbation theory combined with the 1 /Nc expansion in SU(3): Framework

    NASA Astrophysics Data System (ADS)

    Fernando, I. P.; Goity, J. L.

    2018-03-01

    Baryon chiral perturbation theory combined with the 1 /Nc expansion is implemented for three flavors. Baryon masses, vector charges and axial vector couplings are studied to one-loop and organized according to the ξ -expansion, in which the 1 /Nc and the low-energy power countings are linked according to 1 /Nc=O (ξ )=O (p ). The renormalization to O (ξ3) necessary for the mentioned observables is provided, along with applications to the baryon masses and axial couplings as obtained in lattice QCD calculations.

  10. Characterizing representational learning: A combined simulation and tutorial on perturbation theory

    NASA Astrophysics Data System (ADS)

    Kohnle, Antje; Passante, Gina

    2017-12-01

    Analyzing, constructing, and translating between graphical, pictorial, and mathematical representations of physics ideas and reasoning flexibly through them ("representational competence") is a key characteristic of expertise in physics but is a challenge for learners to develop. Interactive computer simulations and University of Washington style tutorials both have affordances to support representational learning. This article describes work to characterize students' spontaneous use of representations before and after working with a combined simulation and tutorial on first-order energy corrections in the context of quantum-mechanical time-independent perturbation theory. Data were collected from two institutions using pre-, mid-, and post-tests to assess short- and long-term gains. A representational competence level framework was adapted to devise level descriptors for the assessment items. The results indicate an increase in the number of representations used by students and the consistency between them following the combined simulation tutorial. The distributions of representational competence levels suggest a shift from perceptual to semantic use of representations based on their underlying meaning. In terms of activity design, this study illustrates the need to support students in making sense of the representations shown in a simulation and in learning to choose the most appropriate representation for a given task. In terms of characterizing representational abilities, this study illustrates the usefulness of a framework focusing on perceptual, syntactic, and semantic use of representations.

  11. Economic theory and nursing administration research--is this a good combination?

    PubMed

    Jones, Terry L; Yoder, Linda

    2010-01-01

    Economic theory is used to describe and explain decision making in the context of scarce resources. This paper presents two applications of economic theory to the delivery of nursing services in acute care hospitals and evaluates its usefulness in guiding nursing administration research. The description of economic theory and the proposed applications for nursing are based on current nursing, healthcare, and economic literature. Evaluation of the potential usefulness of economic theory in guiding nursing administration research is based on the criteria of significance and testability as described by Fawcett and Downs. While economic theory can be very useful in explaining how decisions about nursing time allocation and nursing care production are made, it will not address the issue of how they should be made. Normative theories and ethical frameworks also must be incorporated in the decision-making process around these issues. Economic theory and nursing administration are a good fit when balanced with the values and goals of nursing.

  12. Economic Theory and Nursing Administration Research—Is This a Good Combination?

    PubMed Central

    Jones, Terry L.; Yoder, Linda

    2017-01-01

    TOPIC Economic theory is used to describe and explain decision making in the context of scarce resources. PURPOSE This paper presents two applications of economic theory to the delivery of nursing services in acute care hospitals and evaluates its usefulness in guiding nursing administration research. SOURCES OF INFORMATION The description of economic theory and the proposed applications for nursing are based on current nursing, healthcare, and economic literature. Evaluation of the potential usefulness of economic theory in guiding nursing administration research is based on the criteria of significance and testability as described by Fawcett and Downs. CONCLUSIONS While economic theory can be very useful in explaining how decisions about nursing time allocation and nursing care production are made, it will not address the issue of how they should be made. Normative theories and ethical frameworks also must be incorporated in the decision-making process around these issues. Economic theory and nursing administration are a good fit when balanced with the values and goals of nursing. PMID:20137023

  13. Blade mistuning coupled with shaft flexibility effects in rotor aeroelasticity

    NASA Technical Reports Server (NTRS)

    Khader, Naim; Loewy, Robert G.

    1989-01-01

    The effect of bladed-disk polar dissymmetry, resulting from variations in mass from one blade to another, on aeroelastic stability boundaries for a fan stage is presented. In addition to both in-plane and out-of-plane deformations of the bladed-disk, bending of the supporting shaft in two planes is considered, and the resulting Coriolis forces and gyroscopic moments are included in the analysis. A quasi-steady aerodynamics approach is combined with the Lagrangian method to develop the governing equations of motion for the flexible bladed-disk-shaft assembly. Calculations are performed for an actual fan stage.

  14. Baryon chiral perturbation theory combined with the 1 / N c expansion in SU(3): Framework

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fernando, I. P.; Goity, J. L.

    Baryon Chiral Perturbation Theory combined with themore » $$1/N_c$$ expansion is implemented for three flavors. Here, Baryon masses, vector charges and axial vector couplings are studied to one-loop and organized according to the $$\\xi$$-expansion, in which the $$1/N_c$$ and the low energy power countings are linked according to $$1/N_c={\\cal{O}}(\\xi)={\\cal{O}}(p)$$. The renormalization to $${\\cal{O}}(\\xi^3)$$ necessary for the mentioned observables is provided, along with applications to the baryon masses and axial couplings as obtained in lattice QCD calculations.« less

  15. Baryon chiral perturbation theory combined with the 1 / N c expansion in SU(3): Framework

    DOE PAGES

    Fernando, I. P.; Goity, J. L.

    2018-03-14

    Baryon Chiral Perturbation Theory combined with themore » $$1/N_c$$ expansion is implemented for three flavors. Here, Baryon masses, vector charges and axial vector couplings are studied to one-loop and organized according to the $$\\xi$$-expansion, in which the $$1/N_c$$ and the low energy power countings are linked according to $$1/N_c={\\cal{O}}(\\xi)={\\cal{O}}(p)$$. The renormalization to $${\\cal{O}}(\\xi^3)$$ necessary for the mentioned observables is provided, along with applications to the baryon masses and axial couplings as obtained in lattice QCD calculations.« less

  16. A multimedia adult literacy program: Combining NASA technology, instructional design theory, and authentic literacy concepts

    NASA Technical Reports Server (NTRS)

    Willis, Jerry W.

    1993-01-01

    For a number of years, the Software Technology Branch of the Information Systems Directorate has been involved in the application of cutting edge hardware and software technologies to instructional tasks related to NASA projects. The branch has developed intelligent computer aided training shells, instructional applications of virtual reality and multimedia, and computer-based instructional packages that use fuzzy logic for both instructional and diagnostic decision making. One outcome of the work on space-related technology-supported instruction has been the creation of a significant pool of human talent in the branch with current expertise on the cutting edges of instructional technologies. When the human talent is combined with advanced technologies for graphics, sound, video, CD-ROM, and high speed computing, the result is a powerful research and development group that both contributes to the applied foundations of instructional technology and creates effective instructional packages that take advantage of a range of advanced technologies. Several branch projects are currently underway that combine NASA-developed expertise to significant instructional problems in public education. The branch, for example, has developed intelligent computer aided software to help high school students learn physics and staff are currently working on a project to produce educational software for young children with language deficits. This report deals with another project, the adult literacy tutor. Unfortunately, while there are a number of computer-based instructional packages available for adult literacy instruction, most of them are based on the same instructional models that failed these students when they were in school. The teacher-centered, discrete skill and drill-oriented, instructional strategies, even when they are supported by color computer graphics and animation, that form the foundation for most of the computer-based literacy packages currently on the market may not

  17. Universal prescriptivism: traditional moral decision-making theory revisited.

    PubMed

    Crigger, N J

    1994-09-01

    Universal prescriptivism is a recently developed moral decision-making theory that combines utilitarian and Kantian theories with two levels of moral thinking. A combined approach offers a creative solution to the weaknesses inherent in traditional moral theories. The paper describes the theory and discusses important implications for nursing education, practical ethical decision-making, and research. The relationship of an ethical theory of caring to traditional moral theory is discussed.

  18. Combined Molecular Dynamics Simulation-Molecular-Thermodynamic Theory Framework for Predicting Surface Tensions.

    PubMed

    Sresht, Vishnu; Lewandowski, Eric P; Blankschtein, Daniel; Jusufi, Arben

    2017-08-22

    A molecular modeling approach is presented with a focus on quantitative predictions of the surface tension of aqueous surfactant solutions. The approach combines classical Molecular Dynamics (MD) simulations with a molecular-thermodynamic theory (MTT) [ Y. J. Nikas, S. Puvvada, D. Blankschtein, Langmuir 1992 , 8 , 2680 ]. The MD component is used to calculate thermodynamic and molecular parameters that are needed in the MTT model to determine the surface tension isotherm. The MD/MTT approach provides the important link between the surfactant bulk concentration, the experimental control parameter, and the surfactant surface concentration, the MD control parameter. We demonstrate the capability of the MD/MTT modeling approach on nonionic alkyl polyethylene glycol surfactants at the air-water interface and observe reasonable agreement of the predicted surface tensions and the experimental surface tension data over a wide range of surfactant concentrations below the critical micelle concentration. Our modeling approach can be extended to ionic surfactants and their mixtures with both ionic and nonionic surfactants at liquid-liquid interfaces.

  19. Aerodynamic characteristics of wings designed with a combined-theory method to cruise at a Mach number of 4.5

    NASA Technical Reports Server (NTRS)

    Mack, Robert J.

    1988-01-01

    A wind-tunnel study was conducted to determine the capability of a method combining linear theory and shock-expansion theory to design optimum camber surfaces for wings that will fly at high-supersonic/low-hypersonic speeds. Three force models (a flat-plate reference wing and two cambered and twisted wings) were used to obtain aerodynamic lift, drag, and pitching-moment data. A fourth pressure-orifice model was used to obtain surface-pressure data. All four wing models had the same planform, airfoil section, and centerbody area distribution. The design Mach number was 4.5, but data were also obtained at Mach numbers of 3.5 and 4.0. Results of these tests indicated that the use of airfoil thickness as a theoretical optimum, camber-surface design constraint did not improve the aerodynamic efficiency or performance of a wing as compared with a wing that was designed with a zero-thickness airfoil (linear-theory) constraint.

  20. Graph-based linear scaling electronic structure theory.

    PubMed

    Niklasson, Anders M N; Mniszewski, Susan M; Negre, Christian F A; Cawkwell, Marc J; Swart, Pieter J; Mohd-Yusof, Jamal; Germann, Timothy C; Wall, Michael E; Bock, Nicolas; Rubensson, Emanuel H; Djidjev, Hristo

    2016-06-21

    We show how graph theory can be combined with quantum theory to calculate the electronic structure of large complex systems. The graph formalism is general and applicable to a broad range of electronic structure methods and materials, including challenging systems such as biomolecules. The methodology combines well-controlled accuracy, low computational cost, and natural low-communication parallelism. This combination addresses substantial shortcomings of linear scaling electronic structure theory, in particular with respect to quantum-based molecular dynamics simulations.

  1. Graph-based linear scaling electronic structure theory

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Niklasson, Anders M. N., E-mail: amn@lanl.gov; Negre, Christian F. A.; Cawkwell, Marc J.

    2016-06-21

    We show how graph theory can be combined with quantum theory to calculate the electronic structure of large complex systems. The graph formalism is general and applicable to a broad range of electronic structure methods and materials, including challenging systems such as biomolecules. The methodology combines well-controlled accuracy, low computational cost, and natural low-communication parallelism. This combination addresses substantial shortcomings of linear scaling electronic structure theory, in particular with respect to quantum-based molecular dynamics simulations.

  2. Combining motivational and volitional interventions to promote exercise participation: protection motivation theory and implementation intentions.

    PubMed

    Milne, Sarah; Orbell, Sheina; Sheeran, Paschal

    2002-05-01

    This study compared a motivational intervention based on protection motivation theory (PMT, Rogers, 1975, 1983) with the same motivational intervention augmented by a volitional intervention based on implementation intentions (Gollwitzer, 1993). The study had a longitudinal design, involving three waves of data collection over a 2-week period, incorporating an experimental manipulation of PMT variables at Time 1 and a volitional, implementation intention intervention at Time 2. Participants (N=248) were randomly allocated to a control group or one of two intervention groups. Cognitions and exercise behaviour were measured at three time-points over a 2-week period. The motivational intervention significantly increased threat and coping appraisal and intentions to engage in exercise but did not bring about a significant increase in subsequent exercise behaviour. In contrast, the combined protection motivation theory/implementation intention intervention had a dramatic effect on subsequent exercise behaviour. This volitional intervention did not influence behavioural intention or any other motivational variables. It is concluded that supplementing PMT with implementation intentions strengthens the ability of the model to explain behaviour. This has implications for health education programmes, which should aim to increase both participants' motivation and their volition.

  3. Symmetric tops in combined electric fields: Conditional quasisolvability via the quantum Hamilton-Jacobi theory

    NASA Astrophysics Data System (ADS)

    Schatz, Konrad; Friedrich, Bretislav; Becker, Simon; Schmidt, Burkhard

    2018-05-01

    We make use of the quantum Hamilton-Jacobi (QHJ) theory to investigate conditional quasisolvability of the quantum symmetric top subject to combined electric fields (symmetric top pendulum). We derive the conditions of quasisolvability of the time-independent Schrödinger equation as well as the corresponding finite sets of exact analytic solutions. We do so for this prototypical trigonometric system as well as for its anti-isospectral hyperbolic counterpart. An examination of the algebraic and numerical spectra of these two systems reveals mutually closely related patterns. The QHJ approach allows us to retrieve the closed-form solutions for the spherical and planar pendula and the Razavy system that had been obtained in our earlier work via supersymmetric quantum mechanics as well as to find a cornucopia of additional exact analytic solutions.

  4. Combining Theory and Practice in the Gymnasium: "Constraints" within an Ecological Perspective

    ERIC Educational Resources Information Center

    Gagen, Linda; Getchell, Nancy

    2004-01-01

    Preservice students do not always see the relationship between the theories they learn in motor development class and the practical applications of those theories in the gymnasium. This article begins to bridge the gap between theory and practice. Within the theoretical viewpoint known as the ecological perspective, the authors identify the…

  5. Combination of uncertainty theories and decision-aiding methods for natural risk management in a context of imperfect information

    NASA Astrophysics Data System (ADS)

    Tacnet, Jean-Marc; Dupouy, Guillaume; Carladous, Simon; Dezert, Jean; Batton-Hubert, Mireille

    2017-04-01

    In mountain areas, natural phenomena such as snow avalanches, debris-flows and rock-falls, put people and objects at risk with sometimes dramatic consequences. Risk is classically considered as a combination of hazard, the combination of the intensity and frequency of the phenomenon, and vulnerability which corresponds to the consequences of the phenomenon on exposed people and material assets. Risk management consists in identifying the risk level as well as choosing the best strategies for risk prevention, i.e. mitigation. In the context of natural phenomena in mountainous areas, technical and scientific knowledge is often lacking. Risk management decisions are therefore based on imperfect information. This information comes from more or less reliable sources ranging from historical data, expert assessments, numerical simulations etc. Finally, risk management decisions are the result of complex knowledge management and reasoning processes. Tracing the information and propagating information quality from data acquisition to decisions are therefore important steps in the decision-making process. One major goal today is therefore to assist decision-making while considering the availability, quality and reliability of information content and sources. A global integrated framework is proposed to improve the risk management process in a context of information imperfection provided by more or less reliable sources: uncertainty as well as imprecision, inconsistency and incompleteness are considered. Several methods are used and associated in an original way: sequential decision context description, development of specific multi-criteria decision-making methods, imperfection propagation in numerical modeling and information fusion. This framework not only assists in decision-making but also traces the process and evaluates the impact of information quality on decision-making. We focus and present two main developments. The first one relates to uncertainty and imprecision

  6. Generalizing Prototype Theory: A Formal Quantum Framework

    PubMed Central

    Aerts, Diederik; Broekaert, Jan; Gabora, Liane; Sozzo, Sandro

    2016-01-01

    Theories of natural language and concepts have been unable to model the flexibility, creativity, context-dependence, and emergence, exhibited by words, concepts and their combinations. The mathematical formalism of quantum theory has instead been successful in capturing these phenomena such as graded membership, situational meaning, composition of categories, and also more complex decision making situations, which cannot be modeled in traditional probabilistic approaches. We show how a formal quantum approach to concepts and their combinations can provide a powerful extension of prototype theory. We explain how prototypes can interfere in conceptual combinations as a consequence of their contextual interactions, and provide an illustration of this using an intuitive wave-like diagram. This quantum-conceptual approach gives new life to original prototype theory, without however making it a privileged concept theory, as we explain at the end of our paper. PMID:27065436

  7. Recent advances in combination of capillary electrophoresis with mass spectrometry: methodology and theory.

    PubMed

    Klepárník, Karel

    2015-01-01

    This review focuses on the latest development of microseparation electromigration methods in capillaries and microfluidic devices with MS detection and identification. A wide selection of 183 relevant articles covers the literature published from June 2012 till May 2014 as a continuation of the review article on the same topic by Kleparnik [Electrophoresis 2013, 34, 70-86]. Special attention is paid to the new improvements in the theory of instrumentation and methodology of MS interfacing with capillary versions of zone electrophoresis, ITP, and IEF. Ionization methods in MS include ESI, MALDI, and ICP. Although the main attention is paid to the development of instrumentation and methodology, representative examples illustrate also applications in the proteomics, glycomics, metabolomics, biomarker research, forensics, pharmacology, food analysis, and single-cell analysis. The combinations of MS with capillary versions of electrochromatography and micellar electrokinetic chromatography are not included. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  8. Some considerations concerning the theory of combined toxicity: a case study of subchronic experimental intoxication with cadmium and lead.

    PubMed

    Varaksin, Anatoly N; Katsnelson, Boris A; Panov, Vladimir G; Privalova, Larisa I; Kireyeva, Ekaterina P; Valamina, Irene E; Beresneva, Olga Yu

    2014-02-01

    Rats were exposed intraperitoneally (3 times a week up to 20 injections) to either Cadmium and Lead salts in doses equivalent to their 0.05 LD50 separately or combined in the same or halved doses. Toxic effects were assessed by more than 40 functional, biochemical and morphometric indices. We analysed the results obtained aiming at determination of the type of combined toxicity using either common sense considerations based on descriptive statistics or two mathematical models based (a) on ANOVA and (b) on Mathematical Theory of Experimental Design, which correspond, respectively, to the widely recognised paradigms of effect additivity and dose additivity. Nevertheless, these approaches have led us unanimously to the following conclusions: (1) The above paradigms are virtually interchangeable and should be regarded as different methods of modelling the combined toxicity rather than as reflecting fundamentally differing processes. (2) Within both models there exist not merely three traditionally used types of combined toxicity (additivity, subadditivity and superadditivity) but at least 10 variants of it depending on exactly which effect is considered and on its level, as well as on the dose levels and their ratio. Copyright © 2013 Elsevier Ltd. All rights reserved.

  9. School Psychology Research: Combining Ecological Theory and Prevention Science

    ERIC Educational Resources Information Center

    Burns, Matthew K.

    2011-01-01

    The current article comments on the importance of theoretical implications within school psychological research, and proposes that ecological theory and prevention science could provide the conceptual framework for school psychology research and practice. Articles published in "School Psychology Review" should at least discuss potential…

  10. The effect of an intervention combining self-efficacy theory and pedometers on promoting physical activity among adolescents.

    PubMed

    Lee, Ling-Ling; Kuo, Yu-Chi; Fanaw, Dilw; Perng, Shoa-Jen; Juang, Ian-Fei

    2012-04-01

    To study the effect of an intervention combining self-efficacy theory and pedometers on promoting physical activity among adolescents. The beneficial effects of regular physical activity on health in youths are well-documented. However, adolescence is found to be the age of greatest decline in physical activity participation. Physical activity participation among girls was generally less frequent and less intense than boys. Therefore, there is a strong need for effective interventions that can help promote physical activity in this population. An experimental design. Two classes of female junior college students (mean age = 16) were randomly sampled from a total of four classes and, of those, one each was randomly assigned to either the intervention (n = 46) or the control group (n = 48). Self-efficacy was used as a core theoretical foundation of the intervention design, and pedometers were provided to the students in the intervention group. Distances between each domestic scenic spot were illustrated graphically in a walking log for students to mark the extent of their walking or running. Students in the control group participated in a usual physical education programme. The primary outcome was a change in the number of aerobic steps. The secondary outcomes were changes in cardiopulmonary endurance and exercise self-efficacy. At 12-week follow-up, the mean change in aerobic steps was 371 steps and 108 steps in the intervention and control group, respectively. The difference in mean change between the two groups was 467 steps. Effects of the intervention on changes of cardiopulmonary endurance and perceived exercise self-efficacy scores were not found. Among adolescent girls, a 12-week intervention designed on the theoretical foundation of self-efficacy theory and provision of pedometers was found to have an effect on increasing their physical activity. The intervention, using graphs of domestic scenic spots to represent the distance of walking or running as

  11. E(lementary)-strings in six-dimensional heterotic F-theory

    NASA Astrophysics Data System (ADS)

    Choi, Kang-Sin; Rey, Soo-Jong

    2017-09-01

    Using E-strings, we can analyze not only six-dimensional superconformal field theories but also probe vacua of non-perturabative heterotic string. We study strings made of D3-branes wrapped on various two-cycles in the global F-theory setup. We claim that E-strings are elementary in the sense that various combinations of E-strings can form M-strings as well as heterotic strings and new kind of strings, called G-strings. Using them, we show that emissions and combinations of heterotic small instantons generate most of known six-dimensional superconformal theories, their affinizations and little string theories. Taking account of global structure of compact internal geometry, we also show that special combinations of E-strings play an important role in constructing six-dimensional theories of D- and E-types. We check global consistency conditions from anomaly cancellation conditions, both from five-branes and strings, and show that they are given in terms of elementary E-string combinations.

  12. Stochastic generation of complex crystal structures combining group and graph theory with application to carbon

    NASA Astrophysics Data System (ADS)

    Shi, Xizhi; He, Chaoyu; Pickard, Chris J.; Tang, Chao; Zhong, Jianxin

    2018-01-01

    A method is introduced to stochastically generate crystal structures with defined structural characteristics. Reasonable quotient graphs for symmetric crystals are constructed using a random strategy combined with space group and graph theory. Our algorithm enables the search for large-size and complex crystal structures with a specified connectivity, such as threefold sp2 carbons, fourfold sp3 carbons, as well as mixed sp2-sp3 carbons. To demonstrate the method, we randomly construct initial structures adhering to space groups from 75 to 230 and a range of lattice constants, and we identify 281 new sp3 carbon crystals. First-principles optimization of these structures show that most of them are dynamically and mechanically stable and are energetically comparable to those previously proposed. Some of the new structures can be considered as candidates to explain the experimental cold compression of graphite.

  13. Combined Diffraction and Density Functional Theory Calculations of Halogen-Bonded Cocrystal Monolayers

    PubMed Central

    2013-01-01

    This work describes the combined use of synchrotron X-ray diffraction and density functional theory (DFT) calculations to understand the cocrystal formation or phase separation in 2D monolayers capable of halogen bonding. The solid monolayer structure of 1,4-diiodobenzene (DIB) has been determined by X-ray synchrotron diffraction. The mixing behavior of DIB with 4,4′-bipyridyl (BPY) has also been studied and interestingly is found to phase-separate rather than form a cocrystal, as observed in the bulk. DFT calculations are used to establish the underlying origin of this interesting behavior. The DFT calculations are demonstrated to agree well with the recently proposed monolayer structure for the cocrystal of BPY and 1,4-diiodotetrafluorobenzene (DITFB) (the perfluorinated analogue of DIB), where halogen bonding has also been identified by diffraction. Here we have calculated an estimate of the halogen bond strength by DFT calculations for the DITFB/BPY cocrystal monolayer, which is found to be ∼20 kJ/mol. Computationally, we find that the nonfluorinated DIB and BPY are not expected to form a halogen-bonded cocrystal in a 2D layer; for this pair of species, phase separation of the components is calculated to be lower energy, in good agreement with the diffraction results. PMID:24215390

  14. Combined diffraction and density functional theory calculations of halogen-bonded cocrystal monolayers.

    PubMed

    Sacchi, Marco; Brewer, Adam Y; Jenkins, Stephen J; Parker, Julia E; Friščić, Tomislav; Clarke, Stuart M

    2013-12-03

    This work describes the combined use of synchrotron X-ray diffraction and density functional theory (DFT) calculations to understand the cocrystal formation or phase separation in 2D monolayers capable of halogen bonding. The solid monolayer structure of 1,4-diiodobenzene (DIB) has been determined by X-ray synchrotron diffraction. The mixing behavior of DIB with 4,4'-bipyridyl (BPY) has also been studied and interestingly is found to phase-separate rather than form a cocrystal, as observed in the bulk. DFT calculations are used to establish the underlying origin of this interesting behavior. The DFT calculations are demonstrated to agree well with the recently proposed monolayer structure for the cocrystal of BPY and 1,4-diiodotetrafluorobenzene (DITFB) (the perfluorinated analogue of DIB), where halogen bonding has also been identified by diffraction. Here we have calculated an estimate of the halogen bond strength by DFT calculations for the DITFB/BPY cocrystal monolayer, which is found to be ∼20 kJ/mol. Computationally, we find that the nonfluorinated DIB and BPY are not expected to form a halogen-bonded cocrystal in a 2D layer; for this pair of species, phase separation of the components is calculated to be lower energy, in good agreement with the diffraction results.

  15. First time combination of frozen density embedding theory with the algebraic diagrammatic construction scheme for the polarization propagator of second order

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Prager, Stefan, E-mail: stefan.prager@iwr.uni-heidelberg.de; Dreuw, Andreas, E-mail: dreuw@uni-heidelberg.de; Zech, Alexander, E-mail: alexander.zech@unige.ch

    The combination of Frozen Density Embedding Theory (FDET) and the Algebraic Diagrammatic Construction (ADC) scheme for the polarization propagator for describing environmental effects on electronically excited states is presented. Two different ways of interfacing and expressing the so-called embedding operator are introduced. The resulting excited states are compared with supermolecular calculations of the total system at the ADC(2) level of theory. Molecular test systems were chosen to investigate molecule–environment interactions of varying strength from dispersion interaction up to multiple hydrogen bonds. The overall difference between the supermolecular and the FDE-ADC calculations in excitation energies is lower than 0.09 eV (max)more » and 0.032 eV in average, which is well below the intrinsic error of the ADC(2) method itself.« less

  16. Comparison and combination of "direct" and fragment based local correlation methods: Cluster in molecules and domain based local pair natural orbital perturbation and coupled cluster theories

    NASA Astrophysics Data System (ADS)

    Guo, Yang; Becker, Ute; Neese, Frank

    2018-03-01

    Local correlation theories have been developed in two main flavors: (1) "direct" local correlation methods apply local approximation to the canonical equations and (2) fragment based methods reconstruct the correlation energy from a series of smaller calculations on subsystems. The present work serves two purposes. First, we investigate the relative efficiencies of the two approaches using the domain-based local pair natural orbital (DLPNO) approach as the "direct" method and the cluster in molecule (CIM) approach as the fragment based approach. Both approaches are applied in conjunction with second-order many-body perturbation theory (MP2) as well as coupled-cluster theory with single-, double- and perturbative triple excitations [CCSD(T)]. Second, we have investigated the possible merits of combining the two approaches by performing CIM calculations with DLPNO methods serving as the method of choice for performing the subsystem calculations. Our cluster-in-molecule approach is closely related to but slightly deviates from approaches in the literature since we have avoided real space cutoffs. Moreover, the neglected distant pair correlations in the previous CIM approach are considered approximately. Six very large molecules (503-2380 atoms) were studied. At both MP2 and CCSD(T) levels of theory, the CIM and DLPNO methods show similar efficiency. However, DLPNO methods are more accurate for 3-dimensional systems. While we have found only little incentive for the combination of CIM with DLPNO-MP2, the situation is different for CIM-DLPNO-CCSD(T). This combination is attractive because (1) the better parallelization opportunities offered by CIM; (2) the methodology is less memory intensive than the genuine DLPNO-CCSD(T) method and, hence, allows for large calculations on more modest hardware; and (3) the methodology is applicable and efficient in the frequently met cases, where the largest subsystem calculation is too large for the canonical CCSD(T) method.

  17. Study of homogeneous bubble nucleation in liquid carbon dioxide by a hybrid approach combining molecular dynamics simulation and density gradient theory

    NASA Astrophysics Data System (ADS)

    Langenbach, K.; Heilig, M.; Horsch, M.; Hasse, H.

    2018-03-01

    A new method for predicting homogeneous bubble nucleation rates of pure compounds from vapor-liquid equilibrium (VLE) data is presented. It combines molecular dynamics simulation on the one side with density gradient theory using an equation of state (EOS) on the other. The new method is applied here to predict bubble nucleation rates in metastable liquid carbon dioxide (CO2). The molecular model of CO2 is taken from previous work of our group. PC-SAFT is used as an EOS. The consistency between the molecular model and the EOS is achieved by adjusting the PC-SAFT parameters to VLE data obtained from the molecular model. The influence parameter of density gradient theory is fitted to the surface tension of the molecular model. Massively parallel molecular dynamics simulations are performed close to the spinodal to compute bubble nucleation rates. From these simulations, the kinetic prefactor of the hybrid nucleation theory is estimated, whereas the nucleation barrier is calculated from density gradient theory. This enables the extrapolation of molecular simulation data to the whole metastable range including technically relevant densities. The results are tested against available experimental data and found to be in good agreement. The new method does not suffer from typical deficiencies of classical nucleation theory concerning the thermodynamic barrier at the spinodal and the bubble size dependence of surface tension, which is typically neglected in classical nucleation theory. In addition, the density in the center of critical bubbles and their surface tension is determined as a function of their radius. The usual linear Tolman correction to the capillarity approximation is found to be invalid.

  18. Study of homogeneous bubble nucleation in liquid carbon dioxide by a hybrid approach combining molecular dynamics simulation and density gradient theory.

    PubMed

    Langenbach, K; Heilig, M; Horsch, M; Hasse, H

    2018-03-28

    A new method for predicting homogeneous bubble nucleation rates of pure compounds from vapor-liquid equilibrium (VLE) data is presented. It combines molecular dynamics simulation on the one side with density gradient theory using an equation of state (EOS) on the other. The new method is applied here to predict bubble nucleation rates in metastable liquid carbon dioxide (CO 2 ). The molecular model of CO 2 is taken from previous work of our group. PC-SAFT is used as an EOS. The consistency between the molecular model and the EOS is achieved by adjusting the PC-SAFT parameters to VLE data obtained from the molecular model. The influence parameter of density gradient theory is fitted to the surface tension of the molecular model. Massively parallel molecular dynamics simulations are performed close to the spinodal to compute bubble nucleation rates. From these simulations, the kinetic prefactor of the hybrid nucleation theory is estimated, whereas the nucleation barrier is calculated from density gradient theory. This enables the extrapolation of molecular simulation data to the whole metastable range including technically relevant densities. The results are tested against available experimental data and found to be in good agreement. The new method does not suffer from typical deficiencies of classical nucleation theory concerning the thermodynamic barrier at the spinodal and the bubble size dependence of surface tension, which is typically neglected in classical nucleation theory. In addition, the density in the center of critical bubbles and their surface tension is determined as a function of their radius. The usual linear Tolman correction to the capillarity approximation is found to be invalid.

  19. String Theory: Big Problem for Small Size

    ERIC Educational Resources Information Center

    Sahoo, S.

    2009-01-01

    String theory is the most promising candidate theory for a unified description of all the fundamental forces that exist in nature. It provides a mathematical framework that combines quantum theory with Einstein's general theory of relativity. The typical size of a string is of the order of 10[superscript -33] cm, called the Planck length. But due…

  20. Innovation in healthcare services – creating a Combined Contingency Theory and Ecosystems Approach

    NASA Astrophysics Data System (ADS)

    Engelseth, Per; Kritchanchai, Duangpun

    2018-04-01

    The purpose of this conceptual paper is to develop an analytical framework used for process development in healthcare services. Healthcare services imply a form of operations management demanding an adapted research approach. This study therefore highlights first in the introduction challenges of healthcare services as a reasoning of this study. It is a type of service that has high societal and therefore ethical concern, but at the same time needs to be carried out efficiently to economise service production resource use. Combined business and ethics concerns need to be balanced in this service supply system. In the literature review that is the bulk of this paper, first, particularities of the service industry processes are considered. This is followed by considering literature on contingency theory to consider the nature of the supply chain context of the healthcare service processes highlighting interdependencies and appropriate technology use. This developed view is then expanded to consider an ecosystems approach to encompass the environment expanding analyses to considering in balanced manner features of business, society and nature. A research model for directing both further researches on the healthcare service industry an innovation of such services in practice is introduced.

  1. Universal dissymmetry and the origin of biomolecular chirality.

    PubMed

    Mason, S F

    1987-01-01

    Handed systems are distributed over four general domains. These span the fundamental particles, the molecular enantiomers, the crystal enantiomorphs, and the spiral galaxies. The characterisation of the molecular enantiomers followed from the identification of the crystal enantiomorphs and revealed a chiral homogeneity in the biomolecules of the organic world. The origin of the homogeneity has been variously ascribed to a universal dissymmetric force, from Pasteur, or to a chance choice of the initial enantiomer perpetuated by the stereoselection of diastereomer production with recycling, from Fischer's "key and lock" hypothesis. The classical chiral fields identified by Curie require a particular time or location on the Earth's surface for a determinate molecular enantioselection, as do the weak charged current agencies of the non-classical weak interaction. The weak neutral current of the electroweak interaction provides a constant and uniform chiral agency which favours both the L-series of amino acids and polypeptides and the parent aldotriose of the D-series of sugars. The enantiomeric bias of the electroweak interaction is small at the molecular level: it may become significant either as a trigger-perturbation guiding the transition from a metastable autocatalytic racemic process to one of the two constituent enantiomeric reaction channels, or by cumulative amplification in a large chirally-homogeneous aggregate of enantiomer units.

  2. Analyzing Test-Taking Behavior: Decision Theory Meets Psychometric Theory.

    PubMed

    Budescu, David V; Bo, Yuanchao

    2015-12-01

    We investigate the implications of penalizing incorrect answers to multiple-choice tests, from the perspective of both test-takers and test-makers. To do so, we use a model that combines a well-known item response theory model with prospect theory (Kahneman and Tversky, Prospect theory: An analysis of decision under risk, Econometrica 47:263-91, 1979). Our results reveal that when test-takers are fully informed of the scoring rule, the use of any penalty has detrimental effects for both test-takers (they are always penalized in excess, particularly those who are risk averse and loss averse) and test-makers (the bias of the estimated scores, as well as the variance and skewness of their distribution, increase as a function of the severity of the penalty).

  3. Contributions of treatment theory and enablement theory to rehabilitation research and practice.

    PubMed

    Whyte, John

    2014-01-01

    Scientific theory is crucial to the advancement of clinical research. The breadth of rehabilitation treatment requires that many different theoretical perspectives be incorporated into the design and testing of treatment interventions. In this article, the 2 broad classes of theory relevant to rehabilitation research and practice are defined, and their distinct but complementary contributions to research and clinical practice are explored. These theory classes are referred to as treatment theories (theories about how to effect change in clinical targets) and enablement theories (theories about how changes in a proximal clinical target will influence distal clinical aims). Treatment theories provide the tools for inducing clinical change but do not specify how far reaching the ultimate impact of the change will be. Enablement theories model the impact of changes on other areas of function but provide no insight as to how treatment can create functional change. Treatment theories are more critical in the early stages of treatment development, whereas enablement theories become increasingly relevant in specifying the clinical significance and practical effectiveness of more mature treatments. Understanding the differences in the questions these theory classes address and how to combine their insights is crucial for effective research development and clinical practice. Copyright © 2014 American Congress of Rehabilitation Medicine. Published by Elsevier Inc. All rights reserved.

  4. Phase transition analysis of V-shaped liquid crystal: Combined temperature-dependent FTIR and density functional theory approach

    NASA Astrophysics Data System (ADS)

    Singh, Swapnil; Singh, Harshita; Karthick, T.; Tandon, Poonam; Prasad, Veena

    2018-01-01

    Temperature-dependent Fourier transform infrared spectroscopy (FTIR) combined with density functional theory (DFT) is employed to study the mechanism of phase transitions of V-shaped bent-core liquid crystal. Since it has a large number of flexible bonds, one-dimensional potential energy scan (PES) was performed on the flexible bonds and predicted the most stable conformer I. A detailed analysis of vibrational normal modes of conformer I have been done on the basis of potential energy distribution. The good agreement between the calculated spectrum of conformer I and observed FTIR spectrum at room temperature validates our theoretical structure model. Furthermore, the prominent changes observed in the stretching vibrational bands of CH3/CH2, Cdbnd O, ring CC, ring CO, ring CH in-plane bending, and ring CH out-of-plane bending at Iso → nematic phase transition (at 155 °C) have been illustrated. However, the minor changes in the spectral features observed for the other phase transitions might be due to the shape or bulkiness of molecules. Combined FTIR and PES study beautifully explained the dynamics of the molecules, molecular realignment, H-bonding, and conformational changes at the phase transitions.

  5. Advanced Learning Theories Applied to Leadership Development

    DTIC Science & Technology

    2006-11-01

    Theory . We combined the cognitive , experiential and motivational components of advanced learning theories to develop a training application...Center for Army Leadership Technical Report 2006-2 Advanced Learning Theories Applied to Leadership Development Christina Curnow...2006 5a. CONTRACT NUMBER W91QF4-05-F-0026 5b. GRANT NUMBER 4. TITLE AND SUBTITLE Advanced Learning Theories Applied to Leadership Development 5c

  6. Eigenvalue Detonation of Combined Effects Aluminized Explosives

    NASA Astrophysics Data System (ADS)

    Capellos, Christos; Baker, Ernest; Balas, Wendy; Nicolich, Steven; Stiel, Leonard

    2007-06-01

    This paper reports on the development of theory and performance for recently developed combined effects aluminized explosives. Traditional high energy explosives used for metal pushing incorporate high loading percentages of HMX or RDX, whereas blast explosives incorporate some percentage of aluminum. However, the high blast explosives produce increased blast energies, with reduced metal pushing capability due to late time aluminum reaction. Metal pushing capability refers to the early volume expansion work produced during the first few volume expansions associated with cylinder wall velocities and Gurney energies. Our Recently developed combined effects aluminized explosives (PAX-29C, PAX-30, PAX-42) are capable of achieving excellent metal pushing and high blast energies. Traditional Chapman-Jouguet detonation theory does not explain the observed detonation states achieved by these combined effects explosives. This work demonstrates, with the use of cylinder expansion data and thermochemical code calculations (JAGUAR and CHEETAH), that eigenvalue detonation theory explains the observed behavior.

  7. Comparison of experimental and calculated chiroptical spectra for chiral molecular structure determination.

    PubMed

    Polavarapu, Prasad L; Covington, Cody L

    2014-09-01

    For three different chiroptical spectroscopic methods, namely, vibrational circular dichroism (VCD), electronic circular dichroism (ECD), and Raman optical activity (ROA), the measures of similarity of the experimental spectra to the corresponding spectra predicted using quantum chemical theories are summarized. In determining the absolute configuration and/or predominant conformations of chiral molecules, these similarity measures provide numerical estimates of agreement between experimental observations and theoretical predictions. Selected applications illustrating the similarity measures for absorption, circular dichroism, and corresponding dissymmetry factor (DF) spectra, in the case of VCD and ECD, and for Raman, ROA, and circular intensity differential (CID) spectra in the case of ROA, are presented. The analysis of similarity in DF or CID spectra is considered to be much more discerning and accurate than that in absorption (or Raman) and circular dichroism (or ROA) spectra, undertaken individually. © 2014 Wiley Periodicals, Inc.

  8. Hegemonic masculinity: combining theory and practice in gender interventions

    PubMed Central

    Jewkes, Rachel; Morrell, Robert; Hearn, Jeff; Lundqvist, Emma; Blackbeard, David; Lindegger, Graham; Quayle, Michael; Sikweyiya, Yandisa; Gottzén, Lucas

    2015-01-01

    The concept of hegemonic masculinity has been used in gender studies since the early-1980s to explain men’s power over women. Stressing the legitimating power of consent (rather than crude physical or political power to ensure submission), it has been used to explain men’s health behaviours and the use of violence. Gender activists and others seeking to change men’s relations with women have mobilised the concept of hegemonic masculinity in interventions, but the links between gender theory and activism have often not been explored. The translation of ‘hegemonic masculinity’ into interventions is little examined. We show how, in South Africa and Sweden, the concept has been used to inform theoretically-based gender interventions and to ensure that men are brought into broader social efforts to build gender equity. We discuss the practical translational challenges of using gender theory broadly, and hegemonic masculinity in particular, in a Swedish case study, of the intervention Machofabriken [The Macho Factory], and illustrate how the concept is brought to life in this activist work with men. The concept has considerable practical application in developing a sustainable praxis of theoretically grounded interventions that are more likely to have enduring effect, but evaluating broader societal change in hegemonic masculinity remains an enduring challenge. PMID:26680535

  9. Hegemonic masculinity: combining theory and practice in gender interventions.

    PubMed

    Jewkes, Rachel; Morrell, Robert; Hearn, Jeff; Lundqvist, Emma; Blackbeard, David; Lindegger, Graham; Quayle, Michael; Sikweyiya, Yandisa; Gottzén, Lucas

    2015-01-01

    The concept of hegemonic masculinity has been used in gender studies since the early-1980s to explain men's power over women. Stressing the legitimating power of consent (rather than crude physical or political power to ensure submission), it has been used to explain men's health behaviours and the use of violence. Gender activists and others seeking to change men's relations with women have mobilised the concept of hegemonic masculinity in interventions, but the links between gender theory and activism have often not been explored. The translation of 'hegemonic masculinity' into interventions is little examined. We show how, in South Africa and Sweden, the concept has been used to inform theoretically-based gender interventions and to ensure that men are brought into broader social efforts to build gender equity. We discuss the practical translational challenges of using gender theory broadly, and hegemonic masculinity in particular, in a Swedish case study, of the intervention Machofabriken [The Macho Factory], and illustrate how the concept is brought to life in this activist work with men. The concept has considerable practical application in developing a sustainable praxis of theoretically grounded interventions that are more likely to have enduring effect, but evaluating broader societal change in hegemonic masculinity remains an enduring challenge.

  10. Job Satisfaction: A Possible Integration of Two Theories

    ERIC Educational Resources Information Center

    Hazer, John T.

    1976-01-01

    A rationale for deciding which motivation methods to use for employees who have differing levels of satisfaction. Discusses pros and cons of two theories on job satisfaction--Herzberg's theory and the traditional theory--suggesting strongly a need to combine both ideas. (WL)

  11. Towards a theory of tiered testing.

    PubMed

    Hansson, Sven Ove; Rudén, Christina

    2007-06-01

    Tiered testing is an essential part of any resource-efficient strategy for the toxicity testing of a large number of chemicals, which is required for instance in the risk management of general (industrial) chemicals, In spite of this, no general theory seems to be available for the combination of single tests into efficient tiered testing systems. A first outline of such a theory is developed. It is argued that chemical, toxicological, and decision-theoretical knowledge should be combined in the construction of such a theory. A decision-theoretical approach for the optimization of test systems is introduced. It is based on expected utility maximization with simplified assumptions covering factual and value-related information that is usually missing in the development of test systems.

  12. Autism, theory of mind, and the reactive attitudes.

    PubMed

    Richman, Kenneth A; Bidshahri, Raya

    2018-01-01

    Whether to treat autism as exculpatory in any given circumstance appears to be influenced both by models of autism and by theories of moral responsibility. This article looks at one particular combination of theories: autism as theory of mind challenges and moral responsibility as requiring appropriate experience of the reactive attitudes. In pursuing this particular combination of ideas, we do not intend to endorse them. Our goal is, instead, to explore the implications of this combination of especially prominent ideas about autism and about moral responsibility. These implications can be quite serious and practical for autists and those who interact directly with autists, as well as for broader communities as they attend to the fair, compassionate, and respectful treatment of increasing numbers of autistic adults. We find that these theories point to a limited range of situations in which autists should not be blamed for transgressive actions for which neurotypical individuals should be blamed. We build on what others have written on these issues by bringing in a recent cognitive model of the role theory of mind plays in empathy, by discussing the social implications of the theoretical findings, and by raising questions about the compatibility of reactive attitude theories of moral responsibility with the neurodiversity approach to autism. © 2017 John Wiley & Sons Ltd.

  13. Strength of orthotropic materials subjected to combined stresses

    Treesearch

    Charles B. Norris

    1962-01-01

    A theory of the strength of orthotropic materials subjected to combined stresses, based on the Henky-von Mises theory of energy due to change of shape, is presented. When this theory is applied to macroscopically isotropic materials, it yields the diagram currently used in design with metals. Equations relating the strength of orthotropic materials subjected to a...

  14. Strongly contracted canonical transformation theory

    NASA Astrophysics Data System (ADS)

    Neuscamman, Eric; Yanai, Takeshi; Chan, Garnet Kin-Lic

    2010-01-01

    Canonical transformation (CT) theory describes dynamic correlation in multireference systems with large active spaces. Here we discuss CT theory's intruder state problem and why our previous approach of overlap matrix truncation becomes infeasible for sufficiently large active spaces. We propose the use of strongly and weakly contracted excitation operators as alternatives for dealing with intruder states in CT theory. The performance of these operators is evaluated for the H2O, N2, and NiO molecules, with comparisons made to complete active space second order perturbation theory and Davidson-corrected multireference configuration interaction theory. Finally, using a combination of strongly contracted CT theory and orbital-optimized density matrix renormalization group theory, we evaluate the singlet-triplet gap of free base porphin using an active space containing all 24 out-of-plane 2p orbitals. Modeling dynamic correlation with an active space of this size is currently only possible using CT theory.

  15. GIS and Game Theory for Water Resource Management

    NASA Astrophysics Data System (ADS)

    Ganjali, N.; Guney, C.

    2017-11-01

    In this study, aspects of Game theory and its application on water resources management combined with GIS techniques are detailed. First, each term is explained and the advantages and limitations of its aspect is discussed. Then, the nature of combinations between each pair and literature on the previous studies are given. Several cases were investigated and results were magnified in order to conclude with the applicability and combination of GIS- Game Theory- Water Resources Management. It is concluded that the game theory is used relatively in limited studies of water management fields such as cost/benefit allocation among users, water allocation among trans-boundary users in water resources, water quality management, groundwater management, analysis of water policies, fair allocation of water resources development cost and some other narrow fields. Also, Decision-making in environmental projects requires consideration of trade-offs between socio-political, environmental, and economic impacts and is often complicated by various stakeholder views. Most of the literature on water allocation and conflict problems uses traditional optimization models to identify the most efficient scheme while the Game Theory, as an optimization method, combined GIS are beneficial platforms for agent based models to be used in solving Water Resources Management problems in the further studies.

  16. Splines and control theory

    NASA Technical Reports Server (NTRS)

    Zhang, Zhimin; Tomlinson, John; Martin, Clyde

    1994-01-01

    In this work, the relationship between splines and the control theory has been analyzed. We show that spline functions can be constructed naturally from the control theory. By establishing a framework based on control theory, we provide a simple and systematic way to construct splines. We have constructed the traditional spline functions including the polynomial splines and the classical exponential spline. We have also discovered some new spline functions such as trigonometric splines and the combination of polynomial, exponential and trigonometric splines. The method proposed in this paper is easy to implement. Some numerical experiments are performed to investigate properties of different spline approximations.

  17. Whiteheadian Actual Entitities and String Theory

    NASA Astrophysics Data System (ADS)

    Bracken, Joseph A.

    2012-06-01

    In the philosophy of Alfred North Whitehead, the ultimate units of reality are actual entities, momentary self-constituting subjects of experience which are too small to be sensibly perceived. Their combination into "societies" with a "common element of form" produces the organisms and inanimate things of ordinary sense experience. According to the proponents of string theory, tiny vibrating strings are the ultimate constituents of physical reality which in harmonious combination yield perceptible entities at the macroscopic level of physical reality. Given that the number of Whiteheadian actual entities and of individual strings within string theory are beyond reckoning at any given moment, could they be two ways to describe the same non-verifiable foundational reality? For example, if one could establish that the "superject" or objective pattern of self- constitution of an actual entity vibrates at a specific frequency, its affinity with the individual strings of string theory would be striking. Likewise, if one were to claim that the size and complexity of Whiteheadian 'societies" require different space-time parameters for the dynamic interrelationship of constituent actual entities, would that at least partially account for the assumption of 10 or even 26 instead of just 3 dimensions within string theory? The overall conclusion of this article is that, if a suitably revised understanding of Whiteheadian metaphysics were seen as compatible with the philosophical implications of string theory, their combination into a single world view would strengthen the plausibility of both schemes taken separately. Key words: actual entities, subject/superjects, vibrating strings, structured fields of activity, multi-dimensional physical reality.

  18. A Partial Theory of Executive Succession.

    ERIC Educational Resources Information Center

    Thiemann, Francis C.

    This study has two purposes: (1) To construct a partial theory of succession, and (2) to utilize a method of theory construction which combines some of the concepts of Hans Zetterberg with the principles of formal symbolic logic. A bibliography on succession in complex organizations with entries on descriptive and empirical studies from various…

  19. Rhetorical structure theory and text analysis

    NASA Astrophysics Data System (ADS)

    Mann, William C.; Matthiessen, Christian M. I. M.; Thompson, Sandra A.

    1989-11-01

    Recent research on text generation has shown that there is a need for stronger linguistic theories that tell in detail how texts communicate. The prevailing theories are very difficult to compare, and it is also very difficult to see how they might be combined into stronger theories. To make comparison and combination a bit more approachable, we have created a book which is designed to encourage comparison. A dozen different authors or teams, all experienced in discourse research, are given exactly the same text to analyze. The text is an appeal for money by a lobbying organization in Washington, DC. It informs, stimulates and manipulates the reader in a fascinating way. The joint analysis is far more insightful than any one team's analysis alone. This paper is our contribution to the book. Rhetorical Structure Theory (RST), the focus of this paper, is a way to account for the functional potential of text, its capacity to achieve the purposes of speakers and produce effects in hearers. It also shows a way to distinguish coherent texts from incoherent ones, and identifies consequences of text structure.

  20. Predicting the Oxygen-Binding Properties of Platinum Nanoparticle Ensembles by Combining High-Precision Electron Microscopy and Density Functional Theory.

    PubMed

    Aarons, Jolyon; Jones, Lewys; Varambhia, Aakash; MacArthur, Katherine E; Ozkaya, Dogan; Sarwar, Misbah; Skylaris, Chris-Kriton; Nellist, Peter D

    2017-07-12

    Many studies of heterogeneous catalysis, both experimental and computational, make use of idealized structures such as extended surfaces or regular polyhedral nanoparticles. This simplification neglects the morphological diversity in real commercial oxygen reduction reaction (ORR) catalysts used in fuel-cell cathodes. Here we introduce an approach that combines 3D nanoparticle structures obtained from high-throughput high-precision electron microscopy with density functional theory. Discrepancies between experimental observations and cuboctahedral/truncated-octahedral particles are revealed and discussed using a range of widely used descriptors, such as electron-density, d-band centers, and generalized coordination numbers. We use this new approach to determine the optimum particle size for which both detrimental surface roughness and particle shape effects are minimized.

  1. Information Design Theories

    ERIC Educational Resources Information Center

    Pettersson, Rune

    2014-01-01

    Information design has practical and theoretical components. As an academic discipline we may view information design as a combined discipline, a practical theory, or as a theoretical practice. So far information design has incorporated facts, influences, methods, practices, principles, processes, strategies, and tools from a large number of…

  2. On the vicissitudes of combining individual and group psychotherapy.

    PubMed

    Schermer, Victor L

    2009-01-01

    Abstract Reviewing a series of articles in this special issue on combined treatment incorporating individual and group therapy, and in one instance couples therapy and a relationship group, the author provides a synoptic history of combined treatment, noting that the relational perspectives of psychoanalysis and systems theories of groups now provide conceptual frameworks for combined therapies. Taking up each article in turn, he considers that combined treatment poses particular problems while having certain advantages. He discusses the theoretical emphases and biases of each article, whether relational psychology, classical psychoanalysis, object relations theory, or self-psychology. He considers the roles of transference and countertransference, patient-therapist enactments, empathic attunement, "surrogate therapy" by the group members, and ethics in determining outcomes. In addition, the author notes how so-called difficult or regressed patients impact upon and are affected by the use of combined therapy modalities.

  3. A Study Of Facial Asymmetries By The Stereometric Method

    NASA Astrophysics Data System (ADS)

    Crete, N.; Deloison, Y.; Mollard, R.

    1980-07-01

    In order to determine the part played in facial dissymmetry observed on a living person by the various constitutive elements of the cephalic tip (the soft parts - skin, muscles and the underlying bone structure) we undertook, using a biostereometric method, to evaluate asymmetries between homologous right and left dimensions on a living person's face and on a skeleton. While in an individual, a marked degree of facial dissymmetry can sometimes be observed; average differences between the right and left sides of the face may nethertheless balance out, and remain slight. Conventional anthropometrics techniques do not show up such slight values. With a view to securing a higher degree of accuracy, study of the stereometric technique of measurements. Using this technique, quasi imperceptible differences between the right and the left sides of the face on a living person as well as on a skeleton, together with variations in the orientation or angulation of anatomical segments in a three-dimensional space can be measured. We were thus able to detect, in a number of dry skulls, average differences of approxi-mately a millimetre between the two sides of the face which cannot be attributed to back of accuracy in measurements. Although statistically the difference are not always significant, the para-metric values of facial dimensions are invariably greater for the left side. On the other hand, for the sample of living subjects as a whole, the differences between homologous distances are not statistically significant. But it may be that, on a living subject, the experimenter is inclined to take measurements that are susceptible of symmetrization (for instance, the nasion in the median sagittal plane) whereas on a dry skull anatomical reference marks can be determined with the utmost accuracy. It may be inferred from there results that the softer parts tend, as a rule, to correct the dissymmetry of the underlying skeleton.

  4. Analytic energy gradients in combined second order Møller-Plesset perturbation theory and conductorlike polarizable continuum model calculation.

    PubMed

    Si, Dejun; Li, Hui

    2011-10-14

    The analytic energy gradients in combined second order Møller-Plesset perturbation theory and conductorlike polarizable continuum model calculations are derived and implemented for spin-restricted closed shell (RMP2), Z-averaged spin-restricted open shell (ZAPT2), and spin-unrestricted open shell (UMP2) cases. Using these methods, the geometries of the S(0) ground state and the T(1) state of three nucleobase pairs (guanine-cytosine, adenine-thymine, and adenine-uracil) in the gas phase and aqueous solution phase are optimized. It is found that in both the gas phase and the aqueous solution phase the hydrogen bonds in the T(1) state pairs are weakened by ~1 kcal/mol as compared to those in the S(0) state pairs. © 2011 American Institute of Physics

  5. Toward a Unified Theory of Human Reasoning.

    ERIC Educational Resources Information Center

    Sternberg, Robert J.

    1986-01-01

    The goal of this unified theory of human reasoning is to specify what constitutes reasoning and to characterize the psychological distinction between inductive and deductive reasoning. The theory views reasoning as the controlled and mediated application of three processes (encoding, comparison and selective combination) to inferential rules. (JAZ)

  6. Question 6: coevolution theory of the genetic code: a proven theory.

    PubMed

    Wong, Jeffrey Tze-Fei

    2007-10-01

    The coevolution theory proposes that primordial proteins consisted only of those amino acids readily obtainable from the prebiotic environment, representing about half the twenty encoded amino acids of today, and the missing amino acids entered the system as the code expanded along with pathways of amino acid biosynthesis. The isolation of genetic code mutants, and the antiquity of pretran synthesis revealed by the comparative genomics of tRNAs and aminoacyl-tRNA synthetases, have combined to provide a rigorous proof of the four fundamental tenets of the theory, thus solving the riddle of the structure of the universal genetic code.

  7. A Bayesian Framework for False Belief Reasoning in Children: A Rational Integration of Theory-Theory and Simulation Theory.

    PubMed

    Asakura, Nobuhiko; Inui, Toshio

    2016-01-01

    Two apparently contrasting theories have been proposed to account for the development of children's theory of mind (ToM): theory-theory and simulation theory. We present a Bayesian framework that rationally integrates both theories for false belief reasoning. This framework exploits two internal models for predicting the belief states of others: one of self and one of others. These internal models are responsible for simulation-based and theory-based reasoning, respectively. The framework further takes into account empirical studies of a developmental ToM scale (e.g., Wellman and Liu, 2004): developmental progressions of various mental state understandings leading up to false belief understanding. By representing the internal models and their interactions as a causal Bayesian network, we formalize the model of children's false belief reasoning as probabilistic computations on the Bayesian network. This model probabilistically weighs and combines the two internal models and predicts children's false belief ability as a multiplicative effect of their early-developed abilities to understand the mental concepts of diverse beliefs and knowledge access. Specifically, the model predicts that children's proportion of correct responses on a false belief task can be closely approximated as the product of their proportions correct on the diverse belief and knowledge access tasks. To validate this prediction, we illustrate that our model provides good fits to a variety of ToM scale data for preschool children. We discuss the implications and extensions of our model for a deeper understanding of developmental progressions of children's ToM abilities.

  8. Optimal control of open quantum systems: A combined surrogate Hamiltonian optimal control theory approach applied to photochemistry on surfaces

    NASA Astrophysics Data System (ADS)

    Asplund, Erik; Klüner, Thorsten

    2012-03-01

    In this paper, control of open quantum systems with emphasis on the control of surface photochemical reactions is presented. A quantum system in a condensed phase undergoes strong dissipative processes. From a theoretical viewpoint, it is important to model such processes in a rigorous way. In this work, the description of open quantum systems is realized within the surrogate Hamiltonian approach [R. Baer and R. Kosloff, J. Chem. Phys. 106, 8862 (1997)], 10.1063/1.473950. An efficient and accurate method to find control fields is optimal control theory (OCT) [W. Zhu, J. Botina, and H. Rabitz, J. Chem. Phys. 108, 1953 (1998), 10.1063/1.475576; Y. Ohtsuki, G. Turinici, and H. Rabitz, J. Chem. Phys. 120, 5509 (2004)], 10.1063/1.1650297. To gain control of open quantum systems, the surrogate Hamiltonian approach and OCT, with time-dependent targets, are combined. Three open quantum systems are investigated by the combined method, a harmonic oscillator immersed in an ohmic bath, CO adsorbed on a platinum surface, and NO adsorbed on a nickel oxide surface. Throughout this paper, atomic units, i.e., ℏ = me = e = a0 = 1, have been used unless otherwise stated.

  9. Theory of cortical function

    PubMed Central

    Heeger, David J.

    2017-01-01

    Most models of sensory processing in the brain have a feedforward architecture in which each stage comprises simple linear filtering operations and nonlinearities. Models of this form have been used to explain a wide range of neurophysiological and psychophysical data, and many recent successes in artificial intelligence (with deep convolutional neural nets) are based on this architecture. However, neocortex is not a feedforward architecture. This paper proposes a first step toward an alternative computational framework in which neural activity in each brain area depends on a combination of feedforward drive (bottom-up from the previous processing stage), feedback drive (top-down context from the next stage), and prior drive (expectation). The relative contributions of feedforward drive, feedback drive, and prior drive are controlled by a handful of state parameters, which I hypothesize correspond to neuromodulators and oscillatory activity. In some states, neural responses are dominated by the feedforward drive and the theory is identical to a conventional feedforward model, thereby preserving all of the desirable features of those models. In other states, the theory is a generative model that constructs a sensory representation from an abstract representation, like memory recall. In still other states, the theory combines prior expectation with sensory input, explores different possible perceptual interpretations of ambiguous sensory inputs, and predicts forward in time. The theory, therefore, offers an empirically testable framework for understanding how the cortex accomplishes inference, exploration, and prediction. PMID:28167793

  10. Multiconfiguration Pair-Density Functional Theory Outperforms Kohn-Sham Density Functional Theory and Multireference Perturbation Theory for Ground-State and Excited-State Charge Transfer.

    PubMed

    Ghosh, Soumen; Sonnenberger, Andrew L; Hoyer, Chad E; Truhlar, Donald G; Gagliardi, Laura

    2015-08-11

    The correct description of charge transfer in ground and excited states is very important for molecular interactions, photochemistry, electrochemistry, and charge transport, but it is very challenging for Kohn-Sham (KS) density functional theory (DFT). KS-DFT exchange-correlation functionals without nonlocal exchange fail to describe both ground- and excited-state charge transfer properly. We have recently proposed a theory called multiconfiguration pair-density functional theory (MC-PDFT), which is based on a combination of multiconfiguration wave function theory with a new type of density functional called an on-top density functional. Here we have used MC-PDFT to study challenging ground- and excited-state charge-transfer processes by using on-top density functionals obtained by translating KS exchange-correlation functionals. For ground-state charge transfer, MC-PDFT performs better than either the PBE exchange-correlation functional or CASPT2 wave function theory. For excited-state charge transfer, MC-PDFT (unlike KS-DFT) shows qualitatively correct behavior at long-range with great improvement in predicted excitation energies.

  11. Recent advances in analytical satellite theory

    NASA Technical Reports Server (NTRS)

    Gaposchkin, E. M.

    1978-01-01

    Recent work on analytical satellite perturbation theory has involved the completion of a revision to 4th order for zonal harmonics, the addition of a treatment for ocean tides, an extension of the treatment for the noninertial reference system, and the completion of a theory for direct solar-radiation pressure and earth-albedo pressure. Combined with a theory for tesseral-harmonics, lunisolar, and body-tide perturbations, these formulations provide a comprehensive orbit-computation program. Detailed comparisons with numerical integration and observations are presented to assess the accuracy of each theoretical development.

  12. [Origin of lifting and lowering theory and its herb pair study].

    PubMed

    Guo, Zhao-Juan; Yuan, Yi-Ping; Kong, Li-Ting; Jia, Xiao-Yu; Wang, Ning-Ning; Dai, Ying; Zhai, Hua-Qiang

    2017-08-01

    Lifting and lowering theory is one of the important basis for guiding clinical medication. Through the study of ancient books and literature, we learned that lifting and lowering theory was originated in Huangdi Neijing, practiced more in the Shanghan Zabing Lun, established in Yixue Qiyuan, and developed in Compendium of Materia Medica and now. However, lifting and lowering theory is now mostly stagnated in the theoretical stage, with few experimental research. In the clinical study, the guiding role of lifting and lowering theory to prescriptions?mainly includes opposite?role?of lift and lower medicine property, mutual promotion of lift and lower medicine property, main role of lift medicine property and main role of lower medicine property. Under the guidance of lifting and lowering theory, the herb pair compatibility include herb combination of lift medicine property, herb combination of lift and lower medicine property and herb combination of lower medicine property. Modern biological technology was used in this study to carry out experimental research on the lifting and lowering theory, revealing the scientific connotation of it, which will help to promote clinical rational drug use. Copyright© by the Chinese Pharmaceutical Association.

  13. Group Theory, Computational Thinking, and Young Mathematicians

    ERIC Educational Resources Information Center

    Gadanidis, George; Clements, Erin; Yiu, Chris

    2018-01-01

    In this article, we investigate the artistic puzzle of designing mathematics experiences (MEs) to engage young children with ideas of group theory, using a combination of hands-on and computational thinking (CT) tools. We elaborate on: (1) group theory and why we chose it as a context for young mathematicians' experiences with symmetry and…

  14. JDFTx: Software for joint density-functional theory

    DOE PAGES

    Sundararaman, Ravishankar; Letchworth-Weaver, Kendra; Schwarz, Kathleen A.; ...

    2017-11-14

    Density-functional theory (DFT) has revolutionized computational prediction of atomic-scale properties from first principles in physics, chemistry and materials science. Continuing development of new methods is necessary for accurate predictions of new classes of materials and properties, and for connecting to nano- and mesoscale properties using coarse-grained theories. JDFTx is a fully-featured open-source electronic DFT software designed specifically to facilitate rapid development of new theories, models and algorithms. Using an algebraic formulation as an abstraction layer, compact C++11 code automatically performs well on diverse hardware including GPUs (Graphics Processing Units). This code hosts the development of joint density-functional theory (JDFT) thatmore » combines electronic DFT with classical DFT and continuum models of liquids for first-principles calculations of solvated and electrochemical systems. In addition, the modular nature of the code makes it easy to extend and interface with, facilitating the development of multi-scale toolkits that connect to ab initio calculations, e.g. photo-excited carrier dynamics combining electron and phonon calculations with electromagnetic simulations.« less

  15. JDFTx: Software for joint density-functional theory

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sundararaman, Ravishankar; Letchworth-Weaver, Kendra; Schwarz, Kathleen A.

    Density-functional theory (DFT) has revolutionized computational prediction of atomic-scale properties from first principles in physics, chemistry and materials science. Continuing development of new methods is necessary for accurate predictions of new classes of materials and properties, and for connecting to nano- and mesoscale properties using coarse-grained theories. JDFTx is a fully-featured open-source electronic DFT software designed specifically to facilitate rapid development of new theories, models and algorithms. Using an algebraic formulation as an abstraction layer, compact C++11 code automatically performs well on diverse hardware including GPUs (Graphics Processing Units). This code hosts the development of joint density-functional theory (JDFT) thatmore » combines electronic DFT with classical DFT and continuum models of liquids for first-principles calculations of solvated and electrochemical systems. In addition, the modular nature of the code makes it easy to extend and interface with, facilitating the development of multi-scale toolkits that connect to ab initio calculations, e.g. photo-excited carrier dynamics combining electron and phonon calculations with electromagnetic simulations.« less

  16. Application of fuzzy set and Dempster-Shafer theory to organic geochemistry interpretation

    NASA Technical Reports Server (NTRS)

    Kim, C. S.; Isaksen, G. H.

    1993-01-01

    An application of fuzzy sets and Dempster Shafter Theory (DST) in modeling the interpretational process of organic geochemistry data for predicting the level of maturities of oil and source rock samples is presented. This was accomplished by (1) representing linguistic imprecision and imprecision associated with experience by a fuzzy set theory, (2) capturing the probabilistic nature of imperfect evidences by a DST, and (3) combining multiple evidences by utilizing John Yen's generalized Dempster-Shafter Theory (GDST), which allows DST to deal with fuzzy information. The current prototype provides collective beliefs on the predicted levels of maturity by combining multiple evidences through GDST's rule of combination.

  17. Effective charges of ionic liquid determined self-consistently through combination of molecular dynamics simulation and density-functional theory.

    PubMed

    Ishizuka, Ryosuke; Matubayasi, Nobuyuki

    2017-11-15

    A self-consistent scheme combining the molecular dynamics (MD) simulation and density functional theory (DFT) was recently proposed to incorporate the effects of the charge transfer and polarization of ions into non-poralizable force fields of ionic liquids for improved description of energetics and dynamics. The purpose of the present work is to analyze the detailed setups of the MD/DFT scheme by focusing on how the basis set, exchange-correlation (XC) functional, charge-fitting method or force field for the intramolecular and Lennard-Jones interactions affects the MD/DFT results of 1,3-dimethylimidazolium bis(trifluoromethylsulfonyl) imide ( [C1mim][NTf2]) and 1-ethyl-3-methylimidazolium glycinate ( [C2mim][Gly]). It was found that the double-zeta valence polarized or larger size of basis set is required for the convergence of the effective charge of the ion. The choice of the XC functional was further not influential as far as the generalized gradient approximation is used. The charge-fitting method and force field govern the accuracy of the MD/DFT scheme, on the other hand. We examined the charge-fitting methods of Blöchl, the iterative Hirshfeld (Hirshfeld-I), and REPEAT in combination with Lopes et al.'s force field and general AMBER force field. There is no single combination of charge fitting and force field that provides good agreements with the experiments, while the MD/DFT scheme reduces the effective charges of the ions and leads to better description of energetics and dynamics compared to the original force field with unit charges. © 2017 Wiley Periodicals, Inc. © 2017 Wiley Periodicals, Inc.

  18. Propeller theory of Professor Joukowski and his pupils

    NASA Technical Reports Server (NTRS)

    Margoulis, W

    1922-01-01

    This report gives a summary of the work done in Russia from 1911 to 1914, by Professor Joukowski and his pupils. This summary will show that these men were the true originators of the theory, which combines the theory of the wing element and of the slipstream.

  19. Hepatitis disease detection using Bayesian theory

    NASA Astrophysics Data System (ADS)

    Maseleno, Andino; Hidayati, Rohmah Zahroh

    2017-02-01

    This paper presents hepatitis disease diagnosis using a Bayesian theory for better understanding of the theory. In this research, we used a Bayesian theory for detecting hepatitis disease and displaying the result of diagnosis process. Bayesian algorithm theory is rediscovered and perfected by Laplace, the basic idea is using of the known prior probability and conditional probability density parameter, based on Bayes theorem to calculate the corresponding posterior probability, and then obtained the posterior probability to infer and make decisions. Bayesian methods combine existing knowledge, prior probabilities, with additional knowledge derived from new data, the likelihood function. The initial symptoms of hepatitis which include malaise, fever and headache. The probability of hepatitis given the presence of malaise, fever, and headache. The result revealed that a Bayesian theory has successfully identified the existence of hepatitis disease.

  20. The combination of perception of other individuals and exogenous manipulation of arousal enhances social facilitation as an aftereffect: re-examination of Zajonc's drive theory.

    PubMed

    Ukezono, Masatoshi; Nakashima, Satoshi F; Sudo, Ryunosuke; Yamazaki, Akira; Takano, Yuji

    2015-01-01

    Zajonc's drive theory postulates that arousal enhanced through the perception of the presence of other individuals plays a crucial role in social facilitation (Zajonc, 1965). Here, we conducted two experiments to examine whether the elevation of arousal through a stepping exercise performed in front of others as an exogenous factor causes social facilitation of a cognitive task in a condition where the presence of others does not elevate the arousal level. In the main experiment, as an "aftereffect of social stimulus," we manipulated the presence or absence of others and arousal enhancement before participants conducted the primary cognitive task. The results showed that the strongest social facilitation was induced by the combination of the perception of others and arousal enhancement. In a supplementary experiment, we manipulated these factors by adding the presence of another person during the task. The results showed that the effect of the presence of the other during the primary task is enough on its own to produce facilitation of task performance regardless of the arousal enhancement as an aftereffect of social stimulus. Our study therefore extends the framework of Zajonc's drive theory in that the combination of the perception of others and enhanced arousal as an "aftereffect" was found to induce social facilitation especially when participants did not experience the presence of others while conducting the primary task.

  1. A Bayesian Framework for False Belief Reasoning in Children: A Rational Integration of Theory-Theory and Simulation Theory

    PubMed Central

    Asakura, Nobuhiko; Inui, Toshio

    2016-01-01

    Two apparently contrasting theories have been proposed to account for the development of children's theory of mind (ToM): theory-theory and simulation theory. We present a Bayesian framework that rationally integrates both theories for false belief reasoning. This framework exploits two internal models for predicting the belief states of others: one of self and one of others. These internal models are responsible for simulation-based and theory-based reasoning, respectively. The framework further takes into account empirical studies of a developmental ToM scale (e.g., Wellman and Liu, 2004): developmental progressions of various mental state understandings leading up to false belief understanding. By representing the internal models and their interactions as a causal Bayesian network, we formalize the model of children's false belief reasoning as probabilistic computations on the Bayesian network. This model probabilistically weighs and combines the two internal models and predicts children's false belief ability as a multiplicative effect of their early-developed abilities to understand the mental concepts of diverse beliefs and knowledge access. Specifically, the model predicts that children's proportion of correct responses on a false belief task can be closely approximated as the product of their proportions correct on the diverse belief and knowledge access tasks. To validate this prediction, we illustrate that our model provides good fits to a variety of ToM scale data for preschool children. We discuss the implications and extensions of our model for a deeper understanding of developmental progressions of children's ToM abilities. PMID:28082941

  2. An Integrated Theory for Predicting the Hydrothermomechanical Response of Advanced Composite Structural Components

    NASA Technical Reports Server (NTRS)

    Chamis, C. C.; Lark, R. F.; Sinclair, J. H.

    1977-01-01

    An integrated theory is developed for predicting the hydrothermomechanical (HDTM) response of fiber composite components. The integrated theory is based on a combined theoretical and experimental investigation. In addition to predicting the HDTM response of components, the theory is structured to assess the combined hydrothermal effects on the mechanical properties of unidirectional composites loaded along the material axis and off-axis, and those of angleplied laminates. The theory developed predicts values which are in good agreement with measured data at the micromechanics, macromechanics, laminate analysis and structural analysis levels.

  3. Towards a General Theory of Immunity?

    PubMed

    Eberl, Gérard; Pradeu, Thomas

    2018-04-01

    Theories are indispensable to organize immunological data into coherent, explanatory, and predictive frameworks. We propose to combine different models to develop a unifying theory of immunity which situates immunology in the wider context of physiology. We believe that the immune system will be increasingly understood as a central component of a network of partner physiological systems that interconnect to maintain homeostasis. Copyright © 2017 Elsevier Ltd. All rights reserved.

  4. Uncovering stability mechanisms in microbial ecosystems - combining microcosm experiments, computational modelling and ecological theory in a multidisciplinary approach

    NASA Astrophysics Data System (ADS)

    Worrich, Anja; König, Sara; Banitz, Thomas; Centler, Florian; Frank, Karin; Kästner, Matthias; Miltner, Anja; Thullner, Martin; Wick, Lukas

    2015-04-01

    Although bacterial degraders in soil are commonly exposed to fluctuating environmental conditions, the functional performance of the biodegradation processes can often be maintained by resistance and resilience mechanisms. However, there is still a gap in the mechanistic understanding of key factors contributing to the stability of such an ecosystem service. Therefore we developed an integrated approach combining microcosm experiments, simulation models and ecological theory to directly make use of the strengths of these disciplines. In a continuous interplay process, data, hypotheses, and central questions are exchanged between disciplines to initiate new experiments and models to ultimately identify buffer mechanisms and factors providing functional stability. We focus on drying and rewetting-cycles in soil ecosystems, which are a major abiotic driver for bacterial activity. Functional recovery of the system was found to depend on different spatial processes in the computational model. In particular, bacterial motility is a prerequisite for biodegradation if either bacteria or substrate are heterogeneously distributed. Hence, laboratory experiments focussing on bacterial dispersal processes were conducted and confirmed this finding also for functional resistance. Obtained results will be incorporated into the model in the next step. Overall, the combination of computational modelling and laboratory experiments identified spatial processes as the main driving force for functional stability in the considered system, and has proved a powerful methodological approach.

  5. Venkatapuram's Capability theory of Health: A Critical Discussion.

    PubMed

    Tengland, Per-Anders

    2016-01-01

    The discussion about theories of health has recently had an important new input through the work of Sridhar Venkatapuram. He proposes a combination of Lennart Nordenfelt's holistic theory of health and Martha Nussbaum's version of the capability approach. The aim of the present article is to discuss and evaluate this proposal. The article starts with a discussion of Nordenfelt's theory and evaluates Venkatapuram' critique of it, that is, of its relativism, both regarding goals and environment, and of the subjectivist theory of happiness used. Then the article explains why Nordenfelt's idea of a reasonable environment is not a problem for the theory, and it critiques Venkatapuram's own incorporation of the environment into the concept of health, suggesting that this makes the concept too wide. It contends, moreover, that Venkatapuram's alternative theory retains a problem inherent in Nordenfelt's theory, namely, that health is conceived of as a second-order ability. It is argued that health should, instead, be defined as first-order abilities. This means that health cannot be seen as a capability, and also that health cannot be seen as a meta-capability of the kind envisioned by Venkatapuram. It is, furthermore, argued that the theory lacks one crucial aspect of health, namely, subjective wellbeing. Finally, the article tries to illustrate how health, in the suggested alternative sense, as first-order abilities, fits into Nussbaum's capability theory, since health as an 'actuality' is part of all the 'combined capabilities' suggested by Nussbaum. © 2016 John Wiley & Sons Ltd.

  6. Theory and performance of plated thermocouples.

    NASA Technical Reports Server (NTRS)

    Pesko, R. N.; Ash, R. L.; Cupschalk, S. G.; Germain, E. F.

    1972-01-01

    A theory has been developed to describe the performance of thermocouples which have been formed by electroplating portions of one thermoelectric material with another. The electroplated leg of the thermocouple was modeled as a collection of infinitesimally small homogeneous thermocouples connected in series. Experiments were performed using several combinations of Constantan wire sizes and copper plating thicknesses. A transient method was used to develop the thermoelectric calibrations, and the theory was found to be in quite good agreement with the experiments. In addition, data gathered in a Soviet experiment were also found to be in close agreement with the theory.

  7. Demystifying theory and its use in improvement

    PubMed Central

    Davidoff, Frank; Dixon-Woods, Mary; Leviton, Laura; Michie, Susan

    2015-01-01

    The role and value of theory in improvement work in healthcare has been seriously underrecognised. We join others in proposing that more informed use of theory can strengthen improvement programmes and facilitate the evaluation of their effectiveness. Many professionals, including improvement practitioners, are unfortunately mystified—and alienated—by theory, which discourages them from using it in their work. In an effort to demystify theory we make the point in this paper that, far from being discretionary or superfluous, theory (‘reason-giving’), both informal and formal, is intimately woven into virtually all human endeavour. We explore the special characteristics of grand, mid-range and programme theory; consider the consequences of misusing theory or failing to use it; review the process of developing and applying programme theory; examine some emerging criteria of ‘good’ theory; and emphasise the value, as well as the challenge, of combining informal experience-based theory with formal, publicly developed theory. We conclude that although informal theory is always at work in improvement, practitioners are often not aware of it or do not make it explicit. The germane issue for improvement practitioners, therefore, is not whether they use theory but whether they make explicit the particular theory or theories, informal and formal, they actually use. PMID:25616279

  8. Conceptualizing patient empowerment in cancer follow-up by combining theory and qualitative data.

    PubMed

    Johnsen, Anna Thit; Eskildsen, Nanna Bjerg; Thomsen, Thora Grothe; Grønvold, Mogens; Ross, Lone; Jørgensen, Clara R

    2017-02-01

    Patient empowerment (PE) may be defined as the opportunity for patients to master issues important to their own health. The aim of this study was to conceptualize PE and how the concept manifests itself for cancer patients attending follow-up, in order to develop a relevant and sensitive questionnaire for this population. A theoretical model of PE was made, based on Zimmerman's theory of psychological empowerment. Patients who were in follow-up after first line treatment for their cancer (n = 16) were interviewed about their experiences with follow-up. A deductive thematic analysis was conducted to contextualize the theory and find concrete manifestations of empowerment. Data were analyzed to find situations that expressed empowerment or lack of empowerment. We then analyzed what abilities these situations called for and we further analyzed how these abilities fitted Zimmerman's theory. In all, 16 patients from two different hospitals participated in the interviews. PE in cancer follow-up was conceptualized as: (1) the perception that one had the possibility of mastering treatment and care (e.g. the possibility of 'saying no' to treatment and getting in contact with health care when needed); (2) having knowledge and skills regarding, for example treatment, care, plan of treatment and care, normal reactions and late effects, although knowledge and information was not always considered positively; and (3) being able to make the health care system address one's concerns and needs and, for some patients, also being able to monitor one's treatment, tests and care. We conceptualized PE based on Zimmerman's theory and empirical data to contextualize the concept in cancer follow-up. When developing a patient reported outcome measure measuring PE for this group of patients, one needs to be attentive to differences in wishes regarding mastery.

  9. Reprint of: Combining theory and experiment for X-ray absorption spectroscopy and resonant X-ray scattering characterization of polymers

    DOE PAGES

    Su, Gregory M.; Cordova, Isvar A.; Brady, Michael A.; ...

    2016-11-01

    An improved understanding of fundamental chemistry, electronic structure, morphology, and dynamics in polymers and soft materials requires advanced characterization techniques that are amenable to in situ and operando studies. Soft X-ray methods are especially useful in their ability to non-destructively provide information on specific materials or chemical moieties. Analysis of these experiments, which can be very dependent on X-ray energy and polarization, can quickly become complex. Complementary modeling and predictive capabilities are required to properly probe these critical features. Here in this paper, we present relevant background on this emerging suite of techniques. We focus on how the combination ofmore » theory and experiment has been applied and can be further developed to drive our understanding of how these methods probe relevant chemistry, structure, and dynamics in soft materials.« less

  10. The combination of perception of other individuals and exogenous manipulation of arousal enhances social facilitation as an aftereffect: re-examination of Zajonc’s drive theory

    PubMed Central

    Ukezono, Masatoshi; Nakashima, Satoshi F.; Sudo, Ryunosuke; Yamazaki, Akira; Takano, Yuji

    2015-01-01

    Zajonc’s drive theory postulates that arousal enhanced through the perception of the presence of other individuals plays a crucial role in social facilitation (Zajonc, 1965). Here, we conducted two experiments to examine whether the elevation of arousal through a stepping exercise performed in front of others as an exogenous factor causes social facilitation of a cognitive task in a condition where the presence of others does not elevate the arousal level. In the main experiment, as an “aftereffect of social stimulus,” we manipulated the presence or absence of others and arousal enhancement before participants conducted the primary cognitive task. The results showed that the strongest social facilitation was induced by the combination of the perception of others and arousal enhancement. In a supplementary experiment, we manipulated these factors by adding the presence of another person during the task. The results showed that the effect of the presence of the other during the primary task is enough on its own to produce facilitation of task performance regardless of the arousal enhancement as an aftereffect of social stimulus. Our study therefore extends the framework of Zajonc’s drive theory in that the combination of the perception of others and enhanced arousal as an “aftereffect” was found to induce social facilitation especially when participants did not experience the presence of others while conducting the primary task. PMID:25999906

  11. Deterministic theory of Monte Carlo variance

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ueki, T.; Larsen, E.W.

    1996-12-31

    The theoretical estimation of variance in Monte Carlo transport simulations, particularly those using variance reduction techniques, is a substantially unsolved problem. In this paper, the authors describe a theory that predicts the variance in a variance reduction method proposed by Dwivedi. Dwivedi`s method combines the exponential transform with angular biasing. The key element of this theory is a new modified transport problem, containing the Monte Carlo weight w as an extra independent variable, which simulates Dwivedi`s Monte Carlo scheme. The (deterministic) solution of this modified transport problem yields an expression for the variance. The authors give computational results that validatemore » this theory.« less

  12. The Evolution of Macroeconomic Theory and Implications for Teaching Intermediate Macroeconomics.

    ERIC Educational Resources Information Center

    Froyen, Richard T.

    1996-01-01

    Traces the development of macroeconomic theory from John Maynard Keynes to modern endogenous growth theory. Maintains that a combination of interest in growth theory and related policy questions will play a prominent role in macroeconomics in the future. Recommends narrowing the gap between graduate school and undergraduate economics instruction.…

  13. Diverse carrier mobility of monolayer BNC x : a combined density functional theory and Boltzmann transport theory study

    NASA Astrophysics Data System (ADS)

    Wu, Tao; Deng, Kaiming; Deng, Weiqiao; Lu, Ruifeng

    2017-11-01

    BNC x monolayer as a kind of two-dimensional material has numerous chemical atomic ratios and arrangements with different electronic structures. Via calculations on the basis of density functional theory and Boltzmann transport theory under deformation potential approximation, the band structures and carrier mobilities of BNC x (x  =  1,2,3,4) nanosheets are systematically investigated. The calculated results show that BNC2-1 is a material with very small band gap (0.02 eV) among all the structures while other BNC x monolayers are semiconductors with band gap ranging from 0.51 eV to 1.32 eV. The carrier mobility of BNC x varies considerably from tens to millions of cm2 V-1 s-1. For BNC2-1, the hole mobility and electron mobility along both x and y directions can reach 105 orders of magnitude, which is similar to the carrier mobility of graphene. Besides, all studied BNC x monolayers obviously have anisotropic hole mobility and electron mobility. In particular, for semiconductor BNC4, its hole mobility along the y direction and electron mobility along the x direction unexpectedly reach 106 orders of magnitude, even higher than that of graphene. Our findings suggest that BNC x layered materials with the proper ratio and arrangement of carbon atoms will possess desirable charge transport properties, exhibiting potential applications in nanoelectronic devices.

  14. Non-empirical Prediction of the Photophysical and Magnetic Properties of Systems with Open d- and f-Shells Based on Combined Ligand Field and Density Functional Theory (LFDFT).

    PubMed

    Daul, Claude

    2014-09-01

    Despite the important growth of ab initio and computational techniques, ligand field theory in molecular science or crystal field theory in condensed matter offers the most intuitive way to calculate multiplet energy levels arising from systems with open shells d and/or f electrons. Over the past decade we have developed a ligand field treatment of inorganic molecular modelling taking advantage of the dominant localization of the frontier orbitals within the metal-sphere. This feature, which is observed in any inorganic coordination compound, especially if treated by Density Functional Theory calculation, allows the determination of the electronic structure and properties with a surprising good accuracy. In ligand field theory, the theoretical concepts consider only a single atom center; and treat its interaction with the chemical environment essentially as a perturbation. Therefore success in the simple ligand field theory is no longer questionable, while the more accurate molecular orbital theory does in general over-estimate the metal-ligand covalence, thus yields wave functions that are too delocalized. Although LF theory has always been popular as a semi-empirical method when dealing with molecules of high symmetry e.g. cubic symmetry where the number of parameters needed is reasonably small (3 or 5), this is no more the case for molecules without symmetry and involving both an open d- and f-shell (# parameters ∼90). However, the combination of LF theory and Density Functional (DF) theory that we introduced twenty years ago can easily deal with complex molecules of any symmetry with two and more open shells. The accuracy of these predictions from 1(st) principles achieves quite a high accuracy (<5%) in terms of states energies. Hence, this approach is well suited to predict the magnetic and photo-physical properties arbitrary molecules and materials prior to their synthesis, which is the ultimate goal of each computational chemist. We will illustrate the

  15. Making Theory Relevant: The Gender Attitude and Belief Inventory

    ERIC Educational Resources Information Center

    McCabe, Janice

    2013-01-01

    This article describes and evaluates the Gender Attitude and Belief Inventory (GABI), a teaching tool designed to aid students in (a) realizing how sociological theory links to their personal beliefs and (b) exploring any combination of 11 frequently used theoretical perspectives on gender, including both conservative theories (physiological,…

  16. Teaching English Reading through MI Theory in Primary Schools

    ERIC Educational Resources Information Center

    Jing, Jinxiu

    2013-01-01

    The theory of Multiple Intelligences (MI theory), put forward by Gardner in 1983, claims that each person possesses different combinations of nine intelligences. In education, it advocates that teachers should address students' personal uniqueness and provide a wide range of intelligence-oriented activities and experiences to facilitate learning,…

  17. Combining linear polarization spectroscopy and the Representative Layer Theory to measure the Beer-Lambert law absorbance of highly scattering materials.

    PubMed

    Gobrecht, Alexia; Bendoula, Ryad; Roger, Jean-Michel; Bellon-Maurel, Véronique

    2015-01-01

    Visible and Near Infrared (Vis-NIR) Spectroscopy is a powerful non destructive analytical method used to analyze major compounds in bulk materials and products and requiring no sample preparation. It is widely used in routine analysis and also in-line in industries, in-vivo with biomedical applications or in-field for agricultural and environmental applications. However, highly scattering samples subvert Beer-Lambert law's linear relationship between spectral absorbance and the concentrations. Instead of spectral pre-processing, which is commonly used by Vis-NIR spectroscopists to mitigate the scattering effect, we put forward an optical method, based on Polarized Light Spectroscopy to improve the absorbance signal measurement on highly scattering samples. This method selects part of the signal which is less impacted by scattering. The resulted signal is combined in the Absorption/Remission function defined in Dahm's Representative Layer Theory to compute an absorbance signal fulfilling Beer-Lambert's law, i.e. being linearly related to concentration of the chemicals composing the sample. The underpinning theories have been experimentally evaluated on scattering samples in liquid form and in powdered form. The method produced more accurate spectra and the Pearson's coefficient assessing the linearity between the absorbance spectra and the concentration of the added dye improved from 0.94 to 0.99 for liquid samples and 0.84-0.97 for powdered samples. Copyright © 2014 Elsevier B.V. All rights reserved.

  18. Using generalizability theory to develop clinical assessment protocols.

    PubMed

    Preuss, Richard A

    2013-04-01

    Clinical assessment protocols must produce data that are reliable, with a clinically attainable minimal detectable change (MDC). In a reliability study, generalizability theory has 2 advantages over classical test theory. These advantages provide information that allows assessment protocols to be adjusted to match individual patient profiles. First, generalizability theory allows the user to simultaneously consider multiple sources of measurement error variance (facets). Second, it allows the user to generalize the findings of the main study across the different study facets and to recalculate the reliability and MDC based on different combinations of facet conditions. In doing so, clinical assessment protocols can be chosen based on minimizing the number of measures that must be taken to achieve a realistic MDC, using repeated measures to minimize the MDC, or simply based on the combination that best allows the clinician to monitor an individual patient's progress over a specified period of time.

  19. Grounded theory and feminist inquiry: revitalizing links to the past.

    PubMed

    Plummer, Marilyn; Young, Lynne E

    2010-04-01

    Grounded theory has served feminist research endeavors since the mid-1990s. Researchers from a variety of disciplines claim methodological compatibility and incorporate feminist principles into their grounded theory studies. This article seeks to demonstrate the epistemological affinity between feminist inquiry and grounded theory. Although this relationship is not necessarily unique, the authors contend that when combined, it loosens the androcentric moorings of the empirical processes underpinning grounded theory, enabling the researchers to design inquiry with greater potential to reveal issues particular to the lives and experiences of marginalized women. The article begins by retracing the roots of grounded theory and feminist inquiry to identify six key areas where the underpinnings of GT are enriched by a feminist perspective when working with women. In addition, the authors draw on the literature and their experience from a 2005 study of peer support and lone mothers' health to demonstrate the advantages of combining these theoretical perspectives. Finally, the authors recommend that nurse researchers draw on feminist principles to guide their use of grounded theory to better serve the interests of women by surfacing issues of gender and power that influence the health experience.

  20. On multiscale moving contact line theory.

    PubMed

    Li, Shaofan; Fan, Houfu

    2015-07-08

    In this paper, a multiscale moving contact line (MMCL) theory is presented and employed to simulate liquid droplet spreading and capillary motion. The proposed MMCL theory combines a coarse-grained adhesive contact model with a fluid interface membrane theory, so that it can couple molecular scale adhesive interaction and surface tension with hydrodynamics of microscale flow. By doing so, the intermolecular force, the van der Waals or double layer force, separates and levitates the liquid droplet from the supporting solid substrate, which avoids the shear stress singularity caused by the no-slip condition in conventional hydrodynamics theory of moving contact line. Thus, the MMCL allows the difference of the surface energies and surface stresses to drive droplet spreading naturally. To validate the proposed MMCL theory, we have employed it to simulate droplet spreading over various elastic substrates. The numerical simulation results obtained by using MMCL are in good agreement with the molecular dynamics results reported in the literature.

  1. Iridium(iii) phosphorescent complexes with dual stereogenic centers: single crystal, electronic circular dichroism evidence and circularly polarized luminescence properties.

    PubMed

    Li, Tian-Yi; Zheng, You-Xuan; Zhou, Yong-Hui

    2016-12-06

    Iridium complexes with a chiral metal center and chiral carbons, Λ/Δ-(dfppy) 2 Ir(chty-R) and Λ/Δ-(dfppy) 2 Ir(chty-S), were synthesized and characterized. These isomers have the same steady-state photophysical properties, and obvious offsets in ECD spectra highlight both the chiral sources. Each enantiomeric couple shows mirror-image CPL bands with a dissymmetry factor in the order of 10 -3 .

  2. Exploring the chemical kinetics of partially oxidized intermediates by combining experiments, theory, and kinetic modeling.

    PubMed

    Hoyermann, Karlheinz; Mauß, Fabian; Olzmann, Matthias; Welz, Oliver; Zeuch, Thomas

    2017-07-19

    Partially oxidized intermediates play a central role in combustion and atmospheric chemistry. In this perspective, we focus on the chemical kinetics of alkoxy radicals, peroxy radicals, and Criegee intermediates, which are key species in both combustion and atmospheric environments. These reactive intermediates feature a broad spectrum of chemical diversity. Their reactivity is central to our understanding of how volatile organic compounds are degraded in the atmosphere and converted into secondary organic aerosol. Moreover, they sensitively determine ignition timing in internal combustion engines. The intention of this perspective article is to provide the reader with information about the general mechanisms of reactions initiated by addition of atomic and molecular oxygen to alkyl radicals and ozone to alkenes. We will focus on critical branching points in the subsequent reaction mechanisms and discuss them from a consistent point of view. As a first example of our integrated approach, we will show how experiment, theory, and kinetic modeling have been successfully combined in the first infrared detection of Criegee intermediates during the gas phase ozonolysis. As a second example, we will examine the ignition timing of n-heptane/air mixtures at low and intermediate temperatures. Here, we present a reduced, fuel size independent kinetic model of the complex chemistry initiated by peroxy radicals that has been successfully applied to simulate standard n-heptane combustion experiments.

  3. Diverse carrier mobility of monolayer BNC x : a combined density functional theory and Boltzmann transport theory study.

    PubMed

    Wu, Tao; Deng, Kaiming; Deng, Weiqiao; Lu, Ruifeng

    2017-10-19

    BNC x monolayer as a kind of two-dimensional material has numerous chemical atomic ratios and arrangements with different electronic structures. Via calculations on the basis of density functional theory and Boltzmann transport theory under deformation potential approximation, the band structures and carrier mobilities of BNC x (x  =  1,2,3,4) nanosheets are systematically investigated. The calculated results show that BNC 2 -1 is a material with very small band gap (0.02 eV) among all the structures while other BNC x monolayers are semiconductors with band gap ranging from 0.51 eV to 1.32 eV. The carrier mobility of BNC x varies considerably from tens to millions of cm 2 V -1 s -1 . For BNC 2 -1, the hole mobility and electron mobility along both x and y directions can reach 10 5 orders of magnitude, which is similar to the carrier mobility of graphene. Besides, all studied BNC x monolayers obviously have anisotropic hole mobility and electron mobility. In particular, for semiconductor BNC 4 , its hole mobility along the y direction and electron mobility along the x direction unexpectedly reach 10 6 orders of magnitude, even higher than that of graphene. Our findings suggest that BNC x layered materials with the proper ratio and arrangement of carbon atoms will possess desirable charge transport properties, exhibiting potential applications in nanoelectronic devices.

  4. Prediction of Exercise in Patients across Various Stages of Bariatric Surgery: A Comparison of the Merits of the Theory of Reasoned Action versus the Theory of Planned Behavior

    ERIC Educational Resources Information Center

    Hunt, Hillary R.; Gross, Alan M.

    2009-01-01

    Obesity is a world-wide health concern approaching epidemic proportions. Successful long-term treatment involves a combination of bariatric surgery, diet, and exercise. Social cognitive models, such as the Theory of Reasoned Action (TRA) and the Theory of Planned Behavior (TPB), are among the most commonly tested theories utilized in the…

  5. Receiver-Coupling Schemes Based On Optimal-Estimation Theory

    NASA Technical Reports Server (NTRS)

    Kumar, Rajendra

    1992-01-01

    Two schemes for reception of weak radio signals conveying digital data via phase modulation provide for mutual coupling of multiple receivers, and coherent combination of outputs of receivers. In both schemes, optimal mutual-coupling weights computed according to Kalman-filter theory, but differ in manner of transmission and combination of outputs of receivers.

  6. Coupling-parameter expansion in thermodynamic perturbation theory.

    PubMed

    Ramana, A Sai Venkata; Menon, S V G

    2013-02-01

    An approach to the coupling-parameter expansion in the liquid state theory of simple fluids is presented by combining the ideas of thermodynamic perturbation theory and integral equation theories. This hybrid scheme avoids the problems of the latter in the two phase region. A method to compute the perturbation series to any arbitrary order is developed and applied to square well fluids. Apart from the Helmholtz free energy, the method also gives the radial distribution function and the direct correlation function of the perturbed system. The theory is applied for square well fluids of variable ranges and compared with simulation data. While the convergence of perturbation series and the overall performance of the theory is good, improvements are needed for potentials with shorter ranges. Possible directions for further developments in the coupling-parameter expansion are indicated.

  7. Understanding nursing units with data and theory.

    PubMed

    Diers, Donna; Hendrickson, Karrie; Rimar, Joan; Donovan, Donna

    2013-01-01

    Nursing units are social systems whose function depends on many variables. Available nursing data, combined with a theory of organizational diagnosis, can be used to understand nursing unit performance. One troubled unit served as a case study in organizational diagnosis and treatment using modern methods of data mining and performance improvement. Systems theory did not prescribe how to fix an underbounded system. The theory did suggest, however, that addressing the characteristics of overbounded and underbounded systems can provide some order and structure and identify helpful resources. In this instance, the data analysis served to help define the unit's problems in conjunction with information gained from talking with the nurses and touring the unit, but it was the theory that gave hints for direction for change.

  8. Demystifying theory and its use in improvement.

    PubMed

    Davidoff, Frank; Dixon-Woods, Mary; Leviton, Laura; Michie, Susan

    2015-03-01

    The role and value of theory in improvement work in healthcare has been seriously underrecognised. We join others in proposing that more informed use of theory can strengthen improvement programmes and facilitate the evaluation of their effectiveness. Many professionals, including improvement practitioners, are unfortunately mystified-and alienated-by theory, which discourages them from using it in their work. In an effort to demystify theory we make the point in this paper that, far from being discretionary or superfluous, theory ('reason-giving'), both informal and formal, is intimately woven into virtually all human endeavour. We explore the special characteristics of grand, mid-range and programme theory; consider the consequences of misusing theory or failing to use it; review the process of developing and applying programme theory; examine some emerging criteria of 'good' theory; and emphasise the value, as well as the challenge, of combining informal experience-based theory with formal, publicly developed theory. We conclude that although informal theory is always at work in improvement, practitioners are often not aware of it or do not make it explicit. The germane issue for improvement practitioners, therefore, is not whether they use theory but whether they make explicit the particular theory or theories, informal and formal, they actually use. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.

  9. Hadronic Lorentz violation in chiral perturbation theory including the coupling to external fields

    NASA Astrophysics Data System (ADS)

    Kamand, Rasha; Altschul, Brett; Schindler, Matthias R.

    2018-05-01

    If any violation of Lorentz symmetry exists in the hadron sector, its ultimate origins must lie at the quark level. We continue the analysis of how the theories at these two levels are connected, using chiral perturbation theory. Considering a 2-flavor quark theory, with dimension-4 operators that break Lorentz symmetry, we derive a low-energy theory of pions and nucleons that is invariant under local chiral transformations and includes the coupling to external fields. The pure meson and baryon sectors, as well as the couplings between them and the couplings to external electromagnetic and weak gauge fields, contain forms of Lorentz violation which depend on linear combinations of quark-level coefficients. In particular, at leading order the electromagnetic couplings depend on the very same combinations as appear in the free particle propagators. This means that observations of electromagnetic processes involving hadrons—such as vacuum Cerenkov radiation, which may be allowed in Lorentz-violating theories—can only reliably constrain certain particular combinations of quark coefficients.

  10. A Complete Multimode Equivalent-Circuit Theory for Electrical Design

    PubMed Central

    Williams, Dylan F.; Hayden, Leonard A.; Marks, Roger B.

    1997-01-01

    This work presents a complete equivalent-circuit theory for lossy multimode transmission lines. Its voltages and currents are based on general linear combinations of standard normalized modal voltages and currents. The theory includes new expressions for transmission line impedance matrices, symmetry and lossless conditions, source representations, and the thermal noise of passive multiports. PMID:27805153

  11. A hybrid method for solutes in complex solvents: Density functional theory combined with empirical force fields

    NASA Astrophysics Data System (ADS)

    Eichinger, M.; Tavan, P.; Hutter, J.; Parrinello, M.

    1999-06-01

    We present a hybrid method for molecular dynamics simulations of solutes in complex solvents as represented, for example, by substrates within enzymes. The method combines a quantum mechanical (QM) description of the solute with a molecular mechanics (MM) approach for the solvent. The QM fragment of a simulation system is treated by ab initio density functional theory (DFT) based on plane-wave expansions. Long-range Coulomb interactions within the MM fragment and between the QM and the MM fragment are treated by a computationally efficient fast multipole method. For the description of covalent bonds between the two fragments, we introduce the scaled position link atom method (SPLAM), which removes the shortcomings of related procedures. The various aspects of the hybrid method are scrutinized through test calculations on liquid water, the water dimer, ethane and a small molecule related to the retinal Schiff base. In particular, the extent to which vibrational spectra obtained by DFT for the solute can be spoiled by the lower quality force field of the solvent is checked, including cases in which the two fragments are covalently joined. The results demonstrate that our QM/MM hybrid method is especially well suited for the vibrational analysis of molecules in condensed phase.

  12. High Power Combiner/Divider with Coupled Lines for Broadband Applications

    DTIC Science & Technology

    2017-03-20

    novel isolation structure will also be presented. I. INTRODUCTION Power divider/combiners are traditionally used in the development of high power ...a novel Gysel divider/combiner structure have been demonstrated. The divider/combiner are applicable to various high- power , broadband radar, EW...Gysel Power Divider With Arbitrary Power Ratios and Filtering Responses Using Coupling Structure ,” IEEE Transactions on Microwave Theory and Tech., vol

  13. Brilliant Sm, Eu, Tb, and Dy Chiral Lanthanide Complexes with Strong Circularly Polarized Luminescence

    PubMed Central

    Petoud, Stéphane; Muller, Gilles; Moore, Evan G.; Xu, Jide; Sokolnicki, Jurek; Riehl, James P.; Le, Uyen N.; Cohen, Seth M.; Raymond, Kenneth N.

    2009-01-01

    The synthesis, characterization, and luminescent behavior of trivalent Sm, Eu, Dy, and Tb complexes of two enantiomeric, octadentate, chiral, 2-hydroxyisophthalamide ligands are reported. These complexes are highly luminescent in solution. Functionalization of the achiral parent ligand with a chiral 1-phenylethylamine substituent on the open face of the complex in close proximity to the metal center yields complexes with strong circularly polarized luminescence (CPL) activity. This appears to be the first example of a system utilizing the same ligand architecture to sensitize four different lanthanide cations and display CPL activity. The luminescence dissymmetry factor, glum, recorded for the Eu(III) complex is one of the highest values reported, and this is the first time the CPL effect has been demonstrated for a Sm(III) complex with a chiral ligand. The combination of high luminescence intensity with CPL activity should enable new bioanalytical applications of macromolecules in chiral environments. PMID:17199285

  14. Theory of Self- vs. Externally-Regulated LearningTM: Fundamentals, Evidence, and Applicability.

    PubMed

    de la Fuente-Arias, Jesús

    2017-01-01

    The Theory of Self- vs. Externally-Regulated Learning TM has integrated the variables of SRL theory, the DEDEPRO model, and the 3P model. This new Theory has proposed: (a) in general, the importance of the cyclical model of individual self-regulation (SR) and of external regulation stemming from the context (ER), as two different and complementary variables, both in combination and in interaction; (b) specifically, in the teaching-learning context, the relevance of different types of combinations between levels of self-regulation (SR) and of external regulation (ER) in the prediction of self-regulated learning (SRL), and of cognitive-emotional achievement. This review analyzes the assumptions, conceptual elements, empirical evidence, benefits and limitations of SRL vs. ERL Theory . Finally, professional fields of application and future lines of research are suggested.

  15. Theory of Self- vs. Externally-Regulated LearningTM: Fundamentals, Evidence, and Applicability

    PubMed Central

    de la Fuente-Arias, Jesús

    2017-01-01

    The Theory of Self- vs. Externally-Regulated LearningTM has integrated the variables of SRL theory, the DEDEPRO model, and the 3P model. This new Theory has proposed: (a) in general, the importance of the cyclical model of individual self-regulation (SR) and of external regulation stemming from the context (ER), as two different and complementary variables, both in combination and in interaction; (b) specifically, in the teaching-learning context, the relevance of different types of combinations between levels of self-regulation (SR) and of external regulation (ER) in the prediction of self-regulated learning (SRL), and of cognitive-emotional achievement. This review analyzes the assumptions, conceptual elements, empirical evidence, benefits and limitations of SRL vs. ERL Theory. Finally, professional fields of application and future lines of research are suggested. PMID:29033872

  16. Loop amplitudes in an extended gravity theory

    NASA Astrophysics Data System (ADS)

    Dunbar, David C.; Godwin, John H.; Jehu, Guy R.; Perkins, Warren B.

    2018-05-01

    We extend the S-matrix of gravity by the addition of the minimal three-point amplitude or equivalently adding R3 terms to the Lagrangian. We demonstrate how Unitarity can be used to simply examine the renormalisability of this theory and determine the R4 counter-terms that arise at one-loop. We find that the combination of R4 terms that arise in the extended theory is complementary to the R4 counter-term associated with supersymmetric Lagrangians.

  17. Quasi-local conserved charges in the Einstein-Maxwell theory

    NASA Astrophysics Data System (ADS)

    Setare, M. R.; Adami, H.

    2017-05-01

    In this paper we consider the Einstein-Maxwell theory and define a combined transformation composed of diffeomorphism and U(1) gauge transformation. For generality, we assume that the generator χ of such transformation is field-dependent. We define the extended off-shell ADT current and then off-shell ADT charge such that they are conserved off-shell for the asymptotically field-dependent symmetry generator χ. Then, we define the conserved charge corresponding to the asymptotically field-dependent symmetry generator χ. We apply the presented method to find the conserved charges of the asymptotically AdS3 spacetimes in the context of the Einstein-Maxwell theory in three dimensions. Although the usual proposal for the quasi local charges provides divergent global charges for the Einstein-Maxwell theory with negative cosmological constant in three dimensions, here we avoid this problem by introducing proposed combined transformation χ

  18. Combining disparate data for decision making

    NASA Astrophysics Data System (ADS)

    Gettings, M. E.

    2010-12-01

    Combining information of disparate types from multiple data or model sources is a fundamental task in decision making theory. Procedures for combining and utilizing quantitative data with uncertainties are well-developed in several approaches, but methods for including qualitative and semi-quantitative data are much less so. Possibility theory offers an approach to treating all three data types in an objective and repeatable way. In decision making, biases are frequently present in several forms, including those arising from data quality, data spatial and temporal distribution, and the analyst's knowledge and beliefs as to which data or models are most important. The latter bias is particularly evident in the case of qualitative data and there are numerous examples of analysts feeling that a qualitative dataset is more relevant than a quantified one. Possibility theory and fuzzy logic now provide fairly general rules for quantifying qualitative and semi-quantitative data in ways that are repeatable and minimally biased. Once a set of quantified data and/or model layers is obtained, there are several methods of combining them to obtain insight useful in decision making. These include: various combinations of layers using formal fuzzy logic (for example, layer A and (layer B or layer C) but not layer D); connecting the layers with varying influence links in a Fuzzy Cognitive Map; and using the set of layers for the universe of discourse for agent based model simulations. One example of logical combinations that have proven useful is the definition of possible habitat for valley fever fungus (Coccidioides sp.) using variables such as soil type, altitude, aspect, moisture and temperature. A second example is the delineation of the lithology and possible mineralization of several areas beneath basin fill in southern Arizona. A Fuzzy Cognitive Map example is the impacts of development and operation of a hypothetical mine in an area adjacent to a city. In this model

  19. High-throughput design and optimization of fast lithium ion conductors by the combination of bond-valence method and density functional theory

    NASA Astrophysics Data System (ADS)

    Xiao, Ruijuan; Li, Hong; Chen, Liquan

    2015-09-01

    Looking for solid state electrolytes with fast lithium ion conduction is an important prerequisite for developing all-solid-state lithium secondary batteries. By combining the simulation techniques in different levels of accuracy, e.g. the bond-valence (BV) method and the density functional theory (DFT), a high-throughput design and optimization scheme is proposed for searching fast lithium ion conductors as candidate solid state electrolytes for lithium rechargeable batteries. The screening from more than 1000 compounds is performed through BV-based method, and the ability to predict reliable tendency of the Li+ migration energy barriers is confirmed by comparing with the results from DFT calculations. β-Li3PS4 is taken as a model system to demonstrate the application of this combination method in optimizing properties of solid electrolytes. By employing the high-throughput DFT simulations to more than 200 structures of the doping derivatives of β-Li3PS4, the effects of doping on the ionic conductivities in this material are predicted by the BV calculations. The O-doping scheme is proposed as a promising way to improve the kinetic properties of this materials, and the validity of the optimization is proved by the first-principles molecular dynamics (FPMD) simulations.

  20. Analysis of three-dimensional-cavity-backed aperture antennas using a Combined Finite Element Method/Method of Moments/Geometrical Theory of Diffraction technique

    NASA Technical Reports Server (NTRS)

    Reddy, C. J.; Deshpande, M. D.; Cockrell, C. R.; Beck, F. B.

    1995-01-01

    A combined finite element method (FEM) and method of moments (MoM) technique is presented to analyze the radiation characteristics of a cavity-fed aperture in three dimensions. Generalized feed modeling has been done using the modal expansion of fields in the feed structure. Numerical results for some feeding structures such as a rectangular waveguide, circular waveguide, and coaxial line are presented. The method also uses the geometrical theory of diffraction (GTD) to predict the effect of a finite ground plane on radiation characteristics. Input admittance calculations for open radiating structures such as a rectangular waveguide, a circular waveguide, and a coaxial line are shown. Numerical data for a coaxial-fed cavity with finite ground plane are verified with experimental data.

  1. Theory Z and Schools: What Can We Learn from Toyota?

    ERIC Educational Resources Information Center

    George, Paul S.

    1984-01-01

    William Ouchi's 1981 book designates as "Type Z" American firms that effectively combine domestic and Japanese management strategies. Applications of this theory to improving school effectiveness would involve combining in each school the components of vital philosophy, curricular clarity, instructional focus, social organization, and…

  2. Measurement of non-uniform residual stresses by combined Moiré interferometry and hole-drilling method: Theory, experimental method and applications

    NASA Astrophysics Data System (ADS)

    Ya, Min; Dai, Fulong; Xie, Huimin; Lü, Jian

    2003-12-01

    Hole-drilling method is one of the most convenient methods for engineering residual stress measurement. Combined with moiré interferometry to obtain the relaxed whole-field displacement data, hole-drilling technique can be used to solve non-uniform residual stress problems, both in-depth and in-plane. In this paper, the theory of moiré interferometry and incremental hole-drilling (MIIHD) for non-uniform residual stress measurement is introduced. Three dimensional finite element model is constructed by ABAQUS to obtain the coefficients for the residual stress calculation. An experimental system including real-time measurement, automatic data processing and residual stresses calculation is established. Two applications for non-uniform in-depth residual stress of surface nanocrystalline material and non-uniform in-plane residual stress of friction stir welding are presented. Experimental results show that MIIHD is effective for both non-uniform in-depth and in-plane residual stress measurements.

  3. Towards a Theory of Organisational Culture.

    ERIC Educational Resources Information Center

    Owens, Robert G.; Steinhoff, Carl R.

    1989-01-01

    The development of the paper-and-pencil instrument called the Organizational Culture Assessment Inventory (OCAI) is based on the theory of organizational culture. Recent literature and organizational analysis are combined with Schein's model of organizational culture to provide the background for metaphorical analysis of organizational culture…

  4. Matching Theory - A Sampler: From Denes Koenig to the Present

    DTIC Science & Technology

    1991-01-01

    1079. [1131 , Matching Theory, Ann. Discrete Math . 29, North- Holland, Amsterdam, 1986. [114 ] M. Luby, A simple parallel algorithm for the maximal...311. [135 ]M.D. Plummer, On n-extendable graphs, Discrete Math . 31, 1980, 201-210. [1361 , Matching extension and the genus of a graph, J. Combin...Theory Ser. B, 44, 1988, 329-837. [137] , A theorem on matchings in the plane, Graph Theory in Memory of G.A. Dirac, Ann. Discrete Math . 41, North

  5. Divisions of Labour: Activity Theory, Multi-Professional Working and Intervention Research

    ERIC Educational Resources Information Center

    Warmington, Paul

    2011-01-01

    This article draws upon, but also critiques, activity theory by combining analysis of how an activity theory derived research intervention attempted to address both everyday work practices and organisational power relationships among children's services professionals. It offers two case studies of developmental work research (DWR) interventions in…

  6. Community-Based Research: From Practice to Theory and Back Again.

    ERIC Educational Resources Information Center

    Stoecker, Randy

    2003-01-01

    Explores the theoretical strands being combined in community-based research--charity service learning, social justice service learning, action research, and participatory research. Shows how different models of community-based research, based in different theories of society and different approaches to community work, may combine or conflict. (EV)

  7. An integrative, experience-based theory of attentional control.

    PubMed

    Wilder, Matthew H; Mozer, Michael C; Wickens, Christopher D

    2011-02-09

    Although diverse, theories of visual attention generally share the notion that attention is controlled by some combination of three distinct strategies: (1) exogenous cuing from locally contrasting primitive visual features, such as abrupt onsets or color singletons (e.g., L. Itti, C. Koch, & E. Neiber, 1998), (2) endogenous gain modulation of exogenous activations, used to guide attention to task-relevant features (e.g., V. Navalpakkam & L. Itti, 2007; J. Wolfe, 1994, 2007), and (3) endogenous prediction of likely locations of interest, based on task and scene gist (e.g., A. Torralba, A. Oliva, M. Castelhano, & J. Henderson, 2006). However, little work has been done to synthesize these disparate theories. In this work, we propose a unifying conceptualization in which attention is controlled along two dimensions: the degree of task focus and the contextual scale of operation. Previously proposed strategies-and their combinations-can be viewed as instances of this one mechanism. Thus, this theory serves not as a replacement for existing models but as a means of bringing them into a coherent framework. We present an implementation of this theory and demonstrate its applicability to a wide range of attentional phenomena. The model accounts for key results in visual search with synthetic images and makes reasonable predictions for human eye movements in search tasks involving real-world images. In addition, the theory offers an unusual perspective on attention that places a fundamental emphasis on the role of experience and task-related knowledge.

  8. Combining theory and experiment in electrocatalysis: Insights into materials design

    DOE PAGES

    Seh, Zhi Wei; Kibsgaard, Jakob; Dickens, Colin F.; ...

    2017-01-12

    Electrocatalysis plays a central role in clean energy conversion, enabling a number of sustainable processes for future technologies. This review discusses design strategies for state-of-the-art heterogeneous electrocatalysts and associated materials for several different electrochemical transformations involving water, hydrogen, and oxygen, using theory as a means to rationalize catalyst performance. By examining the common principles that govern catalysis for different electrochemical reactions, we describe a systematic framework that clarifies trends in catalyzing these reactions, serving as a guide to new catalyst development while highlighting key gaps that need to be addressed. Here, we conclude by extending this framework to emerging cleanmore » energy reactions such as hydrogen peroxide production, carbon dioxide reduction, and nitrogen reduction, where the development of improved catalysts could allow for the sustainable production of a broad range of fuels and chemicals.« less

  9. Buckling analysis for anisotropic laminated plates under combined inplane loads

    NASA Technical Reports Server (NTRS)

    Viswanathan, A. V.; Tamekuni, M.; Baker, L. L.

    1974-01-01

    The buckling analysis presented considers rectangular flat or curved general laminates subjected to combined inplane normal and shear loads. Linear theory is used in the analysis. All prebuckling deformations and any initial imperfections are ignored. The analysis method can be readily extended to longitudinally stiffened structures subjected to combined inplane normal and shear loads.

  10. Compatible quantum theory

    NASA Astrophysics Data System (ADS)

    Friedberg, R.; Hohenberg, P. C.

    2014-09-01

    Formulations of quantum mechanics (QM) can be characterized as realistic, operationalist, or a combination of the two. In this paper a realistic theory is defined as describing a closed system entirely by means of entities and concepts pertaining to the system. An operationalist theory, on the other hand, requires in addition entities external to the system. A realistic formulation comprises an ontology, the set of (mathematical) entities that describe the system, and assertions, the set of correct statements (predictions) the theory makes about the objects in the ontology. Classical mechanics is the prime example of a realistic physical theory. A straightforward generalization of classical mechanics to QM is hampered by the inconsistency of quantum properties with classical logic, a circumstance that was noted many years ago by Birkhoff and von Neumann. The present realistic formulation of the histories approach originally introduced by Griffiths, which we call ‘compatible quantum theory (CQT)’, consists of a ‘microscopic’ part (MIQM), which applies to a closed quantum system of any size, and a ‘macroscopic’ part (MAQM), which requires the participation of a large (ideally, an infinite) system. The first (MIQM) can be fully formulated based solely on the assumption of a Hilbert space ontology and the noncontextuality of probability values, relying in an essential way on Gleason's theorem and on an application to dynamics due in large part to Nistico. Thus, the present formulation, in contrast to earlier ones, derives the Born probability formulas and the consistency (decoherence) conditions for frameworks. The microscopic theory does not, however, possess a unique corpus of assertions, but rather a multiplicity of contextual truths (‘c-truths’), each one associated with a different framework. This circumstance leads us to consider the microscopic theory to be physically indeterminate and therefore incomplete, though logically coherent. The

  11. General Systems Theory and Counterplan Competition.

    ERIC Educational Resources Information Center

    Madsen, Arnie

    1989-01-01

    Discusses the trend in academic debate on policy questions toward a wide acceptance of counterplans, encouraging combinations of proposals which appear at face value able to coexist but upon deeper analysis are incompatible. Argues in opposition to this trend by applying concepts from general systems theory to competition. (KEH)

  12. An advanced dissymmetric rolling model for online regulation

    NASA Astrophysics Data System (ADS)

    Cao, Trong-Son

    2017-10-01

    Roll-bite model is employed to predict the rolling force, torque as well as to estimate the forward slip for preset or online regulation at industrial rolling mills. The rolling process is often dissymmetric in terms of work-rolls rotation speeds and diameters as well as the friction conditions at upper and lower contact surfaces between work-rolls and the strip. The roll-bite model thus must be able to account for these dissymmetries and in the same time has to be accurate and fast enough for online applications. In the present study, a new method, namely Adapted Discretization Slab Method (ADSM) is proposed to obtain a robust roll-bite model, which can take into account the aforementioned dissymmetries and has a very short response time, lower than one millisecond. This model is based on the slab method, with an adaptive discretization and a global Newton-Raphson procedure to improve the convergence speed. The model was validated by comparing with other dissymmetric models proposed in the literature, as well as Finite Element simulations and industrial pilot trials. Furthermore, back-calculation tool was also constructed for friction management for both offline and online applications. With very short CPU time, the ADSM-based model is thus attractive for all online applications, both for cold and hot rolling.

  13. Tilted Kantowski-Sachs cosmological model in Brans-Dicke theory of gravitation

    NASA Astrophysics Data System (ADS)

    Pawar, D. D.; Shahare, S. P.; Dagwal, V. J.

    2018-02-01

    Tilted Kantowski-Sachs cosmological model in Brans-Dicke theory for perfect fluid has been investigated. The general solution of field equations in Brans-Dicke theory for the combined scalar and tensor field are obtained by using power law relation. Also, some physical and geometrical parameters are obtained and discussed.

  14. Focus theory of normative conduct and terror-management theory: the interactive impact of mortality salience and norm salience on social judgment.

    PubMed

    Jonas, Eva; Martens, Andy; Kayser, Daniela Niesta; Fritsche, Immo; Sullivan, Daniel; Greenberg, Jeff

    2008-12-01

    Research on terror-management theory has shown that after mortality salience (MS) people attempt to live up to cultural values. But cultures often value very different and sometimes even contradictory standards, leading to difficulties in predicting behavior as a consequence of terror-management needs. The authors report 4 studies to demonstrate that the effect of MS on people's social judgments depends on the salience of norms. In Study 1, making salient opposite norms (prosocial vs. proself) led to reactions consistent with the activated norms following MS compared with the control condition. Study 2 showed that, in combination with a pacifism prime, MS increased pacifistic attitudes. In Study 3, making salient a conservatism/security prime led people to recommend harsher bonds for an illegal prostitute when they were reminded of death, whereas a benevolence prime counteracted this effect. In Study 4 a help prime, combined with MS, increased people's helpfulness. Discussion focuses briefly on how these findings inform both terror-management theory and the focus theory of normative conduct.

  15. Adventures in Topological Field Theory

    NASA Astrophysics Data System (ADS)

    Horne, James H.

    1990-01-01

    This thesis consists of 5 parts. In part I, the topological Yang-Mills theory and the topological sigma model are presented in a superspace formulation. This greatly simplifies the field content of the theories, and makes the Q-invariance more obvious. The Feynman rules for the topological Yang -Mills theory are derived. We calculate the one-loop beta-functions of the topological sigma model in superspace. The lattice version of these theories is presented. The self-duality constraints of both models lead to spectrum doubling. In part II, we show that conformally invariant gravity in three dimensions is equivalent to the Yang-Mills gauge theory of the conformal group in three dimensions, with a Chern-Simons action. This means that conformal gravity is finite and exactly soluble. In part III, we derive the skein relations for the fundamental representations of SO(N), Sp(2n), Su(m| n), and OSp(m| 2n). These relations can be used recursively to calculate the expectation values of Wilson lines in three-dimensional Chern-Simons gauge theory with these gauge groups. A combination of braiding and tying of Wilson lines completely describes the skein relations. In part IV, we show that the k = 1 two dimensional gravity amplitudes at genus 3 agree precisely with the results from intersection theory on moduli space. Predictions for the genus 4 intersection numbers follow from the two dimensional gravity theory. In part V, we discuss the partition function in two dimensional gravity. For the one matrix model at genus 2, we use the partition function to derive a recursion relation. We show that the k = 1 amplitudes completely determine the partition function at arbitrary genus. We present a conjecture for the partition function for the arbitrary topological field theory coupled to topological gravity.

  16. Adolescent Marijuana Use Intentions: Using Theory to Plan an Intervention

    ERIC Educational Resources Information Center

    Sayeed, Sarah; Fishbein, Martin; Hornik, Robert; Cappella, Joseph; Kirkland Ahern, R.

    2005-01-01

    This paper uses an integrated model of behavior change to predict intentions to use marijuana occasionally and regularly in a US-based national sample of male and female 12 to 18 year olds (n = 600). The model combines key constructs from the theory of reasoned action and social cognitive theory. The survey was conducted on laptop computers, and…

  17. AN EDUCATIONAL THEORY MODEL--(SIGGS), AN INTEGRATION OF SET THEORY, INFORMATION THEORY, AND GRAPH THEORY WITH GENERAL SYSTEMS THEORY.

    ERIC Educational Resources Information Center

    MACCIA, ELIZABETH S.; AND OTHERS

    AN ANNOTATED BIBLIOGRAPHY OF 20 ITEMS AND A DISCUSSION OF ITS SIGNIFICANCE WAS PRESENTED TO DESCRIBE CURRENT UTILIZATION OF SUBJECT THEORIES IN THE CONSTRUCTION OF AN EDUCATIONAL THEORY. ALSO, A THEORY MODEL WAS USED TO DEMONSTRATE CONSTRUCTION OF A SCIENTIFIC EDUCATIONAL THEORY. THE THEORY MODEL INCORPORATED SET THEORY (S), INFORMATION THEORY…

  18. Quasi-local conserved charges in Lorenz-diffeomorphism covariant theory of gravity

    NASA Astrophysics Data System (ADS)

    Adami, H.; Setare, M. R.

    2016-04-01

    In this paper, using the combined Lorenz-diffeomorphism symmetry, we find a general formula for the quasi-local conserved charge of the covariant gravity theories in a first order formalism of gravity. We simplify the general formula for the Lovelock theory of gravity. Afterwards, we apply the obtained formula on BHT gravity to obtain the energy and angular momentum of the rotating OTT black hole solution in the context of this theory.

  19. An Introduction to Multilinear Formula Score Theory. Measurement Series 84-4.

    ERIC Educational Resources Information Center

    Levine, Michael V.

    Formula score theory (FST) associates each multiple choice test with a linear operator and expresses all of the real functions of item response theory as linear combinations of the operator's eigenfunctions. Hard measurement problems can then often be reformulated as easier, standard mathematical problems. For example, the problem of estimating…

  20. Second-order subsonic airfoil theory including edge effects

    NASA Technical Reports Server (NTRS)

    Van Dyke, Milton D

    1956-01-01

    Several recent advances in plane subsonic flow theory are combined into a unified second-order theory for airfoil sections of arbitrary shape. The solution is reached in three steps: the incompressible result is found by integration, it is converted into the corresponding subsonic compressible result by means of the second-order compressibility rule, and it is rendered uniformly valid near stagnation points by further rules. Solutions for a number of airfoils are given and are compared with the results of other theories and of experiment. A straight-forward computing scheme is outlined for calculating the surface velocities and pressures on any airfoil at any angle of attack

  1. Non-perturbative String Theory from Water Waves

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Iyer, Ramakrishnan; Johnson, Clifford V.; /Southern California U.

    2012-06-14

    We use a combination of a 't Hooft limit and numerical methods to find non-perturbative solutions of exactly solvable string theories, showing that perturbative solutions in different asymptotic regimes are connected by smooth interpolating functions. Our earlier perturbative work showed that a large class of minimal string theories arise as special limits of a Painleve IV hierarchy of string equations that can be derived by a similarity reduction of the dispersive water wave hierarchy of differential equations. The hierarchy of string equations contains new perturbative solutions, some of which were conjectured to be the type IIA and IIB string theoriesmore » coupled to (4, 4k ? 2) superconformal minimal models of type (A, D). Our present paper shows that these new theories have smooth non-perturbative extensions. We also find evidence for putative new string theories that were not apparent in the perturbative analysis.« less

  2. Diverse carrier mobility of monolayer BNCx: A combined density functional theory and Boltzmann transport theory study.

    PubMed

    Wu, Tao; Deng, Kaiming; Deng, Wei-Qiao; Lu, Ruifeng

    2017-09-19

    BNCX monolayer as a kind of two-dimensional material has numerous chemical atomic ratios and arrangements with different electronic structures. Via calculations on the basis of density functional theory and Boltzmann transport theory under deformation potential approximation, the band structures and carrier mobilities of BNCX (x=1,2,3,4) nanosheets are systematically investigated. The calculated results show that BNC2-1 is a material with very small band gap (0.02 eV) among all the structures while other BNCX monolayers are semiconductors with band gap ranging from 0.51 to 1.32 eV. The carrier mobility of BNCX varies considerably from tens to millions of cm2 V-1 s-1. For BNC2-1, the hole mobility and electron mobility along both x and y directions can reach 105 orders of magnitude, which is similar to the carrier mobility of graphene. Besides, all studied BNCX monolayers obviously have anisotropic hole mobility and electron mobility. In particular, for semiconductor BNC4, its hole mobility along y direction and electron mobility along x direction unexpectedly reach 106 orders of magnitude, even higher than that of graphene. Our findings suggest that BNCX layered materials with proper ratio and arrangement of carbon atoms will possess desirable charge transport properties, exhibiting potential applications in nanoelectronic devices. © 2017 IOP Publishing Ltd.

  3. The quality control theory of aging.

    PubMed

    Ladiges, Warren

    2014-01-01

    The quality control (QC) theory of aging is based on the concept that aging is the result of a reduction in QC of cellular systems designed to maintain lifelong homeostasis. Four QC systems associated with aging are 1) inadequate protein processing in a distressed endoplasmic reticulum (ER); 2) histone deacetylase (HDAC) processing of genomic histones and gene silencing; 3) suppressed AMPK nutrient sensing with inefficient energy utilization and excessive fat accumulation; and 4) beta-adrenergic receptor (BAR) signaling and environmental and emotional stress. Reprogramming these systems to maintain efficiency and prevent aging would be a rational strategy for increased lifespan and improved health. The QC theory can be tested with a pharmacological approach using three well-known and safe, FDA-approved drugs: 1) phenyl butyric acid, a chemical chaperone that enhances ER function and is also an HDAC inhibitor, 2) metformin, which activates AMPK and is used to treat type 2 diabetes, and 3) propranolol, a beta blocker which inhibits BAR signaling and is used to treat hypertension and anxiety. A critical aspect of the QC theory, then, is that aging is associated with multiple cellular systems that can be targeted with drug combinations more effectively than with single drugs. But more importantly, these drug combinations will effectively prevent, delay, or reverse chronic diseases of aging that impose such a tremendous health burden on our society.

  4. Eigenvalue Detonation of Combined Effects Aluminized Explosives

    NASA Astrophysics Data System (ADS)

    Capellos, C.; Baker, E. L.; Nicolich, S.; Balas, W.; Pincay, J.; Stiel, L. I.

    2007-12-01

    Theory and performance for recently developed combined—effects aluminized explosives are presented. Our recently developed combined-effects aluminized explosives (PAX-29C, PAX-30, PAX-42) are capable of achieving excellent metal pushing, as well as high blast energies. Metal pushing capability refers to the early volume expansion work produced during the first few volume expansions associated with cylinder and wall velocities and Gurney energies. Eigenvalue detonation explains the observed detonation states achieved by these combined effects explosives. Cylinder expansion data and thermochemical calculations (JAGUAR and CHEETAH) verify the eigenvalue detonation behavior.

  5. Baryon non-invariant couplings in Higgs effective field theory

    NASA Astrophysics Data System (ADS)

    Merlo, Luca; Saa, Sara; Sacristán-Barbero, Mario

    2017-03-01

    The basis of leading operators which are not invariant under baryon number is constructed within the Higgs effective field theory. This list contains 12 dimension six operators, which preserve the combination B-L, to be compared to only 6 operators for the standard model effective field theory. The discussion of the independent flavour contractions is presented in detail for a generic number of fermion families adopting the Hilbert series technique.

  6. A Novel Method of Enhancing Grounded Theory Memos with Voice Recording

    ERIC Educational Resources Information Center

    Stocker, Rachel; Close, Helen

    2013-01-01

    In this article the authors present the recent discovery of a novel method of supplementing written grounded theory memos with voice recording, the combination of which may provide significant analytical advantages over solely the traditional written method. Memo writing is an essential component of a grounded theory study, however it is often…

  7. Towards a Semantic E-Learning Theory by Using a Modelling Approach

    ERIC Educational Resources Information Center

    Yli-Luoma, Pertti V. J.; Naeve, Ambjorn

    2006-01-01

    In the present study, a semantic perspective on e-learning theory is advanced and a modelling approach is used. This modelling approach towards the new learning theory is based on the four SECI phases of knowledge conversion: Socialisation, Externalisation, Combination and Internalisation, introduced by Nonaka in 1994, and involving two levels of…

  8. Theoretical study of the ammonia nitridation rate on an Fe (100) surface: A combined density functional theory and kinetic Monte Carlo study

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yeo, Sang Chul; Lee, Hyuck Mo, E-mail: hmlee@kaist.ac.kr; Lo, Yu Chieh

    2014-10-07

    Ammonia (NH{sub 3}) nitridation on an Fe surface was studied by combining density functional theory (DFT) and kinetic Monte Carlo (kMC) calculations. A DFT calculation was performed to obtain the energy barriers (E{sub b}) of the relevant elementary processes. The full mechanism of the exact reaction path was divided into five steps (adsorption, dissociation, surface migration, penetration, and diffusion) on an Fe (100) surface pre-covered with nitrogen. The energy barrier (E{sub b}) depended on the N surface coverage. The DFT results were subsequently employed as a database for the kMC simulations. We then evaluated the NH{sub 3} nitridation rate onmore » the N pre-covered Fe surface. To determine the conditions necessary for a rapid NH{sub 3} nitridation rate, the eight reaction events were considered in the kMC simulations: adsorption, desorption, dissociation, reverse dissociation, surface migration, penetration, reverse penetration, and diffusion. This study provides a real-time-scale simulation of NH{sub 3} nitridation influenced by nitrogen surface coverage that allowed us to theoretically determine a nitrogen coverage (0.56 ML) suitable for rapid NH{sub 3} nitridation. In this way, we were able to reveal the coverage dependence of the nitridation reaction using the combined DFT and kMC simulations.« less

  9. Theoretical study of the ammonia nitridation rate on an Fe (100) surface: a combined density functional theory and kinetic Monte Carlo study.

    PubMed

    Yeo, Sang Chul; Lo, Yu Chieh; Li, Ju; Lee, Hyuck Mo

    2014-10-07

    Ammonia (NH3) nitridation on an Fe surface was studied by combining density functional theory (DFT) and kinetic Monte Carlo (kMC) calculations. A DFT calculation was performed to obtain the energy barriers (Eb) of the relevant elementary processes. The full mechanism of the exact reaction path was divided into five steps (adsorption, dissociation, surface migration, penetration, and diffusion) on an Fe (100) surface pre-covered with nitrogen. The energy barrier (Eb) depended on the N surface coverage. The DFT results were subsequently employed as a database for the kMC simulations. We then evaluated the NH3 nitridation rate on the N pre-covered Fe surface. To determine the conditions necessary for a rapid NH3 nitridation rate, the eight reaction events were considered in the kMC simulations: adsorption, desorption, dissociation, reverse dissociation, surface migration, penetration, reverse penetration, and diffusion. This study provides a real-time-scale simulation of NH3 nitridation influenced by nitrogen surface coverage that allowed us to theoretically determine a nitrogen coverage (0.56 ML) suitable for rapid NH3 nitridation. In this way, we were able to reveal the coverage dependence of the nitridation reaction using the combined DFT and kMC simulations.

  10. Rotary-wing aerodynamics. Volume 1: Basic theories of rotor aerodynamics with application to helicopters. [momentum, vortices, and potential theory

    NASA Technical Reports Server (NTRS)

    Stepniewski, W. Z.

    1979-01-01

    The concept of rotary-wing aircraft in general is defined. The energy effectiveness of helicopters is compared with that of other static thrust generators in hover, as well as with various air and ground vehicles in forward translation. The most important aspects of rotor-blade dynamics and rotor control are reviewed. The simple physicomathematical model of the rotor offered by the momentum theory is introduced and its usefulness and limitations are assessed. The combined blade-element and momentum theory approach, which provides greater accuracy in performance predictions, is described as well as the vortex theory which models a rotor blade by means of a vortex filament or vorticity surface. The application of the velocity and acceleration potential theory to the determination of flow fields around three dimensional, non-rotating bodies as well as to rotor aerodynamic problems is described. Airfoil sections suitable for rotors are also considered.

  11. International Test Comparisons: Reviewing Translation Error in Different Source Language-Target Language Combinations

    ERIC Educational Resources Information Center

    Zhao, Xueyu; Solano-Flores, Guillermo; Qian, Ming

    2018-01-01

    This article addresses test translation review in international test comparisons. We investigated the applicability of the theory of test translation error--a theory of the multidimensionality and inevitability of test translation error--across source language-target language combinations in the translation of PISA (Programme of International…

  12. Theory and design of electrical rotating machinery

    NASA Astrophysics Data System (ADS)

    Carr, W. J., Jr.

    1980-04-01

    The objective of this program was to contribute toward new and improved rotating machines for Naval applications, with emphasis on superconducting machinery. Work has been performed on the theory of ac losses in multifilament superconductors and experiments were made to check the theory. A list of publications and abstracts of scientific papers published under the contract is given, and a review is given of the theory of losses. A macroscopic theory for superconductivity in multifilament superconductors was developed, and the theory was used to calculate the hysteresis and eddy current losses which occur in the presence of changing magnetic fields. Both the transverse field and the longitudinal field cases were considered, and also the self-field loss of an alternating transport current, along with some examples of the combined loss due to alternating applied field and transport current. The results are useful for the design of superconducting devices, such as superconducting motors and generators. A small amount of additional work was done on studies of novel homo- and heteropolar motors.

  13. Generalised ballooning theory of two-dimensional tokamak modes

    NASA Astrophysics Data System (ADS)

    Abdoul, P. A.; Dickinson, D.; Roach, C. M.; Wilson, H. R.

    2018-02-01

    In this work, using solutions from a local gyrokinetic flux-tube code combined with higher order ballooning theory, a new analytical approach is developed to reconstruct the global linear mode structure with associated global mode frequency. In addition to the isolated mode (IM), which usually peaks on the outboard mid-plane, the higher order ballooning theory has also captured other types of less unstable global modes: (a) the weakly asymmetric ballooning theory (WABT) predicts a mixed mode (MM) that undergoes a small poloidal shift away from the outboard mid-plane, (b) a relatively more stable general mode (GM) balloons on the top (or bottom) of the tokamak plasma. In this paper, an analytic approach is developed to combine these disconnected analytical limits into a single generalised ballooning theory. This is used to investigate how an IM behaves under the effect of sheared toroidal flow. For small values of flow an IM initially converts into a MM where the results of WABT are recaptured, and eventually, as the flow increases, the mode asymptotically becomes a GM on the top (or bottom) of the plasma. This may be an ingredient in models for understanding why in some experimental scenarios, instead of large edge localised modes (ELMs), small ELMs are observed. Finally, our theory can have other important consequences, especially for calculations involving Reynolds stress driven intrinsic rotation through the radial asymmetry in the global mode structures. Understanding the intrinsic rotation is significant because external torque in a plasma the size of ITER is expected to be relatively low.

  14. Attention in the processing of complex visual displays: detecting features and their combinations.

    PubMed

    Farell, B

    1984-02-01

    The distinction between operations in visual processing that are parallel and preattentive and those that are serial and attentional receives both theoretical and empirical support. According to Treisman's feature-integration theory, independent features are available preattentively, but attention is required to veridically combine features into objects. Certain evidence supporting this theory is consistent with a different interpretation, which was tested in four experiments. The first experiment compared the detection of features and feature combinations while eliminating a factor that confounded earlier comparisons. The resulting priority of access to combinatorial information suggests that features and nonlocal combinations of features are not connected solely by a bottom-up hierarchical convergence. Causes of the disparity between the results of Experiment 1 and the results of previous research were investigated in three subsequent experiments. The results showed that of the two confounded factors, it was the difference in the mapping of alternatives onto responses, not the differing attentional demands of features and objects, that underlaid the results of the previous research. The present results are thus counterexamples to the feature-integration theory. Aspects of this theory are shown to be subsumed by more general principles, which are discussed in terms of attentional processes in the detection of features, objects, and stimulus alternatives.

  15. Solution Structures of Highly Active Molecular Ir Water-Oxidation Catalysts from Density Functional Theory Combined with High-Energy X-ray Scattering and EXAFS Spectroscopy.

    PubMed

    Yang, Ke R; Matula, Adam J; Kwon, Gihan; Hong, Jiyun; Sheehan, Stafford W; Thomsen, Julianne M; Brudvig, Gary W; Crabtree, Robert H; Tiede, David M; Chen, Lin X; Batista, Victor S

    2016-05-04

    The solution structures of highly active Ir water-oxidation catalysts are elucidated by combining density functional theory, high-energy X-ray scattering (HEXS), and extended X-ray absorption fine structure (EXAFS) spectroscopy. We find that the catalysts are Ir dimers with mono-μ-O cores and terminal anionic ligands, generated in situ through partial oxidation of a common catalyst precursor. The proposed structures are supported by (1)H and (17)O NMR, EPR, resonance Raman and UV-vis spectra, electrophoresis, etc. Our findings are particularly valuable to understand the mechanism of water oxidation by highly reactive Ir catalysts. Importantly, our DFT-EXAFS-HEXS methodology provides a new in situ technique for characterization of active species in catalytic systems.

  16. Complex supramolecular interfacial tessellation through convergent multi-step reaction of a dissymmetric simple organic precursor

    NASA Astrophysics Data System (ADS)

    Zhang, Yi-Qi; Paszkiewicz, Mateusz; Du, Ping; Zhang, Liding; Lin, Tao; Chen, Zhi; Klyatskaya, Svetlana; Ruben, Mario; Seitsonen, Ari P.; Barth, Johannes V.; Klappenberger, Florian

    2018-03-01

    Interfacial supramolecular self-assembly represents a powerful tool for constructing regular and quasicrystalline materials. In particular, complex two-dimensional molecular tessellations, such as semi-regular Archimedean tilings with regular polygons, promise unique properties related to their nontrivial structures. However, their formation is challenging, because current methods are largely limited to the direct assembly of precursors, that is, where structure formation relies on molecular interactions without using chemical transformations. Here, we have chosen ethynyl-iodophenanthrene (which features dissymmetry in both geometry and reactivity) as a single starting precursor to generate the rare semi-regular (3.4.6.4) Archimedean tiling with long-range order on an atomically flat substrate through a multi-step reaction. Intriguingly, the individual chemical transformations converge to form a symmetric alkynyl-Ag-alkynyl complex as the new tecton in high yields. Using a combination of microscopy and X-ray spectroscopy tools, as well as computational modelling, we show that in situ generated catalytic Ag complexes mediate the tecton conversion.

  17. Brilliant Sm, Eu, Tb and Dy chiral lanthanide complexes withstrong circularly polarized luminescence

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Petoud, Stephane; Muller, Gilles; Moore, Evan G.

    The synthesis, characterization and luminescent behavior of trivalent Sm, Eu, Dy and Tb complexes of two enantiomeric, octadentate, chiral, 2-hydroxyisophthalamide ligands are reported. These complexes are highly luminescent in solution. Functionalization of the achiral parent ligand with a chiral 1-phenylethylamine substituent on the open face of the complex in close proximity to the metal center yields complexes with strong circularly polarized luminescence (CPL) activity. This appears to be the first example of a system utilizing the same ligand architecture to sensitize four different lanthanide cations and display CPL activity. The luminescence dissymmetry factor, g{sub lum}, recorded for the Eu(III) complexmore » is one of the highest values reported, and this is the first time the CPL effect has been demonstrated for a Sm(III) complex with a chiral ligand. The combination of high luminescence intensity with CPL activity should enable new bioanalytical applications of macromolecules in chiral environments.« less

  18. Low-energy effective field theory below the electroweak scale: operators and matching

    NASA Astrophysics Data System (ADS)

    Jenkins, Elizabeth E.; Manohar, Aneesh V.; Stoffer, Peter

    2018-03-01

    The gauge-invariant operators up to dimension six in the low-energy effective field theory below the electroweak scale are classified. There are 70 Hermitian dimension-five and 3631 Hermitian dimension-six operators that conserve baryon and lepton number, as well as Δ B = ±Δ L = ±1, Δ L = ±2, and Δ L = ±4 operators. The matching onto these operators from the Standard Model Effective Field Theory (SMEFT) up to order 1 /Λ2 is computed at tree level. SMEFT imposes constraints on the coefficients of the low-energy effective theory, which can be checked experimentally to determine whether the electroweak gauge symmetry is broken by a single fundamental scalar doublet as in SMEFT. Our results, when combined with the one-loop anomalous dimensions of the low-energy theory and the one-loop anomalous dimensions of SMEFT, allow one to compute the low-energy implications of new physics to leading-log accuracy, and combine them consistently with high-energy LHC constraints.

  19. On the Minimal Length Uncertainty Relation and the Foundations of String Theory

    DOE PAGES

    Chang, Lay Nam; Lewis, Zachary; Minic, Djordje; ...

    2011-01-01

    We review our work on the minimal length uncertainty relation as suggested by perturbative string theory. We discuss simple phenomenological implications of the minimal length uncertainty relation and then argue that the combination of the principles of quantum theory and general relativity allow for a dynamical energy-momentum space. We discuss the implication of this for the problem of vacuum energy and the foundations of nonperturbative string theory.

  20. Towards program theory validation: Crowdsourcing the qualitative analysis of participant experiences.

    PubMed

    Harman, Elena; Azzam, Tarek

    2018-02-01

    This exploratory study examines a novel tool for validating program theory through crowdsourced qualitative analysis. It combines a quantitative pattern matching framework traditionally used in theory-driven evaluation with crowdsourcing to analyze qualitative interview data. A sample of crowdsourced participants are asked to read an interview transcript and identify whether program theory components (Activities and Outcomes) are discussed and to highlight the most relevant passage about that component. The findings indicate that using crowdsourcing to analyze qualitative data can differentiate between program theory components that are supported by a participant's experience and those that are not. This approach expands the range of tools available to validate program theory using qualitative data, thus strengthening the theory-driven approach. Copyright © 2017 Elsevier Ltd. All rights reserved.

  1. Molecular Dynamics of Dense Fluids: Simulation-Theory Symbiosis

    NASA Astrophysics Data System (ADS)

    Yip, Sidney

    35 years ago Berni J. Alder showed the Boltzmann-Enskog kinetic theory failed to adequately account for the viscosity of fluids near solid density as determined by molecular dynamics simulation. This work, along with other notable simulation findings, provided great stimulus to the statistical mechanical studies of transport phenomena, particularly in dealing with collective effects in the time correlation functions of liquids. An extended theoretical challenge that remains partially resolved at best is the shear viscosity of supercooled liquids. How can one give a unified explanation of the so-called fragile and strong characteristic temperature behavior, with implications for the dynamics of glass transition? In this tribute on the occasion of his 90th birthday symposium, we recount a recent study where simulation, combined with heuristic (transition-state) and first principles (linear response) theories, identifies the molecular mechanisms governing glassy-state relaxation. Such an interplay between simulation and theory is progress from the early days; instead of simulation challenging theory, now simulation and theory complement each other.

  2. Perspective of Postpartum Depression Theories: A Narrative Literature Review

    PubMed Central

    Abdollahi, Fatemeh; Lye, Munn-Sann; Zarghami, Mehran

    2016-01-01

    Postpartum depression is the most prevalent emotional problem during a women's lifespan. Untreated postpartum depression may lead to several consequences such as child, infant, fetal, and maternal effects. The main purpose of this article is to briefly describe different theoretical perspectives of postpartum depression. A literature search was conducted in Psych Info, PubMed, and Science Direct between 1950 and 2015. Additional articles and book chapters were referenced from these sources. Different theories were suggested for developing postpartum depression. Three theories, namely, biological, psychosocial, and evolutionary were discussed. One theory or combinations of psychosocial, biological, and evolutionary theories were considered for postpartum depression. The most important factor that makes clinicians’ choice of intervention is their theoretical perspectives. Healthcare providers and physicians should help women to make informed choices regarding their treatment based on related theories. PMID:27500126

  3. Avionics. Progress Record and Theory Outline.

    ERIC Educational Resources Information Center

    Connecticut State Dept. of Education, Hartford. Div. of Vocational-Technical Schools.

    This combination progress record and course outline is designed for use by individuals teaching a course in avionics that is intended to prepare students for employment in the field of aerospace electronics. Included among the topics addressed in the course are the following: shop practices, aircraft and the theory of flight, electron physics,…

  4. Leadership: Theory and Practice. Sixth Edition

    ERIC Educational Resources Information Center

    Northouse, Peter G.

    2012-01-01

    Adopted at more than 1,000 colleges and universities worldwide, the market-leading text owes its success to the unique way in which it combines an academically robust account of the major theories and models of leadership with an accessible style and practical exercises that help students apply what they learn. Each chapter of Peter…

  5. Combining Theory-Driven Evaluation and Causal Loop Diagramming for Opening the 'Black Box' of an Intervention in the Health Sector: A Case of Performance-Based Financing in Western Uganda.

    PubMed

    Renmans, Dimitri; Holvoet, Nathalie; Criel, Bart

    2017-09-03

    Increased attention on "complexity" in health systems evaluation has resulted in many different methodological responses. Theory-driven evaluations and systems thinking are two such responses that aim for better understanding of the mechanisms underlying given outcomes. Here, we studied the implementation of a performance-based financing intervention by the Belgian Technical Cooperation in Western Uganda to illustrate a methodological strategy of combining these two approaches. We utilized a systems dynamics tool called causal loop diagramming (CLD) to generate hypotheses feeding into a theory-driven evaluation. Semi-structured interviews were conducted with 30 health workers from two districts (Kasese and Kyenjojo) and with 16 key informants. After CLD, we identified three relevant hypotheses: "success to the successful", "growth and underinvestment", and "supervision conundrum". The first hypothesis leads to increasing improvements in performance, as better performance leads to more incentives, which in turn leads to better performance. The latter two hypotheses point to potential bottlenecks. Thus, the proposed methodological strategy was a useful tool for identifying hypotheses that can inform a theory-driven evaluation. The hypotheses are represented in a comprehensible way while highlighting the underlying assumptions, and are more easily falsifiable than hypotheses identified without using CLD.

  6. A mixability theory for the role of sex in evolution

    PubMed Central

    Livnat, Adi; Papadimitriou, Christos; Dushoff, Jonathan; Feldman, Marcus W.

    2008-01-01

    The question of what role sex plays in evolution is still open despite decades of research. It has often been assumed that sex should facilitate the increase in fitness. Hence, the fact that it may break down highly favorable genetic combinations has been seen as a problem. Here, we consider an alternative approach. We define a measure that represents the ability of alleles to perform well across different combinations and, using numerical iterations within a classical population-genetic framework, show that selection in the presence of sex favors this ability in a highly robust manner. We also show that the mechanism responsible for this effect has been out of the purview of previous theory, because it operates during the evolutionary transient, and that the breaking down of favorable genetic combinations is an integral part of it. Implications of these results and more to evolutionary theory are discussed. PMID:19073912

  7. A mixability theory for the role of sex in evolution.

    PubMed

    Livnat, Adi; Papadimitriou, Christos; Dushoff, Jonathan; Feldman, Marcus W

    2008-12-16

    The question of what role sex plays in evolution is still open despite decades of research. It has often been assumed that sex should facilitate the increase in fitness. Hence, the fact that it may break down highly favorable genetic combinations has been seen as a problem. Here, we consider an alternative approach. We define a measure that represents the ability of alleles to perform well across different combinations and, using numerical iterations within a classical population-genetic framework, show that selection in the presence of sex favors this ability in a highly robust manner. We also show that the mechanism responsible for this effect has been out of the purview of previous theory, because it operates during the evolutionary transient, and that the breaking down of favorable genetic combinations is an integral part of it. Implications of these results and more to evolutionary theory are discussed.

  8. A unified theory of bone healing and nonunion: BHN theory.

    PubMed

    Elliott, D S; Newman, K J H; Forward, D P; Hahn, D M; Ollivere, B; Kojima, K; Handley, R; Rossiter, N D; Wixted, J J; Smith, R M; Moran, C G

    2016-07-01

    This article presents a unified clinical theory that links established facts about the physiology of bone and homeostasis, with those involved in the healing of fractures and the development of nonunion. The key to this theory is the concept that the tissue that forms in and around a fracture should be considered a specific functional entity. This 'bone-healing unit' produces a physiological response to its biological and mechanical environment, which leads to the normal healing of bone. This tissue responds to mechanical forces and functions according to Wolff's law, Perren's strain theory and Frost's concept of the "mechanostat". In response to the local mechanical environment, the bone-healing unit normally changes with time, producing different tissues that can tolerate various levels of strain. The normal result is the formation of bone that bridges the fracture - healing by callus. Nonunion occurs when the bone-healing unit fails either due to mechanical or biological problems or a combination of both. In clinical practice, the majority of nonunions are due to mechanical problems with instability, resulting in too much strain at the fracture site. In most nonunions, there is an intact bone-healing unit. We suggest that this maintains its biological potential to heal, but fails to function due to the mechanical conditions. The theory predicts the healing pattern of multifragmentary fractures and the observed morphological characteristics of different nonunions. It suggests that the majority of nonunions will heal if the correct mechanical environment is produced by surgery, without the need for biological adjuncts such as autologous bone graft. Cite this article: Bone Joint J 2016;98-B:884-91. ©2016 The British Editorial Society of Bone & Joint Surgery.

  9. Reduction of parameters in Finite Unified Theories and the MSSM

    NASA Astrophysics Data System (ADS)

    Heinemeyer, Sven; Mondragón, Myriam; Tracas, Nicholas; Zoupanos, George

    2018-02-01

    The method of reduction of couplings developed by W. Zimmermann, combined with supersymmetry, can lead to realistic quantum field theories, where the gauge and Yukawa sectors are related. It is the basis to find all-loop Finite Unified Theories, where the β-function vanishes to all-loops in perturbation theory. It can also be applied to the Minimal Supersymmetric Standard Model, leading to a drastic reduction in the number of parameters. Both Finite Unified Theories and the reduced MSSM lead to successful predictions for the masses of the third generation of quarks and the Higgs boson, and also predict a heavy supersymmetric spectrum, consistent with the non-observation of supersymmetry so far.

  10. Vainshtein screening in scalar-tensor theories before and after GW170817: Constraints on theories beyond Horndeski

    NASA Astrophysics Data System (ADS)

    Dima, Alexandru; Vernizzi, Filippo

    2018-05-01

    Screening mechanisms are essential features of dark energy models mediating a fifth force on large scales. We study the regime of strong scalar field nonlinearities, known as Vainshtein screening, in the most general scalar-tensor theories propagating a single scalar degree of freedom. We first develop an effective approach to parametrize cosmological perturbations beyond linear order for these theories. In the quasistatic limit, the fully nonlinear effective Lagrangian contains six independent terms, one of which starts at cubic order in perturbations. We compute the two gravitational potentials around a spherical body. Outside and near the body, screening reproduces standard gravity, with a modified gravitational coupling. Inside the body, the two potentials are different and depend on the density profile, signalling the breaking of the Vainshtein screening. We provide the most general expressions for these modifications, revising and extending previous results. We apply our findings to show that the combination of the GW170817 event, the Hulse-Taylor pulsar and stellar structure physics, constrain the parameters of these general theories at the level of 10-1, and of Gleyzes-Langlois-Piazza-Vernizzi theories at the level of 10-2.

  11. Literacity: A multimedia adult literacy package combining NASA technology, recursive ID theory, and authentic instruction theory

    NASA Technical Reports Server (NTRS)

    Willis, Jerry; Willis, Dee Anna; Walsh, Clare; Stephens, Elizabeth; Murphy, Timothy; Price, Jerry; Stevens, William; Jackson, Kevin; Villareal, James A.; Way, Bob

    1994-01-01

    An important part of NASA's mission involves the secondary application of its technologies in the public and private sectors. One current application under development is LiteraCity, a simulation-based instructional package for adults who do not have functional reading skills. Using fuzzy logic routines and other technologies developed by NASA's Information Systems Directorate and hypermedia sound, graphics, and animation technologies the project attempts to overcome the limited impact of adult literacy assessment and instruction by involving the adult in an interactive simulation of real-life literacy activities. The project uses a recursive instructional development model and authentic instruction theory. This paper describes one component of a project to design, develop, and produce a series of computer-based, multimedia instructional packages. The packages are being developed for use in adult literacy programs, particularly in correctional education centers. They use the concepts of authentic instruction and authentic assessment to guide development. All the packages to be developed are instructional simulations. The first is a simulation of 'finding a friend a job.'

  12. Multiscale Monte Carlo equilibration: Pure Yang-Mills theory

    DOE PAGES

    Endres, Michael G.; Brower, Richard C.; Orginos, Kostas; ...

    2015-12-29

    In this study, we present a multiscale thermalization algorithm for lattice gauge theory, which enables efficient parallel generation of uncorrelated gauge field configurations. The algorithm combines standard Monte Carlo techniques with ideas drawn from real space renormalization group and multigrid methods. We demonstrate the viability of the algorithm for pure Yang-Mills gauge theory for both heat bath and hybrid Monte Carlo evolution, and show that it ameliorates the problem of topological freezing up to controllable lattice spacing artifacts.

  13. Ordering theories: Typologies and conceptual frameworks for sociotechnical change.

    PubMed

    Sovacool, Benjamin K; Hess, David J

    2017-10-01

    What theories or concepts are most useful at explaining socio technical change? How can - or cannot - these be integrated? To provide an answer, this study presents the results from 35 semi-structured research interviews with social science experts who also shared more than two hundred articles, reports and books on the topic of the acceptance, adoption, use, or diffusion of technology. This material led to the identification of 96 theories and conceptual approaches spanning 22 identified disciplines. The article begins by explaining its research terms and methods before honing in on a combination of fourteen theories deemed most relevant and useful by the material. These are: Sociotechnical Transitions, Social Practice Theory, Discourse Theory, Domestication Theory, Large Technical Systems, Social Construction of Technology, Sociotechnical Imaginaries, Actor-Network Theory, Social Justice Theory, Sociology of Expectations, Sustainable Development, Values Beliefs Norms Theory, Lifestyle Theory, and the Unified Theory of Acceptance and Use of Technology. It then positions these theories in terms of two distinct typologies. Theories can be placed into five general categories of being centered on agency, structure, meaning, relations or norms. They can also be classified based on their assumptions and goals rooted in functionalism, interpretivism, humanism or conflict. The article lays out tips for research methodology before concluding with insights about technology itself, analytical processes associated with technology, and the framing and communication of results. An interdisciplinary theoretical and conceptual inventory has much to offer students, analysts and scholars wanting to study technological change and society.

  14. Some directions in ecological theory.

    PubMed

    Kendall, Bruce E

    2015-12-01

    The role of theory within ecology has changed dramatically in recent decades. Once primarily a source of qualitative conceptual framing, ecological theories and models are now often used to develop quantitative explanations of empirical patterns and to project future dynamics of specific ecological systems. In this essay, I recount my own experience of this transformation, in which accelerating computing power and the widespread incorporation of stochastic processes into ecological theory combined to create some novel integration of mathematical and statistical models. This stronger integration drives theory towards incorporating more biological realism, and I explore ways in which we can grapple with that realism to generate new general theoretical insights. This enhanced realism, in turn, may lead to frameworks for projecting ecological responses to anthropogenic change, which is, arguably, the central challenge for 21st-century ecology. In an era of big data and synthesis, ecologists are increasingly seeking to infer causality from observational data; but conventional biometry provides few tools for this project. This is a realm where theorists can and should play an important role, and I close by pointing towards some analytical and philosophical approaches developed in our sister discipline of economics that address this very problem. While I make no grand prognostications about the likely discoveries of ecological theory over the coming century, you will find in this essay a scattering of more or less far-fetched ideas that I, at least, think are interesting and (possibly) fruitful directions for our field.

  15. Universal calculational recipe for solvent-mediated potential: based on a combination of integral equation theory and density functional theory

    NASA Astrophysics Data System (ADS)

    Zhou, Shiqi

    2004-07-01

    A universal formalism, which enables calculation of solvent-mediated potential (SMP) between two equal or non-equal solute particles with any shape immersed in solvent reservior consisting of atomic particle and/or polymer chain or their mixture, is proposed by importing a density functional theory externally into OZ equation systems. Only if size asymmetry of the solvent bath components is moderate, the present formalism can calculate the SMP in any complex fluids at the present development stage of statistical mechanics, and therefore avoids all of limitations of previous approaches for SMP. Preliminary calculation indicates the reliability of the present formalism.

  16. Heavy dark matter annihilation from effective field theory.

    PubMed

    Ovanesyan, Grigory; Slatyer, Tracy R; Stewart, Iain W

    2015-05-29

    We formulate an effective field theory description for SU(2)_{L} triplet fermionic dark matter by combining nonrelativistic dark matter with gauge bosons in the soft-collinear effective theory. For a given dark matter mass, the annihilation cross section to line photons is obtained with 5% precision by simultaneously including Sommerfeld enhancement and the resummation of electroweak Sudakov logarithms at next-to-leading logarithmic order. Using these results, we present more accurate and precise predictions for the gamma-ray line signal from annihilation, updating both existing constraints and the reach of future experiments.

  17. Theory of electronic phase locking of an optical array without a reference beam

    NASA Astrophysics Data System (ADS)

    Shay, Thomas M.

    2006-08-01

    The first theory for two novel coherent beam combination architectures that are the first electronic beam combination architectures that completely eliminate the need for a separate reference beam are presented. Detailed theoretical models are developed and presented for the first time.

  18. Deafness, Thought-Bubbles and Theory of Mind Development

    PubMed Central

    Wellman, Henry M.; Peterson, Candida C.

    2013-01-01

    The processes and mechanisms of theory of mind development were examined via a training study of false belief conceptions in deaf children of hearing parents (N = 43). In comparison to two different control conditions, training based on thought-bubble instruction about beliefs was linked with improved false belief understanding as well as progress on a broader theory-of-mind scale. By combining intervention, microgenetic, and developmental-scaling methods the findings provide informative data about the nature and mechanisms of theory-of-mind change in deaf children, as well as an initial demonstration of a useful intervention for enhancing social cognition in deaf children of hearing parents. The methods and results also point to possible avenues for the study of conceptual change more generally. PMID:23544856

  19. Derivation of Einstein-Cartan theory from general relativity

    NASA Astrophysics Data System (ADS)

    Petti, Richard

    2015-04-01

    General relativity cannot describe exchange of classical intrinsic angular momentum and orbital angular momentum. Einstein-Cartan theory fixes this problem in the least invasive way. In the late 20th century, the consensus view was that Einstein-Cartan theory requires inclusion of torsion without adequate justification, it has no empirical support (though it doesn't conflict with any known evidence), it solves no important problem, and it complicates gravitational theory with no compensating benefit. In 1986 the author published a derivation of Einstein-Cartan theory from general relativity, with no additional assumptions or parameters. Starting without torsion, Poincaré symmetry, classical or quantum spin, or spinors, it derives torsion and its relation to spin from a continuum limit of general relativistic solutions. The present work makes the case that this computation, combined with supporting arguments, constitutes a derivation of Einstein-Cartan theory from general relativity, not just a plausibility argument. This paper adds more and simpler explanations, more computational details, correction of a factor of 2, discussion of limitations of the derivation, and discussion of some areas of gravitational research where Einstein-Cartan theory is relevant.

  20. Future missions studies: Combining Schatten's solar activity prediction model with a chaotic prediction model

    NASA Technical Reports Server (NTRS)

    Ashrafi, S.

    1991-01-01

    K. Schatten (1991) recently developed a method for combining his prediction model with our chaotic model. The philosophy behind this combined model and his method of combination is explained. Because the Schatten solar prediction model (KS) uses a dynamo to mimic solar dynamics, accurate prediction is limited to long-term solar behavior (10 to 20 years). The Chaotic prediction model (SA) uses the recently developed techniques of nonlinear dynamics to predict solar activity. It can be used to predict activity only up to the horizon. In theory, the chaotic prediction should be several orders of magnitude better than statistical predictions up to that horizon; beyond the horizon, chaotic predictions would theoretically be just as good as statistical predictions. Therefore, chaos theory puts a fundamental limit on predictability.

  1. Charge redistribution in QM:QM ONIOM model systems: a constrained density functional theory approach

    NASA Astrophysics Data System (ADS)

    Beckett, Daniel; Krukau, Aliaksandr; Raghavachari, Krishnan

    2017-11-01

    The ONIOM hybrid method has found considerable success in QM:QM studies designed to approximate a high level of theory at a significantly reduced cost. This cost reduction is achieved by treating only a small model system with the target level of theory and the rest of the system with a low, inexpensive, level of theory. However, the choice of an appropriate model system is a limiting factor in ONIOM calculations and effects such as charge redistribution across the model system boundary must be considered as a source of error. In an effort to increase the general applicability of the ONIOM model, a method to treat the charge redistribution effect is developed using constrained density functional theory (CDFT) to constrain the charge experienced by the model system in the full calculation to the link atoms in the truncated model system calculations. Two separate CDFT-ONIOM schemes are developed and tested on a set of 20 reactions with eight combinations of levels of theory. It is shown that a scheme using a scaled Lagrange multiplier term obtained from the low-level CDFT model calculation outperforms ONIOM at each combination of levels of theory from 32% to 70%.

  2. Validation of psychoanalytic theories: towards a conceptualization of references.

    PubMed

    Zachrisson, Anders; Zachrisson, Henrik Daae

    2005-10-01

    The authors discuss criteria for the validation of psychoanalytic theories and develop a heuristic and normative model of the references needed for this. Their core question in this paper is: can psychoanalytic theories be validated exclusively from within psychoanalytic theory (internal validation), or are references to sources of knowledge other than psychoanalysis also necessary (external validation)? They discuss aspects of the classic truth criteria correspondence and coherence, both from the point of view of contemporary psychoanalysis and of contemporary philosophy of science. The authors present arguments for both external and internal validation. Internal validation has to deal with the problems of subjectivity of observations and circularity of reasoning, external validation with the problem of relevance. They recommend a critical attitude towards psychoanalytic theories, which, by carefully scrutinizing weak points and invalidating observations in the theories, reduces the risk of wishful thinking. The authors conclude by sketching a heuristic model of validation. This model combines correspondence and coherence with internal and external validation into a four-leaf model for references for the process of validating psychoanalytic theories.

  3. Non-Markovian generalization of the Lindblad theory of open quantum systems

    NASA Astrophysics Data System (ADS)

    Breuer, Heinz-Peter

    2007-02-01

    A systematic approach to the non-Markovian quantum dynamics of open systems is given by the projection operator techniques of nonequilibrium statistical mechanics. Combining these methods with concepts from quantum information theory and from the theory of positive maps, we derive a class of correlated projection superoperators that take into account in an efficient way statistical correlations between the open system and its environment. The result is used to develop a generalization of the Lindblad theory to the regime of highly non-Markovian quantum processes in structured environments.

  4. Structure of UV divergences in maximally supersymmetric gauge theories

    NASA Astrophysics Data System (ADS)

    Kazakov, D. I.; Borlakov, A. T.; Tolkachev, D. M.; Vlasenko, D. E.

    2018-06-01

    We consider the UV divergences up to sub-subleading order for the four-point on-shell scattering amplitudes in D =8 supersymmetric Yang-Mills theory in the planar limit. We trace how the leading, subleading, etc divergences appear in all orders of perturbation theory. The structure of these divergences is typical for any local quantum field theory independently on renormalizability. We show how the generalized renormalization group equations allow one to evaluate the leading, subleading, etc. contributions in all orders of perturbation theory starting from one-, two-, etc. loop diagrams respectively. We focus then on subtraction scheme dependence of the results and show that in full analogy with renormalizable theories the scheme dependence can be absorbed into the redefinition of the couplings. The only difference is that the role of the couplings play dimensionless combinations like g2s2 or g2t2, where s and t are the Mandelstam variables.

  5. Considering methodological options for reviews of theory: illustrated by a review of theories linking income and health

    PubMed Central

    2014-01-01

    aim of the review is to scope out theories relating to a particular issue; to conduct in-depth analysis of key theoretical works with the aim of developing new, overarching theories and interpretations; or to combine both these processes in the review. This can help decide the most appropriate methodological approach to take at particular stages of the review. PMID:25312937

  6. Considering methodological options for reviews of theory: illustrated by a review of theories linking income and health.

    PubMed

    Campbell, Mhairi; Egan, Matt; Lorenc, Theo; Bond, Lyndal; Popham, Frank; Fenton, Candida; Benzeval, Michaela

    2014-10-13

    theories relating to a particular issue; to conduct in-depth analysis of key theoretical works with the aim of developing new, overarching theories and interpretations; or to combine both these processes in the review. This can help decide the most appropriate methodological approach to take at particular stages of the review.

  7. Using Classical Test Theory and Item Response Theory to Evaluate the LSCI

    NASA Astrophysics Data System (ADS)

    Schlingman, Wayne M.; Prather, E. E.; Collaboration of Astronomy Teaching Scholars CATS

    2011-01-01

    Analyzing the data from the recent national study using the Light and Spectroscopy Concept Inventory (LSCI), this project uses both Classical Test Theory (CTT) and Item Response Theory (IRT) to investigate the LSCI itself in order to better understand what it is actually measuring. We use Classical Test Theory to form a framework of results that can be used to evaluate the effectiveness of individual questions at measuring differences in student understanding and provide further insight into the prior results presented from this data set. In the second phase of this research, we use Item Response Theory to form a theoretical model that generates parameters accounting for a student's ability, a question's difficulty, and estimate the level of guessing. The combined results from our investigations using both CTT and IRT are used to better understand the learning that is taking place in classrooms across the country. The analysis will also allow us to evaluate the effectiveness of individual questions and determine whether the item difficulties are appropriately matched to the abilities of the students in our data set. These results may require that some questions be revised, motivating the need for further development of the LSCI. This material is based upon work supported by the National Science Foundation under Grant No. 0715517, a CCLI Phase III Grant for the Collaboration of Astronomy Teaching Scholars (CATS). Any opinions, findings, and conclusions or recommendations expressed in this material are those of the authors and do not necessarily reflect the views of the National Science Foundation.

  8. Ordering theories: Typologies and conceptual frameworks for sociotechnical change

    PubMed Central

    Sovacool, Benjamin K; Hess, David J

    2017-01-01

    What theories or concepts are most useful at explaining socio technical change? How can – or cannot – these be integrated? To provide an answer, this study presents the results from 35 semi-structured research interviews with social science experts who also shared more than two hundred articles, reports and books on the topic of the acceptance, adoption, use, or diffusion of technology. This material led to the identification of 96 theories and conceptual approaches spanning 22 identified disciplines. The article begins by explaining its research terms and methods before honing in on a combination of fourteen theories deemed most relevant and useful by the material. These are: Sociotechnical Transitions, Social Practice Theory, Discourse Theory, Domestication Theory, Large Technical Systems, Social Construction of Technology, Sociotechnical Imaginaries, Actor-Network Theory, Social Justice Theory, Sociology of Expectations, Sustainable Development, Values Beliefs Norms Theory, Lifestyle Theory, and the Unified Theory of Acceptance and Use of Technology. It then positions these theories in terms of two distinct typologies. Theories can be placed into five general categories of being centered on agency, structure, meaning, relations or norms. They can also be classified based on their assumptions and goals rooted in functionalism, interpretivism, humanism or conflict. The article lays out tips for research methodology before concluding with insights about technology itself, analytical processes associated with technology, and the framing and communication of results. An interdisciplinary theoretical and conceptual inventory has much to offer students, analysts and scholars wanting to study technological change and society. PMID:28641502

  9. Bonding in Heavier Group 14 Zero-Valent Complexes-A Combined Maximum Probability Domain and Valence Bond Theory Approach.

    PubMed

    Turek, Jan; Braïda, Benoît; De Proft, Frank

    2017-10-17

    The bonding in heavier Group 14 zero-valent complexes of a general formula L 2 E (E=Si-Pb; L=phosphine, N-heterocyclic and acyclic carbene, cyclic tetrylene and carbon monoxide) is probed by combining valence bond (VB) theory and maximum probability domain (MPD) approaches. All studied complexes are initially evaluated on the basis of the structural parameters and the shape of frontier orbitals revealing a bent structural motif and the presence of two lone pairs at the central E atom. For the VB calculations three resonance structures are suggested, representing the "ylidone", "ylidene" and "bent allene" structures, respectively. The influence of both ligands and central atoms on the bonding situation is clearly expressed in different weights of the resonance structures for the particular complexes. In general, the bonding in the studied E 0 compounds, the tetrylones, is best described as a resonating combination of "ylidone" and "ylidene" structures with a minor contribution of the "bent allene" structure. Moreover, the VB calculations allow for a straightforward assessment of the π-backbonding (E→L) stabilization energy. The validity of the suggested resonance model is further confirmed by the complementary MPD calculations focusing on the E lone pair region as well as the E-L bonding region. Likewise, the MPD method reveals a strong influence of the σ-donating and π-accepting properties of the ligand. In particular, either one single domain or two symmetrical domains are found in the lone pair region of the central atom, supporting the predominance of either the "ylidene" or "ylidone" structures having one or two lone pairs at the central atom, respectively. Furthermore, the calculated average populations in the lone pair MPDs correlate very well with the natural bond orbital (NBO) populations, and can be related to the average number of electrons that is backdonated to the ligands. © 2017 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.

  10. Off-axis electron holography combining summation of hologram series with double-exposure phase-shifting: Theory and application.

    PubMed

    Boureau, Victor; McLeod, Robert; Mayall, Benjamin; Cooper, David

    2018-06-04

    In this paper we discuss developments for Lorentz mode or "medium resolution" off-axis electron holography such that it is now routinely possible obtain very high sensitivity phase maps with high spatial resolution whilst maintaining a large field of view. Modifications of the usual Fourier reconstruction procedure have been used to combine series of holograms for sensitivity improvement with a phase-shifting method for doubling the spatial resolution. In the frame of these developments, specific attention is given to the phase standard deviation description and its interaction with the spatial resolution as well as the processing of reference holograms. An experimental study based on Dark-Field Electron Holography (DFEH), using a SiGe/Si multilayer epitaxy sample is compared with theory. The method's efficiency of removing the autocorrelation term during hologram reconstruction is discussed. Software has been written in DigitalMicrograph that can be used to routinely perform these tasks. To illustrate the real improvements made using these methods we show that a strain measurement sensitivity of  ±  0.025 % can be achieved with a spatial resolution of 2 nm and  ±  0.13 % with a spatial resolution of 1 nm whilst maintaining a useful field of view of 300 nm. In the frame of these measurements a model of strain noise for DFEH has also been developed. Copyright © 2018. Published by Elsevier B.V.

  11. Differentiating between precursor and control variables when analyzing reasoned action theories.

    PubMed

    Hennessy, Michael; Bleakley, Amy; Fishbein, Martin; Brown, Larry; Diclemente, Ralph; Romer, Daniel; Valois, Robert; Vanable, Peter A; Carey, Michael P; Salazar, Laura

    2010-02-01

    This paper highlights the distinction between precursor and control variables in the context of reasoned action theory. Here the theory is combined with structural equation modeling to demonstrate how age and past sexual behavior should be situated in a reasoned action analysis. A two wave longitudinal survey sample of African-American adolescents is analyzed where the target behavior is having vaginal sex. Results differ when age and past behavior are used as control variables and when they are correctly used as precursors. Because control variables do not appear in any form of reasoned action theory, this approach to including background variables is not correct when analyzing data sets based on the theoretical axioms of the Theory of Reasoned Action, the Theory of Planned Behavior, or the Integrative Model.

  12. Differentiating Between Precursor and Control Variables When Analyzing Reasoned Action Theories

    PubMed Central

    Hennessy, Michael; Bleakley, Amy; Fishbein, Martin; Brown, Larry; DiClemente, Ralph; Romer, Daniel; Valois, Robert; Vanable, Peter A.; Carey, Michael P.; Salazar, Laura

    2010-01-01

    This paper highlights the distinction between precursor and control variables in the context of reasoned action theory. Here the theory is combined with structural equation modeling to demonstrate how age and past sexual behavior should be situated in a reasoned action analysis. A two wave longitudinal survey sample of African-American adolescents is analyzed where the target behavior is having vaginal sex. Results differ when age and past behavior are used as control variables and when they are correctly used as precursors. Because control variables do not appear in any form of reasoned action theory, this approach to including background variables is not correct when analyzing data sets based on the theoretical axioms of the Theory of Reasoned Action, the Theory of Planned Behavior, or the Integrative Model PMID:19370408

  13. Deafness, thought bubbles, and theory-of-mind development.

    PubMed

    Wellman, Henry M; Peterson, Candida C

    2013-12-01

    The processes and mechanisms of theory-of-mind development were examined via a training study of false-belief conceptions in deaf children of hearing parents (N = 43). In comparison to 2 different control conditions, training based on thought-bubble instruction about beliefs was linked with improved false-belief understanding as well as progress on a broader theory-of-mind scale. By combining intervention, microgenetic, and developmental scaling methods, the findings provide informative data about the nature and mechanisms of theory-of-mind change in deaf children, as well as an initial demonstration of a useful intervention for enhancing social cognition in deaf children of hearing parents. The methods and results also point to possible avenues for the study of conceptual change more generally. PsycINFO Database Record (c) 2013 APA, all rights reserved.

  14. Applying Differential Coercion and Social Support Theory to Intimate Partner Violence.

    PubMed

    Zavala, Egbert; Kurtz, Don L

    2017-09-01

    A review of the current body of literature on intimate partner violence (IPV) shows that the most common theories used to explain this public health issue are social learning theory, a general theory of crime, general strain theory, or a combination of these perspectives. Other criminological theories have received less empirical attention. Therefore, the purpose of this study is to apply Differential Coercion and Social Support (DCSS) theory to test its capability to explain IPV. Data collected from two public universities ( N = 492) shows that three out of four measures of coercion (i.e., physical abuse, emotional abuse, and anticipated strain) predicted IPV perpetration, whereas social support was not found to be significant. Only two social-psychological deficits (anger and self-control) were found to be positive and significant in predicting IPV. Results, as well as the study's limitations and suggestions for future research, are discussed.

  15. Modeling donor/acceptor interactions: Combined roles of theory and computation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Newton, M.D.

    2000-03-05

    An extended superexchange model for electron transfer (ET) matrix elements (H{sub DA}) has been formulated as a superposition of McConnell-type pathways and implemented by combined use of configuration interaction wave functions (obtained using the INDO/s model of Zerner and co-workers) and the generalized Muliken-Hush formulation of charge-localized diabatic states. Applications are made for et (and hold transfer) in several donor/bridge/acceptor radical anion (and cation) systems, (DBA){sup {+-}}, allowing detailed comparison with experimental H{sub DA} estimates. For the case of oligo phenylene ethynylene (OPE) bridges, the role of {pi} and {sigma} electronic manifolds for different distributions of phenylene torsion angles ismore » analyzed in detail.« less

  16. Theories of Suggestion.

    PubMed

    Brown, W

    1928-02-01

    The word "suggestion" has been used in educational, scientific and medical literature in slightly different senses. In psychological medicine the use of suggestion has developed out of the earlier use of hypnotic influence.Charcot defined hypnosis as an artificial hysteria, Bernheim as an artificially increased suggestibility. The two definitions need to be combined to give an adequate account of hypnosis. Moreover, due allowance should be made for the factors of dissociation and of rapport in hypnotic phenomena.The relationships between dissociation, suggestibility, and hypnotizability.Theories of suggestion propounded by Pierre Janet, Freud, McDougall, Pawlow and others. Ernest Jones's theory of the nature of auto-suggestion. Janet explains suggestion in terms of ideo-motor action in which the suggested idea, because of the inactivity of competing ideas, produces its maximum effect. Freud explains rapport in terms of the sex instinct "inhibited in its aim" (transference) and brings in his distinction of "ego" and "ego-ideal" (or "super-ego") to supplement the theory. Jones explains auto-suggestion in terms of narcissism. McDougall explains hypnotic suggestion in terms of the instinct of self-abasement. But different instincts may supply the driving power to produce suggestion-effects in different circumstances. Such instincts as those of self-preservation (fear) and gregariousness may play their part. Auto-suggestion as a therapeutic factor is badly named. It supplements, but does not supplant the will, and makes complete volition possible.

  17. Theories of Suggestion

    PubMed Central

    Brown, William

    1928-01-01

    The word “suggestion” has been used in educational, scientific and medical literature in slightly different senses. In psychological medicine the use of suggestion has developed out of the earlier use of hypnotic influence. Charcot defined hypnosis as an artificial hysteria, Bernheim as an artificially increased suggestibility. The two definitions need to be combined to give an adequate account of hypnosis. Moreover, due allowance should be made for the factors of dissociation and of rapport in hypnotic phenomena. The relationships between dissociation, suggestibility, and hypnotizability. Theories of suggestion propounded by Pierre Janet, Freud, McDougall, Pawlow and others. Ernest Jones's theory of the nature of auto-suggestion. Janet explains suggestion in terms of ideo-motor action in which the suggested idea, because of the inactivity of competing ideas, produces its maximum effect. Freud explains rapport in terms of the sex instinct “inhibited in its aim” (transference) and brings in his distinction of “ego” and “ego-ideal” (or “super-ego”) to supplement the theory. Jones explains auto-suggestion in terms of narcissism. McDougall explains hypnotic suggestion in terms of the instinct of self-abasement. But different instincts may supply the driving power to produce suggestion-effects in different circumstances. Such instincts as those of self-preservation (fear) and gregariousness may play their part. Auto-suggestion as a therapeutic factor is badly named. It supplements, but does not supplant the will, and makes complete volition possible. PMID:19986306

  18. Situational theory of leadership.

    PubMed

    Waller, D J; Smith, S R; Warnock, J T

    1989-11-01

    The situational theory of leadership and the LEAD instruments for determining leadership style are explained, and the application of the situational leadership theory to the process of planning for and implementing organizational change is described. Early studies of leadership style identified two basic leadership styles: the task-oriented autocratic style and the relationship-oriented democratic style. Subsequent research found that most leaders exhibited one of four combinations of task and relationship behaviors. The situational leadership theory holds that the difference between the effectiveness and ineffectiveness of the four leadership styles is the appropriateness of the leader's behavior to the particular situation in which it is used. The task maturity of the individual or group being led must also be accounted for; follower readiness is defined in terms of the capacity to set high but attainable goals, willingness or ability to accept responsibility, and possession of the necessary education or experience for a specific task. A person's leadership style, range, and adaptability can be determined from the LEADSelf and LEADOther questionnaires. By applying the principles of the situational leadership theory and adapting their managerial styles to specific tasks and levels of follower maturity, the authors were successful in implementing 24-hour pharmacokinetic dosing services provided by staff pharmacists with little previous experience in clinical services. The situational leadership model enables a leader to identify a task, set goals, determine the task maturity of the individual or group, select an appropriate leadership style, and modify the style as change occurs. Pharmacy managers can use this model when implementing clinical pharmacy services.

  19. Systemic Disaffection: A Three-Factor Theory of Political Alienation.

    ERIC Educational Resources Information Center

    Long, Samuel

    The paper develops a theory of political alienation based upon interactions among three antecedent conditions. Political alienation is interpreted as combining feelings of inefficacy, discontent, cynicism, estrangement, and hopelessness. The factors evaluated for their contribution to political alienation are: (1) critical perceptions of…

  20. Universal, computer facilitated, steady state oscillator, closed loop analysis theory and some applications to precision oscillators

    NASA Technical Reports Server (NTRS)

    Parzen, Benjamin

    1992-01-01

    The theory of oscillator analysis in the immittance domain should be read in conjunction with the additional theory presented here. The combined theory enables the computer simulation of the steady state oscillator. The simulation makes the calculation of the oscillator total steady state performance practical, including noise at all oscillator locations. Some specific precision oscillators are analyzed.

  1. Technology selection for ballast water treatment by multi-stakeholders: A multi-attribute decision analysis approach based on the combined weights and extension theory.

    PubMed

    Ren, Jingzheng

    2018-01-01

    This objective of this study is to develop a generic multi-attribute decision analysis framework for ranking the technologies for ballast water treatment and determine their grades. An evaluation criteria system consisting of eight criteria in four categories was used to evaluate the technologies for ballast water treatment. The Best-Worst method, which is a subjective weighting method and Criteria importance through inter-criteria correlation method, which is an objective weighting method, were combined to determine the weights of the evaluation criteria. The extension theory was employed to prioritize the technologies for ballast water treatment and determine their grades. An illustrative case including four technologies for ballast water treatment, i.e. Alfa Laval (T 1 ), Hyde (T 2 ), Unitor (T 3 ), and NaOH (T 4 ), were studied by the proposed method, and the Hyde (T 2 ) was recognized as the best technology. Sensitivity analysis was also carried to investigate the effects of the combined coefficients and the weights of the evaluation criteria on the final priority order of the four technologies for ballast water treatment. The sum weighted method and the TOPSIS was also employed to rank the four technologies, and the results determined by these two methods are consistent to that determined by the proposed method in this study. Copyright © 2017 Elsevier Ltd. All rights reserved.

  2. Prospect Theory and Interval-Valued Hesitant Set for Safety Evacuation Model

    NASA Astrophysics Data System (ADS)

    Kou, Meng; Lu, Na

    2018-01-01

    The study applies the research results of prospect theory and multi attribute decision making theory, combined with the complexity, uncertainty and multifactor influence of the underground mine fire system and takes the decision makers’ psychological behavior of emotion and intuition into full account to establish the intuitionistic fuzzy multiple attribute decision making method that is based on the prospect theory. The model established by this method can explain the decision maker’s safety evacuation decision behavior in the complex system of underground mine fire due to the uncertainty of the environment, imperfection of the information and human psychological behavior and other factors.

  3. Microscopic origin of the charge transfer in single crystals based on thiophene derivatives: A combined NEXAFS and density functional theory approach

    NASA Astrophysics Data System (ADS)

    Chernenkaya, A.; Morherr, A.; Backes, S.; Popp, W.; Witt, S.; Kozina, X.; Nepijko, S. A.; Bolte, M.; Medjanik, K.; Öhrwall, G.; Krellner, C.; Baumgarten, M.; Elmers, H. J.; Schönhense, G.; Jeschke, H. O.; Valentí, R.

    2016-07-01

    We have investigated the charge transfer mechanism in single crystals of DTBDT-TCNQ and DTBDT-F4TCNQ (where DTBDT is dithieno[2,3-d;2',3'-d'] benzo[1,2-b;4,5-b']dithiophene) using a combination of near-edge X-ray absorption spectroscopy (NEXAFS) and density functional theory calculations (DFT) including final state effects beyond the sudden state approximation. In particular, we find that a description that considers the partial screening of the electron-hole Coulomb correlation on a static level as well as the rearrangement of electronic density shows excellent agreement with experiment and allows to uncover the details of the charge transfer mechanism in DTBDT-TCNQ and DTBDT-F4 TCNQ, as well as a reinterpretation of previous NEXAFS data on pure TCNQ. Finally, we further show that almost the same quality of agreement between theoretical results and experiment is obtained by the much faster Z+1/2 approximation, where the core hole effects are simulated by replacing N or F with atomic number Z with the neighboring atom with atomic number Z+1/2.

  4. Combining Theory-Driven Evaluation and Causal Loop Diagramming for Opening the ‘Black Box’ of an Intervention in the Health Sector: A Case of Performance-Based Financing in Western Uganda

    PubMed Central

    Holvoet, Nathalie; Criel, Bart

    2017-01-01

    Increased attention on “complexity” in health systems evaluation has resulted in many different methodological responses. Theory-driven evaluations and systems thinking are two such responses that aim for better understanding of the mechanisms underlying given outcomes. Here, we studied the implementation of a performance-based financing intervention by the Belgian Technical Cooperation in Western Uganda to illustrate a methodological strategy of combining these two approaches. We utilized a systems dynamics tool called causal loop diagramming (CLD) to generate hypotheses feeding into a theory-driven evaluation. Semi-structured interviews were conducted with 30 health workers from two districts (Kasese and Kyenjojo) and with 16 key informants. After CLD, we identified three relevant hypotheses: “success to the successful”, “growth and underinvestment”, and “supervision conundrum”. The first hypothesis leads to increasing improvements in performance, as better performance leads to more incentives, which in turn leads to better performance. The latter two hypotheses point to potential bottlenecks. Thus, the proposed methodological strategy was a useful tool for identifying hypotheses that can inform a theory-driven evaluation. The hypotheses are represented in a comprehensible way while highlighting the underlying assumptions, and are more easily falsifiable than hypotheses identified without using CLD. PMID:28869518

  5. A Model-Based Investigation of Charge-Generation According to the Relative Diffusional Growth Rate Theory

    NASA Astrophysics Data System (ADS)

    Glassmeier, F.; Arnold, L.; Lohmann, U.; Dietlicher, R.; Paukert, M.

    2016-12-01

    Our current understanding of charge generation in thunderclouds is based on collisional charge transfer between graupel and ice crystals in the presence of liquid water droplets as dominant mechanism. The physical process of charge transfer and the sign of net charge generated on graupel and ice crystals under different cloud conditions is not yet understood. The Relative-Diffusional-Growth-Rate (RDGR) theory (Baker et al. 1987) suggests that the particle with the faster diffusional radius growth is charged positively. In this contribution, we use simulations of idealized thunderclouds with two-moment warm and cold cloud microphysics to generate realistic combinations of RDGR-parameters. We find that these realistic parameter combinations result in a relationship between sign of charge, cloud temperature and effective water content that deviates from previous theoretical and laboratory studies. This deviation indicates that the RDGR theory is sensitive to correlations between parameters that occur in clouds but are not captured in studies that vary temperature and water content while keeping other parameters at fixed values. In addition, our results suggest that diffusional growth from the riming-related local water vapor field, a key component of the RDGR theory, is negligible for realistic parameter combinations. Nevertheless, we confirm that the RDGR theory results in positive or negative charging of particles under different cloud conditions. Under specific conditions, charge generation via the RDGR theory alone might thus be sufficient to explain tripolar charge structures in thunderclouds. In general, however, additional charge generation mechanisms and adaptations to the RDGR theory that consider riming other than via local vapor deposition seem necessary.

  6. A Theory of Material Spike Formation in Flow Separation

    NASA Astrophysics Data System (ADS)

    Serra, Mattia; Haller, George

    2017-11-01

    We develop a frame-invariant theory of material spike formation during flow separation over a no-slip boundary in two-dimensional flows with arbitrary time dependence. This theory identifies both fixed and moving separation, is effective also over short-time intervals, and admits a rigorous instantaneous limit. Our theory is based on topological properties of material lines, combining objectively stretching- and rotation-based kinematic quantities. The separation profile identified here serves as the theoretical backbone for the material spike from its birth to its fully developed shape, and remains hidden to existing approaches. Finally, our theory can be used to rigorously explain the perception of off-wall separation in unsteady flows, and more importantly, provide the conditions under which such a perception is justified. We illustrate our results in several examples including steady, time-periodic and unsteady analytic velocity fields with flat and curved boundaries, and an experimental dataset.

  7. CO adsorption on W(100) during temperature-programmed desorption: A combined density functional theory and kinetic Monte Carlo study

    NASA Astrophysics Data System (ADS)

    Albao, Marvin A.; Padama, Allan Abraham B.

    2017-02-01

    Using a combined density functional theory (DFT) and kinetic Monte Carlo (KMC) simulations, we study the adsorption at 800 K and subsequent desorption of CO on W(100) at higher temperatures. The resulting TPD profiles are known experimentally to exhibit three desorption peaks β1, β2, and β3 at 930 K, 1070 K, and 1375 K, respectively. Unlike more recent theoretical studies that propose that all three aforementioned peaks are molecularly rather than associatively desorbed, our KMC analyses are in support of the latter, since at 800 K dissociation is facile and that CO exists as dissociation fragments C and O. We show that these peaks arise from desorption from the same adsorption site but whose binding energy varies depending on local environment, that is, the presence of CO as well as dissociation fragments C and O nearby. Furthermore we show that several key parameters, such as desorption, dissociation and recombination barriers all play a key role in the TPD spectra-these parameter effectively controls not only the location of the TPD peaks but the shape and width of the desorption peaks as well. Moreover, our KMC simulations reveal that varying the heating rate shifts the peaks but leaves their shape intact.

  8. A classical density-functional theory for describing water interfaces.

    PubMed

    Hughes, Jessica; Krebs, Eric J; Roundy, David

    2013-01-14

    We develop a classical density functional for water which combines the White Bear fundamental-measure theory (FMT) functional for the hard sphere fluid with attractive interactions based on the statistical associating fluid theory variable range (SAFT-VR). This functional reproduces the properties of water at both long and short length scales over a wide range of temperatures and is computationally efficient, comparable to the cost of FMT itself. We demonstrate our functional by applying it to systems composed of two hard rods, four hard rods arranged in a square, and hard spheres in water.

  9. The mathematical theory of signal processing and compression-designs

    NASA Astrophysics Data System (ADS)

    Feria, Erlan H.

    2006-05-01

    The mathematical theory of signal processing, named processor coding, will be shown to inherently arise as the computational time dual of Shannon's mathematical theory of communication which is also known as source coding. Source coding is concerned with signal source memory space compression while processor coding deals with signal processor computational time compression. Their combination is named compression-designs and referred as Conde in short. A compelling and pedagogically appealing diagram will be discussed highlighting Conde's remarkable successful application to real-world knowledge-aided (KA) airborne moving target indicator (AMTI) radar.

  10. A measurement theory of illusory conjunctions.

    PubMed

    Prinzmetal, William; Ivry, Richard B; Beck, Diane; Shimizu, Naomi

    2002-04-01

    Illusory conjunctions refer to the incorrect perceptual combination of correctly perceived features, such as color and shape. Research on the phenomenon has been hampered by the lack of a measurement theory that accounts for guessing features, as well as the incorrect combination of correctly perceived features. Recently, several investigators have suggested using multinomial models as a tool for measuring feature integration. The authors examined the adequacy of these models in 2 experiments by testing whether model parameters reflect changes in stimulus factors. In a third experiment, confidence ratings were used as a tool for testing the model. Multinomial models accurately reflected both variations in stimulus factors and observers' trial-by-trial confidence ratings.

  11. The essential role of social theory in qualitative public health research.

    PubMed

    Willis, Karen; Daly, Jeanne; Kealy, Michelle; Small, Rhonda; Koutroulis, Glenda; Green, Julie; Gibbs, Lisa; Thomas, Samantha

    2007-10-01

    To define the role of social theory and examine how research studies using qualitative methods can use social theory to generalize their results beyond the setting of the study or to other social groups. The assumptions underlying public health research using qualitative methods derive from a range of social theories that include conflict theory, structural functionalism, symbolic interactionism, the sociology of knowledge and feminism. Depending on the research problem, these and other social theories provide conceptual tools and models for constructing a suitable research framework, and for collecting and analysing data. In combination with the substantive health literature, the theoretical literature provides the conceptual bridge that links the conclusions of the study to other social groups and settings. While descriptive studies using qualitative research methods can generate important insights into social experience, the use of social theory in the construction and conduct of research enables researchers to extrapolate their findings to settings and groups broader than the ones in which the research was conducted.

  12. Colonel Blotto Games and Lancaster's Equations: A Novel Military Modeling Combination

    NASA Technical Reports Server (NTRS)

    Collins, Andrew J.; Hester, Patrick T.

    2012-01-01

    Military strategists face a difficult task when engaged in a battle against an adversarial force. They have to predict both what tactics their opponent will employ and the outcomes of any resultant conflicts in order to make the best decision about their actions. Game theory has been the dominant technique used by analysts to investigate the possible actions that an enemy will employ. Traditional game theory can be augmented by use of Lanchester equations, a set of differential equations used to determine the outcome of a conflict. This paper demonstrates a novel combination of game theory and Lanchester equations using Colonel Blotto games. Colonel Blotto games, which are one of the oldest applications of game theory to the military domain, look at the allocation of troops and resources when fighting across multiple areas of operation. This paper demonstrates that employing Lanchester equations within a game overcomes some of practical problems faced when applying game theory.

  13. A Multidimensional Theory of Suicide.

    PubMed

    Leenaars, Antoon A; Dieserud, Gudrun; Wenckstern, Susanne; Dyregrov, Kari; Lester, David; Lyke, Jennifer

    2018-04-05

    Theory is the foundation of science; this is true in suicidology. Over decades of studies of suicide notes, Leenaars developed a multidimensional model of suicide, with international (crosscultural) studies and independent verification. To corroborate Leenaars's theory with a psychological autopsy (PA) study, examining age and sex of the decedent, and survivor's relationship to deceased. A PA study in Norway, with 120 survivors/informants was undertaken. Leenaars' theoretical-conceptual (protocol) analysis was undertaken of the survivors' narratives and in-depth interviews combined. Substantial interjudge reliability was noted (κ = .632). Overall, there was considerable confirmatory evidence of Leenaars's intrapsychic and interpersonal factors in suicide survivors' narratives. Differences were found in the age of the decedent, but not in sex, nor in the survivor's closeness of the relationship. Older deceased people were perceived to exhibit more heightened unbearable intrapsychic pain, associated with the suicide. Leenaars's theory has corroborative verification, through the decedents' suicide notes and the survivors' narratives. However, the multidimensional model needs further testing to develop a better evidence-based way of understanding suicide.

  14. RESISTANCE TO EXTINCTION AND RELAPSE IN COMBINED STIMULUS CONTEXTS

    PubMed Central

    Podlesnik, Christopher A; Bai, John Y. H; Elliffe, Douglas

    2012-01-01

    Reinforcing an alternative response in the same context as a target response reduces the rate of occurrence but increases the persistence of that target response. Applied researchers who use such techniques to decrease the rate of a target problem behavior risk inadvertently increasing the persistence of the same problem behavior. Behavioral momentum theory asserts that the increased persistence is a function of the alternative reinforcement enhancing the Pavlovian relation between the target stimulus context and reinforcement. A method showing promise for reducing the persistence-enhancing effects of alternative reinforcement is to train the alternative response in a separate stimulus context before combining with the target stimulus in extinction. The present study replicated previous findings using pigeons by showing that combining an “alternative” richer VI schedule (96 reinforcers/hr) with a “target” leaner VI schedule (24 reinforcers/hr) reduced resistance to extinction of target responding compared with concurrent training of the alternative and target responses (totaling 120 reinforcers/hr). We also found less relapse with a reinstatement procedure following extinction with separate-context training, supporting previous findings that training conditions similarly influence both resistance to extinction and relapse. Finally, combining the alternative stimulus context was less disruptive to target responding previously trained in the concurrent schedule, relative to combining with the target response trained alone. Overall, the present findings suggest the technique of combining stimulus contexts associated with alternative responses with those associated with target responses disrupts target responding. Furthermore, the effectiveness of this disruption is a function of training context of reinforcement for target responding, consistent with assertions of behavioral momentum theory. PMID:23008521

  15. Method for PE Pipes Fusion Jointing Based on TRIZ Contradictions Theory

    NASA Astrophysics Data System (ADS)

    Sun, Jianguang; Tan, Runhua; Gao, Jinyong; Wei, Zihui

    The core of the TRIZ theories is the contradiction detection and solution. TRIZ provided various methods for the contradiction solution, but all that is not systematized. Combined with the technique system conception, this paper summarizes an integration solution method for contradiction solution based on the TRIZ contradiction theory. According to the method, a flowchart of integration solution method for contradiction is given. As a casestudy, method of fusion jointing PE pipe is analysised.

  16. Theory-Based Interventions Combining Mental Simulation and Planning Techniques to Improve Physical Activity: Null Results from Two Randomized Controlled Trials.

    PubMed

    Meslot, Carine; Gauchet, Aurélie; Allenet, Benoît; François, Olivier; Hagger, Martin S

    2016-01-01

    Interventions to assist individuals in initiating and maintaining regular participation in physical activity are not always effective. Psychological and behavioral theories advocate the importance of both motivation and volition in interventions to change health behavior. Interventions adopting self-regulation strategies that foster motivational and volitional components may, therefore, have utility in promoting regular physical activity participation. We tested the efficacy of an intervention adopting motivational (mental simulation) and volitional (implementation intentions) components to promote a regular physical activity in two studies. Study 1 adopted a cluster randomized design in which participants ( n = 92) were allocated to one of three conditions: mental simulation plus implementation intention, implementation intention only, or control. Study 2 adopted a 2 (mental simulation vs. no mental simulation) × 2 (implementation intention vs. no implementation intention) randomized controlled design in which fitness center attendees ( n = 184) were randomly allocated one of four conditions: mental simulation only, implementation intention only, combined, or control. Physical activity behavior was measured by self-report (Study 1) or fitness center attendance (Study 2) at 4- (Studies 1 and 2) and 19- (Study 2 only) week follow-up periods. Findings revealed no statistically significant main or interactive effects of the mental simulation and implementation intention conditions on physical activity outcomes in either study. Findings are in contrast to previous research which has found pervasive effects for both intervention strategies. Findings are discussed in light of study limitations including the relatively small sample sizes, particularly for Study 1, deviations in the operationalization of the intervention components from previous research and the lack of a prompt for a goal intention. Future research should focus on ensuring uniformity in the format of the

  17. Theory-Based Interventions Combining Mental Simulation and Planning Techniques to Improve Physical Activity: Null Results from Two Randomized Controlled Trials

    PubMed Central

    Meslot, Carine; Gauchet, Aurélie; Allenet, Benoît; François, Olivier; Hagger, Martin S.

    2016-01-01

    Interventions to assist individuals in initiating and maintaining regular participation in physical activity are not always effective. Psychological and behavioral theories advocate the importance of both motivation and volition in interventions to change health behavior. Interventions adopting self-regulation strategies that foster motivational and volitional components may, therefore, have utility in promoting regular physical activity participation. We tested the efficacy of an intervention adopting motivational (mental simulation) and volitional (implementation intentions) components to promote a regular physical activity in two studies. Study 1 adopted a cluster randomized design in which participants (n = 92) were allocated to one of three conditions: mental simulation plus implementation intention, implementation intention only, or control. Study 2 adopted a 2 (mental simulation vs. no mental simulation) × 2 (implementation intention vs. no implementation intention) randomized controlled design in which fitness center attendees (n = 184) were randomly allocated one of four conditions: mental simulation only, implementation intention only, combined, or control. Physical activity behavior was measured by self-report (Study 1) or fitness center attendance (Study 2) at 4- (Studies 1 and 2) and 19- (Study 2 only) week follow-up periods. Findings revealed no statistically significant main or interactive effects of the mental simulation and implementation intention conditions on physical activity outcomes in either study. Findings are in contrast to previous research which has found pervasive effects for both intervention strategies. Findings are discussed in light of study limitations including the relatively small sample sizes, particularly for Study 1, deviations in the operationalization of the intervention components from previous research and the lack of a prompt for a goal intention. Future research should focus on ensuring uniformity in the format of the

  18. Theory verification and numerical benchmarking on neoclassical toroidal viscosity

    NASA Astrophysics Data System (ADS)

    Wang, Z. R.; Park, J.-K.; Liu, Y. Q.; Logan, N. C.; Menard, J. E.

    2013-10-01

    Systematic verification and numerical benchmarking has been successfully carried out among three different approaches of neoclassical toroidal viscosity (NTV) theory and the corresponding codes: IPEC-PENT is developed based on the combined NTV theory but without geometric simplifications; MARS-K originally calculating the kinetic energy is upgraded to calculate the NTV torque based on the equivalence between kinetic energy and NTV torque; MARS-Q includes smoothly connected NTV formula. The derivation and numerical results both indicate that the imaginary part of kinetic energy calculated by MARS-K is equivalent to the NTV torque in IPEC-PENT. In the benchmark of precession resonance between MARS-Q and MARS-K/IPEC-PENT, it is first time to show the agreement and the correlation between the connected NTV formula and the combined NTV theory in different collisional region. Additionally, both IPEC-PENT and MARS-K indicates the importance of the bounce harmonic resonance which could greatly enhance the NTV torque when E cross B drift frequency reaches the bounce resonance condition. Since MARS-K also has the capability to calculate the plasma response including the kinetic effect self-consistently, the self-consistent NTV torque calculations have also been tested. This work is supported by DOE Contract No. DE-AC02-09CH11466.

  19. Unsteady Thick Airfoil Aerodynamics: Experiments, Computation, and Theory

    NASA Technical Reports Server (NTRS)

    Strangfeld, C.; Rumsey, C. L.; Mueller-Vahl, H.; Greenblatt, D.; Nayeri, C. N.; Paschereit, C. O.

    2015-01-01

    An experimental, computational and theoretical investigation was carried out to study the aerodynamic loads acting on a relatively thick NACA 0018 airfoil when subjected to pitching and surging, individually and synchronously. Both pre-stall and post-stall angles of attack were considered. Experiments were carried out in a dedicated unsteady wind tunnel, with large surge amplitudes, and airfoil loads were estimated by means of unsteady surface mounted pressure measurements. Theoretical predictions were based on Theodorsen's and Isaacs' results as well as on the relatively recent generalizations of van der Wall. Both two- and three-dimensional computations were performed on structured grids employing unsteady Reynolds-averaged Navier-Stokes (URANS). For pure surging at pre-stall angles of attack, the correspondence between experiments and theory was satisfactory; this served as a validation of Isaacs theory. Discrepancies were traced to dynamic trailing-edge separation, even at low angles of attack. Excellent correspondence was found between experiments and theory for airfoil pitching as well as combined pitching and surging; the latter appears to be the first clear validation of van der Wall's theoretical results. Although qualitatively similar to experiment at low angles of attack, two-dimensional URANS computations yielded notable errors in the unsteady load effects of pitching, surging and their synchronous combination. The main reason is believed to be that the URANS equations do not resolve wake vorticity (explicitly modeled in the theory) or the resulting rolled-up un- steady flow structures because high values of eddy viscosity tend to \\smear" the wake. At post-stall angles, three-dimensional computations illustrated the importance of modeling the tunnel side walls.

  20. Streamflow Prediction based on Chaos Theory

    NASA Astrophysics Data System (ADS)

    Li, X.; Wang, X.; Babovic, V. M.

    2015-12-01

    Chaos theory is a popular method in hydrologic time series prediction. Local model (LM) based on this theory utilizes time-delay embedding to reconstruct the phase-space diagram. For this method, its efficacy is dependent on the embedding parameters, i.e. embedding dimension, time lag, and nearest neighbor number. The optimal estimation of these parameters is thus critical to the application of Local model. However, these embedding parameters are conventionally estimated using Average Mutual Information (AMI) and False Nearest Neighbors (FNN) separately. This may leads to local optimization and thus has limitation to its prediction accuracy. Considering about these limitation, this paper applies a local model combined with simulated annealing (SA) to find the global optimization of embedding parameters. It is also compared with another global optimization approach of Genetic Algorithm (GA). These proposed hybrid methods are applied in daily and monthly streamflow time series for examination. The results show that global optimization can contribute to the local model to provide more accurate prediction results compared with local optimization. The LM combined with SA shows more advantages in terms of its computational efficiency. The proposed scheme here can also be applied to other fields such as prediction of hydro-climatic time series, error correction, etc.

  1. Observational information for f(T) theories and dark torsion

    NASA Astrophysics Data System (ADS)

    Bengochea, Gabriel R.

    2011-01-01

    In the present work we analyze and compare the information coming from different observational data sets in the context of a sort of f(T) theories. We perform a joint analysis with measurements of the most recent type Ia supernovae (SNe Ia), Baryon Acoustic Oscillation (BAO), Cosmic Microwave Background radiation (CMB), Gamma-Ray Bursts data (GRBs) and Hubble parameter observations (OHD) to constraint the only new parameter these theories have. It is shown that when the new combined BAO/CMB parameter is used to put constraints, the result is different from previous works. We also show that when we include Observational Hubble Data (OHD) the simpler ΛCDM model is excluded to one sigma level, leading the effective equation of state of these theories to be of phantom type. Also, analyzing a tension criterion for SNe Ia and other observational sets, we obtain more consistent and better suited data sets to work with these theories.

  2. An Item Response Theory Model for Test Bias.

    ERIC Educational Resources Information Center

    Shealy, Robin; Stout, William

    This paper presents a conceptualization of test bias for standardized ability tests which is based on multidimensional, non-parametric, item response theory. An explanation of how individually-biased items can combine through a test score to produce test bias is provided. It is contended that bias, although expressed at the item level, should be…

  3. Modified free volume theory of self-diffusion and molecular theory of shear viscosity of liquid carbon dioxide.

    PubMed

    Nasrabad, Afshin Eskandari; Laghaei, Rozita; Eu, Byung Chan

    2005-04-28

    In previous work on the density fluctuation theory of transport coefficients of liquids, it was necessary to use empirical self-diffusion coefficients to calculate the transport coefficients (e.g., shear viscosity of carbon dioxide). In this work, the necessity of empirical input of the self-diffusion coefficients in the calculation of shear viscosity is removed, and the theory is thus made a self-contained molecular theory of transport coefficients of liquids, albeit it contains an empirical parameter in the subcritical regime. The required self-diffusion coefficients of liquid carbon dioxide are calculated by using the modified free volume theory for which the generic van der Waals equation of state and Monte Carlo simulations are combined to accurately compute the mean free volume by means of statistical mechanics. They have been computed as a function of density along four different isotherms and isobars. A Lennard-Jones site-site interaction potential was used to model the molecular carbon dioxide interaction. The density and temperature dependence of the theoretical self-diffusion coefficients are shown to be in excellent agreement with experimental data when the minimum critical free volume is identified with the molecular volume. The self-diffusion coefficients thus computed are then used to compute the density and temperature dependence of the shear viscosity of liquid carbon dioxide by employing the density fluctuation theory formula for shear viscosity as reported in an earlier paper (J. Chem. Phys. 2000, 112, 7118). The theoretical shear viscosity is shown to be robust and yields excellent density and temperature dependence for carbon dioxide. The pair correlation function appearing in the theory has been computed by Monte Carlo simulations.

  4. Elastic Buckling under Combined Stresses of Flat Plates with Integral Waffle-Like Stiffening

    NASA Technical Reports Server (NTRS)

    Dow, Norris F.; Levin, L. Ross; Troutman, John L.

    1953-01-01

    Theory and experiment were compared and found in good agreement for the elastic Buckling under combined stresses of long flat plates with integral waffle-like stiffening in a variety of configurations. For such flat plates, 45deg waffle stiffening was found to be the most effective of the configurations for the proportions considered over the widest range of combinations of compression and shear.

  5. Elastic Buckling Under Combined Stresses of Flat Plates with Integral Waffle-like Stiffening

    NASA Technical Reports Server (NTRS)

    Dow, Norris F; Levin, L Ross; Troutman, John L

    1954-01-01

    Theory and experiment were compared and found in good agreement for the elastic buckling under combined stresses of long flat plates with integral waffle-like stiffening in a variety of configurations. For such flat plates, 45 degree waffle stiffening was found to be the most effective of the configurations for the proportions considered over the widest range of combinations of compression and shear.

  6. Teaching Analytical Chemistry to Pharmacy Students: A Combined, Iterative Approach

    ERIC Educational Resources Information Center

    Masania, Jinit; Grootveld, Martin; Wilson, Philippe B.

    2018-01-01

    Analytical chemistry has often been a difficult subject to teach in a classroom or lecture-based context. Numerous strategies for overcoming the inherently practical-based difficulties have been suggested, each with differing pedagogical theories. Here, we present a combined approach to tackling the problem of teaching analytical chemistry, with…

  7. Combining Critical Reflection and Design Thinking to Develop Integrative Learners

    ERIC Educational Resources Information Center

    Welsh, M. Ann; Dehler, Gordon E.

    2013-01-01

    In this article, we argue for advancing grounded curricula, which explicitly link theory and pedagogy, and executing them in authentic and multidisciplinary settings as a means to facilitate student growth into integrative learners. We describe the development of a student-centered learning experience that combines elements of critical management…

  8. Following Human Footsteps: Proposal of a Decision Theory Based on Human Behavior

    NASA Technical Reports Server (NTRS)

    Mahmud, Faisal

    2011-01-01

    Human behavior is a complex nature which depends on circumstances and decisions varying from time to time as well as place to place. The way a decision is made either directly or indirectly related to the availability of the options. These options though appear at random nature, have a solid directional way for decision making. In this paper, a decision theory is proposed which is based on human behavior. The theory is structured with model sets that will show the all possible combinations for making a decision, A virtual and simulated environment is considered to show the results of the proposed decision theory

  9. Dalton's disputed nitric oxide experiments and the origins of his atomic theory.

    PubMed

    Usselman, Melvyn C; Leaist, Derek G; Watson, Katherine D

    2008-01-11

    In 1808 John Dalton published his first general account of chemical atomic theory, a cornerstone of modern chemistry. The theory originated in his earlier studies of the properties of atmospheric gases. In 1803 Dalton discovered that oxygen combined with either one or two volumes of nitric oxide in closed vessels over water and this pioneering observation of integral multiple proportions provided important experimental evidence for his incipient atomic ideas. Previous attempts to reproduce Dalton's experiments have been unsuccessful and some commentators have concluded the results were fraudulent. We report a successful reconstruction of Dalton's experiments and provide an analysis exonerating him of any scientific misconduct. But we conclude that Dalton, already thinking atomistically, adjusted experimental conditions to obtain the integral combining proportions.

  10. Assessing Construct Validity Using Multidimensional Item Response Theory.

    ERIC Educational Resources Information Center

    Ackerman, Terry A.

    The concept of a user-specified validity sector is discussed. The idea of the validity sector combines the work of M. D. Reckase (1986) and R. Shealy and W. Stout (1991). Reckase developed a methodology to represent an item in a multidimensional latent space as a vector. Item vectors are computed using multidimensional item response theory item…

  11. Therapy for Childhood Sexual Abuse Survivors using Attachment and Family Systems Theory Orientations.

    PubMed

    Karakurt, Gunnur; Silver, Kristin E

    2014-01-01

    The aim of this paper is to understand the effects of childhood sexual abuse on a survivor's later life. For understanding and treating the emotional distress and interpersonal problems resulting from childhood sexual abuse, attachment theory provides a valuable framework. When this framework is combined with family systems theory, it can help therapists understand the family context where sexual abuse occurs and how this affects health and functioning throughout the lifespan. Case examples of female adult sexual abuse survivors are also explored, with insight from the intersection of systems and attachment theories.

  12. Honing Theory: A Complex Systems Framework for Creativity.

    PubMed

    Gabora, Liane

    2017-01-01

    This paper proposes a theory of creativity, referred to as honing theory, which posits that creativity fuels the process by which culture evolves through communal exchange amongst minds that are self-organizing, self-maintaining, and self-reproducing. According to honing theory, minds, like other self-organizing systems, modify their contents and adapt to their environments to minimize entropy. Creativity begins with detection of high psychological entropy material, which provokes uncertainty and is arousal-inducing. The creative process involves recursively considering this material from new contexts until it is sufficiently restructured that arousal dissipates. Restructuring involves neural synchrony and dynamic binding, and may be facilitated by temporarily shifting to a more associative mode of thought. A creative work may similarly induce restructuring in others, and thereby contribute to the cultural evolution of more nuanced worldviews. Since lines of cultural descent connecting creative outputs may exhibit little continuity, it is proposed that cultural evolution occurs at the level of self-organizing minds; outputs reflect their evolutionary state. Honing theory addresses challenges not addressed by other theories of creativity, such as the factors that guide restructuring, and in what sense creative works evolve. Evidence comes from empirical studies, an agent-based computational model of cultural evolution, and a model of concept combination.

  13. The use and limitations of attachment theory in child psychotherapy.

    PubMed

    Zilberstein, Karen

    2014-03-01

    Attachment theory and research has proliferated in recent years, spawning new ideas and applications to child therapy. Some of those interventions are creative and useful and rest on solid theory and research, whereas others derive from tenuous assumptions. As an important developmental construct, attachment plays a role in every therapy, but defining that role can be difficult. Therapists must recognize the significance of attachment in treatment but not at the expense of recognizing and treating other issues. This article provides an overview of attachment theory and attachment-based interventions and discusses how to apply those constructs to therapeutic work with children. It reviews attachment theory, assessment, and treatments, and discusses how attachment-focused interventions can be combined with other therapeutic needs and methods. It also considers limitations in the current clinical application of attachment and makes recommendations for further research. (PsycINFO Database Record (c) 2014 APA, all rights reserved).

  14. Investigation on the neutral and anionic BxAlyH2 (x + y = 7, 8, 9) clusters using density functional theory combined with photoelectron spectroscopy.

    PubMed

    Ding, Li-Ping; Shao, Peng; Lu, Cheng; Zhang, Fang-Hui; Ding, Lei; Yuan, Tao Li

    2016-08-17

    The structure and bonding nature of neutral and negatively charged BxAlyH2 (x + y = 7, 8, 9) clusters are investigated with the aid of previously published experimental photoelectron spectra combined with the present density functional theory calculations. The comparison between the experimental photoelectron spectra and theoretical simulated spectra helps to identify the ground state structures. The accuracy of the obtained ground state structures is further verified by calculating their adiabatic electron affinities and vertical detachment energies and comparing them against available experimental data. The results show that the structures of BxAlyH2 transform from three-dimensional to planar structures as the number of boron atoms increases. Moreover, boron atoms tend to bind together forming Bn units. The hydrogen atoms prefer to bind with boron atoms rather than aluminum atoms. The analyses of the molecular orbital on the ground state structures further support the abovementioned results.

  15. Applying Factor Analysis Combined with Kriging and Information Entropy Theory for Mapping and Evaluating the Stability of Groundwater Quality Variation in Taiwan

    PubMed Central

    Shyu, Guey-Shin; Cheng, Bai-You; Chiang, Chi-Ting; Yao, Pei-Hsuan; Chang, Tsun-Kuo

    2011-01-01

    In Taiwan many factors, whether geological parent materials, human activities, and climate change, can affect the groundwater quality and its stability. This work combines factor analysis and kriging with information entropy theory to interpret the stability of groundwater quality variation in Taiwan between 2005 and 2007. Groundwater quality demonstrated apparent differences between the northern and southern areas of Taiwan when divided by the Wu River. Approximately 52% of the monitoring wells in southern Taiwan suffered from progressing seawater intrusion, causing unstable groundwater quality. Industrial and livestock wastewaters also polluted 59.6% of the monitoring wells, resulting in elevated EC and TOC concentrations in the groundwater. In northern Taiwan, domestic wastewaters polluted city groundwater, resulting in higher NH3-N concentration and groundwater quality instability was apparent among 10.3% of the monitoring wells. The method proposed in this study for analyzing groundwater quality inspects common stability factors, identifies potential areas influenced by common factors, and assists in elevating and reinforcing information in support of an overall groundwater management strategy. PMID:21695030

  16. Teaching Theory X and Theory Y in Organizational Communication

    ERIC Educational Resources Information Center

    Noland, Carey

    2014-01-01

    The purpose of the activity described here is to integrate McGregor's Theory X and Theory Y into a group application: design a syllabus that embodies either Theory X or Theory Y tenets. Students should be able to differentiate between Theory X and Theory Y, create a syllabus based on Theory X or Theory Y tenets, evaluate the different syllabi…

  17. Effective theories of universal theories

    DOE PAGES

    Wells, James D.; Zhang, Zhengkang

    2016-01-20

    It is well-known but sometimes overlooked that constraints on the oblique parameters (most notably S and T parameters) are generally speaking only applicable to a special class of new physics scenarios known as universal theories. The oblique parameters should not be associated with Wilson coefficients in a particular operator basis in the effective field theory (EFT) framework, unless restrictions have been imposed on the EFT so that it describes universal theories. Here, we work out these restrictions, and present a detailed EFT analysis of universal theories. We find that at the dimension-6 level, universal theories are completely characterized by 16more » parameters. They are conveniently chosen to be: 5 oblique parameters that agree with the commonly-adopted ones, 4 anomalous triple-gauge couplings, 3 rescaling factors for the h 3, hff, hV V vertices, 3 parameters for hV V vertices absent in the Standard Model, and 1 four-fermion coupling of order yf 2. Furthermore, all these parameters are defined in an unambiguous and basis-independent way, allowing for consistent constraints on the universal theories parameter space from precision electroweak and Higgs data.« less

  18. Effective theories of universal theories

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wells, James D.; Zhang, Zhengkang

    It is well-known but sometimes overlooked that constraints on the oblique parameters (most notably S and T parameters) are generally speaking only applicable to a special class of new physics scenarios known as universal theories. The oblique parameters should not be associated with Wilson coefficients in a particular operator basis in the effective field theory (EFT) framework, unless restrictions have been imposed on the EFT so that it describes universal theories. Here, we work out these restrictions, and present a detailed EFT analysis of universal theories. We find that at the dimension-6 level, universal theories are completely characterized by 16more » parameters. They are conveniently chosen to be: 5 oblique parameters that agree with the commonly-adopted ones, 4 anomalous triple-gauge couplings, 3 rescaling factors for the h 3, hff, hV V vertices, 3 parameters for hV V vertices absent in the Standard Model, and 1 four-fermion coupling of order yf 2. Furthermore, all these parameters are defined in an unambiguous and basis-independent way, allowing for consistent constraints on the universal theories parameter space from precision electroweak and Higgs data.« less

  19. Theories of Career Development. A Comparison of the Theories.

    ERIC Educational Resources Information Center

    Osipow, Samuel H.

    These seven theories of career development are examined in previous chapters: (1) Roe's personality theory, (2) Holland's career typology theory, (3) the Ginzberg, Ginsburg, Axelrod, and Herma Theory, (4) psychoanalytic conceptions, (5) Super's developmental self-concept theory, (6) other personality theories, and (7) social systems theories.…

  20. Theory of Gamma-Ray Burst Sources

    NASA Astrophysics Data System (ADS)

    Ramirez-Ruiz, Enrico

    In the sections which follow, we shall be concerned predominantly with the theory of γ-ray burst sources. If the concepts there proposed are indeed relevant to an understanding of the nature of these sources, then their existence becomes inextricably linked to the metabolic pathways through which gravity, spin, and energy can combine to form collimated, ultrarelativistic outflows. These threads are few and fragile, as we are still wrestling with trying to understand non-relativistic processes, most notably those associated with the electromagnetic field and gas dynamics. If we are to improve our picture-making we must make more and stronger ties of physical theory. But in reconstructing the creature, we must be guided by our eyes and their extensions. In this introductory section we have therefore attempted to summarise the observed properties of these ultra-energetic phenomena.

  1. Multiconfiguration Pair-Density Functional Theory.

    PubMed

    Li Manni, Giovanni; Carlson, Rebecca K; Luo, Sijie; Ma, Dongxia; Olsen, Jeppe; Truhlar, Donald G; Gagliardi, Laura

    2014-09-09

    We present a new theoretical framework, called Multiconfiguration Pair-Density Functional Theory (MC-PDFT), which combines multiconfigurational wave functions with a generalization of density functional theory (DFT). A multiconfigurational self-consistent-field (MCSCF) wave function with correct spin and space symmetry is used to compute the total electronic density, its gradient, the on-top pair density, and the kinetic and Coulomb contributions to the total electronic energy. We then use a functional of the total density, its gradient, and the on-top pair density to calculate the remaining part of the energy, which we call the on-top-density-functional energy in contrast to the exchange-correlation energy of Kohn-Sham DFT. Because the on-top pair density is an element of the two-particle density matrix, this goes beyond the Hohenberg-Kohn theorem that refers only to the one-particle density. To illustrate the theory, we obtain first approximations to the required new type of density functionals by translating conventional density functionals of the spin densities using a simple prescription, and we perform post-SCF density functional calculations using the total density, density gradient, and on-top pair density from the MCSCF calculations. Double counting of dynamic correlation or exchange does not occur because the MCSCF energy is not used. The theory is illustrated by applications to the bond energies and potential energy curves of H2, N2, F2, CaO, Cr2, and NiCl and the electronic excitation energies of Be, C, N, N(+), O, O(+), Sc(+), Mn, Co, Mo, Ru, N2, HCHO, C4H6, c-C5H6, and pyrazine. The method presented has a computational cost and scaling similar to MCSCF, but a quantitative accuracy, even with the present first approximations to the new types of density functionals, that is comparable to much more expensive multireference perturbation theory methods.

  2. An Investigation of the Mathematical Models of Piaget's Psychological Theory of Cognitive Learning. Final Report.

    ERIC Educational Resources Information Center

    Kalechofsky, Robert

    This research paper proposes several mathematical models which help clarify Piaget's theory of cognition on the concrete and formal operational stages. Some modified lattice models were used for the concrete stage and a combined Boolean Algebra and group theory model was used for the formal stage. The researcher used experiments cited in the…

  3. CO(2) capture properties of alkaline earth metal oxides and hydroxides: A combined density functional theory and lattice phonon dynamics study.

    PubMed

    Duan, Yuhua; Sorescu, Dan C

    2010-08-21

    By combining density functional theory and lattice phonon dynamics, the thermodynamic properties of CO(2) absorption/desorption reactions with alkaline earth metal oxides MO and hydroxides M(OH)(2) (where M=Be,Mg,Ca,Sr,Ba) are analyzed. The heats of reaction and the chemical potential changes of these solids upon CO(2) capture reactions have been calculated and used to evaluate the energy costs. Relative to CaO, a widely used system in practical applications, MgO and Mg(OH)(2) systems were found to be better candidates for CO(2) sorbent applications due to their lower operating temperatures (600-700 K). In the presence of H(2)O, MgCO(3) can be regenerated into Mg(OH)(2) at low temperatures or into MgO at high temperatures. This transition temperature depends not only on the CO(2) pressure but also on the H(2)O pressure. Based on our calculated results and by comparing with available experimental data, we propose a general computational search methodology which can be used as a general scheme for screening a large number of solids for use as CO(2) sorbents.

  4. Development of a Pressure-Dependent Constitutive Model with Combined Multilinear Kinematic and Isotropic Hardening

    NASA Technical Reports Server (NTRS)

    Allen Phillip A.; Wilson, Christopher D.

    2003-01-01

    The development of a pressure-dependent constitutive model with combined multilinear kinematic and isotropic hardening is presented. The constitutive model is developed using the ABAQUS user material subroutine (UMAT). First the pressure-dependent plasticity model is derived. Following this, the combined bilinear and combined multilinear hardening equations are developed for von Mises plasticity theory. The hardening rule equations are then modified to include pressure dependency. The method for implementing the new constitutive model into ABAQUS is given.

  5. [Traceability of Wine Varieties Using Near Infrared Spectroscopy Combined with Cyclic Voltammetry].

    PubMed

    Li, Meng-hua; Li, Jing-ming; Li, Jun-hui; Zhang, Lu-da; Zhao, Long-lian

    2015-06-01

    To achieve the traceability of wine varieties, a method was proposed to fuse Near-infrared (NIR) spectra and cyclic voltammograms (CV) which contain different information using D-S evidence theory. NIR spectra and CV curves of three different varieties of wines (cabernet sauvignon, merlot, cabernet gernischt) which come from seven different geographical origins were collected separately. The discriminant models were built using PLS-DA method. Based on this, D-S evidence theory was then applied to achieve the integration of the two kinds of discrimination results. After integrated by D-S evidence theory, the accuracy rate of cross-validation is 95.69% and validation set is 94.12% for wine variety identification. When only considering the wine that come from Yantai, the accuracy rate of cross-validation is 99.46% and validation set is 100%. All the traceability models after fusion achieved better results on classification than individual method. These results suggest that the proposed method combining electrochemical information with spectral information using the D-S evidence combination formula is benefit to the improvement of model discrimination effect, and is a promising tool for discriminating different kinds of wines.

  6. Combination of classical test theory (CTT) and item response theory (IRT) analysis to study the psychometric properties of the French version of the Quality of Life Enjoyment and Satisfaction Questionnaire-Short Form (Q-LES-Q-SF).

    PubMed

    Bourion-Bédès, Stéphanie; Schwan, Raymund; Epstein, Jonathan; Laprevote, Vincent; Bédès, Alex; Bonnet, Jean-Louis; Baumann, Cédric

    2015-02-01

    The study aimed to examine the construct validity and reliability of the Quality of Life Enjoyment and Satisfaction Questionnaire-Short Form (Q-LES-Q-SF) according to both classical test and item response theories. The psychometric properties of the French version of this instrument were investigated in a cross-sectional, multicenter study. A total of 124 outpatients with a substance dependence diagnosis participated in the study. Psychometric evaluation included descriptive analysis, internal consistency, test-retest reliability, and validity. The dimensionality of the instrument was explored using a combination of the classical test, confirmatory factor analysis (CFA), and an item response theory analysis, the Person Separation Index (PSI), in a complementary manner. The results of the Q-LES-Q-SF revealed that the questionnaire was easy to administer and the acceptability was good. The internal consistency and the test-retest reliability were 0.9 and 0.88, respectively. All items were significantly correlated with the total score and the SF-12 used in the study. The CFA with one factor model was good, and for the unidimensional construct, the PSI was found to be 0.902. The French version of the Q-LES-Q-SF yielded valid and reliable clinical assessments of the quality of life for future research and clinical practice involving French substance abusers. In response to recent questioning regarding the unidimensionality or bidimensionality of the instrument and according to the underlying theoretical unidimensional construct used for its development, this study suggests the Q-LES-Q-SF as a one-dimension questionnaire in French QoL studies.

  7. Uncertainties in Forecasting Streamflow using Entropy Theory

    NASA Astrophysics Data System (ADS)

    Cui, H.; Singh, V. P.

    2017-12-01

    Streamflow forecasting is essential in river restoration, reservoir operation, power generation, irrigation, navigation, and water management. However, there is always uncertainties accompanied in forecast, which may affect the forecasting results and lead to large variations. Therefore, uncertainties must be considered and be assessed properly when forecasting streamflow for water management. The aim of our work is to quantify the uncertainties involved in forecasting streamflow and provide reliable streamflow forecast. Despite that streamflow time series are stochastic, they exhibit seasonal and periodic patterns. Therefore, streamflow forecasting entails modeling seasonality, periodicity, and its correlation structure, and assessing uncertainties. This study applies entropy theory to forecast streamflow and measure uncertainties during the forecasting process. To apply entropy theory for streamflow forecasting, spectral analysis is combined to time series analysis, as spectral analysis can be employed to characterize patterns of streamflow variation and identify the periodicity of streamflow. That is, it permits to extract significant information for understanding the streamflow process and prediction thereof. Application of entropy theory for streamflow forecasting involves determination of spectral density, determination of parameters, and extension of autocorrelation function. The uncertainties brought by precipitation input, forecasting model and forecasted results are measured separately using entropy. With information theory, how these uncertainties transported and aggregated during these processes will be described.

  8. Issues in Optical Diffraction Theory

    PubMed Central

    Mielenz, Klaus D.

    2009-01-01

    This paper focuses on unresolved or poorly documented issues pertaining to Fresnel’s scalar diffraction theory and its modifications. In Sec. 2 it is pointed out that all thermal sources used in practice are finite in size and errors can result from insufficient coherence of the optical field. A quarter-wave criterion is applied to show how such errors can be avoided by placing the source at a large distance from the aperture plane, and it is found that in many cases it may be necessary to use collimated light as on the source side of a Fraunhofer experiment. If these precautions are not taken the theory of partial coherence may have to be used for the computations. In Sec. 3 it is recalled that for near-zone computations the Kirchhoff or Rayleigh-Sommerfeld integrals are applicable, but fail to correctly describe the energy flux across the aperture plane because they are not continuously differentiable with respect to the assumed geometrical field on the source side. This is remedied by formulating an improved theory in which the field on either side of a semi-reflecting screen is expressed as the superposition of mutually incoherent components which propagate in the opposite directions of the incident and reflected light. These components are defined as linear combinations of the Rayleigh-Sommerfeld integrals, so that they are rigorous solutions of the wave equation as well as continuously differentiable in the aperture plane. Algorithms for using the new theory for computing the diffraction patterns of circular apertures and slits at arbitrary distances z from either side of the aperture (down to z = ± 0.0003 λ) are presented, and numerical examples of the results are given. These results show that the incident geometrical field is modulated by diffraction before it reaches the aperture plane while the reflected field is spilled into the dark space. At distances from the aperture which are large compared to the wavelength λ these field expressions are

  9. A Future of Communication Theory: Systems Theory.

    ERIC Educational Resources Information Center

    Lindsey, Georg N.

    Concepts of general systems theory, cybernetics and the like may provide the methodology for communication theory to move from a level of technology to a level of pure science. It was the purpose of this paper to (1) demonstrate the necessity of applying systems theory to the construction of communication theory, (2) review relevant systems…

  10. Theory comparison and numerical benchmarking on neoclassical toroidal viscosity torque

    NASA Astrophysics Data System (ADS)

    Wang, Zhirui; Park, Jong-Kyu; Liu, Yueqiang; Logan, Nikolas; Kim, Kimin; Menard, Jonathan E.

    2014-04-01

    Systematic comparison and numerical benchmarking have been successfully carried out among three different approaches of neoclassical toroidal viscosity (NTV) theory and the corresponding codes: IPEC-PENT is developed based on the combined NTV theory but without geometric simplifications [Park et al., Phys. Rev. Lett. 102, 065002 (2009)]; MARS-Q includes smoothly connected NTV formula [Shaing et al., Nucl. Fusion 50, 025022 (2010)] based on Shaing's analytic formulation in various collisionality regimes; MARS-K, originally computing the drift kinetic energy, is upgraded to compute the NTV torque based on the equivalence between drift kinetic energy and NTV torque [J.-K. Park, Phys. Plasma 18, 110702 (2011)]. The derivation and numerical results both indicate that the imaginary part of drift kinetic energy computed by MARS-K is equivalent to the NTV torque in IPEC-PENT. In the benchmark of precession resonance between MARS-Q and MARS-K/IPEC-PENT, the agreement and correlation between the connected NTV formula and the combined NTV theory in different collisionality regimes are shown for the first time. Additionally, both IPEC-PENT and MARS-K indicate the importance of the bounce harmonic resonance which can greatly enhance the NTV torque when E ×B drift frequency reaches the bounce resonance condition.

  11. Theory comparison and numerical benchmarking on neoclassical toroidal viscosity torque

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, Zhirui; Park, Jong-Kyu; Logan, Nikolas

    Systematic comparison and numerical benchmarking have been successfully carried out among three different approaches of neoclassical toroidal viscosity (NTV) theory and the corresponding codes: IPEC-PENT is developed based on the combined NTV theory but without geometric simplifications [Park et al., Phys. Rev. Lett. 102, 065002 (2009)]; MARS-Q includes smoothly connected NTV formula [Shaing et al., Nucl. Fusion 50, 025022 (2010)] based on Shaing's analytic formulation in various collisionality regimes; MARS-K, originally computing the drift kinetic energy, is upgraded to compute the NTV torque based on the equivalence between drift kinetic energy and NTV torque [J.-K. Park, Phys. Plasma 18, 110702more » (2011)]. The derivation and numerical results both indicate that the imaginary part of drift kinetic energy computed by MARS-K is equivalent to the NTV torque in IPEC-PENT. In the benchmark of precession resonance between MARS-Q and MARS-K/IPEC-PENT, the agreement and correlation between the connected NTV formula and the combined NTV theory in different collisionality regimes are shown for the first time. Additionally, both IPEC-PENT and MARS-K indicate the importance of the bounce harmonic resonance which can greatly enhance the NTV torque when E×B drift frequency reaches the bounce resonance condition.« less

  12. On the interpretation of combined torsion and tension tests of thin-wall tubes

    NASA Technical Reports Server (NTRS)

    Prager, W

    1948-01-01

    General ways of testing thin-wall tubes under combined tension and torsion as a means of checking the various theories of plasticity are discussed. Suggestions also are given for the interpretation of the tests.

  13. Foundations for a theory of gravitation theories

    NASA Technical Reports Server (NTRS)

    Thorne, K. S.; Lee, D. L.; Lightman, A. P.

    1972-01-01

    A foundation is laid for future analyses of gravitation theories. This foundation is applicable to any theory formulated in terms of geometric objects defined on a 4-dimensional spacetime manifold. The foundation consists of (1) a glossary of fundamental concepts; (2) a theorem that delineates the overlap between Lagrangian-based theories and metric theories; (3) a conjecture (due to Schiff) that the Weak Equivalence Principle implies the Einstein Equivalence Principle; and (4) a plausibility argument supporting this conjecture for the special case of relativistic, Lagrangian-based theories.

  14. Multifractality to Photonic Crystal & Self-Organization to Metamaterials through Anderson Localizations & Group/Gauge Theory

    NASA Astrophysics Data System (ADS)

    Hidajatullah-Maksoed, Widastra

    2015-04-01

    Arthur Cayley at least investigate by creating the theory of permutation group[F:∖∖Group_theory.htm] where in cell elements addressing of the lattice Qmf used a Cayley tree, the self-afine object Qmf is described by the combination of the finite groups of rotation & inversion and the infinite groups of translation & dilation[G Corso & LS Lacena: ``Multifractal lattice and group theory'', Physica A: Statistical Mechanics &Its Applications, 2005, v 357, issue I, h 64-70; http://www.sciencedirect.com/science/articel/pii/S0378437105005005 ] hence multifractal can be related to group theory. Many grateful Thanks to HE. Mr. Drs. P. SWANTORO & HE. Mr. Ir. SARWONO KUSUMAATMADJA.

  15. Situation-specific theories from the middle-range transitions theory.

    PubMed

    Im, Eun-Ok

    2014-01-01

    The purpose of this article was to analyze the theory development process of the situation-specific theories that were derived from the middle-range transitions theory. This analysis aims to provide directions for future development of situation-specific theories. First, transitions theory is concisely described with its history, goal, and major concepts. Then, the approach that was used to retrieve the situation-specific theories derived from transitions theory is described. Next, an analysis of 6 situation-specific theories is presented. Finally, 4 themes reflecting commonalities and variances in the theory development process are discussed with implications for future theoretical development.

  16. Ko Displacement Theory for Structural Shape Predictions

    NASA Technical Reports Server (NTRS)

    Ko, William L.

    2010-01-01

    The development of the Ko displacement theory for predictions of structure deformed shapes was motivated in 2003 by the Helios flying wing, which had a 247-ft (75-m) wing span with wingtip deflections reaching 40 ft (12 m). The Helios flying wing failed in midair in June 2003, creating the need to develop new technology to predict in-flight deformed shapes of unmanned aircraft wings for visual display before the ground-based pilots. Any types of strain sensors installed on a structure can only sense the surface strains, but are incapable to sense the overall deformed shapes of structures. After the invention of the Ko displacement theory, predictions of structure deformed shapes could be achieved by feeding the measured surface strains into the Ko displacement transfer functions for the calculations of out-of-plane deflections and cross sectional rotations at multiple locations for mapping out overall deformed shapes of the structures. The new Ko displacement theory combined with a strain-sensing system thus created a revolutionary new structure- shape-sensing technology.

  17. Ordered rate constitutive theories for thermoviscoelastic solids with memory in Lagrangian description using Gibbs potential

    NASA Astrophysics Data System (ADS)

    Surana, K. S.; Reddy, J. N.; Nunez, Daniel

    2015-11-01

    This paper presents ordered rate constitutive theories of orders m and n, i.e., ( m, n) for finite deformation of homogeneous, isotropic, compressible and incompressible thermoviscoelastic solids with memory in Lagrangian description using entropy inequality in Gibbs potential Ψ as an alternate approach of deriving constitutive theories using entropy inequality in terms of Helmholtz free energy density Φ. Second Piola-Kirchhoff stress σ [0] and Green's strain tensor ɛ [0] are used as conjugate pair. We consider Ψ, heat vector q, entropy density η and rates of upto orders m and n of σ [0] and ɛ [0], i.e., σ [ i]; i = 0, 1, . . . , m and ɛ [ j]; j = 0, 1, . . . , n. We choose Ψ, ɛ [ n], q and η as dependent variables in the constitutive theories with ɛ [ j]; j = 0, 1, . . . , n - 1, σ [ i]; i = 0, 1, . . . , m, temperature gradient g and temperature θ as their argument tensors. Rationale for this choice is explained in the paper. Entropy inequality, decomposition of σ [0] into equilibrium and deviatoric stresses, the conditions resulting from entropy inequality and the theory of generators and invariants are used in the derivations of ordered rate constitutive theories of orders m and n in stress and strain tensors. Constitutive theories for the heat vector q (of up to orders m and n - 1) that are consistent (in terms of the argument tensors) with the constitutive theories for ɛ [ n] (of up to orders m and n) are also derived. Many simplified forms of the rate theories of orders ( m, n) are presented. Material coefficients are derived by considering Taylor series expansions of the coefficients in the linear combinations representing ɛ [ n] and q using the combined generators of the argument tensors about a known configuration {{\\underline{\\varOmega}}} in the combined invariants of the argument tensors and temperature. It is shown that the rate constitutive theories of order one ( m = 1, n = 1) when further simplified result in constitutive

  18. Some Considerations Necessary for a Viable Theory of Human Memory.

    ERIC Educational Resources Information Center

    Sietsema, Douglas J.

    Empirical research is reviewed in the area of cognitive psychology pertaining to models of human memory. Research evidence and theoretical considerations are combined to develop guidelines for future theory development related to the human memory. The following theoretical constructs and variables are discussed: (1) storage versus process…

  19. Effective Mass Theory of 2D Excitons Revisited

    NASA Astrophysics Data System (ADS)

    Gonzalez, Joseph; Oleynik, Ivan

    Two-dimensional (2D) semiconducting materials possess an exceptionally unique set of electronic and excitonic properties due to the combined effects of quantum and dielectric confinement. Reliable determination of exciton binding energies from both first-principles many-body perturbation theory (GW/BSE) and experiment is very challenging due to the enormous computational expense as well as the tremendous technical difficulties in experiment.. Very recently, effective mass theories of 2D excitons have been developed as an attractive alternative for inexpensive and accurate evaluation of the exciton binding energies. In this presentation, we evaluate two effective mass theory approaches by Velizhanin et al and Olsen et al in predicting exciton binding energies across a wide range of 2D materials. We specifically analyze the trends related to the varying screening lengths and exciton effective masses. We also extended the effective mass theory of 2D excitons to include effects of electron and hole mass anisotropies (mx ≠ my) , the latter showing a substantial influence on exciton binding energies. The recent predictions of exciton binding energies being independent of the exciton effective mass and a linear correlation with the band gap of a specific material are also critically reexamined.

  20. Examining the Potential of Combining the Methods of Grounded Theory and Narrative Inquiry: A Comparative Analysis

    ERIC Educational Resources Information Center

    Lal, Shalini; Suto, Melinda; Ungar, Michael

    2012-01-01

    Increasingly, qualitative researchers are combining methods, processes, and principles from two or more methodologies over the course of a research study. Critics charge that researchers adopting combined approaches place too little attention on the historical, epistemological, and theoretical aspects of the research design. Rather than…

  1. General Theory of Aerodynamic Instability and the Mechanism of Flutter

    NASA Technical Reports Server (NTRS)

    Theodorsen, Theodore

    1979-01-01

    The aerodynamic forces on an oscillating airfoil or airfoil-aileron combination of three independent degrees of freedom were determined. The problem resolves itself into the solution of certain definite integrals, which were identified as Bessel functions of the first and second kind, and of zero and first order. The theory, based on potential flow and the Kutta condition, is fundamentally equivalent to the conventional wing section theory relating to the steady case. The air forces being known, the mechanism of aerodynamic instability was analyzed. An exact solution, involving potential flow and the adoption of the Kutta condition, was derived. The solution is of a simple form and is expressed by means of an auxiliary parameter k. The flutter velocity, treated as the unknown quantity, was determined as a function of a certain ratio of the frequencies in the separate degrees of freedom for any magnitudes and combinations of the airfoil-aileron parameters.

  2. Constructing a new theory from old ideas and new evidence

    PubMed Central

    Rhodes, Marjorie; Wellman, Henry

    2014-01-01

    A central tenet of constructivist models of conceptual development is that children’s initial conceptual level constrains how they make sense of new evidence and thus whether exposure to evidence will prompt conceptual change. Yet, little experimental evidence directly examines this claim for the case of sustained, fundamental conceptual achievements. The present study combined scaling and experimental microgenetic methods to examine the processes underlying conceptual change in the context of an important conceptual achievement of early childhood—the development of a representational theory of mind. Results from 47 children (M age = 3.7 years) indicate that only children who were conceptually close to understanding false belief at the beginning of the study, and who were experimentally exposed to evidence of people acting on false beliefs, reliably developed representational theories of minds. Combined scaling and microgenetic data revealed how prior conceptual level interacts with experience, thereby providing critical experimental evidence for how conceptual change results from the interplay between conceptions and evidence. PMID:23489194

  3. Yang-Mills theory and the ABC conjecture

    NASA Astrophysics Data System (ADS)

    He, Yang-Hui; Hu, Zhi; Probst, Malte; Read, James

    2018-05-01

    We establish a precise correspondence between the ABC Conjecture and 𝒩 = 4 super-Yang-Mills theory. This is achieved by combining three ingredients: (i) Elkies’ method of mapping ABC-triples to elliptic curves in his demonstration that ABC implies Mordell/Faltings; (ii) an explicit pair of elliptic curve and associated Belyi map given by Khadjavi-Scharaschkin; and (iii) the fact that the bipartite brane-tiling/dimer model for a gauge theory with toric moduli space is a particular dessin d’enfant in the sense of Grothendieck. We explore this correspondence for the highest quality ABC-triples as well as large samples of random triples. The conjecture itself is mapped to a statement about the fundamental domain of the toroidal compactification of the string realization of 𝒩 = 4 SYM.

  4. Photovoltaic Properties of Two-Dimensional (CH3NH3)2Pb(SCN)2I2 Perovskite: A Combined Experimental and Density Functional Theory Study.

    PubMed

    Xiao, Zewen; Meng, Weiwei; Saparov, Bayrammurad; Duan, Hsin-Sheng; Wang, Changlei; Feng, Chunbao; Liao, Weiqiang; Ke, Weijun; Zhao, Dewei; Wang, Jianbo; Mitzi, David B; Yan, Yanfa

    2016-04-07

    We explore the photovoltaic-relevant properties of the 2D MA2Pb(SCN)2I2 (where MA = CH3NH3(+)) perovskite using a combination of materials synthesis, characterization and density functional theory calculation, and determine electronic properties of MA2Pb(SCN)2I2 that are significantly different from those previously reported in literature. The layered perovskite with mixed-anions exhibits an indirect bandgap of ∼2.04 eV, with a slightly larger direct bandgap of ∼2.11 eV. The carriers (both electrons and holes) are also found to be confined within the 2D layers. Our results suggest that the 2D MA2Pb(SCN)2I2 perovskite may not be among the most promising absorbers for efficient single-junction solar cell applications; however, use as an absorber for the top cell of a tandem solar cell may still be a possibility if films are grown with the 2D layers aligned perpendicular to the substrates.

  5. Combined social cognitive and neurocognitive rehabilitation strategies in schizophrenia: neuropsychological and psychopathological influences on Theory of Mind improvement.

    PubMed

    Bechi, M; Bosia, M; Spangaro, M; Buonocore, M; Cocchi, F; Pigoni, A; Piantanida, M; Guglielmino, C; Bianchi, L; Smeraldi, E; Cavallaro, R

    2015-11-01

    Neurocognitive and social cognitive impairments represent important treatment targets in schizophrenia, as they are significant predictors of functional outcome. Different rehabilitative interventions have recently been developed, addressing both cognitive and psychosocial domains. Although promising, results are still heterogeneous and predictors of treatment outcome are not yet identified. In this study we evaluated the efficacy of two newly developed social cognitive interventions, respectively based on the use of videotaped material and comic strips, combined with domain-specific Cognitive Remediation Therapy (CRT). We also analysed possible predictors of training outcome, including basal neurocognitive performance, the degree of cognitive improvement after CRT and psychopathological variables. Seventy-five patients with schizophrenia treated with CRT, were randomly assigned to: social cognitive training (SCT) group, Theory of Mind Intervention (ToMI) group, and active control group (ACG). ANOVAs showed that SCT and ToMI groups improved significantly in ToM measures, whereas the ACG did not. We reported no influences of neuropsychological measures and improvement after CRT on changes in ToM. Both paranoid and non-paranoid subjects improved significantly after ToMI and SCT, without differences between groups, despite the better performance in basal ToM found among paranoid patients. In the ACG only non-paranoid patients showed an improvement in non-verbal ToM. Results showed that both ToMI and SCT are effective in improving ToM in schizophrenia with no influence of neuropsychological domains. Our data also suggest that paranoid symptoms may discriminate between different types of ToM difficulties in schizophrenia.

  6. Comparing models of the combined-stimulation advantage for speech recognition.

    PubMed

    Micheyl, Christophe; Oxenham, Andrew J

    2012-05-01

    The "combined-stimulation advantage" refers to an improvement in speech recognition when cochlear-implant or vocoded stimulation is supplemented by low-frequency acoustic information. Previous studies have been interpreted as evidence for "super-additive" or "synergistic" effects in the combination of low-frequency and electric or vocoded speech information by human listeners. However, this conclusion was based on predictions of performance obtained using a suboptimal high-threshold model of information combination. The present study shows that a different model, based on Gaussian signal detection theory, can predict surprisingly large combined-stimulation advantages, even when performance with either information source alone is close to chance, without involving any synergistic interaction. A reanalysis of published data using this model reveals that previous results, which have been interpreted as evidence for super-additive effects in perception of combined speech stimuli, are actually consistent with a more parsimonious explanation, according to which the combined-stimulation advantage reflects an optimal combination of two independent sources of information. The present results do not rule out the possible existence of synergistic effects in combined stimulation; however, they emphasize the possibility that the combined-stimulation advantages observed in some studies can be explained simply by non-interactive combination of two information sources.

  7. Game theory as a conceptual framework for managing insect pests.

    PubMed

    Brown, Joel S; Staňková, Kateřina

    2017-06-01

    For over 100 years it has been recognized that insect pests evolve resistance to chemical pesticides. More recently, managers have advocated restrained use of pesticides, crop rotation, the use of multiple pesticides, and pesticide-free sanctuaries as resistance management practices. Game theory provides a conceptual framework for combining the resistance strategies of the insects and the control strategies of the pest manager into a unified conceptual and modelling framework. Game theory can contrast an ecologically enlightened application of pesticides with an evolutionarily enlightened one. In the former case the manager only considers ecological consequences whereas the latter anticipates the evolutionary response of the pests. Broader applications of this game theory approach include anti-biotic resistance, fisheries management and therapy resistance in cancer. Copyright © 2017 Elsevier Inc. All rights reserved.

  8. Regularized variational theories of fracture: A unified approach

    NASA Astrophysics Data System (ADS)

    Freddi, Francesco; Royer-Carfagni, Gianni

    2010-08-01

    The fracture pattern in stressed bodies is defined through the minimization of a two-field pseudo-spatial-dependent functional, with a structure similar to that proposed by Bourdin-Francfort-Marigo (2000) as a regularized approximation of a parent free-discontinuity problem, but now considered as an autonomous model per se. Here, this formulation is altered by combining it with structured deformation theory, to model that when the material microstructure is loosened and damaged, peculiar inelastic (structured) deformations may occur in the representative volume element at the price of surface energy consumption. This approach unifies various theories of failure because, by simply varying the form of the class for admissible structured deformations, different-in-type responses can be captured, incorporating the idea of cleavage, deviatoric, combined cleavage-deviatoric and masonry-like fractures. Remarkably, this latter formulation rigorously avoid material overlapping in the cracked zones. The model is numerically implemented using a standard finite-element discretization and adopts an alternate minimization algorithm, adding an inequality constraint to impose crack irreversibility ( fixed crack model). Numerical experiments for some paradigmatic examples are presented and compared for various possible versions of the model.

  9. Testing the Applicability of Nernst-Planck Theory in Ion Channels: Comparisons with Brownian Dynamics Simulations

    PubMed Central

    Song, Chen; Corry, Ben

    2011-01-01

    The macroscopic Nernst-Planck (NP) theory has often been used for predicting ion channel currents in recent years, but the validity of this theory at the microscopic scale has not been tested. In this study we systematically tested the ability of the NP theory to accurately predict channel currents by combining and comparing the results with those of Brownian dynamics (BD) simulations. To thoroughly test the theory in a range of situations, calculations were made in a series of simplified cylindrical channels with radii ranging from 3 to 15 Å, in a more complex ‘catenary’ channel, and in a realistic model of the mechanosensitive channel MscS. The extensive tests indicate that the NP equation is applicable in narrow ion channels provided that accurate concentrations and potentials can be input as the currents obtained from the combination of BD and NP match well with those obtained directly from BD simulations, although some discrepancies are seen when the ion concentrations are not radially uniform. This finding opens a door to utilising the results of microscopic simulations in continuum theory, something that is likely to be useful in the investigation of a range of biophysical and nano-scale applications and should stimulate further studies in this direction. PMID:21731672

  10. Testing the applicability of Nernst-Planck theory in ion channels: comparisons with Brownian dynamics simulations.

    PubMed

    Song, Chen; Corry, Ben

    2011-01-01

    The macroscopic Nernst-Planck (NP) theory has often been used for predicting ion channel currents in recent years, but the validity of this theory at the microscopic scale has not been tested. In this study we systematically tested the ability of the NP theory to accurately predict channel currents by combining and comparing the results with those of Brownian dynamics (BD) simulations. To thoroughly test the theory in a range of situations, calculations were made in a series of simplified cylindrical channels with radii ranging from 3 to 15 Å, in a more complex 'catenary' channel, and in a realistic model of the mechanosensitive channel MscS. The extensive tests indicate that the NP equation is applicable in narrow ion channels provided that accurate concentrations and potentials can be input as the currents obtained from the combination of BD and NP match well with those obtained directly from BD simulations, although some discrepancies are seen when the ion concentrations are not radially uniform. This finding opens a door to utilising the results of microscopic simulations in continuum theory, something that is likely to be useful in the investigation of a range of biophysical and nano-scale applications and should stimulate further studies in this direction.

  11. Evolution of Online Discussion Forum Richness according to Channel Expansion Theory: A Longitudinal Panel Data Analysis

    ERIC Educational Resources Information Center

    Fernandez, Vicenc; Simo, Pep; Sallan, Jose M.; Enache, Mihaela

    2013-01-01

    The selection and use of communication media has been the center of attention for a great number of researchers in the area of organizational communication. The channel expansion theory combines elements of the main theories in this area; however, these investigations have a static cross-sectional design rather than a longitudinal analysis. With…

  12. Geographical Theories.

    ERIC Educational Resources Information Center

    Golledge, Reginald G.

    1996-01-01

    Discusses the origin of theories in geography and particularly the development of location theories. Considers the influence of economic theory on agricultural land use, industrial location, and geographic location theories. Explores a set of interrelated activities that show how the marketing process illustrates process theory. (MJP)

  13. The Analyst's "Use" of Theory or Theories: The Play of Theory.

    PubMed

    Cooper, Steven H

    2017-10-01

    Two clinical vignettes demonstrate a methodological approach that guides the analyst's attention to metaphors and surfaces that are the focus of different theories. Clinically, the use of different theories expands the metaphorical language with which the analyst tries to make contact with the patient's unconscious life. Metaphorical expressions may be said to relate to each other as the syntax of unconscious fantasy (Arlow 1979). The unconscious fantasy itself represents a metaphorical construction of childhood experience that has persisted, dynamically expressive and emergent into adult life. This persistence is evident in how, in some instances, long periods of an analysis focus on translating one or a few metaphors, chiefly because the manifest metaphorical expressions of a central theme regularly lead to better understanding of an unconscious fantasy. At times employing another model or theory assists in a level of self-reflection about clinical understanding and clinical decisions. The analyst's choice of theory or theories is unique to the analyst and is not prescriptive, except as illustrating a way to think about these issues. The use of multiple models in no way suggests or implies that theories may be integrated.

  14. Combining a dispersal model with network theory to assess habitat connectivity.

    PubMed

    Lookingbill, Todd R; Gardner, Robert H; Ferrari, Joseph R; Keller, Cherry E

    2010-03-01

    Assessing the potential for threatened species to persist and spread within fragmented landscapes requires the identification of core areas that can sustain resident populations and dispersal corridors that can link these core areas with isolated patches of remnant habitat. We developed a set of GIS tools, simulation methods, and network analysis procedures to assess potential landscape connectivity for the Delmarva fox squirrel (DFS; Sciurus niger cinereus), an endangered species inhabiting forested areas on the Delmarva Peninsula, USA. Information on the DFS's life history and dispersal characteristics, together with data on the composition and configuration of land cover on the peninsula, were used as input data for an individual-based model to simulate dispersal patterns of millions of squirrels. Simulation results were then assessed using methods from graph theory, which quantifies habitat attributes associated with local and global connectivity. Several bottlenecks to dispersal were identified that were not apparent from simple distance-based metrics, highlighting specific locations for landscape conservation, restoration, and/or squirrel translocations. Our approach links simulation models, network analysis, and available field data in an efficient and general manner, making these methods useful and appropriate for assessing the movement dynamics of threatened species within landscapes being altered by human and natural disturbances.

  15. Decidability of formal theories and hyperincursivity theory

    NASA Astrophysics Data System (ADS)

    Grappone, Arturo G.

    2000-05-01

    This paper shows the limits of the Proof Standard Theory (briefly, PST) and gives some ideas of how to build a proof anticipatory theory (briefly, PAT) that has no such limits. Also, this paper considers that Gödel's proof of the undecidability of Principia Mathematica formal theory is not valid for axiomatic theories that use a PAT to build their proofs because the (hyper)incursive functions are self-representable.

  16. Theory of Multiple Intelligences: Is It a Scientific Theory?

    ERIC Educational Resources Information Center

    Chen, Jie-Qi

    2004-01-01

    This essay discusses the status of multiple intelligences (MI) theory as a scientific theory by addressing three issues: the empirical evidence Gardner used to establish MI theory, the methodology he employed to validate MI theory, and the purpose or function of MI theory.

  17. Laminar fMRI and computational theories of brain function.

    PubMed

    Stephan, K E; Petzschner, F H; Kasper, L; Bayer, J; Wellstein, K V; Stefanics, G; Pruessmann, K P; Heinzle, J

    2017-11-02

    Recently developed methods for functional MRI at the resolution of cortical layers (laminar fMRI) offer a novel window into neurophysiological mechanisms of cortical activity. Beyond physiology, laminar fMRI also offers an unprecedented opportunity to test influential theories of brain function. Specifically, hierarchical Bayesian theories of brain function, such as predictive coding, assign specific computational roles to different cortical layers. Combined with computational models, laminar fMRI offers a unique opportunity to test these proposals noninvasively in humans. This review provides a brief overview of predictive coding and related hierarchical Bayesian theories, summarises their predictions with regard to layered cortical computations, examines how these predictions could be tested by laminar fMRI, and considers methodological challenges. We conclude by discussing the potential of laminar fMRI for clinically useful computational assays of layer-specific information processing. Copyright © 2017 Elsevier Inc. All rights reserved.

  18. Physicians' intention to prescribe hydrocodone combination products after rescheduling: A theory of reasoned action approach.

    PubMed

    Fleming, Marc L; Driver, Larry; Sansgiry, Sujit S; Abughosh, Susan M; Wanat, Matthew; Sawant, Ruta V; Ferries, Erin; Reeve, Kathleen; Todd, Knox H

    The U.S. Drug Enforcement Administration (DEA) rescheduled hydrocodone combination products (HCPs) in an attempt to mitigate the prescription opioid epidemic. Many in the medical and pharmacy community expressed concerns of unintended consequences as a result of rescheduling. This study examined physicians' intentions to prescribe HCPs after rescheduling using the framework of the theory of reasoned action (TRA). A cover letter containing a link to the online questionnaire was sent to physicians of the Texas Medical Association who were likely to prescribe opioids. The questionnaire assessed physicians' intentions to prescribe HCPs after rescheduling. Predictor variables included attitude toward rescheduling, subjective norm toward HCP prescribing, and past prescribing behavior of schedule II prescriptions. All variables were measured on a 7-point, Likert-type scale. Intention to prescribe as a dependent variable was regressed over TRA variables and respondent characteristics. A total of 1176 usable responses were obtained, yielding a response rate of 13.3%. Mean (M) age was 53.07 ± 11 and most respondents were male (70%) and Caucasian (75%). Physicians held a moderately positive intention to prescribe HCPs (M = 4.36 ± 2.08), held a moderately negative attitude towards rescheduling, M = 4.68 ± 1.51 (reverse coded). Subjective norm was moderately low, M = 3.06 ± 1.78, and past prescribing behavior M = 2.43 ± 1.21. The linear regression analysis indicated that attitude (β = 0.10; P = 0.006), subjective norm (β = 0.35; P < 0.0001) and past prescribing behavior (β = 0.59; P < 0.0001) were significant predictors of intention to prescribe HCPs after rescheduling. TRA was shown to be a predictive model of physicians' intentions to prescribe HCPs after rescheduling. Overall, physicians held a moderately positive intention to prescribe HCPs. Past behavior concerning schedule II prescribing was found to be the most significant predictor

  19. Promoting Effective Teacher-Feedback: From Theory to Practice through a Multiple Component Trajectory for Professional Development

    ERIC Educational Resources Information Center

    Voerman, Lia; Meijer, Paulien C.; Korthagen, Fred; Simons, Robert Jan

    2015-01-01

    This study describes an evaluation of a theory-based trajectory for professional development called FeTiP (Feedback-Theory into Practice) that aims to have an observable effect on teacher classroom behavior. FeTiP is a multicomponent trajectory for professional development and combines several types of interventions. Its goal is to help teachers…

  20. Are trinuclear superhalogens promising candidates for building blocks of novel magnetic materials? A theoretical prospect from combined broken-symmetry density functional theory and ab initio study.

    PubMed

    Yu, Yang; Li, Chen; Yin, Bing; Li, Jian-Li; Huang, Yuan-He; Wen, Zhen-Yi; Jiang, Zhen-Yi

    2013-08-07

    The structures, relative stabilities, vertical electron detachment energies, and magnetic properties of a series of trinuclear clusters are explored via combined broken-symmetry density functional theory and ab initio study. Several exchange-correlation functionals are utilized to investigate the effects of different halogen elements and central atoms on the properties of the clusters. These clusters are shown to possess stronger superhalogen properties than previously reported dinuclear superhalogens. The calculated exchange coupling constants indicate the antiferromagnetic coupling between the transition metal ions. Spin density analysis demonstrates the importance of spin delocalization in determining the strengths of various couplings. Spin frustration is shown to occur in some of the trinuclear superhalogens. The coexistence of strong superhalogen properties and spin frustration implies the possibility of trinuclear superhalogens working as the building block of new materials of novel magnetic properties.

  1. Rapprochement of Rotter's Social Learning Theory with Self-Esteem Constructs.

    ERIC Educational Resources Information Center

    Burke, Joy Patricia

    1983-01-01

    Offers rapprochement of Rotter's (1954) social learning theory with self-esteem and related constructs. Self constructs are defined and combined into a conceptual framework indicating the impact of their interrelations within a self-esteem system. An attribution model is used to clarify the impact of causal internalization on self-esteem.…

  2. Classification Consistency and Accuracy for Complex Assessments Using Item Response Theory

    ERIC Educational Resources Information Center

    Lee, Won-Chan

    2010-01-01

    In this article, procedures are described for estimating single-administration classification consistency and accuracy indices for complex assessments using item response theory (IRT). This IRT approach was applied to real test data comprising dichotomous and polytomous items. Several different IRT model combinations were considered. Comparisons…

  3. The theory of music, mood and movement to improve health outcomes.

    PubMed

    Murrock, Carolyn J; Higgins, Patricia A

    2009-10-01

    This paper presents a discussion of the development of a middle-range nursing theory of the effects of music on physical activity and improved health outcomes. Due to the high rate of physical inactivity and the associated negative health outcomes worldwide, nurses need new evidence-based theories and interventions to increase physical activity. The theory of music, mood and movement (MMM) was developed from physical activity guidelines and music theory using the principles of statement and theory synthesis. The concepts of music, physical activity and health outcomes were searched using the CINAHL, MEDLINE, ProQuest Nursing and Allied Health Source, PsycINFO and Cochrane Library databases covering the years 1975-2008. The theory of MMM was synthesized by combining the psychological and physiological responses of music to increase physical activity and improve health outcomes. It proposes that music alters mood, is a cue for movement, and makes physical activity more enjoyable leading to improved health outcomes of weight, blood pressure, blood sugar and cardiovascular risk factor management, and improved quality of life. As it was developed from the physical activity guidelines, the middle-range theory is prescriptive, produces testable hypotheses, and can guide nursing research and practice. The middle-range theory needs to be tested to determine its usefulness for nurses to develop physical activity programmes to improve health outcomes across various cultures.

  4. A new intuitionism: Meaning, memory, and development in Fuzzy-Trace Theory.

    PubMed

    Reyna, Valerie F

    2012-05-01

    Combining meaning, memory, and development, the perennially popular topic of intuition can be approached in a new way. Fuzzy-trace theory integrates these topics by distinguishing between meaning-based gist representations, which support fuzzy (yet advanced) intuition, and superficial verbatim representations of information, which support precise analysis. Here, I review the counterintuitive findings that led to the development of the theory and its most recent extensions to the neuroscience of risky decision making. These findings include memory interference (worse verbatim memory is associated with better reasoning); nonnumerical framing (framing effects increase when numbers are deleted from decision problems); developmental decreases in gray matter and increases in brain connectivity; developmental reversals in memory, judgment, and decision making (heuristics and biases based on gist increase from childhood to adulthood, challenging conceptions of rationality); and selective attention effects that provide critical tests comparing fuzzy-trace theory, expected utility theory, and its variants (e.g., prospect theory). Surprising implications for judgment and decision making in real life are also discussed, notably, that adaptive decision making relies mainly on gist-based intuition in law, medicine, and public health.

  5. Theories of Variable Mass Particles and Low Energy Nuclear Phenomena

    NASA Astrophysics Data System (ADS)

    Davidson, Mark

    2014-02-01

    Variable particle masses have sometimes been invoked to explain observed anomalies in low energy nuclear reactions (LENR). Such behavior has never been observed directly, and is not considered possible in theoretical nuclear physics. Nevertheless, there are covariant off-mass-shell theories of relativistic particle dynamics, based on works by Fock, Stueckelberg, Feynman, Greenberger, Horwitz, and others. We review some of these and we also consider virtual particles that arise in conventional Feynman diagrams in relativistic field theories. Effective Lagrangian models incorporating variable mass particle theories might be useful in describing anomalous nuclear reactions by combining mass shifts together with resonant tunneling and other effects. A detailed model for resonant fusion in a deuterium molecule with off-shell deuterons and electrons is presented as an example. Experimental means of observing such off-shell behavior directly, if it exists, is proposed and described. Brief explanations for elemental transmutation and formation of micro-craters are also given, and an alternative mechanism for the mass shift in the Widom-Larsen theory is presented. If variable mass theories were to find experimental support from LENR, then they would undoubtedly have important implications for the foundations of quantum mechanics, and practical applications may arise.

  6. A structural model of polyglutamine determined from a host-guest method combining experiments and landscape theory.

    PubMed

    Finke, John M; Cheung, Margaret S; Onuchic, José N

    2004-09-01

    Modeling the structure of natively disordered peptides has proved difficult due to the lack of structural information on these peptides. In this work, we use a novel application of the host-guest method, combining folding theory with experiments, to model the structure of natively disordered polyglutamine peptides. Initially, a minimalist molecular model (C(alpha)C(beta)) of CI2 is developed with a structurally based potential and captures many of the folding properties of CI2 determined from experiments. Next, polyglutamine "guest" inserts of increasing length are introduced into the CI2 "host" model and the polyglutamine is modeled to match the resultant change in CI2 thermodynamic stability between simulations and experiments. The polyglutamine model that best mimics the experimental changes in CI2 thermodynamic stability has 1), a beta-strand dihedral preference and 2), an attractive energy between polyglutamine atoms 0.75-times the attractive energy between the CI2 host Go-contacts. When free-energy differences in the CI2 host-guest system are correctly modeled at varying lengths of polyglutamine guest inserts, the kinetic folding rates and structural perturbation of these CI2 insert mutants are also correctly captured in simulations without any additional parameter adjustment. In agreement with experiments, the residues showing structural perturbation are located in the immediate vicinity of the loop insert. The simulated polyglutamine loop insert predominantly adopts extended random coil conformations, a structural model consistent with low resolution experimental methods. The agreement between simulation and experimental CI2 folding rates, CI2 structural perturbation, and polyglutamine insert structure show that this host-guest method can select a physically realistic model for inserted polyglutamine. If other amyloid peptides can be inserted into stable protein hosts and the stabilities of these host-guest mutants determined, this novel host-guest method

  7. Predicting adherence to combination antiretroviral therapy for HIV in Tanzania: A test of an extended theory of planned behaviour model.

    PubMed

    Banas, Kasia; Lyimo, Ramsey A; Hospers, Harm J; van der Ven, Andre; de Bruin, Marijn

    2017-10-01

    Combination antiretroviral therapy (cART) for HIV is widely available in sub-Saharan Africa. Adherence is crucial to successful treatment. This study aimed to apply an extended theory of planned behaviour (TPB) model to predict objectively measured adherence to cART in Tanzania. Prospective observational study (n = 158) where patients completed questionnaires on demographics (Month 0), socio-cognitive variables including intentions (Month 1), and action planning and self-regulatory processes hypothesised to mediate the intention-behaviour relationship (Month 3), to predict adherence (Month 5). Taking adherence was measured objectively using the Medication Events Monitoring System (MEMS) caps. Model tests were conducted using regression and bootstrap mediation analyses. Perceived behavioural control (PBC) was positively (β = .767, p < .001, R 2  = 57.5%) associated with adherence intentions. Intentions only exercised an indirect effect on adherence (B = 1.29 [0.297-3.15]) through self-regulatory processes (B = 1.10 [0.131-2.87]). Self-regulatory processes (β = .234, p = .010, R 2  = 14.7%) predicted better adherence. This observational study using an objective behavioural measure, identified PBC as the main driver of adherence intentions. The effect of intentions on adherence was only indirect through self-regulatory processes, which were the main predictor of objectively assessed adherence.

  8. Fast clustering algorithm for large ECG data sets based on CS theory in combination with PCA and K-NN methods.

    PubMed

    Balouchestani, Mohammadreza; Krishnan, Sridhar

    2014-01-01

    Long-term recording of Electrocardiogram (ECG) signals plays an important role in health care systems for diagnostic and treatment purposes of heart diseases. Clustering and classification of collecting data are essential parts for detecting concealed information of P-QRS-T waves in the long-term ECG recording. Currently used algorithms do have their share of drawbacks: 1) clustering and classification cannot be done in real time; 2) they suffer from huge energy consumption and load of sampling. These drawbacks motivated us in developing novel optimized clustering algorithm which could easily scan large ECG datasets for establishing low power long-term ECG recording. In this paper, we present an advanced K-means clustering algorithm based on Compressed Sensing (CS) theory as a random sampling procedure. Then, two dimensionality reduction methods: Principal Component Analysis (PCA) and Linear Correlation Coefficient (LCC) followed by sorting the data using the K-Nearest Neighbours (K-NN) and Probabilistic Neural Network (PNN) classifiers are applied to the proposed algorithm. We show our algorithm based on PCA features in combination with K-NN classifier shows better performance than other methods. The proposed algorithm outperforms existing algorithms by increasing 11% classification accuracy. In addition, the proposed algorithm illustrates classification accuracy for K-NN and PNN classifiers, and a Receiver Operating Characteristics (ROC) area of 99.98%, 99.83%, and 99.75% respectively.

  9. Linearized propulsion theory of flapping airfoils revisited

    NASA Astrophysics Data System (ADS)

    Fernandez-Feria, Ramon

    2016-11-01

    A vortical impulse theory is used to compute the thrust of a plunging and pitching airfoil in forward flight within the framework of linear potential flow theory. The result is significantly different from the classical one of Garrick that considered the leading-edge suction and the projection in the flight direction of the pressure force. By taking into account the complete vorticity distribution on the airfoil and the wake the mean thrust coefficient contains a new term that generalizes the leading-edge suction term and depends on Theodorsen function C (k) and on a new complex function C1 (k) of the reduced frequency k. The main qualitative difference with Garrick's theory is that the propulsive efficiency tends to zero as the reduced frequency increases to infinity (as 1 / k), in contrast to Garrick's efficiency that tends to a constant (1 / 2). Consequently, for pure pitching and combined pitching and plunging motions, the maximum of the propulsive efficiency is not reached as k -> ∞ like in Garrick's theory, but at a finite value of the reduced frequency that depends on the remaining non-dimensional parameters. The present analytical results are in good agreement with experimental data and numerical results for small amplitude oscillations. Supported by the Ministerio de Economia y Competitividad of Spain Grant No. DPI2013-40479-P.

  10. Environmentally sensitive theory of electronic and optical transitions in atomically thin semiconductors

    NASA Astrophysics Data System (ADS)

    Cho, Yeongsu; Berkelbach, Timothy C.

    2018-01-01

    We present an electrostatic theory of band-gap renormalization in atomically thin semiconductors that captures the strong sensitivity to the surrounding dielectric environment. In particular, our theory aims to correct known band gaps, such as that of the three-dimensional bulk crystal. Combining our quasiparticle band gaps with an effective-mass theory of excitons yields environmentally sensitive optical gaps as would be observed in absorption or photoluminescence. For an isolated monolayer of MoS2, the presented theory is in good agreement with ab initio results based on the G W approximation and the Bethe-Salpeter equation. We find that changes in the electronic band gap are almost exactly offset by changes in the exciton binding energy such that the energy of the first optical transition is nearly independent of the electrostatic environment, rationalizing experimental observations.

  11. Bounds on the power of proofs and advice in general physical theories.

    PubMed

    Lee, Ciarán M; Hoban, Matty J

    2016-06-01

    Quantum theory presents us with the tools for computational and communication advantages over classical theory. One approach to uncovering the source of these advantages is to determine how computation and communication power vary as quantum theory is replaced by other operationally defined theories from a broad framework of such theories. Such investigations may reveal some of the key physical features required for powerful computation and communication. In this paper, we investigate how simple physical principles bound the power of two different computational paradigms which combine computation and communication in a non-trivial fashion: computation with advice and interactive proof systems. We show that the existence of non-trivial dynamics in a theory implies a bound on the power of computation with advice. Moreover, we provide an explicit example of a theory with no non-trivial dynamics in which the power of computation with advice is unbounded. Finally, we show that the power of simple interactive proof systems in theories where local measurements suffice for tomography is non-trivially bounded. This result provides a proof that [Formula: see text] is contained in [Formula: see text], which does not make use of any uniquely quantum structure-such as the fact that observables correspond to self-adjoint operators-and thus may be of independent interest.

  12. Convolutional coding combined with continuous phase modulation

    NASA Technical Reports Server (NTRS)

    Pizzi, S. V.; Wilson, S. G.

    1985-01-01

    Background theory and specific coding designs for combined coding/modulation schemes utilizing convolutional codes and continuous-phase modulation (CPM) are presented. In this paper the case of r = 1/2 coding onto a 4-ary CPM is emphasized, with short-constraint length codes presented for continuous-phase FSK, double-raised-cosine, and triple-raised-cosine modulation. Coding buys several decibels of coding gain over the Gaussian channel, with an attendant increase of bandwidth. Performance comparisons in the power-bandwidth tradeoff with other approaches are made.

  13. Distorting general relativity: gravity's rainbow and f(R) theories at work

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Garattini, Remo, E-mail: Remo.Garattini@unibg.it

    2013-06-01

    We compute the Zero Point Energy in a spherically symmetric background combining the high energy distortion of Gravity's Rainbow with the modification induced by a f(R) theory. Here f(R) is a generic analytic function of the Ricci curvature scalar R in 4D and in 3D. The explicit calculation is performed for a Schwarzschild metric. Due to the spherically symmetric property of the Schwarzschild metric we can compare the effects of the modification induced by a f(R) theory in 4D and in 3D. We find that the final effect of the combined theory is to have finite quantities that shift themore » Zero Point Energy. In this context we setup a Sturm-Liouville problem with the cosmological constant considered as the associated eigenvalue. The eigenvalue equation is a reformulation of the Wheeler-DeWitt equation which is analyzed by means of a variational approach based on gaussian trial functionals. With the help of a canonical decomposition, we find that the relevant contribution to one loop is given by the graviton quantum fluctuations around the given background. A final discussion on the connection of our result with the observed cosmological constant is also reported.« less

  14. Theory X and Theory Y in the Organizational Structure.

    ERIC Educational Resources Information Center

    Barry, Thomas J.

    This document defines contrasting assumptions about the labor force--theory X and theory Y--and shows how they apply to the pyramid organizational structure, examines the assumptions of the two theories, and finally, based on a survey and individual interviews, proposes a merger of theories X and Y to produce theory Z. Organizational structures…

  15. Dexketoprofen/tramadol: randomised double-blind trial and confirmation of empirical theory of combination analgesics in acute pain.

    PubMed

    Moore, R Andrew; Gay-Escoda, C; Figueiredo, R; Tóth-Bagi, Z; Dietrich, T; Milleri, S; Torres-Lagares, D; Hill, C M; García-García, A; Coulthard, P; Wojtowicz, A; Matenko, D; Peñarrocha-Diago, M; Cuadripani, S; Pizà-Vallespir, B; Guerrero-Bayón, C; Bertolotti, M; Contini, M P; Scartoni, S; Nizzardo, A; Capriati, A; Maggi, C A

    2015-01-01

    Combination analgesics are effective in acute pain, and a theoretical framework predicts efficacy for combinations. The combination of dexketoprofen and tramadol is untested, but predicted to be highly effective. This was a randomised, double-blind, double-dummy, parallel-group, placebo-controlled, single-dose trial in patients with moderate or severe pain following third molar extraction. There were ten treatment arms, including dexketoprofen trometamol (12.5 mg and 25 mg) and tramadol hydrochloride (37.5 mg and 75 mg), given as four different fixed combinations and single components, with ibuprofen 400 mg as active control as well as a placebo control. The study objective was to evaluate the superior analgesic efficacy and safety of each combination and each single agent versus placebo. The primary outcome was the proportion of patients with at least 50 % max TOTPAR over six hours. 606 patients were randomised and provided at least one post-dose assessment. All combinations were significantly better than placebo. The highest percentage of responders (72%) was achieved in the dexketoprofen trometamol 25 mg plus tramadol hydrochloride 75 mg group (NNT 1.6, 95% confidence interval 1.3 to 2.1). Addition of tramadol to dexketoprofen resulted in greater peak pain relief and greater pain relief over the longer term, particularly at times longer than six hours (median duration of 8.1 h). Adverse events were unremarkable. Dexketoprofen trometamol 25 mg combined with tramadol hydrochloride 75 mg provided good analgesia with rapid onset and long duration in a model of moderate to severe pain. The results of the dose finding study are consistent with pre-trial calculations based on empirical formulae. EudraCT (2010-022798-32); Clinicaltrials.gov (NCT01307020).

  16. Modeling shock waves in an ideal gas: combining the Burnett approximation and Holian's conjecture.

    PubMed

    He, Yi-Guang; Tang, Xiu-Zhang; Pu, Yi-Kang

    2008-07-01

    We model a shock wave in an ideal gas by combining the Burnett approximation and Holian's conjecture. We use the temperature in the direction of shock propagation rather than the average temperature in the Burnett transport coefficients. The shock wave profiles and shock thickness are compared with other theories. The results are found to agree better with the nonequilibrium molecular dynamics (NEMD) and direct simulation Monte Carlo (DSMC) data than the Burnett equations and the modified Navier-Stokes theory.

  17. Evaluation of a Theory of Instructional Sequences for Physics Instruction

    ERIC Educational Resources Information Center

    Wackermann, Rainer; Trendel, Georg; Fischer, Hans E.

    2010-01-01

    The background of the study is the theory of "basis models of teaching and learning", a comprehensive set of models of learning processes which includes, for example, learning through experience and problem-solving. The combined use of different models of learning processes has not been fully investigated and it is frequently not clear…

  18. Further Tests of Belief-Importance Theory

    PubMed Central

    Petrides, K. V.; Furnham, Adrian

    2015-01-01

    Belief-importance (belimp) theory hypothesizes that personality traits confer a propensity to perceive convergences or divergences between the belief that we can attain certain goals and the importance that we place on these goals. Belief and importance are conceptualized as two coordinates, together defining the belimp plane. We tested fundamental aspects of the theory using four different planes based on the life domains of appearance, family, financial security, and friendship as well as a global plane combining these four domains. The criteria were from the areas of personality (Big Five and trait emotional intelligence) and learning styles. Two hundred and fifty eight participants were allocated into the four quadrants of the belimp plane (Hubris, Motivation, Depression, and Apathy) according to their scores on four reliable instruments. Most hypotheses were supported by the data. Results are discussed with reference to the stability of the belimp classifications under different life domains and the relationship of the quadrants with the personality traits that are hypothesized to underpin them. PMID:25874374

  19. Vested Interest theory and disaster preparedness.

    PubMed

    Miller, Claude H; Adame, Bradley J; Moore, Scott D

    2013-01-01

    Three studies were designed to extend a combination of vested interest theory (VI) and the extended parallel process model of fear appeals (EPPM) to provide formative research for creating more effective disaster preparedness social action campaigns. The aim was to develop an effective VI scale for assessing individual awareness and 'vestedness' relevant to disaster preparedness. Typical preparedness behaviours are discussed with emphasis on earthquakes and tornados in particular. Brief overviews of VI and the EPPM are offered, and findings are presented from three studies (one dealing with earthquakes, and two with tornados) conducted to determine the factor structure of the key VI components involved, and to develop and test subscales derived from the two theories. The paper finishes with a discussion of future research needs and suggestions on how the new subscales may be applied in the design and execution of more effective disaster preparedness campaigns. © 2013 The Author(s). Journal compilation © Overseas Development Institute, 2013.

  20. Further tests of belief-importance theory.

    PubMed

    Petrides, K V; Furnham, Adrian

    2015-01-01

    Belief-importance (belimp) theory hypothesizes that personality traits confer a propensity to perceive convergences or divergences between the belief that we can attain certain goals and the importance that we place on these goals. Belief and importance are conceptualized as two coordinates, together defining the belimp plane. We tested fundamental aspects of the theory using four different planes based on the life domains of appearance, family, financial security, and friendship as well as a global plane combining these four domains. The criteria were from the areas of personality (Big Five and trait emotional intelligence) and learning styles. Two hundred and fifty eight participants were allocated into the four quadrants of the belimp plane (Hubris, Motivation, Depression, and Apathy) according to their scores on four reliable instruments. Most hypotheses were supported by the data. Results are discussed with reference to the stability of the belimp classifications under different life domains and the relationship of the quadrants with the personality traits that are hypothesized to underpin them.

  1. Orientations of nonlocal vibrational modes from combined experimental and theoretical sum frequency spectroscopy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chase, Hilary M.; Chen, Shunli; Fu, Li

    2017-09-01

    Inferring molecular orientations from vibrational sum frequency generation (SFG) spectra is challenging in polarization combinations that result in low signal intensities, or when the local point group symmetry approximation fails. While combining experiments with density functional theory (DFT) could overcome this problem, the scope of the combined method has yet to be established. Here, we assess its feasibility of determining the distributions of molecular orientations for one monobasic ester, two epoxides and three alcohols at the vapor/fused silica interface. We find that molecular orientations of nonlocal vibrational modes cannot be determined using polarization-resolved SFG measurements alone.

  2. Efficient Regressions via Optimally Combining Quantile Information*

    PubMed Central

    Zhao, Zhibiao; Xiao, Zhijie

    2014-01-01

    We develop a generally applicable framework for constructing efficient estimators of regression models via quantile regressions. The proposed method is based on optimally combining information over multiple quantiles and can be applied to a broad range of parametric and nonparametric settings. When combining information over a fixed number of quantiles, we derive an upper bound on the distance between the efficiency of the proposed estimator and the Fisher information. As the number of quantiles increases, this upper bound decreases and the asymptotic variance of the proposed estimator approaches the Cramér-Rao lower bound under appropriate conditions. In the case of non-regular statistical estimation, the proposed estimator leads to super-efficient estimation. We illustrate the proposed method for several widely used regression models. Both asymptotic theory and Monte Carlo experiments show the superior performance over existing methods. PMID:25484481

  3. Camera System MTF: combining optic with detector

    NASA Astrophysics Data System (ADS)

    Andersen, Torben B.; Granger, Zachary A.

    2017-08-01

    MTF is one of the most common metrics used to quantify the resolving power of an optical component. Extensive literature is dedicated to describing methods to calculate the Modulation Transfer Function (MTF) for stand-alone optical components such as a camera lens or telescope, and some literature addresses approaches to determine an MTF for combination of an optic with a detector. The formulations pertaining to a combined electro-optical system MTF are mostly based on theory, and assumptions that detector MTF is described only by the pixel pitch which does not account for wavelength dependencies. When working with real hardware, detectors are often characterized by testing MTF at discrete wavelengths. This paper presents a method to simplify the calculation of a polychromatic system MTF when it is permissible to consider the detector MTF to be independent of wavelength.

  4. When is a theory a theory? A case example.

    PubMed

    Alkin, Marvin C

    2017-08-01

    This discussion comments on the approximately 20years history of writings on the prescriptive theory called Empowerment Evaluation. To do so, involves examining how "Empowerment Evaluation Theory" has been defined at various points of time (particularly 1996 and now in 2015). Defining a theory is different from judging the success of a theory. This latter topic has been addressed elsewhere by Michael Scriven, Michael Patton, and Brad Cousins. I am initially guided by the work of Robin Miller (2010) who has written on the issue of how to judge the success of a theory. In doing so, she provided potential standards for judging the adequacy of theories. My task is not judging the adequacy or success of the Empowerment Evaluation prescriptive theory in practice, but determining how well the theory is delineated. That is, to what extent do the writings qualify as a prescriptive theory. Copyright © 2016 Elsevier Ltd. All rights reserved.

  5. Electrocatalysis of borohydride oxidation: a review of density functional theory approach combined with experimental validation.

    PubMed

    Escaño, Mary Clare Sison; Arevalo, Ryan Lacdao; Gyenge, Elod; Kasai, Hideaki

    2014-09-03

    The electrocatalysis of borohydride oxidation is a complex, up-to-eight-electron transfer process, which is essential for development of efficient direct borohydride fuel cells. Here we review the progress achieved by density functional theory (DFT) calculations in explaining the adsorption of BH4(-) on various catalyst surfaces, with implications for electrocatalyst screening and selection. Wherever possible, we correlate the theoretical predictions with experimental findings, in order to validate the proposed models and to identify potential directions for further advancements.

  6. Electrocatalysis of borohydride oxidation: a review of density functional theory approach combined with experimental validation

    NASA Astrophysics Data System (ADS)

    Sison Escaño, Mary Clare; Lacdao Arevalo, Ryan; Gyenge, Elod; Kasai, Hideaki

    2014-09-01

    The electrocatalysis of borohydride oxidation is a complex, up-to-eight-electron transfer process, which is essential for development of efficient direct borohydride fuel cells. Here we review the progress achieved by density functional theory (DFT) calculations in explaining the adsorption of BH4- on various catalyst surfaces, with implications for electrocatalyst screening and selection. Wherever possible, we correlate the theoretical predictions with experimental findings, in order to validate the proposed models and to identify potential directions for further advancements.

  7. Hypergame theory applied to cyber attack and defense

    NASA Astrophysics Data System (ADS)

    House, James Thomas; Cybenko, George

    2010-04-01

    This work concerns cyber attack and defense in the context of game theory--specifically hypergame theory. Hypergame theory extends classical game theory with the ability to deal with differences in players' expertise, differences in their understanding of game rules, misperceptions, and so forth. Each of these different sub-scenarios, or subgames, is associated with a probability--representing the likelihood that the given subgame is truly "in play" at a given moment. In order to form an optimal attack or defense policy, these probabilities must be learned if they're not known a-priori. We present hidden Markov model and maximum entropy approaches for accurately learning these probabilities through multiple iterations of both normal and modified game play. We also give a widely-applicable approach for the analysis of cases where an opponent is aware that he is being studied, and intentionally plays to spoil the process of learning and thereby obfuscate his attributes. These are considered in the context of a generic, abstract cyber attack example. We demonstrate that machine learning efficacy can be heavily dependent on the goals and styles of participant behavior. To this end detailed simulation results under various combinations of attacker and defender behaviors are presented and analyzed.

  8. Combining Nuclear Magnetic Resonance Spectroscopy and Density Functional Theory Calculations to Characterize Carvedilol Polymorphs.

    PubMed

    Rezende, Carlos A; San Gil, Rosane A S; Borré, Leandro B; Pires, José Ricardo; Vaiss, Viviane S; Resende, Jackson A L C; Leitão, Alexandre A; De Alencastro, Ricardo B; Leal, Katia Z

    2016-09-01

    The experiments of carvedilol form II, form III, and hydrate by (13)C and (15)N cross-polarization magic-angle spinning (CP MAS) are reported. The GIPAW (gauge-including projector-augmented wave) method from DFT (density functional theory) calculations was used to simulate (13)C and (15)N chemical shifts. A very good agreement was found for the comparison between the global results of experimental and calculated nuclear magnetic resonance (NMR) chemical shifts for carvedilol polymorphs. This work aims a comprehensive understanding of carvedilol crystalline forms employing solution and solid-state NMR as well as DFT calculations. Copyright © 2016. Published by Elsevier Inc.

  9. Connection dynamics of a gauge theory of gravity coupled with matter

    NASA Astrophysics Data System (ADS)

    Yang, Jian; Banerjee, Kinjal; Ma, Yongge

    2013-10-01

    We study the coupling of the gravitational action, which is a linear combination of the Hilbert-Palatini term and the quadratic torsion term, to the action of Dirac fermions. The system possesses local Poincare invariance and hence belongs to Poincare gauge theory (PGT) with matter. The complete Hamiltonian analysis of the theory is carried out without gauge fixing but under certain ansatz on the coupling parameters, which leads to a consistent connection dynamics with second-class constraints and torsion. After performing a partial gauge fixing, all second-class constraints can be solved, and a SU(2)-connection dynamical formalism of the theory can be obtained. Hence, the techniques of loop quantum gravity (LQG) can be employed to quantize this PGT with non-zero torsion. Moreover, the Barbero-Immirzi parameter in LQG acquires its physical meaning as the coupling parameter between the Hilbert-Palatini term and the quadratic torsion term in this gauge theory of gravity.

  10. Theory and Simulation of Multicomponent Osmotic Systems

    PubMed Central

    Karunaweera, Sadish; Gee, Moon Bae; Weerasinghe, Samantha; Smith, Paul E.

    2012-01-01

    Most cellular processes occur in systems containing a variety of components many of which are open to material exchange. However, computer simulations of biological systems are almost exclusively performed in systems closed to material exchange. In principle, the behavior of biomolecules in open and closed systems will be different. Here, we provide a rigorous framework for the analysis of experimental and simulation data concerning open and closed multicomponent systems using the Kirkwood-Buff (KB) theory of solutions. The results are illustrated using computer simulations for various concentrations of the solutes Gly, Gly2 and Gly3 in both open and closed systems, and in the absence or presence of NaCl as a cosolvent. In addition, KB theory is used to help rationalize the aggregation properties of the solutes. Here one observes that the picture of solute association described by the KB integrals, which are directly related to the solution thermodynamics, and that provided by more physical clustering approaches are different. It is argued that the combination of KB theory and simulation data provides a simple and powerful tool for the analysis of complex multicomponent open and closed systems. PMID:23329894

  11. Abelian gauge symmetries in F-theory and dual theories

    NASA Astrophysics Data System (ADS)

    Song, Peng

    In this dissertation, we focus on important physical and mathematical aspects, especially abelian gauge symmetries, of F-theory compactifications and its dual formulations within type IIB and heterotic string theory. F-theory is a non-perturbative formulation of type IIB string theory which enjoys important dualities with other string theories such as M-theory and E8 x E8 heterotic string theory. One of the main strengths of F-theory is its geometrization of many physical problems in the dual string theories. In particular, its study requires a lot of mathematical tools such as advanced techniques in algebraic geometry. Thus, it has also received a lot of interests among mathematicians, and is a vivid area of research within both the physics and the mathematics community. Although F-theory has been a long-standing theory, abelian gauge symmetry in Ftheory has been rarely studied, until recently. Within the mathematics community, in 2009, Grassi and Perduca first discovered the possibility of constructing elliptically fibered varieties with non-trivial toric Mordell-Weil group. In the physics community, in 2012, Morrison and Park first made a major advancement by constructing general F-theory compactifications with U(1) abelian gauge symmetry. They found that in such cases, the elliptically-fibered Calabi-Yau manifold that F-theory needs to be compactified on has its fiber being a generic elliptic curve in the blow-up of the weighted projective space P(1;1;2) at one point. Subsequent developments have been made by Cvetic, Klevers and Piragua extended the works of Morrison and Park and constructed general F-theory compactifications with U(1) x U(1) abelian gauge symmetry. They found that in the U(1) x U(1) abelian gauge symmetry case, the elliptically-fibered Calabi-Yau manifold that F-theory needs to be compactified on has its fiber being a generic elliptic curve in the del Pezzo surface dP2. In chapter 2 of this dissertation, I bring this a step further by

  12. Further Development of Ko Displacement Theory for Deformed Shape Predictions of Nonuniform Aerospace Structures

    NASA Technical Reports Server (NTRS)

    Ko, William L.; Fleischer, Van Tran

    2009-01-01

    The Ko displacement theory previously formulated for deformed shape predictions of nonuniform beam structures is further developed mathematically. The further-developed displacement equations are expressed explicitly in terms of geometrical parameters of the beam and bending strains at equally spaced strain-sensing stations along the multiplexed fiber-optic sensor line installed on the bottom surface of the beam. The bending strain data can then be input into the displacement equations for calculations of local slopes, deflections, and cross-sectional twist angles for generating the overall deformed shapes of the nonuniform beam. The further-developed displacement theory can also be applied to the deformed shape predictions of nonuniform two-point supported beams, nonuniform panels, nonuniform aircraft wings and fuselages, and so forth. The high degree of accuracy of the further-developed displacement theory for nonuniform beams is validated by finite-element analysis of various nonuniform beam structures. Such structures include tapered tubular beams, depth-tapered unswept and swept wing boxes, width-tapered wing boxes, and double-tapered wing boxes, all under combined bending and torsional loads. The Ko displacement theory, combined with the fiber-optic strain-sensing system, provide a powerful tool for in-flight deformed shape monitoring of unmanned aerospace vehicles by ground-based pilots to maintain safe flights.

  13. Arcing rates for High Voltage Solar Arrays - Theory, experiment, and predictions

    NASA Technical Reports Server (NTRS)

    Hastings, Daniel E.; Cho, Mengu; Kuninaka, Hitoshi

    1992-01-01

    All solar arrays have biased surfaces that can be exposed to the space environment. It has been observed that when the array bias is less than a few hundred volts negative, then the exposed conductive surfaces may undergo arcing in the space plasma. A theory for arcing is developed on these high voltage solar arrays that ascribes the arcing to electric field runaway at the interface of the plasma, conductor, and solar cell dielectric. Experiments were conducted in the laboratory for the High Voltage Solar Array experiment that will fly on the Japanese Space Flyer Unit (SFU) in 1994. The theory was compared in detail with the experiment and shown to give a reasonable explanation for the data. The combined theory and ground experiments were then used to develop predictions for the SFU flight.

  14. Analysing barriers to service improvement using a multi-level theory of innovation: the case of glaucoma outpatient clinics.

    PubMed

    Turner, Simon; Vasilakis, Christos; Utley, Martin; Foster, Paul; Kotecha, Aachal; Fulop, Naomi J

    2018-05-01

    The development and implementation of innovation by healthcare providers is understood as a multi-determinant and multi-level process. Theories at different analytical levels (i.e. micro and organisational) are needed to capture the processes that influence innovation by providers. This article combines a micro theory of innovation, actor-network theory, with organisational level processes using the 'resource based view of the firm'. It examines the influence of, and interplay between, innovation-seeking teams (micro) and underlying organisational capabilities (meso) during innovation processes. We used ethnographic methods to study service innovations in relation to ophthalmology services run by a specialist English NHS Trust at multiple locations. Operational research techniques were used to support the ethnographic methods by mapping the care process in the existing and redesigned clinics. Deficiencies in organisational capabilities for supporting innovation were identified, including manager-clinician relations and organisation-wide resources. The article concludes that actor-network theory can be combined with the resource-based view to highlight the influence of organisational capabilities on the management of innovation. Equally, actor-network theory helps to address the lack of theory in the resource-based view on the micro practices of implementing change. © 2018 The Authors. Sociology of Health & Illness published by John Wiley & Sons Ltd on behalf of Foundation for SHIL.

  15. Time Evolving Fission Chain Theory and Fast Neutron and Gamma-Ray Counting Distributions

    DOE PAGES

    Kim, K. S.; Nakae, L. F.; Prasad, M. K.; ...

    2015-11-01

    Here, we solve a simple theoretical model of time evolving fission chains due to Feynman that generalizes and asymptotically approaches the point model theory. The point model theory has been used to analyze thermal neutron counting data. This extension of the theory underlies fast counting data for both neutrons and gamma rays from metal systems. Fast neutron and gamma-ray counting is now possible using liquid scintillator arrays with nanosecond time resolution. For individual fission chains, the differential equations describing three correlated probability distributions are solved: the time-dependent internal neutron population, accumulation of fissions in time, and accumulation of leaked neutronsmore » in time. Explicit analytic formulas are given for correlated moments of the time evolving chain populations. The equations for random time gate fast neutron and gamma-ray counting distributions, due to randomly initiated chains, are presented. Correlated moment equations are given for both random time gate and triggered time gate counting. There are explicit formulas for all correlated moments are given up to triple order, for all combinations of correlated fast neutrons and gamma rays. The nonlinear differential equations for probabilities for time dependent fission chain populations have a remarkably simple Monte Carlo realization. A Monte Carlo code was developed for this theory and is shown to statistically realize the solutions to the fission chain theory probability distributions. Combined with random initiation of chains and detection of external quanta, the Monte Carlo code generates time tagged data for neutron and gamma-ray counting and from these data the counting distributions.« less

  16. Simple Z2 lattice gauge theories at finite fermion density

    NASA Astrophysics Data System (ADS)

    Prosko, Christian; Lee, Shu-Ping; Maciejko, Joseph

    2017-11-01

    Lattice gauge theories are a powerful language to theoretically describe a variety of strongly correlated systems, including frustrated magnets, high-Tc superconductors, and topological phases. However, in many cases gauge fields couple to gapless matter degrees of freedom, and such theories become notoriously difficult to analyze quantitatively. In this paper we study several examples of Z2 lattice gauge theories with gapless fermions at finite density, in one and two spatial dimensions, that are either exactly soluble or whose solution reduces to that of a known problem. We consider complex fermions (spinless and spinful) as well as Majorana fermions and study both theories where Gauss' law is strictly imposed and those where all background charge sectors are kept in the physical Hilbert space. We use a combination of duality mappings and the Z2 slave-spin representation to map our gauge theories to models of gauge-invariant fermions that are either free, or with on-site interactions of the Hubbard or Falicov-Kimball type that are amenable to further analysis. In 1D, the phase diagrams of these theories include free-fermion metals, insulators, and superconductors, Luttinger liquids, and correlated insulators. In 2D, we find a variety of gapped and gapless phases, the latter including uniform and spatially modulated flux phases featuring emergent Dirac fermions, some violating Luttinger's theorem.

  17. Buckling Analysis of Angle-ply Composite and Sandwich Plates by Combination of Geometric Stiffness Matrix

    NASA Astrophysics Data System (ADS)

    Zhen, Wu; Wanji, Chen

    2007-05-01

    Buckling response of angle-ply laminated composite and sandwich plates are analyzed using the global-local higher order theory with combination of geometric stiffness matrix in this paper. This global-local theory completely fulfills the free surface conditions and the displacement and stress continuity conditions at interfaces. Moreover, the number of unknowns in this theory is independent of the number of layers in the laminate. Based on this global-local theory, a three-noded triangular element satisfying C1 continuity conditions has also been proposed. The bending part of this element is constructed from the concept of DKT element. In order to improve the accuracy of the analysis, a method of modified geometric stiffness matrix has been introduced. Numerical results show that the present theory not only computes accurately the buckling response of general laminated composite plates but also predicts the critical buckling loads of soft-core sandwiches. However, the global higher-order theories as well as first order theories might encounter some difficulties and overestimate the critical buckling loads for soft-core sandwich plates.

  18. A new approach for modular robot system behavioral modeling: Base on Petri net and category theory

    NASA Astrophysics Data System (ADS)

    Zhang, Yun; Wei, Hongxing; Yang, Bo

    2018-04-01

    To design modular robot system, Petri nets and category theory are combined and the ability of simulation of Petri net is discussed. According to category theory, the method of describing the category of components in the dynamic characteristics of the system is deduced. Moreover, a modular robot system is analyzed, which provides a verifiable description of the dynamic characteristics of the system.

  19. The force distribution probability function for simple fluids by density functional theory.

    PubMed

    Rickayzen, G; Heyes, D M

    2013-02-28

    Classical density functional theory (DFT) is used to derive a formula for the probability density distribution function, P(F), and probability distribution function, W(F), for simple fluids, where F is the net force on a particle. The final formula for P(F) ∝ exp(-AF(2)), where A depends on the fluid density, the temperature, and the Fourier transform of the pair potential. The form of the DFT theory used is only applicable to bounded potential fluids. When combined with the hypernetted chain closure of the Ornstein-Zernike equation, the DFT theory for W(F) agrees with molecular dynamics computer simulations for the Gaussian and bounded soft sphere at high density. The Gaussian form for P(F) is still accurate at lower densities (but not too low density) for the two potentials, but with a smaller value for the constant, A, than that predicted by the DFT theory.

  20. Identity theory and personality theory: mutual relevance.

    PubMed

    Stryker, Sheldon

    2007-12-01

    Some personality psychologists have found a structural symbolic interactionist frame and identity theory relevant to their work. This frame and theory, developed in sociology, are first reviewed. Emphasized in the review are a multiple identity conception of self, identities as internalized expectations derived from roles embedded in organized networks of social interaction, and a view of social structures as facilitators in bringing people into networks or constraints in keeping them out, subsequently, attention turns to a discussion of the mutual relevance of structural symbolic interactionism/identity theory and personality theory, looking to extensions of the current literature on these topics.

  1. The theory of music, mood and movement to improve health outcomes

    PubMed Central

    Murrock, Carolyn J.; Higgins, Patricia A.

    2013-01-01

    Aim This paper presents a discussion of the development of a middle-range nursing theory of the effects of music on physical activity and improved health outcomes. Background Due to the high rate of physical inactivity and the associated negative health outcomes worldwide, nurses need new evidence-based theories and interventions to increase physical activity. Data sources The theory of music, mood and movement (MMM) was developed from physical activity guidelines and music theory using the principles of statement and theory synthesis. The concepts of music, physical activity and health outcomes were searched using the CINAHL, MEDLINE, ProQuest Nursing and Allied Health Source, PsycINFO and Cochrane Library databases covering the years 1975–2008. Discussion The theory of MMM was synthesized by combining the psychological and physiological responses of music to increase physical activity and improve health outcomes. It proposes that music alters mood, is a cue for movement, and makes physical activity more enjoyable leading to improved health outcomes of weight, blood pressure, blood sugar and cardiovascular risk factor management, and improved quality of life. Conclusion As it was developed from the physical activity guidelines, the middle-range theory is prescriptive, produces testable hypotheses, and can guide nursing research and practice. The middle-range theory needs to be tested to determine its usefulness for nurses to develop physical activity programmes to improve health outcomes across various cultures. PMID:20568327

  2. Combination of poroelasticity theory and constant strain rate test in modelling land subsidence due to groundwater extraction

    NASA Astrophysics Data System (ADS)

    Pham, Tien Hung; Rühaak, Wolfram; Sass, Ingo

    2017-04-01

    Extensive groundwater extraction leads to a drawdown of the ground water table. Consequently, soil effective stress increases and can cause land subsidence. Analysis of land subsidence generally requires a numerical model based on poroelasticity theory, which was first proposed by Biot (1941). In the review of regional land subsidence accompanying groundwater extraction, Galloway and Burbey (2011) stated that more research and application is needed in coupling of stress-dependent land subsidence process. In geotechnical field, the constant rate of strain tests (CRS) was first introduced in 1969 (Smith and Wahls 1969) and was standardized in 1982 through the designation D4186-82 by American Society for Testing and Materials. From the reading values of CRS tests, the stress-dependent parameters of poroelasticity model can be calculated. So far, there is no research to link poroelasticity theory with CRS tests in modelling land subsidence due to groundwater extraction. One dimensional CRS tests using conventional compression cell and three dimension CRS tests using Rowe cell were performed. The tests were also modelled by using finite element method with mixed elements. Back analysis technique is used to find the suitable values of hydraulic conductivity and bulk modulus that depend on the stress or void ratio. Finally, the obtained results are used in land subsidence models. Biot, M. A. (1941). "General theory of three-dimensional consolidation." Journal of applied physics 12(2): 155-164. Galloway, D. L. and T. J. Burbey (2011). "Review: Regional land subsidence accompanying groundwater extraction." Hydrogeology Journal 19(8): 1459-1486. Smith, R. E. and H. E. Wahls (1969). "Consolidation under constant rates of strain." Journal of Soil Mechanics & Foundations Div.

  3. [From the cell theory to the neuron theory].

    PubMed

    Tixier-Vidal, Andrée

    2010-01-01

    The relationship between the cell theory formulated by Schwann (1839) and by Virchow (1855) on the one hand, and, on the other hand, the neuron theory, as formulated by Waldeyer (1891) and by Cajal (1906), are discussed from a historical point of view. Both of them are the result of technical and conceptuel progress. Both of them had to fight against the dominant dogma before being accepted. The cell theory opposed the school of Bichat, the vitalist philosophy and the positivist philosophy of Auguste Comte. The neuron theory, which is clearly based on the cell theory, was mostly concerned with the mode of interneuronal communication; it opposed the concept of contiguity to Golgi's concept of continuity. At present, the cell theory remains central in every field of Biology. By contrast, the neuron theory, which until the middle of the XXth century opened the study of the nervous system to a necessary reductionnist approach, is no longer central to recent developments of neurosciences. © Société de Biologie, 2011.

  4. Understanding Superfluid ^3He by Determining β-Coefficients of Ginzburg-Landau Theory

    NASA Astrophysics Data System (ADS)

    Choi, H.; Davis, J. P.; Pollanen, J.; Halperin, W. P.

    2007-03-01

    The Ginzburg-Landau (GL) theory is a phenomenological theory that is used to characterize thermodynamic properties of a system near a phase transition. The free energy is expressed as an expansion of the order parameter and for superfluid ^3He there is one second order term and five fourth order terms. Since the GL theory is a phenomenological theory, one can determine the coefficients to these terms empirically; however, existing experiments are unable to determine all five fourth order coefficients, the β's. To date, only four different combinations of β's are known [1]. In the case of supeprfluid ^3He, using quasiclassical theory, the coefficients have been calculated [2]. We used the calculation as a guide to construct a model to define all five β's independently. The model provides us with the full understanding of the GL theory for ^3He, which is useful in understanding various superfluid phases of both bulk ^3He and disordered ^3He in aerogel. [1] H. Choi et al., J. Low Temp. Phys., submitted; http://arxiv.org/abs/cond-mat/0606786. [2] J.A. Sauls and J.W. Serene, Phys. Rev. B 24, 183 (1981).

  5. An Introduction to Music Therapy: Theory and Practice. Third Edition

    ERIC Educational Resources Information Center

    Davis, William B.; Gfeller, Kate E.; Thaut, Michael H.

    2008-01-01

    "An Introduction to Music Therapy: Theory and Practice, Third Edition," provides a comprehensive overview of the practice of music therapy for the 21st century. It looks at where we have been, where we are today, and where we might be in the future. Combining sound pedagogy with recent research findings, this new edition has been updated and…

  6. Theory of positive disintegration as a model of adolescent development.

    PubMed

    Laycraft, Krystyna

    2011-01-01

    This article introduces a conceptual model of the adolescent development based on the theory of positive disintegration combined with theory of self-organization. Dabrowski's theory of positive disintegration, which was created almost a half century ago, still attracts psychologists' and educators' attention, and is extensively applied into studies of gifted and talented people. The positive disintegration is the mental development described by the process of transition from lower to higher levels of mental life and stimulated by tension, inner conflict, and anxiety. This process can be modeled by a sequence of patterns of organization (attractors) as a developmental potential (a control parameter) changes. Three levels of disintegration (unilevel disintegration, spontaneous multilevel disintegration, and organized multilevel disintegration) are analyzed in detail and it is proposed that they represent behaviour of early, middle and late periods of adolescence. In the discussion, recent research on the adolescent brain development is included.

  7. CuPc/Au(1 1 0): Determination of the azimuthal alignment by a combination of angle-resolved photoemission and density functional theory

    PubMed Central

    Lüftner, Daniel; Milko, Matus; Huppmann, Sophia; Scholz, Markus; Ngyuen, Nam; Wießner, Michael; Schöll, Achim; Reinert, Friedrich; Puschnig, Peter

    2014-01-01

    Here we report on a combined experimental and theoretical study on the structural and electronic properties of a monolayer of Copper-Phthalocyanine (CuPc) on the Au(1 1 0) surface. Low-energy electron diffraction reveals a commensurate overlayer unit cell containing one adsorbate species. The azimuthal alignment of the CuPc molecule is revealed by comparing experimental constant binding energy (kxky)-maps using angle-resolved photoelectron spectroscopy with theoretical momentum maps of the free molecule's highest occupied molecular orbital (HOMO). This structural information is confirmed by total energy calculations within the framework of van-der-Waals corrected density functional theory. The electronic structure is further analyzed by computing the molecule-projected density of states, using both a semi-local and a hybrid exchange-correlation functional. In agreement with experiment, the HOMO is located about 1.2 eV below the Fermi-level, while there is no significant charge transfer into the molecule and the CuPc LUMO remains unoccupied on the Au(1 1 0) surface. PMID:25284953

  8. Theorising big IT programmes in healthcare: strong structuration theory meets actor-network theory.

    PubMed

    Greenhalgh, Trisha; Stones, Rob

    2010-05-01

    The UK National Health Service is grappling with various large and controversial IT programmes. We sought to develop a sharper theoretical perspective on the question "What happens - at macro-, meso- and micro-level - when government tries to modernise a health service with the help of big IT?" Using examples from data fragments at the micro-level of clinical work, we considered how structuration theory and actor-network theory (ANT) might be combined to inform empirical investigation. Giddens (1984) argued that social structures and human agency are recursively linked and co-evolve. ANT studies the relationships that link people and technologies in dynamic networks. It considers how discourses become inscribed in data structures and decision models of software, making certain network relations irreversible. Stones' (2005) strong structuration theory (SST) is a refinement of Giddens' work, systematically concerned with empirical research. It views human agents as linked in dynamic networks of position-practices. A quadripartite approcach considers [a] external social structures (conditions for action); [b] internal social structures (agents' capabilities and what they 'know' about the social world); [c] active agency and actions and [d] outcomes as they feed back on the position-practice network. In contrast to early structuration theory and ANT, SST insists on disciplined conceptual methodology and linking this with empirical evidence. In this paper, we adapt SST for the study of technology programmes, integrating elements from material interactionism and ANT. We argue, for example, that the position-practice network can be a socio-technical one in which technologies in conjunction with humans can be studied as 'actants'. Human agents, with their complex socio-cultural frames, are required to instantiate technology in social practices. Structurally relevant properties inscribed and embedded in technological artefacts constrain and enable human agency. The fortunes

  9. Thermodynamics in variable speed of light theories

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Racker, Juan; Facultad de Ciencias Astronomicas y Geofisicas, Universidad Nacional de La Plata, Paseo del Bosque S/N; Sisterna, Pablo

    2009-10-15

    The perfect fluid in the context of a covariant variable speed of light theory proposed by J. Magueijo is studied. On the one hand the modified first law of thermodynamics together with a recipe to obtain equations of state are obtained. On the other hand the Newtonian limit is performed to obtain the nonrelativistic hydrostatic equilibrium equation for the theory. The results obtained are used to determine the time variation of the radius of Mercury induced by the variability of the speed of light (c), and the scalar contribution to the luminosity of white dwarfs. Using a bound for themore » change of that radius and combining it with an upper limit for the variation of the fine structure constant, a bound on the time variation of c is set. An independent bound is obtained from luminosity estimates for Stein 2015B.« less

  10. Estimation and Psychometric Analysis of Component Profile Scores via Multivariate Generalizability Theory

    ERIC Educational Resources Information Center

    Grochowalski, Joseph H.

    2015-01-01

    Component Universe Score Profile analysis (CUSP) is introduced in this paper as a psychometric alternative to multivariate profile analysis. The theoretical foundations of CUSP analysis are reviewed, which include multivariate generalizability theory and constrained principal components analysis. Because CUSP is a combination of generalizability…

  11. Nuclear Quantum Effects in Water and Aqueous Systems: Experiment, Theory, and Current Challenges

    DOE PAGES

    Ceriotti, Michele; Fang, Wei; Kusalik, Peter G.; ...

    2016-04-06

    Nuclear quantum effects influence the structure and dynamics of hydrogen bonded systems, such as water, which impacts their observed properties with widely varying magnitudes. This review highlights the recent significant developments in the experiment, theory and simulation of nuclear quantum effects in water. Novel experimental techniques, such as deep inelastic neutron scattering, now provide a detailed view of the role of nuclear quantum effects in water’s properties. These have been combined with theoretical developments such as the introduction of the competing quantum effects principle that allows the subtle interplay of water’s quantum effects and their manifestation in experimental observables tomore » be explained. We discuss how this principle has recently been used to explain the apparent dichotomy in water’s isotope effects, which can range from very large to almost nonexistent depending on the property and conditions. We then review the latest major developments in simulation algorithms and theory that have enabled the efficient inclusion of nuclear quantum effects in molecular simulations, permitting their combination with on-the-fly evaluation of the potential energy surface using electronic structure theory. Finally, we identify current challenges and future opportunities in the area.« less

  12. Using Nonlinear Programming in International Trade Theory: The Factor-Proportions Model

    ERIC Educational Resources Information Center

    Gilbert, John

    2004-01-01

    Students at all levels benefit from a multi-faceted approach to learning abstract material. The most commonly used technique in teaching the pure theory of international trade is a combination of geometry and algebraic derivations. Numerical simulation can provide a valuable third support to these approaches. The author describes a simple…

  13. Combining p-values in replicated single-case experiments with multivariate outcome.

    PubMed

    Solmi, Francesca; Onghena, Patrick

    2014-01-01

    Interest in combining probabilities has a long history in the global statistical community. The first steps in this direction were taken by Ronald Fisher, who introduced the idea of combining p-values of independent tests to provide a global decision rule when multiple aspects of a given problem were of interest. An interesting approach to this idea of combining p-values is the one based on permutation theory. The methods belonging to this particular approach exploit the permutation distributions of the tests to be combined, and use a simple function to combine probabilities. Combining p-values finds a very interesting application in the analysis of replicated single-case experiments. In this field the focus, while comparing different treatments effects, is more articulated than when just looking at the means of the different populations. Moreover, it is often of interest to combine the results obtained on the single patients in order to get more global information about the phenomenon under study. This paper gives an overview of how the concept of combining p-values was conceived, and how it can be easily handled via permutation techniques. Finally, the method of combining p-values is applied to a simulated replicated single-case experiment, and a numerical illustration is presented.

  14. Psychoacoustic entropy theory and its implications for performance practice

    NASA Astrophysics Data System (ADS)

    Strohman, Gregory J.

    This dissertation attempts to motivate, derive and imply potential uses for a generalized perceptual theory of musical harmony called psychoacoustic entropy theory. This theory treats the human auditory system as a physical system which takes acoustic measurements. As a result, the human auditory system is subject to all the appropriate uncertainties and limitations of other physical measurement systems. This is the theoretic basis for defining psychoacoustic entropy. Psychoacoustic entropy is a numerical quantity which indexes the degree to which the human auditory system perceives instantaneous disorder within a sound pressure wave. Chapter one explains the importance of harmonic analysis as a tool for performance practice. It also outlines the critical limitations for many of the most influential historical approaches to modeling harmonic stability, particularly when compared to available scientific research in psychoacoustics. Rather than analyze a musical excerpt, psychoacoustic entropy is calculated directly from sound pressure waves themselves. This frames psychoacoustic entropy theory in the most general possible terms as a theory of musical harmony, enabling it to be invoked for any perceivable sound. Chapter two provides and examines many widely accepted mathematical models of the acoustics and psychoacoustics of these sound pressure waves. Chapter three introduces entropy as a precise way of measuring perceived uncertainty in sound pressure waves. Entropy is used, in combination with the acoustic and psychoacoustic models introduced in chapter two, to motivate the mathematical formulation of psychoacoustic entropy theory. Chapter four shows how to use psychoacoustic entropy theory to analyze the certain types of musical harmonies, while chapter five applies the analytical tools developed in chapter four to two short musical excerpts to influence their interpretation. Almost every form of harmonic analysis invokes some degree of mathematical reasoning

  15. A new intuitionism: Meaning, memory, and development in Fuzzy-Trace Theory

    PubMed Central

    Reyna, Valerie F.

    2014-01-01

    Combining meaning, memory, and development, the perennially popular topic of intuition can be approached in a new way. Fuzzy-trace theory integrates these topics by distinguishing between meaning-based gist representations, which support fuzzy (yet advanced) intuition, and superficial verbatim representations of information, which support precise analysis. Here, I review the counterintuitive findings that led to the development of the theory and its most recent extensions to the neuroscience of risky decision making. These findings include memory interference (worse verbatim memory is associated with better reasoning); nonnumerical framing (framing effects increase when numbers are deleted from decision problems); developmental decreases in gray matter and increases in brain connectivity; developmental reversals in memory, judgment, and decision making (heuristics and biases based on gist increase from childhood to adulthood, challenging conceptions of rationality); and selective attention effects that provide critical tests comparing fuzzy-trace theory, expected utility theory, and its variants (e.g., prospect theory). Surprising implications for judgment and decision making in real life are also discussed, notably, that adaptive decision making relies mainly on gist-based intuition in law, medicine, and public health. PMID:25530822

  16. Noncommutative Field Theories and (super)string Field Theories

    NASA Astrophysics Data System (ADS)

    Aref'eva, I. Ya.; Belov, D. M.; Giryavets, A. A.; Koshelev, A. S.; Medvedev, P. B.

    2002-11-01

    In this lecture notes we explain and discuss some ideas concerning noncommutative geometry in general, as well as noncommutative field theories and string field theories. We consider noncommutative quantum field theories emphasizing an issue of their renormalizability and the UV/IR mixing. Sen's conjectures on open string tachyon condensation and their application to the D-brane physics have led to wide investigations of the covariant string field theory proposed by Witten about 15 years ago. We review main ingredients of cubic (super)string field theories using various formulations: functional, operator, conformal and the half string formalisms. The main technical tools that are used to study conjectured D-brane decay into closed string vacuum through the tachyon condensation are presented. We describe also methods which are used to study the cubic open string field theory around the tachyon vacuum: construction of the sliver state, "comma" and matrix representations of vertices.

  17. General Second-Order Scalar-Tensor Theory and Self-Tuning

    NASA Astrophysics Data System (ADS)

    Charmousis, Christos; Copeland, Edmund J.; Padilla, Antonio; Saffin, Paul M.

    2012-02-01

    Starting from the most general scalar-tensor theory with second-order field equations in four dimensions, we establish the unique action that will allow for the existence of a consistent self-tuning mechanism on Friedmann-Lemaître-Robertson-Walker backgrounds, and show how it can be understood as a combination of just four base Lagrangians with an intriguing geometric structure dependent on the Ricci scalar, the Einstein tensor, the double dual of the Riemann tensor, and the Gauss-Bonnet combination. Spacetime curvature can be screened from the net cosmological constant at any given moment because we allow the scalar field to break Poincaré invariance on the self-tuning vacua, thereby evading the Weinberg no-go theorem. We show how the four arbitrary functions of the scalar field combine in an elegant way opening up the possibility of obtaining nontrivial cosmological solutions.

  18. Intra-band gap in Lamb modes propagating in a periodic solid structure

    NASA Astrophysics Data System (ADS)

    Pierre, J.; Rénier, M.; Bonello, B.; Hladky-Hennion, A.-C.

    2012-05-01

    A laser ultrasonic technique is used to measure the dispersion of Lamb waves at a few MHz, propagating in phononic crystals made of dissymmetric air inclusions drilled throughout silicon plates. It is shown that the specific shape of the inclusions is at the origin of the intra-band gap that opens within the second Brillouin zone, at the crossing of both flexural and dilatational zero-order modes. The magnitude of the intra-band gap is measured as a function of the dissymmetry rate of the inclusions. Experimental data and the computed dispersion curves are in very good agreement.

  19. Circularly polarized luminescence of syndiotactic polystyrene

    NASA Astrophysics Data System (ADS)

    Rizzo, Paola; Abbate, Sergio; Longhi, Giovanna; Guerra, Gaetano

    2017-11-01

    Syndiotactic polystyrene (s-PS) films, when crystallized from the amorphous state by temporary sorption of non-racemic guest molecules (like carvone) not only exhibit unusually high optical activity, both in the UV-Visible and Infrared ranges, but also present circularly polarized luminescence (CPL) with high dissymmetry ratios (g = ΔI/I values in the range 0.02-0.03). Experimental evidences provide support, rather than to the usual molecular circular dichroism, to a supramolecular chiral optical response being extrinsic to the site of photon absorption and emission, possibly associated with a helical morphology of s-PS crystallites.

  20. Multiconfiguration Pair-Density Functional Theory Is as Accurate as CASPT2 for Electronic Excitation.

    PubMed

    Hoyer, Chad E; Ghosh, Soumen; Truhlar, Donald G; Gagliardi, Laura

    2016-02-04

    A correct description of electronically excited states is critical to the interpretation of visible-ultraviolet spectra, photochemical reactions, and excited-state charge-transfer processes in chemical systems. We have recently proposed a theory called multiconfiguration pair-density functional theory (MC-PDFT), which is based on a combination of multiconfiguration wave function theory and a new kind of density functional called an on-top density functional. Here, we show that MC-PDFT with a first-generation on-top density functional performs as well as CASPT2 for an organic chemistry database including valence, Rydberg, and charge-transfer excitations. The results are very encouraging for practical applications.

  1. On a new semi-discrete integrable combination of Burgers and Sharma-Tasso-Olver equation

    NASA Astrophysics Data System (ADS)

    Zhao, Hai-qiong

    2017-02-01

    In this paper, a new semi-discrete integrable combination of Burgers and Sharma-Tasso-Olver equation is investigated. The underlying integrable structures like the Lax pair, the infinite number of conservation laws, the Darboux-Bäcklund transformation, and the solutions are presented in the explicit form. The theory of the semi-discrete equation including integrable properties yields the corresponding theory of the continuous counterpart in the continuous limit. Finally, numerical experiments are provided to demonstrate the effectiveness of the developed integrable semi-discretization algorithms.

  2. 'Theory of Mind' I: a theory of knowledge?

    PubMed

    Plastow, Michael

    2012-06-01

    'Theory of mind' is a cognitive notion introduced by Simon Baron-Cohen and colleagues to explain certain deficits in autistic disorders. It has, however, been extended beyond this, and applied more broadly. It proposes a means of knowing the mind of others, and suggests that this means fails in autism. The epistemological basis of 'theory of mind' will be examined critically, not just in terms of its endeavour as a theory of knowledge, but also in regard to the principles that underlie it. The proponents of 'theory of mind' eschew the rich field of psychological and phenomenological research, privileging only the biological sciences into which they endeavour to place their theorizations. In doing this, they fail to recognize the epistemological problems involved. This leads to the theory remaining hamstrung by the very Cartesian ontological problems that it seeks to avoid. For some, 'theory of mind' is but an artefact of the cognitive approach that it employs. It is argued that these difficulties are compounded by the failure of 'theory of mind' to take account of the place of language in the interpersonal encounters it attempts to describe.

  3. Determinations of the combined effect of toxic substances in predictions of atmospheric pollution (in Russian)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gusey, M.I.; Gil'denskiol'd, R.S.; Baikov, B.K.

    There have recently been several investigations of the combined effect of several pollutants present simultaneously in the atmosphere. As a rule the combined effect of toxic substances in the atmosphere at the levels of liminal and subliminal concentrations are in accordance with the principle of simple summation. There is a definite gap between theory and practice in the establishment of standards for atmospheric pollutants. 17 references, 1 table.

  4. A succession of theories: purging redundancy from disturbance theory.

    PubMed

    Pulsford, Stephanie A; Lindenmayer, David B; Driscoll, Don A

    2016-02-01

    The topics of succession and post-disturbance ecosystem recovery have a long and convoluted history. There is extensive redundancy within this body of theory, which has resulted in confusion, and the links among theories have not been adequately drawn. This review aims to distil the unique ideas from the array of theory related to ecosystem change in response to disturbance. This will help to reduce redundancy, and improve communication and understanding between researchers. We first outline the broad range of concepts that have developed over the past century to describe community change in response to disturbance. The body of work spans overlapping succession concepts presented by Clements in 1916, Egler in 1954, and Connell and Slatyer in 1977. Other theories describing community change include state and transition models, biological legacy theory, and the application of functional traits to predict responses to disturbance. Second, we identify areas of overlap of these theories, in addition to highlighting the conceptual and taxonomic limitations of each. In aligning each of these theories with one another, the limited scope and relative inflexibility of some theories becomes apparent, and redundancy becomes explicit. We identify a set of unique concepts to describe the range of mechanisms driving ecosystem responses to disturbance. We present a schematic model of our proposed synthesis which brings together the range of unique mechanisms that were identified in our review. The model describes five main mechanisms of transition away from a post-disturbance community: (i) pulse events with rapid state shifts; (ii) stochastic community drift; (iii) facilitation; (iv) competition; and (v) the influence of the initial composition of a post-disturbance community. In addition, stabilising processes such as biological legacies, inhibition or continuing disturbance may prevent a transition between community types. Integrating these six mechanisms with the functional

  5. Mean-field theory of differential rotation in density stratified turbulent convection

    NASA Astrophysics Data System (ADS)

    Rogachevskii, I.

    2018-04-01

    A mean-field theory of differential rotation in a density stratified turbulent convection has been developed. This theory is based on the combined effects of the turbulent heat flux and anisotropy of turbulent convection on the Reynolds stress. A coupled system of dynamical budget equations consisting in the equations for the Reynolds stress, the entropy fluctuations and the turbulent heat flux has been solved. To close the system of these equations, the spectral approach, which is valid for large Reynolds and Péclet numbers, has been applied. The adopted model of the background turbulent convection takes into account an increase of the turbulence anisotropy and a decrease of the turbulent correlation time with the rotation rate. This theory yields the radial profile of the differential rotation which is in agreement with that for the solar differential rotation.

  6. Combined Effects of Nonylphenol and Bisphenol A on the Human Prostate Epithelial Cell Line RWPE-1

    PubMed Central

    Gan, Weidong; Zhou, Ming; Xiang, Zou; Han, Xiaodong; Li, Dongmei

    2015-01-01

    The xenoestrogens nonylphenol (NP) and bisphenol A (BPA) are regarded as endocrine disrupting chemicals (EDCs) which have widespread occurrence in our daily life. In the present study, the purpose was to analyze the combined effects of NP and BPA on the human prostate epithelial cell line RWPE-1 using two mathematical models based on the Loewe additivity (LA) theory and the Bliss independence (BI) theory. RWPE-1 cells were treated with NP (0.01–100 µM) and BPA (1–5000 µM) in either a single or a combined format. A cell viability assay and lactate dehydrogenase (LDH) leakage rate assay were employed as endpoints. As predicted by the two models and based on the cell viability assay, significant synergism between NP and BPA were observed. However, based on the LDH assay, the trends were reversed. Given that environmental contaminants are frequently encountered simultaneously, these data indicated that there were potential interactions between NP and BPA, and the combined effects of the chemical mixture might be stronger than the additive values of individual chemicals combined, which should be taken into consideration for the risk assessment of EDCs. PMID:25874684

  7. GY SAMPLING THEORY AND GEOSTATISTICS: ALTERNATE MODELS OF VARIABILITY IN CONTINUOUS MEDIA

    EPA Science Inventory



    In the sampling theory developed by Pierre Gy, sample variability is modeled as the sum of a set of seven discrete error components. The variogram used in geostatisties provides an alternate model in which several of Gy's error components are combined in a continuous mode...

  8. Evaluation of a Theory of Instructional Sequences for Physics Instruction

    NASA Astrophysics Data System (ADS)

    Wackermann, Rainer; Trendel, Georg; Fischer, Hans E.

    2010-05-01

    The background of the study is the theory of basis models of teaching and learning, a comprehensive set of models of learning processes which includes, for example, learning through experience and problem-solving. The combined use of different models of learning processes has not been fully investigated and it is frequently not clear under what circumstances a particular model should be used by teachers. In contrast, the theory under investigation here gives guidelines for choosing a particular model and provides instructional sequences for each model. The aim is to investigate the implementation of the theory applied to physics instruction and to show if possible effects for the students may be attributed to the use of the theory. Therefore, a theory-oriented education programme for 18 physics teachers was developed and implemented in the 2005/06 school year. The main features of the intervention consisted of coaching physics lessons and video analysis according to the theory. The study follows a pre-treatment-post design with non-equivalent control group. Findings of repeated-measures ANOVAs show large effects for teachers' subjective beliefs, large effects for classroom actions, and small to medium effects for student outcomes such as perceived instructional quality and student emotions. The teachers/classes that applied the theory especially well according to video analysis showed the larger effects. The results showed that differentiating between different models of learning processes improves physics instruction. Effects can be followed through to student outcomes. The education programme effect was clearer for classroom actions and students' outcomes than for teachers' beliefs.

  9. Life Origination Hydrate Theory (LOH-Theory) and Mitosis and Replication Hydrate Theory (MRH-Theory): three-dimensional PC validation

    NASA Astrophysics Data System (ADS)

    Kadyshevich, E. A.; Dzyabchenko, A. V.; Ostrovskii, V. E.

    2014-04-01

    Size compatibility of the CH4-hydrate structure II and multi-component DNA fragments is confirmed by three-dimensional simulation; it is validation of the Life Origination Hydrate Theory (LOH-Theory).

  10. Theoretical frameworks for testing relativistic gravity. IV - A compendium of metric theories of gravity and their post-Newtonian limits.

    NASA Technical Reports Server (NTRS)

    Ni, W.-T.

    1972-01-01

    Metric theories of gravity are compiled and classified according to the types of gravitational fields they contain, and the modes of interaction among those fields. The gravitation theories considered are classified as (1) general relativity, (2) scalar-tensor theories, (3) conformally flat theories, and (4) stratified theories with conformally flat space slices. The post-Newtonian limit of each theory is constructed and its Parametrized Post-Newtonian (PPN) values are obtained by comparing it with Will's version of the formalism. Results obtained here, when combined with experimental data and with recent work by Nordtvedt and Will and by Ni, show that, of all theories thus far examined by our group, the only currently viable ones are general relativity, the Bergmann-Wagoner scalar-tensor theory and its special cases (Nordtvedt; Brans-Dicke-Jordan), and a recent, new vector-tensor theory by Nordtvedt, Hellings, and Will.

  11. Grounded theory.

    PubMed

    Harris, Tina

    2015-04-29

    Grounded theory is a popular research approach in health care and the social sciences. This article provides a description of grounded theory methodology and its key components, using examples from published studies to demonstrate practical application. It aims to demystify grounded theory for novice nurse researchers, by explaining what it is, when to use it, why they would want to use it and how to use it. It should enable nurse researchers to decide if grounded theory is an appropriate approach for their research, and to determine the quality of any grounded theory research they read.

  12. [Zhu Lian's cognition on theory and method of acupuncture and moxibustion under background of western medicine].

    PubMed

    Li, Su-yun; Zhang, Li-jian; Liu, Bing

    2014-11-01

    With new acupuncture and moxibustion as the study object, based on the basic composition of acupuncture-moxibustion theory, from 3 aspects of meridian-acupoint theory, acupuncture-moxibustion method theory and acupuncture-moxibustion treatment theory, under the background of western medicine, ZHU Lian's different opinions on theory and method of acupuncture and moxibustion were discussed. It was believed by ZHU Lian that the distribution of 14-meridians was approximately identical to that of nerves, so with modern neuroanatomy knowledge to understand the meaning of acupoint; the acupuncture function could be explained from the angle of neurophysiology. Clinical diagnosis and treatment method could be established by modern classification methods of diseases. ZHU Lian's cognition that was different from traditional theory and method of acupuncture and moxibustion was combined with updated physiology and anatomy knowledge at that time, and was involved with Pavlov's advanced nerve theory, so she firstly put forward the opinion that acupuncture therapy can't work without the involvement of cerebral cortex.

  13. Combined Optimal Control System for excavator electric drive

    NASA Astrophysics Data System (ADS)

    Kurochkin, N. S.; Kochetkov, V. P.; Platonova, E. V.; Glushkin, E. Y.; Dulesov, A. S.

    2018-03-01

    The article presents a synthesis of the combined optimal control algorithms of the AC drive rotation mechanism of the excavator. Synthesis of algorithms consists in the regulation of external coordinates - based on the theory of optimal systems and correction of the internal coordinates electric drive using the method "technical optimum". The research shows the advantage of optimal combined control systems for the electric rotary drive over classical systems of subordinate regulation. The paper presents a method for selecting the optimality criterion of coefficients to find the intersection of the range of permissible values of the coordinates of the control object. There is possibility of system settings by choosing the optimality criterion coefficients, which allows one to select the required characteristics of the drive: the dynamic moment (M) and the time of the transient process (tpp). Due to the use of combined optimal control systems, it was possible to significantly reduce the maximum value of the dynamic moment (M) and at the same time - reduce the transient time (tpp).

  14. Comparison of Attachment theory and Cognitive-Motivational Structure theory.

    PubMed

    Malerstein, A J

    2005-01-01

    Attachment theory and Cognitive-Motivational Structure (CMS) are similar in most respects. They differ primarily in their proposal of when, during development, one's sense of the self and of the outside world are formed. I propose that the theories supplement each other after about age seven years--when Attachment theory's predictions of social function become unreliable, CMS theory comes into play.

  15. Communication: Extended multi-state complete active space second-order perturbation theory: Energy and nuclear gradients

    NASA Astrophysics Data System (ADS)

    Shiozaki, Toru; Győrffy, Werner; Celani, Paolo; Werner, Hans-Joachim

    2011-08-01

    The extended multireference quasi-degenerate perturbation theory, proposed by Granovsky [J. Chem. Phys. 134, 214113 (2011)], is combined with internally contracted multi-state complete active space second-order perturbation theory (XMS-CASPT2). The first-order wavefunction is expanded in terms of the union of internally contracted basis functions generated from all the reference functions, which guarantees invariance of the theory with respect to unitary rotations of the reference functions. The method yields improved potentials in the vicinity of avoided crossings and conical intersections. The theory for computing nuclear energy gradients for MS-CASPT2 and XMS-CASPT2 is also presented and the first implementation of these gradient methods is reported. A number of illustrative applications of the new methods are presented.

  16. Fermentation, phlogiston and matter theory: chemistry and natural philosophy in Georg Ernst Stahl's Zymotechnia Fundamentalis.

    PubMed

    Chang, Ku-Ming Kevin

    2002-01-01

    This paper examines Georg Ernst Stahl's first book, the Zymotechnia Fundamentalis, in the context of contemporary natural philosophy and the author's career. I argue that the Zymotechnia was a mechanical theory of fermentation written consciously against the influential "fermentational program" of Joan Baptista van Helmont and especially Thomas Willis, Stahl's theory of fermentation introduced his first conception of phlogiston, which was in part a corpuscular transformation of the Paracelsian sulphur principle. Meanwhile some assumptions underlying this theory, such as the composition of matter, the absolute passivity of matter and the "passions" of sulphur, reveal the combined scholastic and mechanistic character of Stahl's natural philosophy. In the conclusion I show that Stahl's theory of fermentation undermined the old fermentational program and paved the way for his dualist vitalism.

  17. Potential Theory for Directed Networks

    PubMed Central

    Zhang, Qian-Ming; Lü, Linyuan; Wang, Wen-Qiang; Zhou, Tao

    2013-01-01

    Uncovering factors underlying the network formation is a long-standing challenge for data mining and network analysis. In particular, the microscopic organizing principles of directed networks are less understood than those of undirected networks. This article proposes a hypothesis named potential theory, which assumes that every directed link corresponds to a decrease of a unit potential and subgraphs with definable potential values for all nodes are preferred. Combining the potential theory with the clustering and homophily mechanisms, it is deduced that the Bi-fan structure consisting of 4 nodes and 4 directed links is the most favored local structure in directed networks. Our hypothesis receives strongly positive supports from extensive experiments on 15 directed networks drawn from disparate fields, as indicated by the most accurate and robust performance of Bi-fan predictor within the link prediction framework. In summary, our main contribution is twofold: (i) We propose a new mechanism for the local organization of directed networks; (ii) We design the corresponding link prediction algorithm, which can not only testify our hypothesis, but also find out direct applications in missing link prediction and friendship recommendation. PMID:23408979

  18. Intrinsic Instability of Cs2In(I)M(III)X6 (M = Bi, Sb; X = Halogen) Double Perovskites: A Combined Density Functional Theory and Experimental Study.

    PubMed

    Xiao, Zewen; Du, Ke-Zhao; Meng, Weiwei; Wang, Jianbo; Mitzi, David B; Yan, Yanfa

    2017-05-03

    Recently, there has been substantial interest in developing double-B-cation halide perovskites, which hold the potential to overcome the toxicity and instability issues inherent within emerging lead halide-based solar absorber materials. Among all double perovskites investigated, In(I)-based Cs 2 InBiCl 6 and Cs 2 InSbCl 6 have been proposed as promising thin-film photovoltaic absorber candidates, with computational examination predicting suitable materials properties, including direct bandgap and small effective masses for both electrons and holes. In this study, we report the intrinsic instability of Cs 2 In(I)M(III)X 6 (M = Bi, Sb; X = halogen) double perovskites by a combination of density functional theory and experimental study. Our results suggest that the In(I)-based double perovskites are unstable against oxidation into In(III)-based compounds. Further, the results show the need to consider reduction-oxidation (redox) chemistry when predicting stability of new prospective electronic materials, especially when less common oxidation states are involved.

  19. Not just numeracy and literacy: Theory of mind development and school readiness among low-income children.

    PubMed

    Cavadel, Elizabeth Woodburn; Frye, Douglas A

    2017-12-01

    The current study investigated the role of theory of mind development in school readiness among 120 low-income preschool and kindergarten children. A short-term longitudinal design was used to examine relations among theory of mind, the understanding of teaching, and learning behaviors and their collective role in children's literacy and numeracy skills at school entry. Results replicate differences in theory of mind development among low-income children as compared to typically studied, higher-income samples. Theory of mind and the combination of several sociocognitive variables successfully predicted concurrent relations with academic outcomes. Children's understanding of teaching predicted changes in literacy scores over time. Results are discussed in the context of what is known about theory of mind and sociocognitive development in school readiness. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  20. [Comparative research into the process of forming the theory of constitution in ancient western medicine and that of four trigrams constitution in Korean medicine and contents of two theories of constitution].

    PubMed

    Park, Joo-Hong

    2009-06-01

    After conducting comparative research into the process of forming the Theory of Constitution in Ancient Western Medicine and that of Four Trigrams Constitution(Sasang Constitution) in Korean Medicine and contents of two Theories of Constitution in terms of medical history, both theories were found to be formed by an interaction between philosophy and medicine, followed by a combination of the two, on a philosophical basis. The Theory of Constitution in Ancient Western Medicine began with the Theory of Four Elements presented by Empedocles, followed by the Theory of Four Humors presented by Hippocrates and the Theory of Four Temperaments by Galenos, forming and developing the Theory of Constitution. After the Middle Ages, there was no significant advance in the Theory of Constitution by modern times ; however, it developed into the theory of constitution type of Kretschmer and others after the 19th century and into the scientific theory of constitution based on genetics presented by Garrod and others early in the 20th century. The Theory of Four Trigrams Constitution began with the Theory of Constitution in Huangdi Neijing, followed by developments and influences of existing medicine called beginning, restoration, and revival periods and DongeuisoosebowonSaSangChoBonGwon based on the original philosophy of Four Trigrams presented by Lee Je-ma, which is found in GyeokChiGo, DongMuYuGo and so on, ultimately forming and developing into the Theory of Four Trigrams Constitution in Dongeuisoosebowon. Recently, a lot of research is being conducted into making it objective in order to achieve reproducibility in diagnosis and so forth of Four Trigrams Constitution.

  1. Predictions of new AB O3 perovskite compounds by combining machine learning and density functional theory

    NASA Astrophysics Data System (ADS)

    Balachandran, Prasanna V.; Emery, Antoine A.; Gubernatis, James E.; Lookman, Turab; Wolverton, Chris; Zunger, Alex

    2018-04-01

    We apply machine learning (ML) methods to a database of 390 experimentally reported A B O3 compounds to construct two statistical models that predict possible new perovskite materials and possible new cubic perovskites. The first ML model classified the 390 compounds into 254 perovskites and 136 that are not perovskites with a 90% average cross-validation (CV) accuracy; the second ML model further classified the perovskites into 22 known cubic perovskites and 232 known noncubic perovskites with a 94% average CV accuracy. We find that the most effective chemical descriptors affecting our classification include largely geometric constructs such as the A and B Shannon ionic radii, the tolerance and octahedral factors, the A -O and B -O bond length, and the A and B Villars' Mendeleev numbers. We then construct an additional list of 625 A B O3 compounds assembled from charge conserving combinations of A and B atoms absent from our list of known compounds. Then, using the two ML models constructed on the known compounds, we predict that 235 of the 625 exist in a perovskite structure with a confidence greater than 50% and among them that 20 exist in the cubic structure (albeit, the latter with only ˜50 % confidence). We find that the new perovskites are most likely to occur when the A and B atoms are a lanthanide or actinide, when the A atom is an alkali, alkali earth, or late transition metal atom, or when the B atom is a p -block atom. We also compare the ML findings with the density functional theory calculations and convex hull analyses in the Open Quantum Materials Database (OQMD), which predicts the T =0 K ground-state stability of all the A B O3 compounds. We find that OQMD predicts 186 of 254 of the perovskites in the experimental database to be thermodynamically stable within 100 meV/atom of the convex hull and predicts 87 of the 235 ML-predicted perovskite compounds to be thermodynamically stable within 100 meV/atom of the convex hull, including 6 of these to

  2. Ensemble method: Community detection based on game theory

    NASA Astrophysics Data System (ADS)

    Zhang, Xia; Xia, Zhengyou; Xu, Shengwu; Wang, J. D.

    2014-08-01

    Timely and cost-effective analytics over social network has emerged as a key ingredient for success in many businesses and government endeavors. Community detection is an active research area of relevance to analyze online social network. The problem of selecting a particular community detection algorithm is crucial if the aim is to unveil the community structure of a network. The choice of a given methodology could affect the outcome of the experiments because different algorithms have different advantages and depend on tuning specific parameters. In this paper, we propose a community division model based on the notion of game theory, which can combine advantages of previous algorithms effectively to get a better community classification result. By making experiments on some standard dataset, it verifies that our community detection model based on game theory is valid and better.

  3. String theory, gauge theory and quantum gravity. Proceedings. Trieste Spring School and Workshop on String Theory, Gauge Theory and Quantum Gravity, Trieste (Italy), 11 - 22 Apr 1994.

    NASA Astrophysics Data System (ADS)

    1995-04-01

    The following topics were dealt with: string theory, gauge theory, quantum gravity, quantum geometry, black hole physics and information loss, second quantisation of the Wilson loop, 2D Yang-Mills theory, topological field theories, equivariant cohomology, superstring theory and fermion masses, supergravity, topological gravity, waves in string cosmology, superstring theories, 4D space-time.

  4. Solar-System Tests of Gravitational Theories

    NASA Technical Reports Server (NTRS)

    Shapiro, Irwin I.

    2005-01-01

    This research is aimed at testing gravitational theory, primarily on an interplanetary scale and using mainly observations of objects in the solar system. Our goal is either to detect departures from the standard model (general relativity) - if any exist within the level of sensitivity of our data - or to support this model by placing tighter bounds on any departure from it. For this project, we have analyzed a combination of observational data with our model of the solar system, including planetary radar ranging, lunar laser ranging, and spacecraft tracking, as well as pulsar timing and pulsar VLBI measurements.

  5. Modeling Composite Assessment Data Using Item Response Theory

    PubMed Central

    Ueckert, Sebastian

    2018-01-01

    Composite assessments aim to combine different aspects of a disease in a single score and are utilized in a variety of therapeutic areas. The data arising from these evaluations are inherently discrete with distinct statistical properties. This tutorial presents the framework of the item response theory (IRT) for the analysis of this data type in a pharmacometric context. The article considers both conceptual (terms and assumptions) and practical questions (modeling software, data requirements, and model building). PMID:29493119

  6. Prototypes reflect normative perceptions: implications for the development of reasoned action theory.

    PubMed

    Hennessy, Michael; Bleakley, Amy; Ellithorpe, Morgan

    2018-03-01

    The reasoned action approach is one of the most successful behavioral theories in the history of social psychology. This study outlines the theoretical principles of reasoned action and considers when it is appropriate to augment it with a new variable. To demonstrate, we use survey data collected from a 4 to 17 year old U.S. adolescents to test how the 'prototype' variables fit into reasoned action approach. Through confirmatory factor analysis, we find that the prototype measures are normative pressure measures and when treated as a separate theoretical construct, prototype identity is not completely mediated by the proximal predictors of behavioral intention. We discuss the assumptions of the two theories and finally consider the distinction between augmenting a specific theory versus combining measures derived from different theoretical perspectives.

  7. The use of ecological momentary assessment to test appraisal theories of emotion.

    PubMed

    Tong, Eddie M W; Bishop, George D; Enkelmann, Hwee Chong; Why, Yong Peng; Diong, Siew Maan; Khader, Majeed; Ang, Jansen

    2005-12-01

    Although appraisal theories have received strong empirical support, there are methodological concerns about the research, including biased recall, heuristic responding, ethical issues, and weak and unrealistic induction of emotions in laboratories. To provide a more ecologically valid test of appraisal theories, the authors used ecological momentary assessment, in which the emotions and appraisals of Singaporean police officers were measured online over the course of an ordinary workday. The research focused on happiness. Support was obtained for predictions, demonstrating the generalizability of appraisal theories to a nonlaboratory setting and circumventing the shortcomings of previously used methodologies. Also, evidence was obtained that happiness was reported primarily in association with a specific combination of 3 relevant appraisals: high pleasantness, high perceived control, and low moral violation. Copyright (c) 2005 APA, all rights reserved.

  8. Prototypes Reflect Normative Perceptions: Implications for the Development of Reasoned Action Theory

    PubMed Central

    Hennessy, Michael; Bleakley, Amy; Ellithorpe, Morgan

    2017-01-01

    The reasoned action approach is one of the most successful behavioral theories in the history of social psychology. This study outlines the theoretical principles of reasoned action and considers when it is appropriate to augment it with a new variable. To demonstrate, we use survey data collected from a 4–17 year old U.S. adolescents to test how the “prototype” variables fit into reasoned action approach. Through confirmatory factor analysis, we find that the prototype measures are normative pressure measures and when treated as a separate theoretical construct, prototype identity is not completely mediated by the proximal predictors of behavioral intention. We discuss the assumptions of the two theories and finally consider the distinction between augmenting a specific theory versus combining measures derived from different theoretical perspectives. PMID:28612624

  9. [Analysis on traditional Chinese medicine prescriptions treating cancer-related anorexia syndrome based on grey system theory combined with multivariate analysis method and discovery of new prescriptions].

    PubMed

    Chen, Song-Lin; Chen, Cong; Zhu, Hui; Li, Jing; Pang, Yan

    2016-01-01

    Cancer-related anorexia syndrome (CACS) is one of the main causes for death at present as well as a syndrome seriously harming patients' quality of life, treatment effect and survival time. In current clinical researches, there are fewer reports about empirical traditional Chinese medicine(TCM) prescriptions and patent prescriptions treating CACS, and prescription rules are rarely analyzed in a systematic manner. As the hidden rules are not excavated, it is hard to have an innovative discovery and knowledge of clinical medication. In this paper, the grey screening method combined with the multivariate statistical method was used to build the ″CACS prescriptions database″. Based on the database, totally 359 prescriptions were selected, the frequency of herbs in prescription was determined, and commonly combined drugs were evolved into 4 new prescriptions for different syndromes. Prescriptions of TCM in treatment of CACS gave priority to benefiting qi for strengthening spleen, also laid emphasis on replenishing kidney essence, dispersing stagnated liver-qi and dispersing lung-qi. Moreover, interdependence and mutual promotion of yin and yang should be taken into account to reflect TCM's holism and theory for treatment based on syndrome differentiation. The grey screening method, as a valuable traditional Chinese medicine research-supporting method, can be used to subjectively and objectively analyze prescription rules; and the new prescriptions can provide reference for the clinical use of TCM for treating CACS and the drug development. Copyright© by the Chinese Pharmaceutical Association.

  10. [Construction of research system for processing mechanism of traditional Chinese medicine based on chemical composition transformation combined with intestinal absorption barrier].

    PubMed

    Sun, E; Xu, Feng-Juan; Zhang, Zhen-Hai; Wei, Ying-Jie; Tan, Xiao-Bin; Cheng, Xu-Dong; Jia, Xiao-Bin

    2014-02-01

    Based on practice of Epimedium processing mechanism for many years and integrated multidisciplinary theory and technology, this paper initially constructs the research system for processing mechanism of traditional Chinese medicine based on chemical composition transformation combined with intestinal absorption barrier, which to form an innovative research mode of the " chemical composition changes-biological transformation-metabolism in vitro and in vivo-intestinal absorption-pharmacokinetic combined pharmacodynamic-pharmacodynamic mechanism". Combined with specific examples of Epimedium and other Chinese herbal medicine processing mechanism, this paper also discusses the academic thoughts, research methods and key technologies of this research system, which will be conducive to systematically reveal the modem scientific connotation of traditional Chinese medicine processing, and enrich the theory of Chinese herbal medicine processing.

  11. How to Map Theory: Reliable Methods Are Fruitless Without Rigorous Theory.

    PubMed

    Gray, Kurt

    2017-09-01

    Good science requires both reliable methods and rigorous theory. Theory allows us to build a unified structure of knowledge, to connect the dots of individual studies and reveal the bigger picture. Some have criticized the proliferation of pet "Theories," but generic "theory" is essential to healthy science, because questions of theory are ultimately those of validity. Although reliable methods and rigorous theory are synergistic, Action Identification suggests psychological tension between them: The more we focus on methodological details, the less we notice the broader connections. Therefore, psychology needs to supplement training in methods (how to design studies and analyze data) with training in theory (how to connect studies and synthesize ideas). This article provides a technique for visually outlining theory: theory mapping. Theory mapping contains five elements, which are illustrated with moral judgment and with cars. Also included are 15 additional theory maps provided by experts in emotion, culture, priming, power, stress, ideology, morality, marketing, decision-making, and more (see all at theorymaps.org ). Theory mapping provides both precision and synthesis, which helps to resolve arguments, prevent redundancies, assess the theoretical contribution of papers, and evaluate the likelihood of surprising effects.

  12. Veto player theory and reform making in Western Europe.

    PubMed

    Angelova, Mariyana; Bäck, Hanna; Müller, Wolfgang C; Strobl, Daniel

    2018-05-01

    Veto player theory generates predictions about governments' capacity for policy change. Due to the difficulty of identifying significant laws needed to change the policy status quo, evidence about governments' ability to change policy has been mostly provided for a limited number of reforms and single-country studies. To evaluate the predictive power of veto player theory for policy making across time, policy areas and countries, a dataset was gathered that incorporates about 5,600 important government reform measures in the areas of social, labour, economic and taxation policy undertaken in 13 Western European countries from the mid-1980s until the mid-2000s. Veto player theory is applied in a combined model with other central theoretical expectations on policy change derived from political economy (crisis-driven policy change) and partisan theory (ideology-driven policy change). Robust support is found that governments introduce more reform measures when economic conditions are poor and when the government is positioned further away from the policy status quo. No empirical support is found for predictions of veto player theory in its pure form, where no differentiation between government types is made. However, the findings provide support for the veto player theory in the special case of minimal winning cabinets, where the support of all government parties is sufficient (in contrast to minority cabinets) and necessary (in contrast to oversized cabinets) for policy change. In particular, it is found that in minimal winning cabinets the ideological distance between the extreme government parties significantly decreases the government's ability to introduce reforms. These findings improve our understanding of reform making in parliamentary democracies and highlight important issues and open questions for future applications and tests of the veto player theory.

  13. Veto player theory and reform making in Western Europe

    PubMed Central

    ANGELOVA, MARIYANA; BÄCK, HANNA; MÜLLER, WOLFGANG C.

    2017-01-01

    Abstract Veto player theory generates predictions about governments’ capacity for policy change. Due to the difficulty of identifying significant laws needed to change the policy status quo, evidence about governments’ ability to change policy has been mostly provided for a limited number of reforms and single‐country studies. To evaluate the predictive power of veto player theory for policy making across time, policy areas and countries, a dataset was gathered that incorporates about 5,600 important government reform measures in the areas of social, labour, economic and taxation policy undertaken in 13 Western European countries from the mid‐1980s until the mid‐2000s. Veto player theory is applied in a combined model with other central theoretical expectations on policy change derived from political economy (crisis‐driven policy change) and partisan theory (ideology‐driven policy change). Robust support is found that governments introduce more reform measures when economic conditions are poor and when the government is positioned further away from the policy status quo. No empirical support is found for predictions of veto player theory in its pure form, where no differentiation between government types is made. However, the findings provide support for the veto player theory in the special case of minimal winning cabinets, where the support of all government parties is sufficient (in contrast to minority cabinets) and necessary (in contrast to oversized cabinets) for policy change. In particular, it is found that in minimal winning cabinets the ideological distance between the extreme government parties significantly decreases the government's ability to introduce reforms. These findings improve our understanding of reform making in parliamentary democracies and highlight important issues and open questions for future applications and tests of the veto player theory. PMID:29695891

  14. An empirical test of Rogers' original and revised theory of correlates in adolescents.

    PubMed

    Yarcheski, A; Mahon, N E

    1991-12-01

    The purpose of this study was to examine Rogers' original and revised theory of correlates in adolescents. The correlates were measured by Perceived Field Motion, Human Field Rhythms, Creativity, Sentience, Fast Tempo, and Waking Periods. The original theory was tested with data obtained from samples of early (n = 116), middle (n = 116), and late (n = 116) adolescents. The revised theory was tested in a fourth selectively combined sample of adolescents, aged 12 to 21 (n = 89). Data were collected in classroom settings. Although the findings did not support either theory, they did indicate that: (1) four of the six correlates studied performed as correlates when examined in three discrete phases of adolescence, as determined by chronological age, (2) the means of the individual correlates increased slightly in frequency levels developmentally, and (3) the correlates emerged at different frequency levels when examined in adolescents, aged 12 to 21.

  15. Galaxy power spectrum in redshift space: Combining perturbation theory with the halo model

    NASA Astrophysics Data System (ADS)

    Okumura, Teppei; Hand, Nick; Seljak, Uroš; Vlah, Zvonimir; Desjacques, Vincent

    2015-11-01

    Theoretical modeling of the redshift-space power spectrum of galaxies is crucially important to correctly extract cosmological information from galaxy redshift surveys. The task is complicated by the nonlinear biasing and redshift space distortion (RSD) effects, which change with halo mass, and by the wide distribution of halo masses and their occupations by galaxies. One of the main modeling challenges is the existence of satellite galaxies that have both radial distribution inside the halos and large virial velocities inside halos, a phenomenon known as the Finger-of-God (FoG) effect. We present a model for the redshift-space power spectrum of galaxies in which we decompose a given galaxy sample into central and satellite galaxies and relate different contributions to the power spectrum to 1-halo and 2-halo terms in a halo model. Our primary goal is to ensure that any parameters that we introduce have physically meaningful values, and are not just fitting parameters. For the lowest order 2-halo terms we use the previously developed RSD modeling of halos in the context of distribution function and perturbation theory approach. This term needs to be multiplied by the effect of radial distances and velocities of satellites inside the halo. To this one needs to add the 1-halo terms, which are nonperturbative. We show that the real space 1-halo terms can be modeled as almost constant, with the finite extent of the satellites inside the halo inducing a small k2R2 term over the range of scales of interest, where R is related to the size of the halo given by its halo mass. We adopt a similar model for FoG in redshift space, ensuring that FoG velocity dispersion is related to the halo mass. For FoG k2 type expansions do not work over the range of scales of interest and FoG resummation must be used instead. We test several simple damping functions to model the velocity dispersion FoG effect. Applying the formalism to mock galaxies modeled after the "CMASS" sample of the

  16. Galaxy power spectrum in redshift space: Combining perturbation theory with the halo model

    DOE PAGES

    Okumura, Teppei; Hand, Nick; Seljak, Uros; ...

    2015-11-19

    Theoretical modeling of the redshift-space power spectrum of galaxies is crucially important to correctly extract cosmological information from galaxy redshift surveys. The task is complicated by the nonlinear biasing and redshift space distortion (RSD) effects, which change with halo mass, and by the wide distribution of halo masses and their occupations by galaxies. One of the main modeling challenges is the existence of satellite galaxies that have both radial distribution inside the halos and large virial velocities inside halos, a phenomenon known as the Finger-of-God (FoG) effect. We present a model for the redshift-space power spectrum of galaxies in whichmore » we decompose a given galaxy sample into central and satellite galaxies and relate different contributions to the power spectrum to 1-halo and 2-halo terms in a halo model. Our primary goal is to ensure that any parameters that we introduce have physically meaningful values, and are not just fitting parameters. For the lowest order 2-halo terms we use the previously developed RSD modeling of halos in the context of distribution function and perturbation theory approach. This term needs to be multiplied by the effect of radial distances and velocities of satellites inside the halo. To this one needs to add the 1-halo terms, which are nonperturbative. We show that the real space 1-halo terms can be modeled as almost constant, with the finite extent of the satellites inside the halo inducing a small k 2R 2 term over the range of scales of interest, where R is related to the size of the halo given by its halo mass. Furthermore, we adopt a similar model for FoG in redshift space, ensuring that FoG velocity dispersion is related to the halo mass. For FoG k 2 type expansions do not work over the range of scales of interest and FoG resummation must be used instead. We test several simple damping functions to model the velocity dispersion FoG effect. Applying the formalism to mock galaxies modeled after the

  17. Expanding resource theory and feminist-informed theory to explain intimate partner violence perpetration by court-ordered men.

    PubMed

    Basile, Kathleen C; Hall, Jeffrey E; Walters, Mikel L

    2013-07-01

    This study tested resource and feminist-informed theories to explain physical, sexual, psychological, and stalking intimate partner violence (IPV) perpetrated by court-mandated men. Data were obtained from 340 men arrested for physical assault of a partner before their court-ordered treatment. Using path analysis, findings provided partial support for each model. Ineffective arguing and substance-use problems were moderators of resources and perpetration. Dominance mediated early exposures and perpetration in the feminist-informed model. In both models, predictors of stalking were different than those for other types of perpetration. Future studies should replicate this research and determine the utility of combining models.

  18. Changing theories of change: strategic shifting in implicit theory endorsement.

    PubMed

    Leith, Scott A; Ward, Cindy L P; Giacomin, Miranda; Landau, Enoch S; Ehrlinger, Joyce; Wilson, Anne E

    2014-10-01

    People differ in their implicit theories about the malleability of characteristics such as intelligence and personality. These relatively chronic theories can be experimentally altered, and can be affected by parent or teacher feedback. Little is known about whether people might selectively shift their implicit beliefs in response to salient situational goals. We predicted that, when motivated to reach a desired conclusion, people might subtly shift their implicit theories of change and stability to garner supporting evidence for their desired position. Any motivated context in which a particular lay theory would help people to reach a preferred directional conclusion could elicit shifts in theory endorsement. We examine a variety of motivated situational contexts across 7 studies, finding that people's theories of change shifted in line with goals to protect self and liked others and to cast aspersions on disliked others. Studies 1-3 demonstrate how people regulate their implicit theories to manage self-view by more strongly endorsing an incremental theory after threatening performance feedback or memories of failure. Studies 4-6 revealed that people regulate the implicit theories they hold about favored and reviled political candidates, endorsing an incremental theory to forgive preferred candidates for past gaffes but leaning toward an entity theory to ensure past failings "stick" to opponents. Finally, in Study 7, people who were most threatened by a previously convicted child sex offender (i.e., parents reading about the offender moving to their neighborhood) gravitated most to the entity view that others do not change. Although chronic implicit theories are undoubtedly meaningful, this research reveals a previously unexplored source of fluidity by highlighting the active role people play in managing their implicit theories in response to goals. 2014 APA, all rights reserved

  19. Toda theories as contractions of affine Toda theories

    NASA Astrophysics Data System (ADS)

    Aghamohammadi, A.; Khorrami, M.; Shariati, A.

    1996-02-01

    Using a contraction procedure, we obtain Toda theories and their structures, from affine Toda theories and their corresponding structures. By structures, we mean the equation of motion, the classical Lax pair, the boundary term for half line theories, and the quantum transfer matrix. The Lax pair and the transfer matrix so obtained, depend nontrivially on the spectral parameter.

  20. Postbuckling behaviors of nanorods including the effects of nonlocal elasticity theory and surface stress

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Thongyothee, Chawis, E-mail: chawist@hotmail.com; Chucheepsakul, Somchai

    2013-12-28

    This paper is concerned with postbuckling behaviors of nanorods subjected to an end concentrated load. One end of the nanorod is clamped while the other end is fixed to a support that can slide in the slot. The governing equation is developed from static equilibrium and geometrical conditions by using the exact curvature corresponding to the elastica theory. The nonlocal elasticity, the effect of surface stress, and their combined effects are taken into account in Euler–Bernoulli beam theory. Differential equations in this problem can be solved numerically by using the shooting-optimization technique for the postbuckling loads and the buckled configurations.more » The results show that nanorods with the nonlocal elasticity effect undergo increasingly large deformation while the effect of surface stress in combination with nonlocal elasticity decreases the deflection of nanorods under the same postbuckling load.« less

  1. Generalizability Theory and Classical Test Theory

    ERIC Educational Resources Information Center

    Brennan, Robert L.

    2011-01-01

    Broadly conceived, reliability involves quantifying the consistencies and inconsistencies in observed scores. Generalizability theory, or G theory, is particularly well suited to addressing such matters in that it enables an investigator to quantify and distinguish the sources of inconsistencies in observed scores that arise, or could arise, over…

  2. Combining extrapolation with ghost interaction correction in range-separated ensemble density functional theory for excited states

    NASA Astrophysics Data System (ADS)

    Alam, Md. Mehboob; Deur, Killian; Knecht, Stefan; Fromager, Emmanuel

    2017-11-01

    The extrapolation technique of Savin [J. Chem. Phys. 140, 18A509 (2014)], which was initially applied to range-separated ground-state-density-functional Hamiltonians, is adapted in this work to ghost-interaction-corrected (GIC) range-separated ensemble density-functional theory (eDFT) for excited states. While standard extrapolations rely on energies that decay as μ-2 in the large range-separation-parameter μ limit, we show analytically that (approximate) range-separated GIC ensemble energies converge more rapidly (as μ-3) towards their pure wavefunction theory values (μ → +∞ limit), thus requiring a different extrapolation correction. The purpose of such a correction is to further improve on the convergence and, consequently, to obtain more accurate excitation energies for a finite (and, in practice, relatively small) μ value. As a proof of concept, we apply the extrapolation method to He and small molecular systems (viz., H2, HeH+, and LiH), thus considering different types of excitations such as Rydberg, charge transfer, and double excitations. Potential energy profiles of the first three and four singlet Σ+ excitation energies in HeH+ and H2, respectively, are studied with a particular focus on avoided crossings for the latter. Finally, the extraction of individual state energies from the ensemble energy is discussed in the context of range-separated eDFT, as a perspective.

  3. Toward a Unified Sub-symbolic Computational Theory of Cognition

    PubMed Central

    Butz, Martin V.

    2016-01-01

    This paper proposes how various disciplinary theories of cognition may be combined into a unifying, sub-symbolic, computational theory of cognition. The following theories are considered for integration: psychological theories, including the theory of event coding, event segmentation theory, the theory of anticipatory behavioral control, and concept development; artificial intelligence and machine learning theories, including reinforcement learning and generative artificial neural networks; and theories from theoretical and computational neuroscience, including predictive coding and free energy-based inference. In the light of such a potential unification, it is discussed how abstract cognitive, conceptualized knowledge and understanding may be learned from actively gathered sensorimotor experiences. The unification rests on the free energy-based inference principle, which essentially implies that the brain builds a predictive, generative model of its environment. Neural activity-oriented inference causes the continuous adaptation of the currently active predictive encodings. Neural structure-oriented inference causes the longer term adaptation of the developing generative model as a whole. Finally, active inference strives for maintaining internal homeostasis, causing goal-directed motor behavior. To learn abstract, hierarchical encodings, however, it is proposed that free energy-based inference needs to be enhanced with structural priors, which bias cognitive development toward the formation of particular, behaviorally suitable encoding structures. As a result, it is hypothesized how abstract concepts can develop from, and thus how they are structured by and grounded in, sensorimotor experiences. Moreover, it is sketched-out how symbol-like thought can be generated by a temporarily active set of predictive encodings, which constitute a distributed neural attractor in the form of an interactive free-energy minimum. The activated, interactive network attractor

  4. Ab Initio Theory of Nuclear Magnetic Resonance Shifts in Metals

    NASA Astrophysics Data System (ADS)

    D'Avezac, Mayeul; Marzari, Nicola; Mauri, Francesco

    2005-03-01

    A comprehensive approach for the first-principles determination of all-electron NMR shifts in metallic systems is presented. Our formulation is based on a combination of density-functional perturbation theory and all-electron wavefunction reconstruction, starting from periodic-boundary calculations in the pseudopotential approximation. The orbital contribution to the NMR shift (the chemical shift) is obtained by combining the gauge-including projector augmented-wave approach (GIPAW), originally developed for the case of insulatorsootnotetextC. J. Pickard, Francesco Mauri, Phys. Rev. B, 63, 245101(2001), with the extension of linear-response theory to the case of metallic systemsootnotetextS. de Gironcoli, Phys. Rev. B, 51, 6773(1995). The spin contribution (the Knight shift) is obtained as a response to a finite uniform magnetic field, and through reconstructing the hyperfine interaction between the electron-spin density and the nuclear spins with the projector augmented-wave method (PAWootnotetextC. G. Van de Walle, P. E. Blöchl, Phys. Rev. B, 47, 4244(1993)). Our method is validated with applications to the case of the homogeneous electron gas and of simple metals. (Work supported by MURI grant DAAD 19-03-1-0169 and MIT-France)

  5. Tidal Forces: A Different Theory

    NASA Astrophysics Data System (ADS)

    Masters, Roy

    2010-10-01

    We revisit the theories describing the moon raising the tides by virtue of pull gravity combined with the moon's centripetal angular momentum. We show that if gravity is considered as the attractive interaction between individual bodies, then the moon would have fallen to earth eons ago. Isaac Newton's laws of motion cannot work with pull gravity. However, they do with gravity as a property of the universe as Einstein said with a huge energy bonus. In other words, the moon-Earth system becomes the first observable vacuum gravity energy machine, meaning that it not only produces energy, but provides also escape momentum for the moon's centripetal motion at 4cm per year.

  6. Kinetic theory for strongly coupled Coulomb systems

    NASA Astrophysics Data System (ADS)

    Dufty, James; Wrighton, Jeffrey

    2018-01-01

    The calculation of dynamical properties for matter under extreme conditions is a challenging task. The popular Kubo-Greenwood model exploits elements from equilibrium density-functional theory (DFT) that allow a detailed treatment of electron correlations, but its origin is largely phenomenological; traditional kinetic theories have a more secure foundation but are limited to weak ion-electron interactions. The objective here is to show how a combination of the two evolves naturally from the short-time limit for the generator of the effective single-electron dynamics governing time correlation functions without such limitations. This provides a theoretical context for the current DFT-related approach, the Kubo-Greenwood model, while showing the nature of its corrections. The method is to calculate the short-time dynamics in the single-electron subspace for a given configuration of the ions. This differs from the usual kinetic theory approach in which an average over the ions is performed as well. In this way the effective ion-electron interaction includes strong Coulomb coupling and is shown to be determined from DFT. The correlation functions have the form of the random-phase approximation for an inhomogeneous system but with renormalized ion-electron and electron-electron potentials. The dynamic structure function, density response function, and electrical conductivity are calculated as examples. The static local field corrections in the dielectric function are identified in this way. The current analysis is limited to semiclassical electrons (quantum statistical potentials), so important quantum conditions are excluded. However, a quantization of the kinetic theory is identified for broader application while awaiting its detailed derivation.

  7. Blogs, Tweets, and Protests: Learning Movement Theory through Online Case Studies

    ERIC Educational Resources Information Center

    Muñoz, José A.; Culton, Kenneth R.

    2016-01-01

    This article takes the practical inquiry model as an approach to designing a course on social movements that combines self-directed investigation and group discussion as an avenue for deep learning. For the purpose of developing a case study, a guided approach is provided that allows the students to explore theory on their own and make connections…

  8. Practice Makes Perfect: Engaging Student-Citizens in Politics through Theory and Practice

    ERIC Educational Resources Information Center

    Csajko, Karen; Lindaman, Kara

    2011-01-01

    We study one aspect of the relationship between theory and politics, in order to begin to address this issue of political science education--specifically focusing on whether participation in the election process as voting monitors, combined with political science education, can help students better understand politics as democratic engagement. In…

  9. G-theory: The generator of M-theory and supersymmetry

    NASA Astrophysics Data System (ADS)

    Sepehri, Alireza; Pincak, Richard

    2018-04-01

    In string theory with ten dimensions, all Dp-branes are constructed from D0-branes whose action has two-dimensional brackets of Lie 2-algebra. Also, in M-theory, with 11 dimensions, all Mp-branes are built from M0-branes whose action contains three-dimensional brackets of Lie 3-algebra. In these theories, the reason for difference between bosons and fermions is unclear and especially in M-theory there is not any stable object like stable M3-branes on which our universe would be formed on it and for this reason it cannot help us to explain cosmological events. For this reason, we construct G-theory with M dimensions whose branes are formed from G0-branes with N-dimensional brackets. In this theory, we assume that at the beginning there is nothing. Then, two energies, which differ in their signs only, emerge and produce 2M degrees of freedom. Each two degrees of freedom create a new dimension and then M dimensions emerge. M-N of these degrees of freedom are removed by symmetrically compacting half of M-N dimensions to produce Lie-N-algebra. In fact, each dimension produces a degree of freedom. Consequently, by compacting M-N dimensions from M dimensions, N dimensions and N degrees of freedom is emerged. These N degrees of freedoms produce Lie-N-algebra. During this compactification, some dimensions take extra i and are different from other dimensions, which are known as time coordinates. By this compactification, two types of branes, Gp and anti-Gp-branes, are produced and rank of tensor fields which live on them changes from zero to dimension of brane. The number of time coordinates, which are produced by negative energy in anti-Gp-branes, is more sensible to number of times in Gp-branes. These branes are compactified anti-symmetrically and then fermionic superpartners of bosonic fields emerge and supersymmetry is born. Some of gauge fields play the role of graviton and gravitino and produce the supergravity. The question may arise that what is the physical reason

  10. Sculpting the Immunological Response against Viral Disease: Statistical Mechanics and Network Theory

    NASA Astrophysics Data System (ADS)

    Zhou, Hao; Deem, Michael

    2007-03-01

    The twin challenges of immunodominance and heterologous immunity have hampered discovery of an effective vaccine against all four dengue viruses. Here we develop a generalized NK, or spin glass, theory of T cell original antigenic sin and immunodominance. The theory we develop predicts dengue vaccine clinical trial data well. From the insights that we gain by this theory, we propose two new ideas for design of epitope-based T cell vaccines against dengue. The H5N1 strain of avian influenza first appeared in Hong Kong in 1997. Since then, it has spread to at least eight other Asian countries, Romania, and Russia, and it is widely expected to enter the rest of Europe through migratory birds. Various countries around the world have started to create stockpiles of avian influenza vaccines. However, since the avian influenza is mutating, how many and which strains should be stockpiled? Here we use a combination of statistical physics and network theory to simulate the bird flu transmission and evolution. From the insights that we gain by the theory, we propose new strategies to improve the vaccine efficacy.

  11. Displacement Theories for In-Flight Deformed Shape Predictions of Aerospace Structures

    NASA Technical Reports Server (NTRS)

    Ko, William L.; Richards, W. L.; Tran, Van t.

    2007-01-01

    Displacement theories are developed for a variety of structures with the goal of providing real-time shape predictions for aerospace vehicles during flight. These theories are initially developed for a cantilever beam to predict the deformed shapes of the Helios flying wing. The main structural configuration of the Helios wing is a cantilever wing tubular spar subjected to bending, torsion, and combined bending and torsion loading. The displacement equations that are formulated are expressed in terms of strains measured at multiple sensing stations equally spaced on the surface of the wing spar. Displacement theories for other structures, such as tapered cantilever beams, two-point supported beams, wing boxes, and plates also are developed. The accuracy of the displacement theories is successfully validated by finite-element analysis and classical beam theory using input-strains generated by finite-element analysis. The displacement equations and associated strain-sensing system (such as fiber optic sensors) create a powerful means for in-flight deformation monitoring of aerospace structures. This method serves multiple purposes for structural shape sensing, loads monitoring, and structural health monitoring. Ultimately, the calculated displacement data can be visually displayed to the ground-based pilot or used as input to the control system to actively control the shape of structures during flight.

  12. Search Algorithms as a Framework for the Optimization of Drug Combinations

    PubMed Central

    Coquin, Laurence; Schofield, Jennifer; Feala, Jacob D.; Reed, John C.; McCulloch, Andrew D.; Paternostro, Giovanni

    2008-01-01

    Combination therapies are often needed for effective clinical outcomes in the management of complex diseases, but presently they are generally based on empirical clinical experience. Here we suggest a novel application of search algorithms—originally developed for digital communication—modified to optimize combinations of therapeutic interventions. In biological experiments measuring the restoration of the decline with age in heart function and exercise capacity in Drosophila melanogaster, we found that search algorithms correctly identified optimal combinations of four drugs using only one-third of the tests performed in a fully factorial search. In experiments identifying combinations of three doses of up to six drugs for selective killing of human cancer cells, search algorithms resulted in a highly significant enrichment of selective combinations compared with random searches. In simulations using a network model of cell death, we found that the search algorithms identified the optimal combinations of 6–9 interventions in 80–90% of tests, compared with 15–30% for an equivalent random search. These findings suggest that modified search algorithms from information theory have the potential to enhance the discovery of novel therapeutic drug combinations. This report also helps to frame a biomedical problem that will benefit from an interdisciplinary effort and suggests a general strategy for its solution. PMID:19112483

  13. Adaptive Disturbance Tracking Theory with State Estimation and State Feedback for Region II Control of Large Wind Turbines

    NASA Technical Reports Server (NTRS)

    Balas, Mark J.; Thapa Magar, Kaman S.; Frost, Susan A.

    2013-01-01

    A theory called Adaptive Disturbance Tracking Control (ADTC) is introduced and used to track the Tip Speed Ratio (TSR) of 5 MW Horizontal Axis Wind Turbine (HAWT). Since ADTC theory requires wind speed information, a wind disturbance generator model is combined with lower order plant model to estimate the wind speed as well as partial states of the wind turbine. In this paper, we present a proof of stability and convergence of ADTC theory with lower order estimator and show that the state feedback can be adaptive.

  14. R A Fisher, design theory, and the Indian connection.

    PubMed

    Rau, A R P

    2009-09-01

    Design Theory, a branch of mathematics, was born out of the experimental statistics research of the population geneticist R A Fisher and of Indian mathematical statisticians in the 1930s. The field combines elements of combinatorics, finite projective geometries, Latin squares, and a variety of further mathematical structures, brought together in surprising ways. This essay will present these structures and ideas as well as how the field came together, in itself an interesting story.

  15. Minimum-domain impulse theory for unsteady aerodynamic force

    NASA Astrophysics Data System (ADS)

    Kang, L. L.; Liu, L. Q.; Su, W. D.; Wu, J. Z.

    2018-01-01

    We extend the impulse theory for unsteady aerodynamics from its classic global form to finite-domain formulation then to minimum-domain form and from incompressible to compressible flows. For incompressible flow, the minimum-domain impulse theory raises the finding of Li and Lu ["Force and power of flapping plates in a fluid," J. Fluid Mech. 712, 598-613 (2012)] to a theorem: The entire force with discrete wake is completely determined by only the time rate of impulse of those vortical structures still connecting to the body, along with the Lamb-vector integral thereof that captures the contribution of all the rest disconnected vortical structures. For compressible flows, we find that the global form in terms of the curl of momentum ∇ × (ρu), obtained by Huang [Unsteady Vortical Aerodynamics (Shanghai Jiaotong University Press, 1994)], can be generalized to having an arbitrary finite domain, but the formula is cumbersome and in general ∇ × (ρu) no longer has discrete structures and hence no minimum-domain theory exists. Nevertheless, as the measure of transverse process only, the unsteady field of vorticity ω or ρω may still have a discrete wake. This leads to a minimum-domain compressible vorticity-moment theory in terms of ρω (but it is beyond the classic concept of impulse). These new findings and applications have been confirmed by our numerical experiments. The results not only open an avenue to combine the theory with computation-experiment in wide applications but also reveal a physical truth that it is no longer necessary to account for all wake vortical structures in computing the force and moment.

  16. Bi-Factor Multidimensional Item Response Theory Modeling for Subscores Estimation, Reliability, and Classification

    ERIC Educational Resources Information Center

    Md Desa, Zairul Nor Deana

    2012-01-01

    In recent years, there has been increasing interest in estimating and improving subscore reliability. In this study, the multidimensional item response theory (MIRT) and the bi-factor model were combined to estimate subscores, to obtain subscores reliability, and subscores classification. Both the compensatory and partially compensatory MIRT…

  17. Constraints on Einstein-aether theory after GW170817

    NASA Astrophysics Data System (ADS)

    Oost, Jacob; Mukohyama, Shinji; Wang, Anzhong

    2018-06-01

    In this paper, we carry out a systematic analysis of the theoretical and observational constraints on the dimensionless coupling constants ci (i =1 , 2, 3, 4) of the Einstein-aether theory, taking into account the events GW170817 and GRB 170817A. The combination of these events restricts the deviation of the speed cT of the spin-2 graviton to the range, -3 ×10-15theory implies |c13| ≤10-15 with ci j≡ci+cj. The rest of the constraints are divided into two groups: those on the (c1 , c14 )-plane and those on the (c2 , c14 )-plane, except the strong-field constraints. The latter depend on the sensitivities σæ of neutron stars, which are not known at present in the new ranges of the parameters found in this paper.

  18. A cellular automaton model for evacuation flow using game theory

    NASA Astrophysics Data System (ADS)

    Guan, Junbiao; Wang, Kaihua; Chen, Fangyue

    2016-11-01

    Game theory serves as a good tool to explore crowd dynamic conflicts during evacuation processes. The purpose of this study is to simulate the complicated interaction behavior among the conflicting pedestrians in an evacuation flow. Two types of pedestrians, namely, defectors and cooperators, are considered, and two important factors including fear index and cost coefficient are taken into account. By combining the snowdrift game theory with a cellular automaton (CA) model, it is shown that the increase of fear index and cost coefficient will lengthen the evacuation time, which is more apparent for large values of cost coefficient. Meanwhile, it is found that the defectors to cooperators ratio could always tend to consistent states despite different values of parameters, largely owing to self-organization effects.

  19. Game theory.

    PubMed

    Dufwenberg, Martin

    2011-03-01

    Game theory is a toolkit for examining situations where decision makers influence each other. I discuss the nature of game-theoretic analysis, the history of game theory, why game theory is useful for understanding human psychology, and why game theory has played a key role in the recent explosion of interest in the field of behavioral economics. WIREs Cogni Sci 2011 2 167-173 DOI: 10.1002/wcs.119 For further resources related to this article, please visit the WIREs website. Copyright © 2010 John Wiley & Sons, Ltd.

  20. Systemic Thinking in Career Development Theory: Contributions of the Systems Theory Framework

    ERIC Educational Resources Information Center

    McMahon, Mary; Patton, Wendy

    2018-01-01

    This article considers systemic thinking in relation to the Systems Theory Framework (STF) and to career theory. An overview of systems theory and its applications is followed by a discussion of career theory to provide a context for the subsequent description of STF. The contributions of STF to career theory and to theory integration are…

  1. A company I can trust? Organizational lay theories moderate stereotype threat for women.

    PubMed

    Emerson, Katherine T U; Murphy, Mary C

    2015-02-01

    Women remain under-represented in the leadership of corporate America. According to stereotype threat theory, this under-representation may persist because women are concerned about being stereotyped in business settings. Three studies investigated whether an entity (fixed), compared with an incremental (malleable), organizational lay theory is threatening for women evaluating a consulting company. Men and women viewed a company mission statement or website containing an entity or incremental theory. Results revealed that women--more so than men--trusted the entity company less than the incremental company. Furthermore, only women's mistrust of the entity company was driven by their expectations about being stereotyped by its management. Notably, when combined with high or low representations of female employees, only organizational lay theories predicted trust. Finally, people's--particularly women's--mistrust of the entity company led them to disengage more before interacting with a representative. Implications for women's experiences and outcomes in workplace settings are discussed. © 2014 by the Society for Personality and Social Psychology, Inc.

  2. Theory for Explaining and Comparing the Dynamics of Education in Transitional Processes

    ERIC Educational Resources Information Center

    van der Walt, Johannes L.

    2016-01-01

    Countries all over the world find themselves in the throes of revolution, change, transition or transformation. Because of the complexities of these momentous events, it is no simple matter to describe and evaluate them. This paper suggests that comparative educationists apply a combination of three theories as a lens through which such national…

  3. Person-Environment Fit Theory and Organizations: Commensurate Dimensions, Time Perspectives, and Mechanisms.

    ERIC Educational Resources Information Center

    Caplan, Robert D.

    1987-01-01

    Describes person-environment (PE) theory, pertinent studies and experiments in improving PE fit, advocating research on role of past, present, and anticipated PE fit on well-being and employee behavior; outcomes when PE fit is changed by altering P, E, or some combination; and considering the agent of change. Emphasizes systemic properties of…

  4. A conceptual care model for individualized care approach in cardiac rehabilitation--combining both illness representation and self-efficacy.

    PubMed

    Lau-Walker, Margaret

    2006-02-01

    This paper analyses the two prominent psychological theories of patient response--illness representation and self-efficacy--and explore the possibilities of the development of a conceptual individualized care model that would make use of both theories. Analysis of the literature established common themes that were used as the basis to form a conceptual framework intended to assist in the joint application of these theories to therapeutic settings. Both theories emphasize personal experience, pre-construction of self, individual response to illness and treatment, and that the patients' beliefs are more influential in their recovery than the severity of the illness. Where the theories are most divergent is their application to therapeutic interventions, which reflects the different sources of influence that each theory emphasizes. Based on their similarities and differences it is possible to integrate the two theories into a conceptual care model. The Interactive Care Model combines both theories of patient response and provides an explicit framework for further research into the design of effective therapeutic interventions in rehabilitation care.

  5. Combining item response theory with multiple imputation to equate health assessment questionnaires.

    PubMed

    Gu, Chenyang; Gutman, Roee

    2017-09-01

    The assessment of patients' functional status across the continuum of care requires a common patient assessment tool. However, assessment tools that are used in various health care settings differ and cannot be easily contrasted. For example, the Functional Independence Measure (FIM) is used to evaluate the functional status of patients who stay in inpatient rehabilitation facilities, the Minimum Data Set (MDS) is collected for all patients who stay in skilled nursing facilities, and the Outcome and Assessment Information Set (OASIS) is collected if they choose home health care provided by home health agencies. All three instruments or questionnaires include functional status items, but the specific items, rating scales, and instructions for scoring different activities vary between the different settings. We consider equating different health assessment questionnaires as a missing data problem, and propose a variant of predictive mean matching method that relies on Item Response Theory (IRT) models to impute unmeasured item responses. Using real data sets, we simulated missing measurements and compared our proposed approach to existing methods for missing data imputation. We show that, for all of the estimands considered, and in most of the experimental conditions that were examined, the proposed approach provides valid inferences, and generally has better coverages, relatively smaller biases, and shorter interval estimates. The proposed method is further illustrated using a real data set. © 2016, The International Biometric Society.

  6. "Fathers" and "sons" of theories in cell physiology: the membrane theory.

    PubMed

    Matveev, V V; Wheatley, D N

    2005-12-16

    The last 50 years in the history of life sciences are remarkable for a new important feature that looks as a great threat for their future. A profound specialization dominating in quickly developing fields of science causes a crisis of the scientific method. The essence of the method is a unity of two elements, the experimental data and the theory that explains them. To us, "fathers" of science, classically, were the creators of new ideas and theories. They were the true experts of their own theories. It is only they who have the right to say: "I am the theory". In other words, they were carriers of theories, of the theoretical knowledge. The fathers provided the necessary logical integrity to their theories, since theories in biology have still to be based on strict mathematical proofs. It is not true for sons. As a result of massive specialization, modern experts operate in very confined close spaces. They formulate particular rules far from the level of theory. The main theories of science are known to them only at the textbook level. Nowadays, nobody can say: "I am the theory". With whom, then is it possible to discuss today on a broader theoretical level? How can a classical theory--for example, the membrane one--be changed or even disproved under these conditions? How can the "sons" with their narrow education catch sight of membrane theory defects? As a result, "global" theories have few critics and control. Due to specialization, we have lost the ability to work at the experimental level of biology within the correct or appropriate theoretical context. The scientific method in its classic form is now being rapidly eroded. A good case can be made for "Membrane Theory", to which we will largely refer throughout this article.

  7. Critical Theory: Implications for School Leadership Theory and Practice.

    ERIC Educational Resources Information Center

    Peca, Kathy

    The school leader's behaviors are inspired by theories, and theories are intrinsic to practice. This paper provides an overview of an emerging perspective in educational administration, critical theory. The paper first highlights the philosophies of Immanuel Kant, Fichte, Hegel, Marx, and the Frankfurt School. It then discusses critical theory…

  8. NORMATIVE ACCOUNTING THEORY AND THE THEORY OF DECISION,

    DTIC Science & Technology

    The paper discusses an approach to the construction of normative accounting theory with respect to both methodology and substance. The method of...postulation and deduction is outlined, with particular emphasis on its role in the social sciences in general and in accounting in particular. It is...suggested that a formal link must be established between the (normative) theory of decision and accounting , and that rigorous (economic) theories of the

  9. Using category theory to assess the relationship between consciousness and integrated information theory.

    PubMed

    Tsuchiya, Naotsugu; Taguchi, Shigeru; Saigo, Hayato

    2016-06-01

    One of the most mysterious phenomena in science is the nature of conscious experience. Due to its subjective nature, a reductionist approach is having a hard time in addressing some fundamental questions about consciousness. These questions are squarely and quantitatively tackled by a recently developed theoretical framework, called integrated information theory (IIT) of consciousness. In particular, IIT proposes that a maximally irreducible conceptual structure (MICS) is identical to conscious experience. However, there has been no principled way to assess the claimed identity. Here, we propose to apply a mathematical formalism, category theory, to assess the proposed identity and suggest that it is important to consider if there exists a proper translation between the domain of conscious experience and that of the MICS. If such translation exists, we postulate that questions in one domain can be answered in the other domain; very difficult questions in the domain of consciousness can be resolved in the domain of mathematics. We claim that it is possible to empirically test if such a functor exists, by using a combination of neuroscientific and computational approaches. Our general, principled and empirical framework allows us to assess the relationship between the domain of consciousness and the domain of mathematical structures, including those suggested by IIT. Copyright © 2015 The Authors. Published by Elsevier Ireland Ltd.. All rights reserved.

  10. Administrative Attribution Theory.

    ERIC Educational Resources Information Center

    Frasher, James M.; Frasher, Ramona S.

    1981-01-01

    Hypothesizes that the growing body of empirical data concerning attribution theory offers insight into the administrative process. To stimulate research to test this hypothesis, presents previous relevant research and a theory entitled Administrative Attribution Theory. Research questions applying the theory to educational administration are…

  11. A combination strategy for tracking the serial criminal

    NASA Astrophysics Data System (ADS)

    He, Chuan; Zhang, Yuan-Biao; Wan, Jiadi; Yu, Wenjing

    2010-08-01

    We build a Geographic Profiling Model to generate the criminal's geographical profile, by combining two complementary strategies: the Spatial Distribution Strategy and the Probability Distance Strategy. In the first strategy, we designate the mean of all the known crime sites as the anchor point, and build a Standard Deviational Ellipse Model, considering the effect of landscape. In the second strategy, we take many factors such as the buffer zone and distance decay theory into consideration and calculate the probability of the offender's residence in a certain area by using the Bayesian Theorem and the Rossmo Algorithm. Then, we combine the result of two strategies and get three search areas suit different conditions of the police to track the serial criminal. Apply the model to the English serial killer Peter Sutcliffe's case, the calculation result shows that the model can effectively be used to track serial criminal.

  12. Contrasting Ohlsson's Resubsumption Theory with Chi's Categorical Shift Theory

    ERIC Educational Resources Information Center

    Chi, Michelene T. H.; Brem, Sarah K.

    2009-01-01

    Ohlsson's proposal of resubsumption as the dominant process in conceptual, or nonmonotonic, change presents a worthy challenge to more established theories, such as Chi's theory of ontological shift. The two approaches differ primarily in that Ohlsson's theory emphasizes a process of learning in which narrower, more specific concepts are subsumed…

  13. Self-consistent field theory of tethered polymers: one dimensional, three dimensional, strong stretching theories and the effects of excluded-volume-only interactions.

    PubMed

    Suo, Tongchuan; Whitmore, Mark D

    2014-11-28

    We examine end-tethered polymers in good solvents, using one- and three-dimensional self-consistent field theory, and strong stretching theories. We also discuss different tethering scenarios, namely, mobile tethers, fixed but random ones, and fixed but ordered ones, and the effects and important limitations of including only binary interactions (excluded volume terms). We find that there is a "mushroom" regime in which the layer thickness is independent of the tethering density, σ, for systems with ordered tethers, but we argue that there is no such plateau for mobile or disordered anchors, nor is there one in the 1D theory. In the other limit of brushes, all approaches predict that the layer thickness scales linearly with N. However, the σ(1/3) scaling is a result of keeping only excluded volume interactions: when the full potential is included, the dependence is faster and more complicated than σ(1/3). In fact, there does not appear to be any regime in which the layer thickness scales in the combination Nσ(1/3). We also compare the results for two different solvents with each other, and with earlier Θ solvent results.

  14. Theory of transformation thermal convection for creeping flow in porous media: Cloaking, concentrating, and camouflage

    NASA Astrophysics Data System (ADS)

    Dai, Gaole; Shang, Jin; Huang, Jiping

    2018-02-01

    Heat can transfer via thermal conduction, thermal radiation, and thermal convection. All the existing theories of transformation thermotics and optics can treat thermal conduction and thermal radiation, respectively. Unfortunately, thermal convection has seldom been touched in transformation theories due to the lack of a suitable theory, thus limiting applications associated with heat transfer through fluids (liquid or gas). Here, we develop a theory of transformation thermal convection by considering the convection-diffusion equation, the equation of continuity, and the Darcy law. By introducing porous media, we get a set of equations keeping their forms under coordinate transformation. As model applications, the theory helps to show the effects of cloaking, concentrating, and camouflage. Our finite-element simulations confirm the theoretical findings. This work offers a transformation theory for thermal convection, thus revealing novel behaviors associated with potential applications; it not only provides different hints on how to control heat transfer by combining thermal conduction, thermal convection, and thermal radiation, but also benefits mass diffusion and other related fields that contain a set of equations and need to transform velocities at the same time.

  15. Automating Access Control Logics in Simple Type Theory with LEO-II

    NASA Astrophysics Data System (ADS)

    Benzmüller, Christoph

    Garg and Abadi recently proved that prominent access control logics can be translated in a sound and complete way into modal logic S4. We have previously outlined how normal multimodal logics, including monomodal logics K and S4, can be embedded in simple type theory and we have demonstrated that the higher-order theorem prover LEO-II can automate reasoning in and about them. In this paper we combine these results and describe a sound (and complete) embedding of different access control logics in simple type theory. Employing this framework we show that the off the shelf theorem prover LEO-II can be applied to automate reasoning in and about prominent access control logics.

  16. The arcing rate for a High Voltage Solar Array - Theory, experiment and predictions

    NASA Technical Reports Server (NTRS)

    Hastings, Daniel E.; Cho, Mengu; Kuninaka, Hitoshi

    1992-01-01

    All solar arrays have biased surfaces which can be exposed to the space environment. It has been observed that when the array bias is less than a few hundred volts negative then the exposed conductive surfaces may undergo arcing in the space plasma. A theory for arcing is developed on these high voltage solar arrays which ascribes the arcing to electric field runaway at the interface of the plasma, conductor and solar cell dielectric. Experiments were conducted in the laboratory for the High Voltage Solar Array (HVSA) experiment which will fly on the Japanese Space Flyer Unit (SFU) in 1994. The theory was compared in detail to the experiment and shown to give a reasonable explanation for the data. The combined theory and ground experiments were then used to develop predictions for the SFU flight.

  17. How to make thermodynamic perturbation theory to be suitable for low temperature?

    NASA Astrophysics Data System (ADS)

    Zhou, Shiqi

    2009-02-01

    Low temperature unsuitability is a problem plaguing thermodynamic perturbation theory (TPT) for years. Present investigation indicates that the low temperature predicament can be overcome by employing as reference system a nonhard sphere potential which incorporates one part of the attractive ingredient in a potential function of interest. In combination with a recently proposed TPT [S. Zhou, J. Chem. Phys. 125, 144518 (2006)] based on a λ expansion (λ being coupling parameter), the new perturbation strategy is employed to predict for several model potentials. It is shown that the new perturbation strategy can very accurately predict various thermodynamic properties even if the potential range is extremely short and hence the temperature of interest is very low and current theoretical formalisms seriously deteriorate or critically fail to predict even the existence of the critical point. Extensive comparison with existing liquid state theories and available computer simulation data discloses a superiority of the present TPT to two Ornstein-Zernike-type integral equation theories, i.e., hierarchical reference theory and self-consistent Ornstein-Zernike approximation.

  18. How to make thermodynamic perturbation theory to be suitable for low temperature?

    PubMed

    Zhou, Shiqi

    2009-02-07

    Low temperature unsuitability is a problem plaguing thermodynamic perturbation theory (TPT) for years. Present investigation indicates that the low temperature predicament can be overcome by employing as reference system a nonhard sphere potential which incorporates one part of the attractive ingredient in a potential function of interest. In combination with a recently proposed TPT [S. Zhou, J. Chem. Phys. 125, 144518 (2006)] based on a lambda expansion (lambda being coupling parameter), the new perturbation strategy is employed to predict for several model potentials. It is shown that the new perturbation strategy can very accurately predict various thermodynamic properties even if the potential range is extremely short and hence the temperature of interest is very low and current theoretical formalisms seriously deteriorate or critically fail to predict even the existence of the critical point. Extensive comparison with existing liquid state theories and available computer simulation data discloses a superiority of the present TPT to two Ornstein-Zernike-type integral equation theories, i.e., hierarchical reference theory and self-consistent Ornstein-Zernike approximation.

  19. Polyvagal Theory and developmental psychopathology: emotion dysregulation and conduct problems from preschool to adolescence.

    PubMed

    Beauchaine, Theodore P; Gatzke-Kopp, Lisa; Mead, Hilary K

    2007-02-01

    In science, theories lend coherence to vast amounts of descriptive information. However, current diagnostic approaches in psychopathology are primarily atheoretical, emphasizing description over etiological mechanisms. We describe the importance of Polyvagal Theory toward understanding the etiology of emotion dysregulation, a hallmark of psychopathology. When combined with theories of social reinforcement and motivation, Polyvagal Theory specifies etiological mechanisms through which distinct patterns of psychopathology emerge. In this paper, we summarize three studies evaluating autonomic nervous system functioning in children with conduct problems, ages 4-18. At all age ranges, these children exhibit attenuated sympathetic nervous system responses to reward, suggesting deficiencies in approach motivation. By middle school, this reward insensitivity is met with inadequate vagal modulation of cardiac output, suggesting additional deficiencies in emotion regulation. We propose a biosocial developmental model of conduct problems in which inherited impulsivity is amplified through social reinforcement of emotional lability. Implications for early intervention are discussed.

  20. Theory of L -edge spectroscopy of strongly correlated systems

    NASA Astrophysics Data System (ADS)

    Lüder, Johann; Schött, Johan; Brena, Barbara; Haverkort, Maurits W.; Thunström, Patrik; Eriksson, Olle; Sanyal, Biplab; Di Marco, Igor; Kvashnin, Yaroslav O.

    2017-12-01

    X-ray absorption spectroscopy measured at the L edge of transition metals (TMs) is a powerful element-selective tool providing direct information about the correlation effects in the 3 d states. The theoretical modeling of the 2 p →3 d excitation processes remains to be challenging for contemporary ab initio electronic structure techniques, due to strong core-hole and multiplet effects influencing the spectra. In this work, we present a realization of the method combining the density-functional theory with multiplet ligand field theory, proposed in Haverkort et al. [Phys. Rev. B 85, 165113 (2012), 10.1103/PhysRevB.85.165113]. In this approach, a single-impurity Anderson model (SIAM) is constructed, with almost all parameters obtained from first principles, and then solved to obtain the spectra. In our implementation, we adopt the language of the dynamical mean-field theory and utilize the local density of states and the hybridization function, projected onto TM 3 d states, in order to construct the SIAM. The developed computational scheme is applied to calculate the L -edge spectra for several TM monoxides. A very good agreement between the theory and experiment is found for all studied systems. The effect of core-hole relaxation, hybridization discretization, possible extensions of the method as well as its limitations are discussed.

  1. Informal Theory: The Ignored Link in Theory-to-Practice

    ERIC Educational Resources Information Center

    Love, Patrick

    2012-01-01

    Applying theory to practice in student affairs is dominated by the assumption that formal theory is directly applied to practice. Among the problems with this assumption is that many practitioners believe they must choose between their lived experiences and formal theory, and that graduate students are taught that their experience "does not…

  2. Extension of lattice cluster theory to strongly interacting, self-assembling polymeric systems.

    PubMed

    Freed, Karl F

    2009-02-14

    A new extension of the lattice cluster theory is developed to describe the influence of monomer structure and local correlations on the free energy of strongly interacting and self-assembling polymer systems. This extension combines a systematic high dimension (1/d) and high temperature expansion (that is appropriate for weakly interacting systems) with a direct treatment of strong interactions. The general theory is illustrated for a binary polymer blend whose two components contain "sticky" donor and acceptor groups, respectively. The free energy is determined as an explicit function of the donor-acceptor contact probabilities that depend, in turn, on the local structure and both the strong and weak interactions.

  3. Melting slope of MgO from molecular dynamics and density functional theory

    NASA Astrophysics Data System (ADS)

    Tangney, Paul; Scandolo, Sandro

    2009-09-01

    We combine density functional theory (DFT) with molecular dynamics simulations based on an accurate atomistic force field to calculate the pressure derivative of the melting temperature of magnesium oxide at ambient pressure—a quantity for which a serious disagreement between theory and experiment has existed for almost 15 years. We find reasonable agreement with previous DFT results and with a very recent experimental determination of the slope. We pay particular attention to areas of possible weakness in theoretical calculations and conclude that the long-standing discrepancy with experiment could only be explained by a dramatic failure of existing density functionals or by flaws in the original experiment.

  4. KI-Aikido for Handicapped Students at Leeward Community College: Theory and Practice.

    ERIC Educational Resources Information Center

    MacGugan, Kirk

    In an effort to provide physical education instruction for handicapped students, Leeward Community College implemented, on a pilot basis, a non-credit course in KI-Aikido, an oriental martial art which combines theory and exercise toward the goal of controlling the body through the power of the mind. The course, offered to both handicapped and…

  5. Thermal field theory and generalized light front quantization

    NASA Astrophysics Data System (ADS)

    Weldon, H. Arthur

    2003-04-01

    The dependence of thermal field theory on the surface of quantization and on the velocity of the heat bath is investigated by working in general coordinates that are arbitrary linear combinations of the Minkowski coordinates. In the general coordinates the metric tensor gμν¯ is nondiagonal. The Kubo-Martin-Schwinger condition requires periodicity in thermal correlation functions when the temporal variable changes by an amount -i/(T(g00¯)). Light-front quantization fails since g00¯=0; however, various related quantizations are possible.

  6. A re-examination of the biphasic theory of skeletal muscle growth.

    PubMed Central

    Levine, A S; Hegarty, P V

    1977-01-01

    Because of the importance of fibre diameter measurements it was decided to re-evaluate the biphasic theory of skeletal muscle growth and development. This theory proposes an initial memophasic distribution of muscle fibres which changes to a biphasic distribution during development. The theory is based on observations made on certain muscles in mice, where two distinct populations of fibre diameters (20 and 40 micronm) contribute to the biphasic distribution. In the present investigation corss sections of frozen biceps brachii of mice in rigor mortis were examined. The rigor state was used to avoid complications produced by thaw-rigor contraction. The diameters of the outermost and innermost fibres were found to be significantly different. However, if the outer and inner fibres were combined to form one group, no significant difference between this group and other random groups was found. The distributions of all groups were monophasic. The diameters of isolated fibres from mice and rats also displayed a monophasic distribution. This evidence leads to the conclusion that the biphasic theory of muscle growth is untenable. Some of the variables which may occur in fibre size and shape are discussed. Images Fig. 1 PMID:858691

  7. [Neurosis and genetic theory of etiology and pathogenesis of ulcer disease].

    PubMed

    Kolotilova, M L; Ivanov, L N

    2014-01-01

    Based on the analysis of literature data and our own research, we have developed the original concept of etiology and pathogenesis of peptic ulcer disease. An analysis of the literature shows that none of the theories of pathogenesis of peptic ulcer disease does not cover the full diversity of the involved functions and their shifts, which lead to the development of ulcers in the stomach and the duodenum. Our neurogenic-genetic theory of etiology and pathogenesis of gastric ulcer and duodenal ulcer very best explains the cause-and-effect relationships in the patient peptic ulcer, allowing options for predominance in one or the other case factors of neurosis or genetic factors. However, it is clear that the only other: combination of neurogenic factor with genetically modified reactivity of gastroduodenal system (the presence of the target organ) cause the chronicity of the sores. The theory of peptic ulcer disease related to psychosomatic pathologies allows us to develop effective schema therapy, including drugs with psychocorrective action. On the basis of our theory of the role of Helicobacter pylori infection is treated as a pathogenetic factor in the development of peptic ulcer disease.

  8. Drawing Out Theory: Art and the Teaching of Political Theory.

    ERIC Educational Resources Information Center

    Miller, Char R.

    2000-01-01

    Discusses how to use art in introductory political theory courses. Provides examples of incorporating art to teach political theory, such as examining Machiavelli's "The Prince" and Michelangelo's "David" to understand Florentine (Florence, Italy) political theory. (CMK)

  9. Beyond naïve cue combination: salience and social cues in early word learning.

    PubMed

    Yurovsky, Daniel; Frank, Michael C

    2017-03-01

    Children learn their earliest words through social interaction, but it is unknown how much they rely on social information. Some theories argue that word learning is fundamentally social from its outset, with even the youngest infants understanding intentions and using them to infer a social partner's target of reference. In contrast, other theories argue that early word learning is largely a perceptual process in which young children map words onto salient objects. One way of unifying these accounts is to model word learning as weighted cue combination, in which children attend to many potential cues to reference, but only gradually learn the correct weight to assign each cue. We tested four predictions of this kind of naïve cue combination account, using an eye-tracking paradigm that combines social word teaching and two-alternative forced-choice testing. None of the predictions were supported. We thus propose an alternative unifying account: children are sensitive to social information early, but their ability to gather and deploy this information is constrained by domain-general cognitive processes. Developmental changes in children's use of social cues emerge not from learning the predictive power of social cues, but from the gradual development of attention, memory, and speed of information processing. © 2015 John Wiley & Sons Ltd.

  10. Beyond Naïve Cue Combination: Salience and Social Cues in Early Word Learning

    PubMed Central

    Yurovsky, Daniel

    2015-01-01

    Children learn their earliest words through social interaction, but it is unknown how much they rely on social information. Some theories argue that word learning is fundamentally social from its outset, with even the youngest infants understanding intentions and using them to infer a social partner’s target of reference. In contrast, other theories argue that early word learning is largely a perceptual process in which young children map words onto salient objects. One way of unifying these accounts is to model word learning as weighted cue-combination, in which children attend to many potential cues to reference, but only gradually learn the correct weight to assign each cue. We tested four predictions of this kind of naïve cue-combination account, using an eye-tracking paradigm that combines social word-teaching and two-alternative forced-choice testing. None of the predictions were supported. We thus propose an alternative unifying account: children are sensitive to social information early, but their ability to gather and deploy this information is constrained by domain-general cognitive processes. Developmental changes in children’s use of social cues emerge not from learning the predictive power of social cues, but from the gradual development of attention, memory, and speed of information processing. PMID:26575408

  11. The ethics of wildlife research: a nine R theory.

    PubMed

    Curzer, Howard J; Wallace, Mark C; Perry, Gad; Muhlberger, Peter J; Perry, Dan

    2013-01-01

    The commonsense ethical constraints on laboratory animal research known as the three Rs are widely accepted, but no constraints tailored to research on animals in the wild are available. In this article, we begin to fill that gap. We sketch a set of commonsense ethical constraints on ecosystem research parallel to the constraints that govern laboratory animal research. Then we combine the animal and ecosystem constraints into a single theory to govern research on animals in the wild.

  12. Theories and Modes

    ERIC Educational Resources Information Center

    Apsche, Jack A.

    2005-01-01

    In his work on the Theory of Modes, Beck (1996) suggested that there were flaws with his cognitive theory. He suggested that though there are shortcomings to his cognitive theory, there were not similar shortcomings to the practice of Cognitive Therapy. The author suggests that if there are shortcomings to cognitive theory the same shortcomings…

  13. Learning Theories, Career Development Theories, and Their Applications at Two-Year Colleges.

    ERIC Educational Resources Information Center

    Haag-Mutter, Priscilla

    Trait-factor theory, developmental/self-concept theory, personality theory, and behavioral theory are some of the major theories of career development. The first three (trait-factor, developmental/self-concept, and personality) have ties to the gestalt school because of the emphasis on the individual's relationship to the environment. Anne Roe's…

  14. [Consanguinity between meridian theory and Bianque's pulse theory].

    PubMed

    Huang, Longxiang

    2015-05-01

    The integral meridian theory is composed of five parts, including meridian course, syndrome, diagnostic method, treating principle and treatment, and the core of it is meridian syndrome. It has been proved by multiple evidences that the meridian syndrome induced by the pathological change in meridian and the death syndrome of pulse penetrating or attaching to the syndrome are all originated from Bianque' s facial color and pulse diagnosis. And regarding the pulse syndrome,there are many different interpretations based on the theory of yin-yang in four seasons before the Han Dynasty. The emerging of Biaoben diagnostic method in Bianque's pulse method and its extensive clinical application promote a new theoretic interpretation the connection of meridians interpreting pulse syndrome directly. Besides, along with the new development of blood-pulse theory of Bianque's medicine, the revolution on meridian theory is aroused as well its theoretical paradigm turning from "tree" type to "ring" type. In other words, Bianque's medicine not only gives birth to meridian theory, but also decides its final development.

  15. Theory of Ion and Water Transport in Reverse-Osmosis Membranes

    NASA Astrophysics Data System (ADS)

    Oren, Y. S.; Biesheuvel, P. M.

    2018-02-01

    We present a theory for ion and water transport through reverse-osmosis (RO) membranes based on a Maxwell-Stefan framework combined with hydrodynamic theory for the reduced motion of particles in thin pores. We take into account all driving forces and frictions both on the fluid (water) and on the ions including ion-fluid friction and ion-wall friction. By including the acid-base characteristic of the carbonic acid system, the boric acid system, H3O+/OH- , and the membrane charge, we locally determine p H , the effective charge of the membrane, and the dissociation degree of carbonic acid and boric acid. We present calculation results for an experiment with fixed feed concentration, where effluent composition is a self-consistent function of fluxes through the membrane. A comparison with experimental results from literature for fluid flow vs pressure, and for salt and boron rejection, shows that our theory agrees very well with the available data. Our model is based on realistic assumptions for the effective size of the ions and makes use of a typical pore size of a commercial RO membrane.

  16. Graph Theory

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sanfilippo, Antonio P.

    2005-12-27

    Graph theory is a branch of discrete combinatorial mathematics that studies the properties of graphs. The theory was pioneered by the Swiss mathematician Leonhard Euler in the 18th century, commenced its formal development during the second half of the 19th century, and has witnessed substantial growth during the last seventy years, with applications in areas as diverse as engineering, computer science, physics, sociology, chemistry and biology. Graph theory has also had a strong impact in computational linguistics by providing the foundations for the theory of features structures that has emerged as one of the most widely used frameworks for themore » representation of grammar formalisms.« less

  17. New fundamental evidence of non-classical structure in the combination of natural concepts.

    PubMed

    Aerts, D; Sozzo, S; Veloz, T

    2016-01-13

    We recently performed cognitive experiments on conjunctions and negations of two concepts with the aim of investigating the combination problem of concepts. Our experiments confirmed the deviations (conceptual vagueness, underextension, overextension etc.) from the rules of classical (fuzzy) logic and probability theory observed by several scholars in concept theory, while our data were successfully modelled in a quantum-theoretic framework developed by ourselves. In this paper, we isolate a new, very stable and systematic pattern of violation of classicality that occurs in concept combinations. In addition, the strength and regularity of this non-classical effect leads us to believe that it occurs at a more fundamental level than the deviations observed up to now. It is our opinion that we have identified a deep non-classical mechanism determining not only how concepts are combined but, rather, how they are formed. We show that this effect can be faithfully modelled in a two-sector Fock space structure, and that it can be exactly explained by assuming that human thought is the superposition of two processes, a 'logical reasoning', guided by 'logic', and a 'conceptual reasoning', guided by 'emergence', and that the latter generally prevails over the former. All these findings provide new fundamental support to our quantum-theoretic approach to human cognition. © 2015 The Author(s).

  18. The Q theory of investment, the capital asset pricing model, and asset valuation: a synthesis.

    PubMed

    McDonald, John F

    2004-05-01

    The paper combines Tobin's Q theory of real investment with the capital asset pricing model to produce a new and relatively simple procedure for the valuation of real assets using the income approach. Applications of the new method are provided.

  19. Organizational Diagnosis: Six Places To Look for Trouble With or Without a Theory

    ERIC Educational Resources Information Center

    Weisbord, Marvin R.

    1976-01-01

    This article presents a practice theory for diagnosing organizations--that is, a combination of many ideas in a relatively simple framework that can be applied in various settings. It brings together organization/environment, sociotechnical, and formal/informal systems concepts, and proposes six broad categories for looking at an organization.…

  20. Teaching About Theory-Laden Observation to Secondary Students Through Manipulated Lab Inquiry Experience

    NASA Astrophysics Data System (ADS)

    Lau, Kwok-chi; Chan, Shi-lun

    2013-10-01

    This study seeks to develop and evaluate a modified lab inquiry approach to teaching about nature of science (NOS) to secondary students. Different from the extended, open-ended inquiry, this approach makes use of shorter lab inquiry activities in which one or several specific NOS aspects are manipulated deliberately so that students are compelled to experience and then reflect on these NOS aspects. In this study, to let students experience theory-laden observation, they were provided with different "theories" in order to bias their observations in the lab inquiry. Then, in the post-lab discussion, the teacher guided students to reflect on their own experience and explicitly taught about theory-ladenness. This study employs a quasi-experimental pretest-posttest design using the historical approach as the control group. The results show that the manipulated lab inquiry approach was much more effective than the historical approach in fostering students' theory-laden views, and it was even more effective when the two approaches were combined. Besides, the study also sought to examine the practical epistemological beliefs of students concerning theory-ladenness, but limited evidence could be found.

  1. Gravitational waves in Einstein-æther and generalized TeVeS theory after GW170817

    NASA Astrophysics Data System (ADS)

    Gong, Yungui; Hou, Shaoqi; Liang, Dicong; Papantonopoulos, Eleftherios

    2018-04-01

    In this work we discuss the polarization contents of Einstein-æther theory and the generalized tensor-vector-scalar (TeVeS) theory, as both theories have a normalized timelike vector field. We derive the linearized equations of motion around the flat spacetime background using the gauge-invariant variables to easily separate physical degrees of freedom. We find the plane wave solutions and identify the polarizations by examining the geodesic deviation equations. We find that there are five polarizations in Einstein-æther theory and six polarizations in the generalized TeVeS theory. In particular, the transverse breathing mode is mixed with the pure longitudinal mode. We also discuss the experimental tests of the extra polarizations in Einstein-æther theory using pulsar timing arrays combined with the gravitational-wave speed bound derived from the observations on GW 170817 and GRB 170817A. It turns out that it might be difficult to use pulsar timing arrays to distinguish different polarizations in Einstein-æther theory. The same speed bound also forces one of the propagating modes in the generalized TeVeS theory to travel much faster than the speed of light. Since the strong coupling problem does not exist in some parameter subspaces, the generalized TeVeS theory is excluded in these parameter subspaces.

  2. Information theory applications for biological sequence analysis.

    PubMed

    Vinga, Susana

    2014-05-01

    Information theory (IT) addresses the analysis of communication systems and has been widely applied in molecular biology. In particular, alignment-free sequence analysis and comparison greatly benefited from concepts derived from IT, such as entropy and mutual information. This review covers several aspects of IT applications, ranging from genome global analysis and comparison, including block-entropy estimation and resolution-free metrics based on iterative maps, to local analysis, comprising the classification of motifs, prediction of transcription factor binding sites and sequence characterization based on linguistic complexity and entropic profiles. IT has also been applied to high-level correlations that combine DNA, RNA or protein features with sequence-independent properties, such as gene mapping and phenotype analysis, and has also provided models based on communication systems theory to describe information transmission channels at the cell level and also during evolutionary processes. While not exhaustive, this review attempts to categorize existing methods and to indicate their relation with broader transversal topics such as genomic signatures, data compression and complexity, time series analysis and phylogenetic classification, providing a resource for future developments in this promising area.

  3. Design of crusher liner based on time - varying uncertainty theory

    NASA Astrophysics Data System (ADS)

    Tang, J. C.; Shi, B. Q.; Yu, H. J.; Wang, R. J.; Zhang, W. Y.

    2017-09-01

    This article puts forward the time-dependent design method considering the load fluctuation factors for the liner based on the time-varying uncertainty theory. In this method, the time-varying uncertainty design model of liner is constructed by introducing the parameters that affect the wear rate, the volatility and the drift rate. Based on the design example, the timevarying design outline of the moving cone liner is obtained. Based on the theory of minimum wear, the gap curve of wear resistant cavity is designed, and the optimized cavity is obtained by the combination of the thickness of the cone and the cavity gap. Taking the PYGB1821 multi cylinder hydraulic cone crusher as an example, it is proved that the service life of the new liner is improved by more than 14.3%.

  4. Auditory processing theories of language disorders: past, present, and future.

    PubMed

    Miller, Carol A

    2011-07-01

    The purpose of this article is to provide information that will assist readers in understanding and interpreting research literature on the role of auditory processing in communication disorders. A narrative review was used to summarize and synthesize the literature on auditory processing deficits in children with auditory processing disorder (APD), specific language impairment (SLI), and dyslexia. The history of auditory processing theories of these 3 disorders is described, points of convergence and controversy within and among the different branches of research literature are considered, and the influence of research on practice is discussed. The theoretical and clinical contributions of neurophysiological methods are also reviewed, and suggested approaches for critical reading of the research literature are provided. Research on the role of auditory processing in communication disorders springs from a variety of theoretical perspectives and assumptions, and this variety, combined with controversies over the interpretation of research results, makes it difficult to draw clinical implications from the literature. Neurophysiological research methods are a promising route to better understanding of auditory processing. Progress in theory development and its clinical application is most likely to be made when researchers from different disciplines and theoretical perspectives communicate clearly and combine the strengths of their approaches.

  5. Between Academic Theory and Folk Wisdom: Local Discourse on Differential Educational Attainment in Fiji.

    ERIC Educational Resources Information Center

    White, Carmen M.

    2001-01-01

    In the multiethnic South Pacific nation of Fiji--a former British colony--the impact of Western theoretical hegemony on educational discourse is evident. Results of extensive fieldwork show how themes of achievement motivation, differential valuation of education, and cultural deficit theory combine with surviving colonial discourse and…

  6. Management Design Theories

    NASA Astrophysics Data System (ADS)

    Pries-Heje, Jan; Baskerville, Richard L.

    This paper elaborates a design science approach for management planning anchored to the concept of a management design theory. Unlike the notions of design theories arising from information systems, management design theories can appear as a system of technological rules, much as a system of hypotheses or propositions can embody scientific theories. The paper illus trates this form of management design theories with three grounded cases. These grounded cases include a software process improvement study, a user involvement study, and an organizational change study. Collectively these studies demonstrate how design theories founded on technological rules can not only improve the design of information systems, but that these concepts have great practical value for improving the framing of strategic organi zational design decisions about such systems. Each case is either grounded in an empirical sense, that is to say, actual practice, or it is grounded to practices described extensively in the practical literature. Such design theories will help managers more easily approach complex, strategic decisions.

  7. Simulations of nanocrystals under pressure: combining electronic enthalpy and linear-scaling density-functional theory.

    PubMed

    Corsini, Niccolò R C; Greco, Andrea; Hine, Nicholas D M; Molteni, Carla; Haynes, Peter D

    2013-08-28

    We present an implementation in a linear-scaling density-functional theory code of an electronic enthalpy method, which has been found to be natural and efficient for the ab initio calculation of finite systems under hydrostatic pressure. Based on a definition of the system volume as that enclosed within an electronic density isosurface [M. Cococcioni, F. Mauri, G. Ceder, and N. Marzari, Phys. Rev. Lett. 94, 145501 (2005)], it supports both geometry optimizations and molecular dynamics simulations. We introduce an approach for calibrating the parameters defining the volume in the context of geometry optimizations and discuss their significance. Results in good agreement with simulations using explicit solvents are obtained, validating our approach. Size-dependent pressure-induced structural transformations and variations in the energy gap of hydrogenated silicon nanocrystals are investigated, including one comparable in size to recent experiments. A detailed analysis of the polyamorphic transformations reveals three types of amorphous structures and their persistence on depressurization is assessed.

  8. Simulations of nanocrystals under pressure: Combining electronic enthalpy and linear-scaling density-functional theory

    NASA Astrophysics Data System (ADS)

    Corsini, Niccolò R. C.; Greco, Andrea; Hine, Nicholas D. M.; Molteni, Carla; Haynes, Peter D.

    2013-08-01

    We present an implementation in a linear-scaling density-functional theory code of an electronic enthalpy method, which has been found to be natural and efficient for the ab initio calculation of finite systems under hydrostatic pressure. Based on a definition of the system volume as that enclosed within an electronic density isosurface [M. Cococcioni, F. Mauri, G. Ceder, and N. Marzari, Phys. Rev. Lett. 94, 145501 (2005)], 10.1103/PhysRevLett.94.145501, it supports both geometry optimizations and molecular dynamics simulations. We introduce an approach for calibrating the parameters defining the volume in the context of geometry optimizations and discuss their significance. Results in good agreement with simulations using explicit solvents are obtained, validating our approach. Size-dependent pressure-induced structural transformations and variations in the energy gap of hydrogenated silicon nanocrystals are investigated, including one comparable in size to recent experiments. A detailed analysis of the polyamorphic transformations reveals three types of amorphous structures and their persistence on depressurization is assessed.

  9. Constructor theory of probability

    PubMed Central

    2016-01-01

    Unitary quantum theory, having no Born Rule, is non-probabilistic. Hence the notorious problem of reconciling it with the unpredictability and appearance of stochasticity in quantum measurements. Generalizing and improving upon the so-called ‘decision-theoretic approach’, I shall recast that problem in the recently proposed constructor theory of information—where quantum theory is represented as one of a class of superinformation theories, which are local, non-probabilistic theories conforming to certain constructor-theoretic conditions. I prove that the unpredictability of measurement outcomes (to which constructor theory gives an exact meaning) necessarily arises in superinformation theories. Then I explain how the appearance of stochasticity in (finitely many) repeated measurements can arise under superinformation theories. And I establish sufficient conditions for a superinformation theory to inform decisions (made under it) as if it were probabilistic, via a Deutsch–Wallace-type argument—thus defining a class of decision-supporting superinformation theories. This broadens the domain of applicability of that argument to cover constructor-theory compliant theories. In addition, in this version some of the argument's assumptions, previously construed as merely decision-theoretic, follow from physical properties expressed by constructor-theoretic principles. PMID:27616914

  10. Polyvagal Theory and Developmental Psychopathology: Emotion Dysregulation and Conduct Problems from Preschool to Adolescence

    PubMed Central

    Beauchaine, Theodore P.; Gatzke-Kopp, Lisa; Mead, Hilary K.

    2007-01-01

    In science, theories lend coherence to vast amounts of descriptive information. However, current diagnostic approaches in psychopathology are primarily atheoretical, emphasizing description over etiological mechanisms. We describe the importance of Polyvagal Theory toward understanding the etiology of emotion dysregulation, a hallmark of psychopathology. When combined with theories of social reinforcement and motivation, Polyvagal Theory specifies etiological mechanisms through which distinct patterns of psychopathology emerge. In this paper, we summarize three studies evaluating autonomic nervous system functioning in children with conduct problems, ages 4-18. At all age ranges, these children exhibit attenuated sympathetic nervous system responses to reward, suggesting deficiencies in approach motivation. By middle school, this reward insensitivity is met with inadequate vagal modulation of cardiac output, suggesting additional deficiencies in emotion regulation. We propose a biosocial developmental model of conduct problems in which inherited impulsivity is amplified through social reinforcement of emotional lability. Implications for early intervention are discussed. PMID:17045726

  11. A General Theory for the Fusion of Data.

    DTIC Science & Technology

    1987-11-01

    a real sense the developing a Grand Unified Theory of the Universe. problem of how to model the real world with all of its underscore this quest 1...al (6)h,g.k+l ’ P(Qh,gk+l With Ghg,k+lr ( S ) 8 hgk P(Wg.k+l ,hi?/O); (18)(over l (17,k.(IB)9.k.] dNg’k1 The basic internodal analysis is developed ...set c l t’ bility, developed rather complicated expressions ) o -set camp emene for combining conditional objects, not realizing the s (partial

  12. Testing gravity with EG: mapping theory onto observations

    NASA Astrophysics Data System (ADS)

    Leonard, C. Danielle; Ferreira, Pedro G.; Heymans, Catherine

    2015-12-01

    We present a complete derivation of the observationally motivated definition of the modified gravity statistic EG. Using this expression, we investigate how variations to theory and survey parameters may introduce uncertainty in the general relativistic prediction of EG. We forecast errors on EG for measurements using two combinations of upcoming surveys, and find that theoretical uncertainties may dominate for a futuristic measurement. Finally, we compute predictions of EG under modifications to general relativity in the quasistatic regime, and comment on the pros and cons of using EG to test gravity with future surveys.

  13. [ital N]-string vertices in string field theory

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bordes, J.; Abdurrahman, A.; Anton, F.

    1994-03-15

    We give the general form of the vertex corresponding to the interaction of an arbitrary number of strings. The technique employed relies on the comma'' representation of string field theory where string fields and interactions are represented as matrices and operations between them such as multiplication and trace. The general formulation presented here shows that the interaction vertex of [ital N] strings, for any arbitrary [ital N], is given as a function of particular combinations of matrices corresponding to the change of representation between the full string and the half string degrees of freedom.

  14. Combining Vision with Voice: A Learning and Implementation Structure Promoting Teachers' Internalization of Practices Based on Self-Determination Theory

    ERIC Educational Resources Information Center

    Assor, Avi; Kaplan, Haya; Feinberg, Ofra; Tal, Karen

    2009-01-01

    We propose that self-determination theory's conceptualization of internalization may help school reformers overcome the recurrent problem of "the predictable failure of educational reform" (Sarason, 1993). Accordingly, we present a detailed learning and implementation structure to promote teachers' internalization and application of ideas and…

  15. Trichotomous Processes in Early Memory Development, Aging, and Neurocognitive Impairment: A Unified Theory

    ERIC Educational Resources Information Center

    Brainerd, C. J.; Reyna, V. F.; Howe, M. L.

    2009-01-01

    One of the most extensively investigated topics in the adult memory literature, dual memory processes, has had virtually no impact on the study of early memory development. The authors remove the key obstacles to such research by formulating a trichotomous theory of recall that combines the traditional dual processes of recollection and…

  16. Undergraduate healthcare ethics education, moral resilience, and the role of ethical theories.

    PubMed

    Monteverde, Settimio

    2014-06-01

    This article combines foundational and empirical aspects of healthcare education and develops a framework for teaching ethical theories inspired by pragmatist learning theory and recent work on the concept of moral resilience. It describes an exemplary implementation and presents data from student evaluation. After a pilot implementation in a regular ethics module, the feasibility and acceptance of the novel framework by students were evaluated. In addition to the regular online module evaluation, specific questions referring to the teaching of ethical theories were added using simple (yes/no) and Likert rating answer formats. At the Bern University of Applied Sciences, a total of 93 students from 2 parallel sub-cohorts of the bachelor's program in nursing science were sent the online survey link after having been exposed to the same modular contents. A total of 62% of all students participated in the survey. The survey was voluntary and anonymous. Students were free to write their name and additional comments. Students consider ethical theories-as taught within the proposed framework-as practically applicable, useful, and transferable into practice. Teaching ethical theories within the proposed framework overcomes the shortcomings described by current research. Students do not consider the mutually exclusive character of ethical theories as an insurmountable problem. The proposed framework is likely to promote the effectiveness of healthcare ethics education. Inspired by pragmatist learning theory, it enables students to consider ethical theories as educative playgrounds that help them to "frame" and "name" the ethical issues they encounter in daily practice, which is seen as an expression of moral resilience. Since it does not advocate a single ethical theory, but is open to the diversity of traditions that shape ethical thinking, it promotes a culturally sensitive, ethically reflected healthcare practice. © The Author(s) 2013.

  17. Explaining Michigan: Developing an Ex Post Theory of a Quality Improvement Program

    PubMed Central

    Dixon-Woods, Mary; Bosk, Charles L; Aveling, Emma Louise; Goeschel, Christine A; Pronovost, Peter J

    2011-01-01

    Context: Understanding how and why programs work—not simply whether they work—is crucial. Good theory is indispensable to advancing the science of improvement. We argue for the usefulness of ex post theorization of programs. Methods: We propose an approach, located within the broad family of theory-oriented methods, for developing ex post theories of interventional programs. We use this approach to develop an ex post theory of the Michigan Intensive Care Unit (ICU) project, which attracted international attention by successfully reducing rates of central venous catheter bloodstream infections (CVC-BSIs). The procedure used to develop the ex post theory was (1) identify program leaders’ initial theory of change and learning from running the program; (2) enhance this with new information in the form of theoretical contributions from social scientists; (3) synthesize prior and new information to produce an updated theory. Findings: The Michigan project achieved its effects by (1) generating isomorphic pressures for ICUs to join the program and conform to its requirements; (2) creating a densely networked community with strong horizontal links that exerted normative pressures on members; (3) reframing CVC-BSIs as a social problem and addressing it through a professional movement combining “grassroots” features with a vertically integrating program structure; (4) using several interventions that functioned in different ways to shape a culture of commitment to doing better in practice; (5) harnessing data on infection rates as a disciplinary force; and (6) using “hard edges.” Conclusions: Updating program theory in the light of experience from program implementation is essential to improving programs’ generalizability and transferability, although it is not a substitute for concurrent evaluative fieldwork. Future iterations of programs based on the Michigan project, and improvement science more generally, may benefit from the updated theory present here

  18. Explaining Michigan: developing an ex post theory of a quality improvement program.

    PubMed

    Dixon-Woods, Mary; Bosk, Charles L; Aveling, Emma Louise; Goeschel, Christine A; Pronovost, Peter J

    2011-06-01

    Understanding how and why programs work-not simply whether they work-is crucial. Good theory is indispensable to advancing the science of improvement. We argue for the usefulness of ex post theorization of programs. We propose an approach, located within the broad family of theory-oriented methods, for developing ex post theories of interventional programs. We use this approach to develop an ex post theory of the Michigan Intensive Care Unit (ICU) project, which attracted international attention by successfully reducing rates of central venous catheter bloodstream infections (CVC-BSIs). The procedure used to develop the ex post theory was (1) identify program leaders' initial theory of change and learning from running the program; (2) enhance this with new information in the form of theoretical contributions from social scientists; (3) synthesize prior and new information to produce an updated theory. The Michigan project achieved its effects by (1) generating isomorphic pressures for ICUs to join the program and conform to its requirements; (2) creating a densely networked community with strong horizontal links that exerted normative pressures on members; (3) reframing CVC-BSIs as a social problem and addressing it through a professional movement combining "grassroots" features with a vertically integrating program structure; (4) using several interventions that functioned in different ways to shape a culture of commitment to doing better in practice; (5) harnessing data on infection rates as a disciplinary force; and (6) using "hard edges." Updating program theory in the light of experience from program implementation is essential to improving programs' generalizability and transferability, although it is not a substitute for concurrent evaluative fieldwork. Future iterations of programs based on the Michigan project, and improvement science more generally, may benefit from the updated theory present here. © 2011 Milbank Memorial Fund. Published by Wiley

  19. Generalized probability theories: what determines the structure of quantum theory?

    NASA Astrophysics Data System (ADS)

    Janotta, Peter; Hinrichsen, Haye

    2014-08-01

    The framework of generalized probabilistic theories is a powerful tool for studying the foundations of quantum physics. It provides the basis for a variety of recent findings that significantly improve our understanding of the rich physical structure of quantum theory. This review paper tries to present the framework and recent results to a broader readership in an accessible manner. To achieve this, we follow a constructive approach. Starting from a few basic physically motivated assumptions we show how a given set of observations can be manifested in an operational theory. Furthermore, we characterize consistency conditions limiting the range of possible extensions. In this framework classical and quantum theory appear as special cases, and the aim is to understand what distinguishes quantum mechanics as the fundamental theory realized in nature. It turns out that non-classical features of single systems can equivalently result from higher-dimensional classical theories that have been restricted. Entanglement and non-locality, however, are shown to be genuine non-classical features.

  20. Religion, evolution, and mental health: attachment theory and ETAS theory.

    PubMed

    Flannelly, Kevin J; Galek, Kathleen

    2010-09-01

    This article reviews the historical origins of Attachment Theory and Evolutionary Threat Assessment Systems Theory (ETAS Theory), their evolutionary basis and their application in research on religion and mental health. Attachment Theory has been most commonly applied to religion and mental health in research on God as an attachment figure, which has shown that secure attachment to God is positively associated with psychological well-being. Its broader application to religion and mental health is comprehensively discussed by Kirkpatrick (2005). ETAS Theory explains why certain religious beliefs--including beliefs about God and life-after-death--should have an adverse association, an advantageous association, or no association at all with mental health. Moreover, it makes specific predictions to this effect, which have been confirmed, in part. The authors advocate the application of ETAS Theory in research on religion and mental health because it explains how religious and other beliefs related to the dangerousness of the world can directly affect psychiatric symptoms through their affects on specific brain structures.

  1. Nucleon Polarisabilities and Effective Field Theories

    NASA Astrophysics Data System (ADS)

    Griesshammer, Harald W.

    2017-09-01

    Low-energy Compton scattering probes the nucleon's two-photon response to electric and magnetic fields at fixed photon frequency and multipolarity. It tests the symmetries and strengths of the interactions between constituents, and with photons. For convenience, this energy-dependent information is often compressed into the two scalar dipole polarisabilities αE 1 and βM 1 at zero photon energy. These are fundamental quantities, and important for the proton charge radius puzzle and the Lamb shift of muonic hydrogen. Combined with emerging lattice QCD computations, they provide stringent tests for our understanding of hadron structure. Extractions of the proton and neutron polarisabilities from all published elastic data below 300 MeV in Chiral Effective Field Theory with explicit Δ (1232) are now available. This talk emphasises χEFT as natural bridge between lattice QCD and ongoing or approved efforts at HI γS, MAMI and MAX-lab. Chiral lattice extrapolations from mπ > 200 MeV to the physical point compare well to lattice computations. Combining χEFT with high-intensity experiments with polarised targets and polarised beams will extract not only scalar polarisabilities, but in particular the four so-far poorly explored spin-polarisabilities. These parametrise the stiffness of the spin in external electro-magnetic fields (nucleonic bi-refringence/Faraday effect). New chiral predictions for proton, deuteron and 3He observables show intriguing sensitivities on spin and neutron polarisabilities. Data consistency and a model-independent quantification of residual theory uncertainties by Bayesian analysis are also discussed. Proton-neutron differences explore the interplay between chiral symmetry breaking and short-distance Physics. Finally, I address their impact on the neutron-proton mass difference, big-bang nucleosynthesis, and their relevance for anthropic arguments. Supported in part by DOE DE-SC0015393 and George Washington University.

  2. Theories of autism.

    PubMed

    Levy, Florence

    2007-11-01

    The purpose of the present paper was to review psychological theories of autism, and to integrate these theories with neurobiological findings. Cognitive, theory of mind, language and coherence theories were identified, and briefly reviewed. Psychological theories were found not to account for the rigid/repetitive behaviours universally described in autistic subjects, and underlying neurobiological systems were identified. When the developing brain encounters constrained connectivity, it evolves an abnormal organization, the features of which may be best explained by a developmental failure of neural connectivity, where high local connectivity develops in tandem with low long-range connectivity, resulting in constricted repetitive behaviours.

  3. Higgs compositeness in Sp(2N) gauge theories - Determining the low-energy constants with lattice calculations

    NASA Astrophysics Data System (ADS)

    Bennett, Ed; Ki Hong, Deog; Lee, Jong-Wan; David Lin, C.-J.; Lucini, Biagio; Piai, Maurizio; Vadacchino, Davide

    2018-03-01

    As a first step towards a quantitative understanding of the SU(4)/Sp(4) composite Higgs model through lattice calculations, we discuss the low energy effective field theory resulting from the SU(4) → Sp(4) global symmetry breaking pattern. We then consider an Sp(4) gauge theory with two Dirac fermion flavours in the fundamental representation on a lattice, which provides a concrete example of the microscopic realisation of the SU(4)/Sp(4) composite Higgs model. For this system, we outline a programme of numerical simulations aiming at the determination of the low-energy constants of the effective field theory and we test the method on the quenched theory. We also report early results from dynamical simulations, focussing on the phase structure of the lattice theory and a calculation of the lowest-lying meson spectrum at coarse lattice spacing. Combined contributions of B. Lucini (e-mail: b.lucini@swansea.ac.uk) and J.-W. Lee (e-mail: wlee823@pusan.ac.kr).

  4. Tin Oxide Crystals Exposed by Low-Energy {110} Facets for Enhanced Electrochemical Heavy Metal Ions Sensing: X-ray Absorption Fine Structure Experimental Combined with Density-Functional Theory Evidence.

    PubMed

    Jin, Zhen; Yang, Meng; Chen, Shao-Hua; Liu, Jin-Huai; Li, Qun-Xiang; Huang, Xing-Jiu

    2017-02-21

    Herein, we revealed that the electrochemical behaviors on the detection of heavy metal ions (HMIs) would largely rely on the exposed facets of SnO 2 nanoparticles. Compared to the high-energy {221} facet, the low-energy {110} facet of SnO 2 possessed better electrochemical performance. The adsorption/desorption tests, density-functional theory (DFT) calculations, and X-ray absorption fine structure (XAFS) studies showed that the lower barrier energy of surface diffusion on {110} facet was critical for the superior electrochemical property, which was favorable for the ions diffusion on the electrode, and further leading the enhanced electrochemical performance. Through the combination of experiments and theoretical calculations, a reliable interpretation of the mechanism for electroanalysis of HMIs with nanomaterials exposed by different crystal facets has been provided. Furthermore, it provides a deep insight into understanding the key factor to improve the electrochemical performance for HMIs detection, so as to design high-performance electrochemical sensors.

  5. Functional renormalization group and Kohn-Sham scheme in density functional theory

    NASA Astrophysics Data System (ADS)

    Liang, Haozhao; Niu, Yifei; Hatsuda, Tetsuo

    2018-04-01

    Deriving accurate energy density functional is one of the central problems in condensed matter physics, nuclear physics, and quantum chemistry. We propose a novel method to deduce the energy density functional by combining the idea of the functional renormalization group and the Kohn-Sham scheme in density functional theory. The key idea is to solve the renormalization group flow for the effective action decomposed into the mean-field part and the correlation part. Also, we propose a simple practical method to quantify the uncertainty associated with the truncation of the correlation part. By taking the φ4 theory in zero dimension as a benchmark, we demonstrate that our method shows extremely fast convergence to the exact result even for the highly strong coupling regime.

  6. Formalizing nursing knowledge: from theories and models to ontologies.

    PubMed

    Peace, Jane; Brennan, Patricia Flatley

    2009-01-01

    Knowledge representation in nursing is poised to address the depth of nursing knowledge about the specific phenomena of importance to nursing. Nursing theories and models may provide a starting point for making this knowledge explicit in representations. We combined knowledge building methods from nursing and ontology design methods from biomedical informatics to create a nursing representation of family health history. Our experience provides an example of how knowledge representations may be created to facilitate electronic support for nursing practice and knowledge development.

  7. Group Theory and Crystal Field Theory: A Simple and Rigorous Derivation of the Spectroscopic Terms Generated by the t[subscript 2g][superscript 2] Electronic Configuration in a Strong Octahedral Field

    ERIC Educational Resources Information Center

    Morpurgo, Simone

    2007-01-01

    The principles of symmetry and group theory are applied to the zero-order wavefunctions associated with the strong-field t[subscript 2g][superscript 2] configuration and their symmetry-adapted linear combinations (SALC) associated with the generated energy terms are derived. This approach will enable students to better understand the use of…

  8. Joint density-functional theory and its application to systems in solution

    NASA Astrophysics Data System (ADS)

    Petrosyan, Sahak A.

    The physics of solvation, the interaction of water with solutes, plays a central role in chemistry and biochemistry, and it is essential for the very existence of life. Despite the central importance of water and the advent of the quantum theory early in the twentieth century, the link between the fundamental laws of physics and the observable properties of water remain poorly understood to this day. The central goal of this thesis is to develop a new formalism and framework to make the study of systems (solutes or surfaces) in contact with liquid water as practical and accurate as standard electronic structure calculations without the need for explicit averaging over large ensembles of configurations of water molecules. The thesis introduces a new form of density functional theory for the ab initio description of electronic systems in contact with a molecular liquid environment. This theory rigorously joins an electron density-functional for the electrons of a solute with a classical density-functional theory for the liquid into a single variational principle for the free energy of the combined system. Using the new form of density-functional theory for the ab initio description of electronic systems in contact with a molecular liquid environment, the thesis then presents the first detailed study of the impact of a solvent on the surface chemistry of Cr2O3, the passivating layer of stainless steel alloys. In comparison to a vacuum, we predict that the presence of water has little impact on the adsorption of chloride ions to the oxygen-terminated surface but has a dramatic effect on the binding of hydrogen to that surface. A key ingredient of a successful joint density functional theory is a good approximate functional for describing the solvent. We explore how the simplest examples of the best known class of approximate forms for the classical density functional fail when applied directly to water. The thesis then presents a computationally efficient density

  9. A theory of physician-hospital integration: contending institutional and market logics in the health care field.

    PubMed

    Rundall, Thomas G; Shortell, Stephen M; Alexander, Jeffrey A

    2004-01-01

    This article proposes a theory of physician-hospital integration. The theory is developed by building on three streams of scholarship: "new" institutionalism, "old" institutionalism, and the theory of economic markets. The theory uses several key concepts from these theoretical frameworks, including the notions of environmental demands for legitimacy, market demands for efficiency, and agency. To enhance the predictive power of the theory, two new concepts are introduced: directionality of influence between institutional and market forces at the macro-societal level, and degree of separation of institutional and market domains at the local level--which add important predictive power to the theory. Using these concepts, a number of hypotheses are proposed regarding the ideal types of physician-hospital arrangements that are likely to emerge under different combinations of directionality of influence and institutional and market domain separation. Moreover, the theory generates hypotheses regarding organizational dynamics associated with physician-hospital integration, including the conditions associated with high and low prevalence of physician-hospital integration, the extent to which the integrated organization is physician-centric or hospital-centric, and whether physician-hospital integration is likely to be based on loose contractual arrangements or tight, ownership-based arrangements.

  10. Theory of Change: a theory-driven approach to enhance the Medical Research Council's framework for complex interventions

    PubMed Central

    2014-01-01

    Background The Medical Research Councils’ framework for complex interventions has been criticized for not including theory-driven approaches to evaluation. Although the framework does include broad guidance on the use of theory, it contains little practical guidance for implementers and there have been calls to develop a more comprehensive approach. A prospective, theory-driven process of intervention design and evaluation is required to develop complex healthcare interventions which are more likely to be effective, sustainable and scalable. Methods We propose a theory-driven approach to the design and evaluation of complex interventions by adapting and integrating a programmatic design and evaluation tool, Theory of Change (ToC), into the MRC framework for complex interventions. We provide a guide to what ToC is, how to construct one, and how to integrate its use into research projects seeking to design, implement and evaluate complex interventions using the MRC framework. We test this approach by using ToC within two randomized controlled trials and one non-randomized evaluation of complex interventions. Results Our application of ToC in three research projects has shown that ToC can strengthen key stages of the MRC framework. It can aid the development of interventions by providing a framework for enhanced stakeholder engagement and by explicitly designing an intervention that is embedded in the local context. For the feasibility and piloting stage, ToC enables the systematic identification of knowledge gaps to generate research questions that strengthen intervention design. ToC may improve the evaluation of interventions by providing a comprehensive set of indicators to evaluate all stages of the causal pathway through which an intervention achieves impact, combining evaluations of intervention effectiveness with detailed process evaluations into one theoretical framework. Conclusions Incorporating a ToC approach into the MRC framework holds promise for

  11. Theory of Change: a theory-driven approach to enhance the Medical Research Council's framework for complex interventions.

    PubMed

    De Silva, Mary J; Breuer, Erica; Lee, Lucy; Asher, Laura; Chowdhary, Neerja; Lund, Crick; Patel, Vikram

    2014-07-05

    The Medical Research Councils' framework for complex interventions has been criticized for not including theory-driven approaches to evaluation. Although the framework does include broad guidance on the use of theory, it contains little practical guidance for implementers and there have been calls to develop a more comprehensive approach. A prospective, theory-driven process of intervention design and evaluation is required to develop complex healthcare interventions which are more likely to be effective, sustainable and scalable. We propose a theory-driven approach to the design and evaluation of complex interventions by adapting and integrating a programmatic design and evaluation tool, Theory of Change (ToC), into the MRC framework for complex interventions. We provide a guide to what ToC is, how to construct one, and how to integrate its use into research projects seeking to design, implement and evaluate complex interventions using the MRC framework. We test this approach by using ToC within two randomized controlled trials and one non-randomized evaluation of complex interventions. Our application of ToC in three research projects has shown that ToC can strengthen key stages of the MRC framework. It can aid the development of interventions by providing a framework for enhanced stakeholder engagement and by explicitly designing an intervention that is embedded in the local context. For the feasibility and piloting stage, ToC enables the systematic identification of knowledge gaps to generate research questions that strengthen intervention design. ToC may improve the evaluation of interventions by providing a comprehensive set of indicators to evaluate all stages of the causal pathway through which an intervention achieves impact, combining evaluations of intervention effectiveness with detailed process evaluations into one theoretical framework. Incorporating a ToC approach into the MRC framework holds promise for improving the design and evaluation of complex

  12. Mixing methodology, nursing theory and research design for a practice model of district nursing advocacy.

    PubMed

    Reed, Frances M; Fitzgerald, Les; Rae, Melanie

    2016-01-01

    To highlight philosophical and theoretical considerations for planning a mixed methods research design that can inform a practice model to guide rural district nursing end of life care. Conceptual models of nursing in the community are general and lack guidance for rural district nursing care. A combination of pragmatism and nurse agency theory can provide a framework for ethical considerations in mixed methods research in the private world of rural district end of life care. Reflection on experience gathered in a two-stage qualitative research phase, involving rural district nurses who use advocacy successfully, can inform a quantitative phase for testing and complementing the data. Ongoing data analysis and integration result in generalisable inferences to achieve the research objective. Mixed methods research that creatively combines philosophical and theoretical elements to guide design in the particular ethical situation of community end of life care can be used to explore an emerging field of interest and test the findings for evidence to guide quality nursing practice. Combining philosophy and nursing theory to guide mixed methods research design increases the opportunity for sound research outcomes that can inform a nursing model of care.

  13. Time-Dependent Density Functional Theory for Open Systems and Its Applications.

    PubMed

    Chen, Shuguang; Kwok, YanHo; Chen, GuanHua

    2018-02-20

    Photovoltaic devices, electrochemical cells, catalysis processes, light emitting diodes, scanning tunneling microscopes, molecular electronics, and related devices have one thing in common: open quantum systems where energy and matter are not conserved. Traditionally quantum chemistry is confined to isolated and closed systems, while quantum dissipation theory studies open quantum systems. The key quantity in quantum dissipation theory is the reduced system density matrix. As the reduced system density matrix is an O(M! × M!) matrix, where M is the number of the particles of the system of interest, quantum dissipation theory can only be employed to simulate systems of a few particles or degrees of freedom. It is thus important to combine quantum chemistry and quantum dissipation theory so that realistic open quantum systems can be simulated from first-principles. We have developed a first-principles method to simulate the dynamics of open electronic systems, the time-dependent density functional theory for open systems (TDDFT-OS). Instead of the reduced system density matrix, the key quantity is the reduced single-electron density matrix, which is an N × N matrix where N is the number of the atomic bases of the system of interest. As the dimension of the key quantity is drastically reduced, the TDDFT-OS can thus be used to simulate the dynamics of realistic open electronic systems and efficient numerical algorithms have been developed. As an application, we apply the method to study how quantum interference develops in a molecular transistor in time domain. We include electron-phonon interaction in our simulation and show that quantum interference in the given system is robust against nuclear vibration not only in the steady state but also in the transient dynamics. As another application, by combining TDDFT-OS with Ehrenfest dynamics, we study current-induced dissociation of water molecules under scanning tunneling microscopy and follow its time dependent

  14. Multiscale System Theory

    DTIC Science & Technology

    1990-02-21

    LIDS-P-1953 Multiscale System Theory Albert Benveniste IRISA-INRIA, Campus de Beaulieu 35042 RENNES CEDEX, FRANCE Ramine Nikoukhah INRIA...TITLE AND SUBTITLE Multiscale System Theory 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) 5d. PROJECT NUMBER 5e...the development of a corresponding system theory and a theory of stochastic processes and their estimation. The research presented in this and several

  15. On the theory of dielectric spectroscopy of protein solutions

    NASA Astrophysics Data System (ADS)

    Matyushov, Dmitry V.

    2012-08-01

    We present a theory of the dielectric response of solutions containing large solutes, of the nanometer size, in a molecular solvent. It combines the molecular dipole moment of the solute with the polarization of a large subensemble of solvent molecules at the solute-solvent interface. The goal of the theory is two-fold: (i) to formulate the problem of the dielectric response avoiding the reliance on the cavity-field susceptibility of dielectric theories and (ii) to separate the non-additive polarization of the interface, jointly produced by the external field of the laboratory experiment and the solute, from specific solute-solvent interactions contributing to the dielectric signal. The theory is applied to experimentally reported frequency-dependent dielectric spectra of lysozyme in solution. The analysis of the data in the broad range of frequencies up to 700 GHz shows that the cavity-field susceptibility, critical for the theory formulation, is consistent with the prediction of Maxwell’s electrostatics in the frequency range of 10-200 GHz, but deviates from it outside this range. In particular, it becomes much smaller than the Maxwell result, and shifts to negative values, at small frequencies. The latter observation implies a dia-electric response, or negative dielectrophoresis, of hydrated lysozyme. It also implies that the effective protein dipole recorded by dielectric spectroscopy is much smaller than the value calculated from the protein’s charge distribution. We suggest an empirical equation that describes both the increment of the static dielectric constant and the decrement of the Debye water peak with increasing protein concentration. It gives fair agreement with broad-band dispersion and loss spectra of protein solutions, but misses the δ-dispersion region.

  16. Plasticity - Theory and finite element applications.

    NASA Technical Reports Server (NTRS)

    Armen, H., Jr.; Levine, H. S.

    1972-01-01

    A unified presentation is given of the development and distinctions associated with various incremental solution procedures used to solve the equations governing the nonlinear behavior of structures, and this is discussed within the framework of the finite-element method. Although the primary emphasis here is on material nonlinearities, consideration is also given to geometric nonlinearities acting separately or in combination with nonlinear material behavior. The methods discussed here are applicable to a broad spectrum of structures, ranging from simple beams to general three-dimensional bodies. The finite-element analysis methods for material nonlinearity are general in the sense that any of the available plasticity theories can be incorporated to treat strain hardening or ideally plastic behavior.

  17. Models and theories of prescribing decisions: A review and suggested a new model.

    PubMed

    Murshid, Mohsen Ali; Mohaidin, Zurina

    2017-01-01

    To date, research on the prescribing decisions of physician lacks sound theoretical foundations. In fact, drug prescribing by doctors is a complex phenomenon influenced by various factors. Most of the existing studies in the area of drug prescription explain the process of decision-making by physicians via the exploratory approach rather than theoretical. Therefore, this review is an attempt to suggest a value conceptual model that explains the theoretical linkages existing between marketing efforts, patient and pharmacist and physician decision to prescribe the drugs. The paper follows an inclusive review approach and applies the previous theoretical models of prescribing behaviour to identify the relational factors. More specifically, the report identifies and uses several valuable perspectives such as the 'persuasion theory - elaboration likelihood model', the stimuli-response marketing model', the 'agency theory', the theory of planned behaviour,' and 'social power theory,' in developing an innovative conceptual paradigm. Based on the combination of existing methods and previous models, this paper suggests a new conceptual model of the physician decision-making process. This unique model has the potential for use in further research.

  18. A Method for Co-Designing Theory-Based Behaviour Change Systems for Health Promotion.

    PubMed

    Janols, Rebecka; Lindgren, Helena

    2017-01-01

    A methodology was defined and developed for designing theory-based behaviour change systems for health promotion that can be tailored to the individual. Theories from two research fields were combined with a participatory action research methodology. Two case studies applying the methodology were conducted. During and between group sessions the participants created material and designs following the behaviour change strategy themes, which were discussed, analysed and transformed into a design of a behaviour change system. Theories in behavioural change and persuasive technology guided the data collection, data analyses, and the design of a behaviour change system. The methodology has strong emphasis on the target group's participation in the design process. The different aspects brought forward related to behaviour change strategies defined in literature on persuasive technology, and the dynamics of these are associated to needs and motivation defined in literature on behaviour change. It was concluded that the methodology aids the integration of theories into a participatory action research design process, and aids the analyses and motivations of design choices.

  19. Conformal field theories from deformations of theories with Wn symmetry

    NASA Astrophysics Data System (ADS)

    Babaro, Juan Pablo; Giribet, Gaston; Ranjbar, Arash

    2016-10-01

    We construct a set of nonrational conformal field theories that consist of deformations of Toda field theory for s l (n ). In addition to preserving conformal invariance, the theories may still exhibit a remnant infinite-dimensional affine symmetry. The case n =3 is used to illustrate this phenomenon, together with further deformations that yield enhanced Kac-Moody symmetry algebras. For generic n we compute N -point correlation functions on the Riemann sphere and show that these can be expressed in terms of s l (n ) Toda field theory ((N -2 )n +2 ) -point correlation functions.

  20. Toward a theory of organisms: Three founding principles in search of a useful integration

    PubMed Central

    SOTO, ANA M.; LONGO, GIUSEPPE; MIQUEL, PAUL-ANTOINE; MONTEVIL, MAËL; MOSSIO, MATTEO; PERRET, NICOLE; POCHEVILLE, ARNAUD; SONNENSCHEIN, CARLOS

    2016-01-01

    Organisms, be they uni- or multi-cellular, are agents capable of creating their own norms; they are continuously harmonizing their ability to create novelty and stability, that is, they combine plasticity with robustness. Here we articulate the three principles for a theory of organisms proposed in this issue, namely: the default state of proliferation with variation and motility, the principle of variation and the principle of organization. These principles profoundly change both biological observables and their determination with respect to the theoretical framework of physical theories. This radical change opens up the possibility of anchoring mathematical modeling in biologically proper principles. PMID:27498204