Science.gov

Sample records for le modele standard

  1. Insights: Simple Models for Teaching Equilibrium and Le Chatelier's Principle.

    ERIC Educational Resources Information Center

    Russell, Joan M.

    1988-01-01

    Presents three models that have been effective for teaching chemical equilibrium and Le Chatelier's principle: (1) the liquid transfer model, (2) the fish model, and (3) the teeter-totter model. Explains each model and its relation to Le Chatelier's principle. (MVL)

  2. Insights: Simple Models for Teaching Equilibrium and Le Chatelier's Principle.

    ERIC Educational Resources Information Center

    Russell, Joan M.

    1988-01-01

    Presents three models that have been effective for teaching chemical equilibrium and Le Chatelier's principle: (1) the liquid transfer model, (2) the fish model, and (3) the teeter-totter model. Explains each model and its relation to Le Chatelier's principle. (MVL)

  3. Finding the Higgs boson of the standard model in the channel ZH → e+e-b$\\bar{b}$ with the D0 detector at the Tevatron; Recherche du boson de Higgs du nideke standard dans le canal ZH → e+e-b$\\bar{b}$ avec le detecteur DØ aupres du Tevatron

    SciTech Connect

    Calpas, Betty Constante

    2010-06-11

    The organization of this thesis consists of three main ideas: the first presents the theoretical framework and experimental, as well as objects used in the analysis and the second relates to the various work tasks of service that I performed on the calorimeter, and the third is the search for the Higgs boson in the channel ZH → e+e-b$\\bar{b}$. Thus, this thesis has the following structure: Chapter 1 is an introduction to the standard model of particle physics and the Higgs mechanism; Chapter 2 is an overview of the complex and the acceleration of the Tevatron at Fermilab DØ detector; Chapter 3 is an introduction to physical objects used in this thesis; Chapter 4 presents the study made on correcting the energy measured in the calorimeter; Chapter 5 describes the study of certification of electrons in the calorimeter; Chapter 6 describes the study of certification of electrons in the intercryostat region of calorimeter; Chapter 7 Detailed analysis on the search for Higgs production in the channel ZH → e+e-b$\\bar{b}$; and Chapter 8 presents the final results of the calculations of upper limits to the production cross section of the Higgs boson on a range of low masses.

  4. The standard cosmological model

    NASA Astrophysics Data System (ADS)

    Scott, D.

    2006-06-01

    The Standard Model of Particle Physics (SMPP) is an enormously successful description of high-energy physics, driving ever more precise measurements to find "physics beyond the standard model", as well as providing motivation for developing more fundamental ideas that might explain the values of its parameters. Simultaneously, a description of the entire three-dimensional structure of the present-day Universe is being built up painstakingly. Most of the structure is stochastic in nature, being merely the result of the particular realization of the "initial conditions" within our observable Universe patch. However, governing this structure is the Standard Model of Cosmology (SMC), which appears to require only about a dozen parameters. Cosmologists are now determining the values of these quantities with increasing precision to search for "physics beyond the standard model", as well as trying to develop an understanding of the more fundamental ideas that might explain the values of its parameters. Although it is natural to see analogies between the two Standard Models, some intrinsic differences also exist, which are discussed here. Nevertheless, a truly fundamental theory will have to explain both the SMPP and SMC, and this must include an appreciation of which elements are deterministic and which are accidental. Considering different levels of stochasticity within cosmology may make it easier to accept that physical parameters in general might have a nondeterministic aspect.

  5. Beyond the Standard Model

    SciTech Connect

    Peskin, M.E.

    1997-05-01

    These lectures constitute a short course in ``Beyond the Standard Model`` for students of experimental particle physics. The author discusses the general ideas which guide the construction of models of physics beyond the Standard model. The central principle, the one which most directly motivates the search for new physics, is the search for the mechanism of the spontaneous symmetry breaking observed in the theory of weak interactions. To illustrate models of weak-interaction symmetry breaking, the author gives a detailed discussion of the idea of supersymmetry and that of new strong interactions at the TeV energy scale. He discusses experiments that will probe the details of these models at future pp and e{sup +}e{sup {minus}} colliders.

  6. Beyond the Standard Model

    SciTech Connect

    Lykken, Joseph D.; /Fermilab

    2010-05-01

    'BSM physics' is a phrase used in several ways. It can refer to physical phenomena established experimentally but not accommodated by the Standard Model, in particular dark matter and neutrino oscillations (technically also anything that has to do with gravity, since gravity is not part of the Standard Model). 'Beyond the Standard Model' can also refer to possible deeper explanations of phenomena that are accommodated by the Standard Model but only with ad hoc parameterizations, such as Yukawa couplings and the strong CP angle. More generally, BSM can be taken to refer to any possible extension of the Standard Model, whether or not the extension solves any particular set of puzzles left unresolved in the SM. In this general sense one sees reference to the BSM 'theory space' of all possible SM extensions, this being a parameter space of coupling constants for new interactions, new charges or other quantum numbers, and parameters describing possible new degrees of freedom or new symmetries. Despite decades of model-building it seems unlikely that we have mapped out most of, or even the most interesting parts of, this theory space. Indeed we do not even know what is the dimensionality of this parameter space, or what fraction of it is already ruled out by experiment. Since Nature is only implementing at most one point in this BSM theory space (at least in our neighborhood of space and time), it might seem an impossible task to map back from a finite number of experimental discoveries and measurements to a unique BSM explanation. Fortunately for theorists the inevitable limitations of experiments themselves, in terms of resolutions, rates, and energy scales, means that in practice there are only a finite number of BSM model 'equivalence classes' competing at any given time to explain any given set of results. BSM phenomenology is a two-way street: not only do experimental results test or constrain BSM models, they also suggest - to those who get close enough to listen

  7. Standardissimo. Les limitations théoriques du Modèle Standard. Quelles réponses y apporter?

    NASA Astrophysics Data System (ADS)

    Renard, F. M.

    Nous présentons I 'état du Modèle Standard des interactions fortes, faibles et électromagnétiques. Après une description rapide de ses 3 secteurs, secteur de jauge (radiation), secteur fermionique (matière) et secteur scalaire (génération des masses), nous insistons sur le grand nombre de paramètres libres et sur les choix arbitraires qu'il a fallu faire dans l'élaboration du modèle. Nous faisons ressortir les problèmes techniques non résolus et nous dressons la liste des questions fondamentales restées sans réponses. Nous passons ensuite en revue les idées et méthodes proposées pour répondre à ces questions. Elles utilisent essentiellement 3 voies différentes. La première consiste à requérir plus de symétrie (extension du modèle, symétrie Gauche-Droite, Grandes Unifications, Supersymétrie,...). La seconde contient les diverses alternatives au Modèle Standard impliquant des modifications dans certains secteurs (par exemple le secteur scalaire avec le modèle de la Technicouleur) ou de façon plus violente l'hypothèse d'une sous-structure des leptons, des quarks et des bosons W et Z eux-mêmes. Une dernière voie cherche à justifier les particularités du Modèle Standard et relier ses paramètres libres en se basant sur des principes de cohérence interne du modèle. Les conséquences observables de ces diverses approches sont dans chaque cas mentionnées.

  8. The standard model

    SciTech Connect

    Marciano, W.J.

    1994-03-01

    In these lectures, my aim is to provide a survey of the standard model with emphasis on its renormalizability and electroweak radiative corrections. Since this is a school, I will try to be somewhat pedagogical by providing examples of loop calculations. In that way, I hope to illustrate some of the commonly employed tools of particle physics. With those goals in mind, I have organized my presentations as follows: In Section 2, renormalization is discussed from an applied perspective. The technique of dimensional regularization is described and used to define running couplings and masses. The utility of the renormalization group for computing leading logs is illustrated for the muon anomalous magnetic moment. In Section 3 electroweak radiative corrections are discussed. Standard model predictions are surveyed and used to constrain the top quark mass. The S, T, and U parameters are introduced and employed to probe for ``new physics``. The effect of Z{prime} bosons on low energy phenomenology is described. In Section 4, a detailed illustration of electroweak radiative corrections is given for atomic parity violation. Finally, in Section 5, I conclude with an outlook for the future.

  9. The Supersymmetric Standard Model

    NASA Astrophysics Data System (ADS)

    Fayet, Pierre

    2016-10-01

    The Standard Model may be included within a supersymmetric theory, postulating new sparticles that differ by half-a-unit of spin from their standard model partners, and by a new quantum number called R-parity. The lightest one, usually a neutralino, is expected to be stable and a possible candidate for dark matter. The electroweak breaking requires two doublets, leading to several charged and neutral Brout-Englert-Higgs bosons. This also leads to gauge/Higgs unification by providing extra spin-0 partners for the spin-1 W± and Z. It offers the possibility to view, up to a mixing angle, the new 125 GeV boson as the spin-0 partner of the Z under two supersymmetry transformations, i.e. as a Z that would be deprived of its spin. Supersymmetry then relates two existing particles of different spins, in spite of their different gauge symmetry properties, through supersymmetry transformations acting on physical fields in a non-polynomial way. We also discuss how the compactification of extra dimensions, relying on R-parity and other discrete symmetries, may determine both the supersymmetrybreaking and grand-unification scales.

  10. MODeLeR: A Virtual Constructivist Learning Environment and Methodology for Object-Oriented Design

    ERIC Educational Resources Information Center

    Coffey, John W.; Koonce, Robert

    2008-01-01

    This article contains a description of the organization and method of use of an active learning environment named MODeLeR, (Multimedia Object Design Learning Resource), a tool designed to facilitate the learning of concepts pertaining to object modeling with the Unified Modeling Language (UML). MODeLeR was created to provide an authentic,…

  11. MODeLeR: A Virtual Constructivist Learning Environment and Methodology for Object-Oriented Design

    ERIC Educational Resources Information Center

    Coffey, John W.; Koonce, Robert

    2008-01-01

    This article contains a description of the organization and method of use of an active learning environment named MODeLeR, (Multimedia Object Design Learning Resource), a tool designed to facilitate the learning of concepts pertaining to object modeling with the Unified Modeling Language (UML). MODeLeR was created to provide an authentic,…

  12. Beyond Standard Model Physics

    SciTech Connect

    Bellantoni, L.

    2009-11-01

    There are many recent results from searches for fundamental new physics using the TeVatron, the SLAC b-factory and HERA. This talk quickly reviewed searches for pair-produced stop, for gauge-mediated SUSY breaking, for Higgs bosons in the MSSM and NMSSM models, for leptoquarks, and v-hadrons. There is a SUSY model which accommodates the recent astrophysical experimental results that suggest that dark matter annihilation is occurring in the center of our galaxy, and a relevant experimental result. Finally, model-independent searches at D0, CDF, and H1 are discussed.

  13. Phenomenology beyond the standard model

    SciTech Connect

    Lykken, Joseph D.; /Fermilab

    2005-03-01

    An elementary review of models and phenomenology for physics beyond the Standard Model (excluding supersymmetry). The emphasis is on LHC physics. Based upon a talk given at the ''Physics at LHC'' conference, Vienna, 13-17 July 2004.

  14. Reference and Standard Atmosphere Models

    NASA Technical Reports Server (NTRS)

    Johnson, Dale L.; Roberts, Barry C.; Vaughan, William W.; Parker, Nelson C. (Technical Monitor)

    2002-01-01

    This paper describes the development of standard and reference atmosphere models along with the history of their origin and use since the mid 19th century. The first "Standard Atmospheres" were established by international agreement in the 1920's. Later some countries, notably the United States, also developed and published "Standard Atmospheres". The term "Reference Atmospheres" is used to identify atmosphere models for specific geographical locations. Range Reference Atmosphere Models developed first during the 1960's are examples of these descriptions of the atmosphere. This paper discusses the various models, scopes, applications and limitations relative to use in aerospace industry activities.

  15. CoLeMo: A Collaborative Learning Environment for UML Modelling

    ERIC Educational Resources Information Center

    Chen, Weiqin; Pedersen, Roger Heggernes; Pettersen, Oystein

    2006-01-01

    This paper presents the design, implementation, and evaluation of a distributed collaborative UML modelling environment, CoLeMo. CoLeMo is designed for students studying UML modelling. It can also be used as a platform for collaborative design of software. We conducted formative evaluations and a summative evaluation to improve the environment and…

  16. CoLeMo: A Collaborative Learning Environment for UML Modelling

    ERIC Educational Resources Information Center

    Chen, Weiqin; Pedersen, Roger Heggernes; Pettersen, Oystein

    2006-01-01

    This paper presents the design, implementation, and evaluation of a distributed collaborative UML modelling environment, CoLeMo. CoLeMo is designed for students studying UML modelling. It can also be used as a platform for collaborative design of software. We conducted formative evaluations and a summative evaluation to improve the environment and…

  17. Colorado Model Content Standards: Science

    ERIC Educational Resources Information Center

    Colorado Department of Education, 2007

    2007-01-01

    The Colorado Model Content Standards for Science specify what all students should know and be able to do in science as a result of their school studies. Specific expectations are given for students completing grades K-2, 3-5, 6-8, and 9-12. Five standards outline the essential level of science knowledge and skills needed by Colorado citizens to…

  18. CIM - A Manufacturing Paradigm (Le CIM - Un Nouveau Modele Industriel),

    DTIC Science & Technology

    1986-07-01

    fait la plupart des entreprises , ont affind le module de "Rgvolution industrielle". Nous vivons A l’poque du spdcialiste. Toutefois, le modale de sp...program will serve as an umbrella under which specific projects are planned, financed , managed, and imple- mented. Well defined corporate goals must be...assets. Through an integration of financing strategies an enterprise can focus on capital investment in shared, value-added assets such as databases

  19. The New Minimal Standard Model

    SciTech Connect

    Davoudiasl, Hooman; Kitano, Ryuichiro; Li, Tianjun; Murayama, Hitoshi

    2005-01-13

    We construct the New Minimal Standard Model that incorporates the new discoveries of physics beyond the Minimal Standard Model (MSM): Dark Energy, non-baryonic Dark Matter, neutrino masses, as well as baryon asymmetry and cosmic inflation, adopting the principle of minimal particle content and the most general renormalizable Lagrangian. We base the model purely on empirical facts rather than aesthetics. We need only six new degrees of freedom beyond the MSM. It is free from excessive flavor-changing effects, CP violation, too-rapid proton decay, problems with electroweak precision data, and unwanted cosmological relics. Any model of physics beyond the MSM should be measured against the phenomenological success of this model.

  20. Standard model of knowledge representation

    NASA Astrophysics Data System (ADS)

    Yin, Wensheng

    2016-09-01

    Knowledge representation is the core of artificial intelligence research. Knowledge representation methods include predicate logic, semantic network, computer programming language, database, mathematical model, graphics language, natural language, etc. To establish the intrinsic link between various knowledge representation methods, a unified knowledge representation model is necessary. According to ontology, system theory, and control theory, a standard model of knowledge representation that reflects the change of the objective world is proposed. The model is composed of input, processing, and output. This knowledge representation method is not a contradiction to the traditional knowledge representation method. It can express knowledge in terms of multivariate and multidimensional. It can also express process knowledge, and at the same time, it has a strong ability to solve problems. In addition, the standard model of knowledge representation provides a way to solve problems of non-precision and inconsistent knowledge.

  1. Model Standards Advance the Profession

    ERIC Educational Resources Information Center

    Journal of Staff Development, 2011

    2011-01-01

    Leadership by teachers is essential to serving the needs of students, schools, and the teaching profession. To that end, the Teacher Leadership Exploratory Consortium has developed Teacher Leader Model Standards to codify, promote, and support teacher leadership as a vehicle to transform schools for the needs of the 21st century. The Teacher…

  2. Model Standards Advance the Profession

    ERIC Educational Resources Information Center

    Journal of Staff Development, 2011

    2011-01-01

    Leadership by teachers is essential to serving the needs of students, schools, and the teaching profession. To that end, the Teacher Leadership Exploratory Consortium has developed Teacher Leader Model Standards to codify, promote, and support teacher leadership as a vehicle to transform schools for the needs of the 21st century. The Teacher…

  3. The standard model and beyond

    SciTech Connect

    Marciano, W.J.

    1989-05-01

    In these lectures, my aim is to present a status report on the standard model and some key tests of electroweak unification. Within that context, I also discuss how and where hints of new physics may emerge. To accomplish those goals, I have organized my presentation as follows. I survey the standard model parameters with particular emphasis on the gauge coupling constants and vector boson masses. Examples of new physics appendages are also commented on. In addition, I have included an appendix on dimensional regularization and a simple example which employs that technique. I focus on weak charged current phenomenology. Precision tests of the standard model are described and up-to-date values for the Cabibbo-Kobayashi-Maskawa (CKM) mixing matrix parameters are presented. Constraints implied by those tests for a 4th generation, extra Z' bosons, and compositeness are discussed. An overview of the physics of tau decays is also included. I discuss weak neutral current phenomenology and the extraction of sin/sup 2//theta/W from experiment. The results presented there are based on a global analysis of all existing data. I have chosen to concentrate that discussion on radiative corrections, the effect of a heavy top quark mass, implications for grand unified theories (GUTS), extra Z' gauge bosons, and atomic parity violation. The potential for further experimental progress is also commented on. Finally, I depart from the narrowest version of the standard model and discuss effects of neutrino masses, mixings, and electromagnetic moments. 32 refs., 3 figs., 5 tabs

  4. Modular modelling with Physiome standards.

    PubMed

    Cooling, Michael T; Nickerson, David P; Nielsen, Poul M F; Hunter, Peter J

    2016-12-01

    The complexity of computational models is increasing, supported by research in modelling tools and frameworks. But relatively little thought has gone into design principles for complex models. We propose a set of design principles for complex model construction with the Physiome standard modelling protocol CellML. By following the principles, models are generated that are extensible and are themselves suitable for reuse in larger models of increasing complexity. We illustrate these principles with examples including an architectural prototype linking, for the first time, electrophysiology, thermodynamically compliant metabolism, signal transduction, gene regulation and synthetic biology. The design principles complement other Physiome research projects, facilitating the application of virtual experiment protocols and model analysis techniques to assist the modelling community in creating libraries of composable, characterised and simulatable quantitative descriptions of physiology. The ability to produce and customise complex computational models has great potential to have a positive impact on human health. As the field develops towards whole-cell models and linking such models in multi-scale frameworks to encompass tissue, organ, or organism levels, reuse of previous modelling efforts will become increasingly necessary. Any modelling group wishing to reuse existing computational models as modules for their own work faces many challenges in the context of construction, storage, retrieval, documentation and analysis of such modules. Physiome standards, frameworks and tools seek to address several of these challenges, especially for models expressed in the modular protocol CellML. Aside from providing a general ability to produce modules, there has been relatively little research work on architectural principles of CellML models that will enable reuse at larger scales. To complement and support the existing tools and frameworks, we develop a set of principles to

  5. Consistency Across Standards or Standards in a New Business Model

    NASA Technical Reports Server (NTRS)

    Russo, Dane M.

    2010-01-01

    Presentation topics include: standards in a changing business model, the new National Space Policy is driving change, a new paradigm for human spaceflight, consistency across standards, the purpose of standards, danger of over-prescriptive standards, a balance is needed (between prescriptive and general standards), enabling versus inhibiting, characteristics of success-oriented standards, characteristics of success-oriented standards, and conclusions. Additional slides include NASA Procedural Requirements 8705.2B identifies human rating standards and requirements, draft health and medical standards for human rating, what's been done, government oversight models, examples of consistency from anthropometry, examples of inconsistency from air quality and appendices of government and non-governmental human factors standards.

  6. Neutrinos beyond the Standard Model

    SciTech Connect

    Valle, J.W.F.

    1989-08-01

    I review some basic aspects of neutrino physics beyond the Standard Model such as neutrino mixing and neutrino non-orthogonality, universality and CP violation in the lepton sector, total lepton number and lepton flavor violation, etc.. These may lead to neutrino decays and oscillations, exotic weak decay processes, neutrinoless double /beta/ decay, etc.. Particle physics models are discussed where some of these processes can be sizable even in the absence of measurable neutrino masses. These may also substantially affect the propagation properties of solar and astrophysical neutrinos. 39 refs., 4 figs.

  7. Beyond the Standard Model IV

    NASA Astrophysics Data System (ADS)

    Gunion, John F.; Han, Tao; Ohnemus, James

    1995-08-01

    The Table of Contents for the book is as follows: * Preface * Organizing and Advisory Committees * PLENARY SESSIONS * Looking Beyond the Standard Model from LEP1 and LEP2 * Virtual Effects of Physics Beyond the Standard Model * Extended Gauge Sectors * CLEO's Views Beyond the Standard Model * On Estimating Perturbative Coefficients in Quantum Field Theory and Statistical Physics * Perturbative Corrections to Inclusive Heavy Hadron Decay * Some Recent Developments in Sphalerons * Searching for New Matter Particles at Future Colliders * Issues in Dynamical Supersymmetry Breaking * Present Status of Fermilab Collider Accelerator Upgrades * The Extraordinary Scientific Opportunities from Upgrading Fermilab's Luminosity ≥ 1033 cm-2 sec-1 * Applications of Effective Lagrangians * Collider Phenomenology for Strongly Interacting Electroweak Sector * Physics of Self-Interacting Electroweak Bosons * Particle Physics at a TeV-Scale e+e- Linear Collider * Physics at γγ and eγ Colliders * Challenges for Non-Minimal Higgs Searchers at Future Colliders * Physics Potential and Development of μ+μ- Colliders * Beyond Standard Quantum Chromodynamics * Extracting Predictions from Supergravity/Superstrings for the Effective Theory Below the Planck Scale * Non-Universal SUSY Breaking, Hierarchy and Squark Degeneracy * Supersymmetric Phenomenology in the Light of Grand Unification * A Survey of Phenomenological Constraints on Supergravity Models * Precision Tests of the MSSM * The Search for Supersymmetry * Neutrino Physics * Neutrino Mass: Oscillations and Hot Dark Matter * Dark Matter and Large-Scale Structure * Electroweak Baryogenesis * Progress in Searches for Non-Baryonic Dark Matter * Big Bang Nucleosynthesis * Flavor Tests of Quark-Lepton * Where are We Coming from? What are We? Where are We Going? * Summary, Perspectives * PARALLEL SESSIONS * SUSY Phenomenology I * Is Rb Telling us that Superpartners will soon be Discovered? * Dark Matter in Constrained Minimal

  8. Issues in the standard model

    SciTech Connect

    Gaillard, M.K.

    1983-04-01

    Focussing on the standard electroweak model, we examine physics issues which may be addressed with the help of intense beams of strange particles. I have collected miscellany of issues, starting with some philosophical remarks on how things stand and where we should go from here. I will then focus on a case study: the decay K/sup +/ ..-->.. ..pi../sup +/ + nothing observable, which provides a nice illustration of the type of physics that can be probed through rare decays. Other topics I will mention are CP violation in K-decays, hyperon and anti-hyperon physics, and a few random comments on other relevant phenomena.

  9. Standard for Models and Simulations

    NASA Technical Reports Server (NTRS)

    Steele, Martin J.

    2016-01-01

    This NASA Technical Standard establishes uniform practices in modeling and simulation to ensure essential requirements are applied to the design, development, and use of models and simulations (MS), while ensuring acceptance criteria are defined by the program project and approved by the responsible Technical Authority. It also provides an approved set of requirements, recommendations, and criteria with which MS may be developed, accepted, and used in support of NASA activities. As the MS disciplines employed and application areas involved are broad, the common aspects of MS across all NASA activities are addressed. The discipline-specific details of a given MS should be obtained from relevant recommended practices. The primary purpose is to reduce the risks associated with MS-influenced decisions by ensuring the complete communication of the credibility of MS results.

  10. Minimal standard heterotic string models

    NASA Astrophysics Data System (ADS)

    Faraggi, A. E.; Manno, E.; Timirgaziu, C.

    2007-04-01

    Three generation heterotic string vacua in the free fermionic formulation gave rise to models with solely the MSSM states in the observable standard model charged sector. The relation of these models to Z2×Z2 orbifold compactifications dictates that they produce three pairs of untwisted Higgs multiplets. The reduction to one pair relies on the analysis of supersymmetric flat directions, which give a superheavy mass to the dispensable Higgs states. We explore the removal of the extra Higgs representations by using the free fermion boundary conditions, and hence we work directly at the string level, rather than in the effective low energy field theory. We present a general mechanism that achieves this reduction by using asymmetric boundary conditions between the left- and right-moving internal fermions. We incorporate this mechanism in explicit string models containing three twisted generations and a single untwisted Higgs doublet pair. We further demonstrate that an additional effect of the asymmetric boundary conditions is to substantially reduce the supersymmetric moduli space.

  11. From Interactive Open Learner Modelling to Intelligent Mentoring: STyLE-OLM and Beyond

    ERIC Educational Resources Information Center

    Dimitrova, Vania; Brna, Paul

    2016-01-01

    STyLE-OLM (Dimitrova 2003 "International Journal of Artificial Intelligence in Education," 13, 35-78) presented a framework for interactive open learner modelling which entails the development of the means by which learners can "inspect," "discuss" and "alter" the learner model that has been jointly…

  12. From Interactive Open Learner Modelling to Intelligent Mentoring: STyLE-OLM and Beyond

    ERIC Educational Resources Information Center

    Dimitrova, Vania; Brna, Paul

    2016-01-01

    STyLE-OLM (Dimitrova 2003 "International Journal of Artificial Intelligence in Education," 13, 35-78) presented a framework for interactive open learner modelling which entails the development of the means by which learners can "inspect," "discuss" and "alter" the learner model that has been jointly…

  13. Conductivite dans le modele de Hubbard bi-dimensionnel a faible couplage

    NASA Astrophysics Data System (ADS)

    Bergeron, Dominic

    Le modele de Hubbard bi-dimensionnel (2D) est souvent considere comme le modele minimal pour les supraconducteurs a haute temperature critique a base d'oxyde de cuivre (SCHT). Sur un reseau carre, ce modele possede les phases qui sont communes a tous les SCHT, la phase antiferromagnetique, la phase supraconductrice et la phase dite du pseudogap. Il n'a pas de solution exacte, toutefois, plusieurs methodes approximatives permettent d'etudier ses proprietes de facon numerique. Les proprietes optiques et de transport sont bien connues dans les SCHT et sont donc de bonne candidates pour valider un modele theorique et aider a comprendre mieux la physique de ces materiaux. La presente these porte sur le calcul de ces proprietes pour le modele de Hubbard 2D a couplage faible ou intermediaire. La methode de calcul utilisee est l'approche auto-coherente a deux particules (ACDP), qui est non-perturbative et inclue l'effet des fluctuations de spin et de charge a toutes les longueurs d'onde. La derivation complete de l'expression de la conductivite dans l'approche ACDP est presentee. Cette expression contient ce qu'on appelle les corrections de vertex, qui tiennent compte des correlations entre quasi-particules. Pour rendre possible le calcul numerique de ces corrections, des algorithmes utilisant, entre autres, des transformees de Fourier rapides et des splines cubiques sont developpes. Les calculs sont faits pour le reseau carre avec sauts aux plus proches voisins autour du point critique antiferromagnetique. Aux dopages plus faibles que le point critique, la conductivite optique presente une bosse dans l'infrarouge moyen a basse temperature, tel qu'observe dans plusieurs SCHT. Dans la resistivite en fonction de la temperature, on trouve un comportement isolant dans le pseudogap lorsque les corrections de vertex sont negligees et metallique lorsqu'elles sont prises en compte. Pres du point critique, la resistivite est lineaire en T a basse temperature et devient

  14. Establishing the isolated standard model

    NASA Astrophysics Data System (ADS)

    Wells, James D.; Zhang, Zhengkang; Zhao, Yue

    2017-07-01

    The goal of this article is to initiate a discussion on what it takes to claim "there is no new physics at the weak scale," namely that the Standard Model (SM) is "isolated." The lack of discovery of beyond the SM (BSM) physics suggests that this may be the case. But to truly establish this statement requires proving all "connected" BSM theories are false, which presents a significant challenge. We propose a general approach to quantitatively assess the current status and future prospects of establishing the isolated SM (ISM), which we give a reasonable definition of. We consider broad elements of BSM theories, and show many examples where current experimental results are not sufficient to verify the ISM. In some cases, there is a clear roadmap for the future experimental program, which we outline, while in other cases, further efforts—both theoretical and experimental—are needed in order to robustly claim the establishment of the ISM in the absence of new physics discoveries.

  15. Experiments beyond the standard model

    SciTech Connect

    Perl, M.L.

    1984-09-01

    This paper is based upon lectures in which I have described and explored the ways in which experimenters can try to find answers, or at least clues toward answers, to some of the fundamental questions of elementary particle physics. All of these experimental techniques and directions have been discussed fully in other papers, for example: searches for heavy charged leptons, tests of quantum chromodynamics, searches for Higgs particles, searches for particles predicted by supersymmetric theories, searches for particles predicted by technicolor theories, searches for proton decay, searches for neutrino oscillations, monopole searches, studies of low transfer momentum hadron physics at very high energies, and elementary particle studies using cosmic rays. Each of these subjects requires several lectures by itself to do justice to the large amount of experimental work and theoretical thought which has been devoted to these subjects. My approach in these tutorial lectures is to describe general ways to experiment beyond the standard model. I will use some of the topics listed to illustrate these general ways. Also, in these lectures I present some dreams and challenges about new techniques in experimental particle physics and accelerator technology, I call these Experimental Needs. 92 references.

  16. Le modele de Hubbard bidimensionnel a faible couplage: Thermodynamique et phenomenes critiques

    NASA Astrophysics Data System (ADS)

    Roy, Sebastien

    Une etude systematique du modele de Hubbard en deux dimensions a faible couplage a l'aide de la theorie Auto-Coherente a Deux Particules (ACDP) dans le diagramme temperature-dopage-interaction-sauts permet de mettre en evidence l'influence des fluctuations magnetiques sur les proprietes thermodynamiques du systeme electronique sur reseau. Le regime classique renormalise a temperature finie pres du dopage nul est marque par la grandeur de la longueur de correlation de spin comparee a la longueur thermique de de Broglie et est caracterisee par un accroissement drastique de la longueur de correlation de spin. Cette croissance exponentielle a dopage nul marque la presence d'un pic de chaleur specifique en fonction de la temperature a basse temperature. Une temperature de crossover est alors associee a la temperature a laquelle la longueur de correlation de spin est egale a la longueur thermique de de Broglie. C'est a cette temperature caracteristique, ou est observee l'ouverture du pseudogap dans le poids spectral, que se situe le maximum du pic de chaleur specifique. La presence de ce pic a des consequences sur l'evolution du potentiel chimique avec le dopage lorsque l'uniformite thermodynamique est respectee. Les contraintes imposees par les lois de la thermodynamique font en sorte que l'evolution du potentiel chimique avec le dopage est non triviale. On demontre entre autres que le potentiel chimique est proportionnel a la double occupation qui est reliee au moment local. Par ailleurs, une derivation de la fonction de mise a l'echelle de la susceptibilite de spin a frequence nulle au voisinage d'un point critique marque sans equivoque la presence d'un point critique quantique en dopage pour une valeur donnee de l'interaction. Ce point critique, associe a une transition de phase magnetique en fonction du dopage a temperature nulle, induit un comportement non trivial sur les proprietes physiques du systeme a temperature finie. L'approche quantitative ACDP permet de

  17. Theories, Models, and Standard Systems of Measurement.

    ERIC Educational Resources Information Center

    Aftanas, Marion S.

    1988-01-01

    A meta-theoretical framework that begins with the standard system of measurement is outlined. Identification of different standard systems and elements of the measurement process provide a focus for comparisons between measurement theories and models. (SLD)

  18. Higgs bosons in standard model extensions

    NASA Astrophysics Data System (ADS)

    Gurskaya, A. V.; Dolgopolov, M. V.; Rykova, E. N.

    2017-09-01

    Several possibilities for extending the scalar sector of the Standard Model are considered. The conditions of calculation of Higgs bosons masses in the Next-to-Minimal Supersymmetric Standard Model are discussed. The probable limits on mass parameters of Higgs bosons are analyzed. The role of minimum conditions as a physical criterion in a model with an extended scalar sector is defined.

  19. SCaLeM: A Framework for Characterizing and Analyzing Execution Models

    SciTech Connect

    Chavarría-Miranda, Daniel; Manzano Franco, Joseph B.; Krishnamoorthy, Sriram; Vishnu, Abhinav; Barker, Kevin J.; Hoisie, Adolfy

    2014-10-13

    As scalable parallel systems evolve towards more complex nodes with many-core architectures and larger trans-petascale & upcoming exascale deployments, there is a need to understand, characterize and quantify the underlying execution models being used on such systems. Execution models are a conceptual layer between applications & algorithms and the underlying parallel hardware and systems software on which those applications run. This paper presents the SCaLeM (Synchronization, Concurrency, Locality, Memory) framework for characterizing and execution models. SCaLeM consists of three basic elements: attributes, compositions and mapping of these compositions to abstract parallel systems. The fundamental Synchronization, Concurrency, Locality and Memory attributes are used to characterize each execution model, while the combinations of those attributes in the form of compositions are used to describe the primitive operations of the execution model. The mapping of the execution model’s primitive operations described by compositions, to an underlying abstract parallel system can be evaluated quantitatively to determine its effectiveness. Finally, SCaLeM also enables the representation and analysis of applications in terms of execution models, for the purpose of evaluating the effectiveness of such mapping.

  20. Modeling in the Common Core State Standards

    ERIC Educational Resources Information Center

    Tam, Kai Chung

    2011-01-01

    The inclusion of modeling and applications into the mathematics curriculum has proven to be a challenging task over the last fifty years. The Common Core State Standards (CCSS) has made mathematical modeling both one of its Standards for Mathematical Practice and one of its Conceptual Categories. This article discusses the need for mathematical…

  1. Modeling in the Common Core State Standards

    ERIC Educational Resources Information Center

    Tam, Kai Chung

    2011-01-01

    The inclusion of modeling and applications into the mathematics curriculum has proven to be a challenging task over the last fifty years. The Common Core State Standards (CCSS) has made mathematical modeling both one of its Standards for Mathematical Practice and one of its Conceptual Categories. This article discusses the need for mathematical…

  2. A standard satellite control reference model

    NASA Technical Reports Server (NTRS)

    Golden, Constance

    1994-01-01

    This paper describes a Satellite Control Reference Model that provides the basis for an approach to identify where standards would be beneficial in supporting space operations functions. The background and context for the development of the model and the approach are described. A process for using this reference model to trace top level interoperability directives to specific sets of engineering interface standards that must be implemented to meet these directives is discussed. Issues in developing a 'universal' reference model are also identified.

  3. Beyond the supersymmetric standard model

    SciTech Connect

    Hall, L.J.

    1988-02-01

    The possibility of baryon number violation at the weak scale and an alternative primordial nucleosynthesis scheme arising from the decay of gravitations are discussed. The minimal low energy supergravity model is defined and a few of its features are described. Renormalization group scaling and flavor physics are mentioned.

  4. Electroweak baryogenesis and the standard model

    SciTech Connect

    Huet, P.

    1994-06-15

    Electroweak baryogenesis is addressed within the context of the standard model of particle physics. Although the minimal standard model has the means of fulfilling the three Sakharov`s conditions, it falls short to explaining the making of the baryon asymmetry of the universe. In particular, it is demonstrated that the phase of the CKM mixing matrix is an, insufficient source of CP violation. The shortcomings of the standard model could be bypassed by enlarging the symmetry breaking sector and adding a new source of CP violation.

  5. Less minimal supersymmetric standard model

    SciTech Connect

    de Gouvea, Andre; Friedland, Alexander; Murayama, Hitoshi

    1998-03-28

    Most of the phenomenological studies of supersymmetry have been carried out using the so-called minimal supergravity scenario, where one assumes a universal scalar mass, gaugino mass, and trilinear coupling at M{sub GUT}. Even though this is a useful simplifying assumption for phenomenological analyses, it is rather too restrictive to accommodate a large variety of phenomenological possibilities. It predicts, among other things, that the lightest supersymmetric particle (LSP) is an almost pure B-ino, and that the {mu}-parameter is larger than the masses of the SU(2){sub L} and U(1){sub Y} gauginos. We extend the minimal supergravity framework by introducing one extra parameter: the Fayet'Iliopoulos D-term for the hypercharge U(1), D{sub Y}. Allowing for this extra parameter, we find a much more diverse phenomenology, where the LSP is {tilde {nu}}{sub {tau}}, {tilde {tau}} or a neutralino with a large higgsino content. We discuss the relevance of the different possibilities to collider signatures. The same type of extension can be done to models with the gauge mediation of supersymmetry breaking. We argue that it is not wise to impose cosmological constraints on the parameter space.

  6. An alternative to the standard model

    SciTech Connect

    Baek, Seungwon; Ko, Pyungwon; Park, Wan-Il

    2014-06-24

    We present an extension of the standard model to dark sector with an unbroken local dark U(1){sub X} symmetry. Including various singlet portal interactions provided by the standard model Higgs, right-handed neutrinos and kinetic mixing, we show that the model can address most of phenomenological issues (inflation, neutrino mass and mixing, baryon number asymmetry, dark matter, direct/indirect dark matter searches, some scale scale puzzles of the standard collisionless cold dark matter, vacuum stability of the standard model Higgs potential, dark radiation) and be regarded as an alternative to the standard model. The Higgs signal strength is equal to one as in the standard model for unbroken U(1){sub X} case with a scalar dark matter, but it could be less than one independent of decay channels if the dark matter is a dark sector fermion or if U(1){sub X} is spontaneously broken, because of a mixing with a new neutral scalar boson in the models.

  7. Higgs couplings in noncommutative Standard Model

    NASA Astrophysics Data System (ADS)

    Batebi, S.; Haghighat, M.; Tizchang, S.; Akafzade, H.

    2015-06-01

    We consider the Higgs and Yukawa parts of the Noncommutative Standard Model (NCSM). We explore the NC-action to give all Feynman rules for couplings of the Higgs boson to electroweak gauge fields and fermions.

  8. Standard Model Higgs searches at the Tevatron

    SciTech Connect

    Herner, Kenneth; /Michigan U.

    2010-04-01

    We report results of searches for the Standard Model Higgs Boson at the Fermilab Tevatron using up to 5.4 fb{sup -1} of data taken with the CDF and D0 detectors. There is no significant excess in the mass range of interest and the experiments set upper limits on the Higgs boson production cross section, including an exclusion of the Standard Model Higgs in the mass range 162-166 GeV.

  9. The Higgs boson in the Standard Model

    NASA Astrophysics Data System (ADS)

    Djouadi, Abdelhak; Grazzini, Massimiliano

    2016-10-01

    The major goal of the Large Hadron Collider is to probe the electroweak symmetry breaking mechanism and the generation of the elementary particle masses. In the Standard Model this mechanism leads to the existence of a scalar Higgs boson with unique properties. We review the physics of the Standard Model Higgs boson, discuss its main search channels at hadron colliders and the corresponding theoretical predictions. We also summarize the strategies to study its basic properties.

  10. Deriving fractional rate of degradation of logistic-exponential (LE) model to evaluate early in vitro fermentation.

    PubMed

    Wang, M; Sun, X Z; Tang, S X; Tan, Z L; Pacheco, D

    2013-06-01

    Water-soluble components of feedstuffs are mainly utilized during the early phase of microbial fermentation, which could be deemed an important determinant of gas production behavior in vitro. Many studies proposed that the fractional rate of degradation (FRD) estimated by fitting gas production curves to mathematical models might be used to characterize the early incubation for in vitro systems. In this study, the mathematical concept of FRD was developed on the basis of the Logistic-Exponential (LE) model, with initial gas volume being zero (LE0). The FRD of the LE0 model exhibits a continuous increase from initial (FRD 0) toward final asymptotic value (FRD F) with longer incubation time. The relationships between the FRD and gas production at incubation times 2, 4, 6, 8, 12 and 24 h were compared for four models, in addition to LE0, Generalization of the Mitscherlich (GM), c th order Michaelis-Menten (MM) and Exponential with a discrete LAG (EXPLAG). A total of 94 in vitro gas curves from four subsets with a wide range of feedstuffs from different laboratories and incubation periods were used for model testing. Results indicated that compared with the GM, MM and EXPLAG models, the FRD of LE0 model consistently had stronger correlations with gas production across the four subsets, especially at incubation times 2, 4, 6, 8 and 12 h. Thus, the LE0 model was deemed to provide a better representation of the early fermentation rates. Furthermore, the FRD 0 also exhibited strong correlations (P < 0.05) with gas production at early incubation times 2, 4, 6 and 8 h across all four subsets. In summary, the FRD of LE0 model provides an alternative to quantify the rate of early stage incubation, and its initial value could be an important starting parameter of rate.

  11. Produce Live News Broadcasts Using Standard AV Equipment: A Success Story from the Le Center High School in Minnesota.

    ERIC Educational Resources Information Center

    Rostad, John

    1997-01-01

    Describes the production of news broadcasts on video by a high school class in Le Center, Minnesota. Topics include software for Apple computers, equipment used, student responsibilities, class curriculum, group work, communication among the production crew, administrative and staff support, and future improvements. (LRW)

  12. Exploring the Standard Model of Particles

    ERIC Educational Resources Information Center

    Johansson, K. E.; Watkins, P. M.

    2013-01-01

    With the recent discovery of a new particle at the CERN Large Hadron Collider (LHC) the Higgs boson could be about to be discovered. This paper provides a brief summary of the standard model of particle physics and the importance of the Higgs boson and field in that model for non-specialists. The role of Feynman diagrams in making predictions for…

  13. Exploring the Standard Model of Particles

    ERIC Educational Resources Information Center

    Johansson, K. E.; Watkins, P. M.

    2013-01-01

    With the recent discovery of a new particle at the CERN Large Hadron Collider (LHC) the Higgs boson could be about to be discovered. This paper provides a brief summary of the standard model of particle physics and the importance of the Higgs boson and field in that model for non-specialists. The role of Feynman diagrams in making predictions for…

  14. Exploring the Standard Model at the LHC

    NASA Astrophysics Data System (ADS)

    Vachon, Brigitte

    The ATLAS and CMS collaborations have performed studies of a wide range of Standard Model processes using data collected at the Large Hadron Collider at center-of-mass energies of 7, 8 and 13 TeV. These measurements are used to explore the Standard Model in a new kinematic regime, perform precision tests of the model, determine some of its fundamental parameters, constrain the proton parton distribution functions, and study new rare processes observed for the first time. Examples of recent Standard Model measurements performed by the ATLAS and CMS collaborations are summarized in this report. The measurements presented span a wide range of event final states including jets, photons, W/Z bosons, top quarks, and Higgs bosons.

  15. Estimating standard errors in feature network models.

    PubMed

    Frank, Laurence E; Heiser, Willem J

    2007-05-01

    Feature network models are graphical structures that represent proximity data in a discrete space while using the same formalism that is the basis of least squares methods employed in multidimensional scaling. Existing methods to derive a network model from empirical data only give the best-fitting network and yield no standard errors for the parameter estimates. The additivity properties of networks make it possible to consider the model as a univariate (multiple) linear regression problem with positivity restrictions on the parameters. In the present study, both theoretical and empirical standard errors are obtained for the constrained regression parameters of a network model with known features. The performance of both types of standard error is evaluated using Monte Carlo techniques.

  16. Development of NASA's Models and Simulations Standard

    NASA Technical Reports Server (NTRS)

    Bertch, William J.; Zang, Thomas A.; Steele, Martin J.

    2008-01-01

    From the Space Shuttle Columbia Accident Investigation, there were several NASA-wide actions that were initiated. One of these actions was to develop a standard for development, documentation, and operation of Models and Simulations. Over the course of two-and-a-half years, a team of NASA engineers, representing nine of the ten NASA Centers developed a Models and Simulation Standard to address this action. The standard consists of two parts. The first is the traditional requirements section addressing programmatics, development, documentation, verification, validation, and the reporting of results from both the M&S analysis and the examination of compliance with this standard. The second part is a scale for evaluating the credibility of model and simulation results using levels of merit associated with 8 key factors. This paper provides an historical account of the challenges faced by and the processes used in this committee-based development effort. This account provides insights into how other agencies might approach similar developments. Furthermore, we discuss some specific applications of models and simulations used to assess the impact of this standard on future model and simulation activities.

  17. Development of NASA's Models and Simulations Standard

    NASA Technical Reports Server (NTRS)

    Bertch, William J.; Zang, Thomas A.; Steele, Martin J.

    2008-01-01

    From the Space Shuttle Columbia Accident Investigation, there were several NASA-wide actions that were initiated. One of these actions was to develop a standard for development, documentation, and operation of Models and Simulations. Over the course of two-and-a-half years, a team of NASA engineers, representing nine of the ten NASA Centers developed a Models and Simulation Standard to address this action. The standard consists of two parts. The first is the traditional requirements section addressing programmatics, development, documentation, verification, validation, and the reporting of results from both the M&S analysis and the examination of compliance with this standard. The second part is a scale for evaluating the credibility of model and simulation results using levels of merit associated with 8 key factors. This paper provides an historical account of the challenges faced by and the processes used in this committee-based development effort. This account provides insights into how other agencies might approach similar developments. Furthermore, we discuss some specific applications of models and simulations used to assess the impact of this standard on future model and simulation activities.

  18. Standard controlled vocabulary for climate models

    NASA Astrophysics Data System (ADS)

    Moine, Marie-Pierre; Pascoe, Charlotte; Guilyardi, Eric; Ford, Rupert

    2010-05-01

    The scope of climate modeling has grown tremendously in the last 10 years, resulting in a large variety of climate models, increasing complexity with more physical or chemical components and huge volumes of data sets (simulation outputs). While significant efforts to standardise the associated metadata (i.e. data describing data and models) have already been made in recent projects (e.g. CF standard names for CMIP3), detailed standards documentation of the models and experiments that created this data is still lacking. The EU METAFOR Project (http://metaforclimate.eu) is specifically addressing this issue by creating new metadata schemas in cooperation with existing standards in Earth System Modeling (Curator, GridSpec, CF convention, NumSim, etc.). Descriptions of climate simulations, of the data they produce, and of the numerical models used to perform these simulations are all within the scope of METAFOR and these descriptions are assembled in a common information model (the CIM). Of particular note is the metadata for numerical models that is found in the CIM. This paper presents the controlled vocabulary (CV) that has been collected by METAFOR to describe (in a common manner) the components of the numerical models developed by the different modeling centres. This vocabulary is used in the model part of the web-based questionnaire that METAFOR has developed in support of the upcoming IPCC exercise (the CMIP5 questionnaire). The methods to (1) establish standards for this vocabulary via interactions with climate scientists, (2) utilise the vocabulary in the web-based questionnaire and (3) process the vocabulary for ingestion in the Earth System Grid (ESG) portal, are described. Governance aspects of this new controlled vocabulary are also addressed.

  19. Toward a midisuperspace quantization of LeMaitre-Tolman-Bondi collapse models

    SciTech Connect

    Vaz, Cenalo; Witten, Louis; Singh, T. P.

    2001-05-15

    LeMaitre-Tolman-Bondi models of spherical dust collapse have been used and continue to be used extensively to study various stellar collapse scenarios. It is by now well known that these models lead to the formation of black holes and naked singularities from regular initial data. The final outcome of the collapse, particularly in the event of naked singularity formation, depends very heavily on quantum effects during the final stages. These quantum effects cannot generally be treated semiclassically as quantum fluctuations of the gravitational field are expected to dominate before the final state is reached. We present a canonical reduction of LeMaitre-Tolman-Bondi space-times describing the marginally bound collapse of inhomogeneous dust, in which the physical radius R, the proper time of the collapsing dust {tau}, and the mass function F are the canonical coordinates R(r), {tau}(r) and F(r) on the phase space. Dirac's constraint quantization leads to a simple functional (Wheeler-DeWitt) equation. The equation is solved and the solution can be employed to study some of the effects of quantum gravity during gravitational collapse with different initial conditions.

  20. Le Bon Samaritain: A Community-Based Care Model Supported by Technology.

    PubMed

    Gay, Valerie; Leijdekkers, Peter; Gill, Asif; Felix Navarro, Karla

    2015-01-01

    The effective care and well-being of a community is a challenging task especially in an emergency situation. Traditional technology-based silos between health and emergency services are challenged by the changing needs of the community that could benefit from integrated health and safety services. Low-cost smart-home automation solutions, wearable devices and Cloud technology make it feasible for communities to interact with each other, and with health and emergency services in a timely manner. This paper proposes a new community-based care model, supported by technology, that aims at reducing healthcare and emergency services costs while allowing community to become resilient in response to health and emergency situations. We looked at models of care in different industries and identified the type of technology that can support the suggested new model of care. Two prototypes were developed to validate the adequacy of the technology. The result is a new community-based model of care called 'Le Bon Samaritain'. It relies on a network of people called 'Bons Samaritains' willing to help and deal with the basic care and safety aspects of their community. Their role is to make sure that people in their community receive and understand the messages from emergency and health services. The new care model is integrated with existing emergency warning, community and health services. Le Bon Samaritain model is scalable, community-based and can help people feel safer, less isolated and more integrated in their community. It could be the key to reduce healthcare cost, increase resilience and drive the change for a more integrated emergency and care system.

  1. Models of the Primordial Standard Clock

    NASA Astrophysics Data System (ADS)

    Chen, Xingang; Namjoo, Mohammad Hossein; Wang, Yi

    2015-02-01

    Oscillating massive fields in the primordial universe can be used as Standard Clocks. The ticks of these oscillations induce features in the density perturbations, which directly record the time evolution of the scale factor of the primordial universe, thus if detected, provide a direct evidence for the inflation scenario or the alternatives. In this paper, we construct a full inflationary model of primordial Standard Clock and study its predictions on the density perturbations. This model provides a full realization of several key features proposed previously. We compare the theoretical predictions from inflation and alternative scenarios with the Planck 2013 temperature data on Cosmic Microwave Background (CMB), and identify a statistically marginal but interesting candidate. We discuss how future CMB temperature and polarization data, non-Gaussianity analysis and Large Scale Structure data may be used to further test or constrain the Standard Clock signals.

  2. Observational challenges for the standard FLRW model

    NASA Astrophysics Data System (ADS)

    Buchert, Thomas; Coley, Alan A.; Kleinert, Hagen; Roukema, Boudewijn F.; Wiltshire, David L.

    2016-02-01

    In this paper, we summarize some of the main observational challenges for the standard Friedmann-Lemaître-Robertson-Walker (FLRW) cosmological model and describe how results recently presented in the parallel session “Large-scale Structure and Statistics” (DE3) at the “Fourteenth Marcel Grossman Meeting on General Relativity” are related to these challenges.

  3. Inclusive Standard Model Higgs searches with ATLAS

    SciTech Connect

    Polci, Francesco

    2008-11-23

    The update of the discovery potential for a Standard Model Higgs boson through the inclusive searches H{yields}{gamma}{gamma}, H{yields}ZZ* and H{yields}WW with the ATLAS detector is reported. The analysis are based on the most recent available simulations of signal, backgrounds as well as the detector response.

  4. Inflation in the standard cosmological model

    NASA Astrophysics Data System (ADS)

    Uzan, Jean-Philippe

    2015-12-01

    The inflationary paradigm is now part of the standard cosmological model as a description of its primordial phase. While its original motivation was to solve the standard problems of the hot big bang model, it was soon understood that it offers a natural theory for the origin of the large-scale structure of the universe. Most models rely on a slow-rolling scalar field and enjoy very generic predictions. Besides, all the matter of the universe is produced by the decay of the inflaton field at the end of inflation during a phase of reheating. These predictions can be (and are) tested from their imprint of the large-scale structure and in particular the cosmic microwave background. Inflation stands as a window in physics where both general relativity and quantum field theory are at work and which can be observationally studied. It connects cosmology with high-energy physics. Today most models are constructed within extensions of the standard model, such as supersymmetry or string theory. Inflation also disrupts our vision of the universe, in particular with the ideas of chaotic inflation and eternal inflation that tend to promote the image of a very inhomogeneous universe with fractal structure on a large scale. This idea is also at the heart of further speculations, such as the multiverse. This introduction summarizes the connections between inflation and the hot big bang model and details the basics of its dynamics and predictions. xml:lang="fr"

  5. Some Standard model problems and possible solutions

    NASA Astrophysics Data System (ADS)

    Barranco, J.

    2016-10-01

    Three problems of the standard model of elementary particles are studied from a phenomenological approach. (i) It is shown that the Dirac or the Majorana nature of the neutrino can be studied by looking for differences in the v-electron scattering if the polarization of the neutrino is considered. (ii) The absolute scale of the neutrino mass can be set if a four zero mass matrix texture is considered for the leptons. It is found that m ν3 ∼⃒ 0.05 eV. (iii) It is shown that it is possible -within a certain class of two Higgs model extensions of the standard model- to have a cancelation of the quadratic divergences to the mass of physical Higgs boson.

  6. Collective Political Violence in Easton’s Political Systems Model (La Violence Politique Collective dans le Modele de Systeme Politique d’Easton)

    DTIC Science & Technology

    2011-09-01

    Her Majesty the Queen in Right of Canada, as represented by the Minister of National Defence,.2011 © Sa Majesté la Reine (en droit du Canada...d’autocorrection du système. Le recours à la violence peut – dans une certaine mesure – permettre à un régime de restaurer le niveau critique ...Model ............................................................................................. 3 4 A Critique of Easton’s Model

  7. Electroweak standard model with very special relativity

    NASA Astrophysics Data System (ADS)

    Alfaro, Jorge; González, Pablo; Ávila, Ricardo

    2015-05-01

    The very special relativity electroweak Standard Model (VSR EW SM) is a theory with SU (2 )L×U (1 )R symmetry, with the same number of leptons and gauge fields as in the usual Weinberg-Salam model. No new particles are introduced. The model is renormalizable and unitarity is preserved. However, photons obtain mass and the massive bosons obtain different masses for different polarizations. Besides, neutrino masses are generated. A VSR-invariant term will produce neutrino oscillations and new processes are allowed. In particular, we compute the rate of the decays μ →e +γ . All these processes, which are forbidden in the electroweak Standard Model, put stringent bounds on the parameters of our model and measure the violation of Lorentz invariance. We investigate the canonical quantization of this nonlocal model. Second quantization is carried out, and we obtain a well-defined particle content. Additionally, we do a counting of the degrees of freedom associated with the gauge bosons involved in this work, after spontaneous symmetry breaking has been realized. Violations of Lorentz invariance have been predicted by several theories of quantum gravity [J. Alfaro, H. Morales-Tecotl, and L. F. Urrutia, Phys. Rev. Lett. 84, 2318 (2000); Phys. Rev. D 65, 103509 (2002)]. It is a remarkable possibility that the low-energy effects of Lorentz violation induced by quantum gravity could be contained in the nonlocal terms of the VSR EW SM.

  8. Temperature dependence of standard model CP violation.

    PubMed

    Brauner, Tomáš; Taanila, Olli; Tranberg, Anders; Vuorinen, Aleksi

    2012-01-27

    We analyze the temperature dependence of CP violation effects in the standard model by determining the effective action of its bosonic fields, obtained after integrating out the fermions from the theory and performing a covariant gradient expansion. We find nonvanishing CP violating terms starting at the sixth order of the expansion, albeit only in the C-odd-P-even sector, with coefficients that depend on quark masses, Cabibbo-Kobayashi-Maskawa matrix elements, temperature and the magnitude of the Higgs field. The CP violating effects are observed to decrease rapidly with temperature, which has important implications for the generation of a matter-antimatter asymmetry in the early Universe. Our results suggest that the cold electroweak baryogenesis scenario may be viable within the standard model, provided the electroweak transition temperature is at most of order 1 GeV.

  9. The Standard Model of Nuclear Physics

    NASA Astrophysics Data System (ADS)

    Detmold, William

    2015-04-01

    At its core, nuclear physics, which describes the properties and interactions of hadrons, such as protons and neutrons, and atomic nuclei, arises from the Standard Model of particle physics. However, the complexities of nuclei result in severe computational difficulties that have historically prevented the calculation of central quantities in nuclear physics directly from this underlying theory. The availability of petascale (and prospect of exascale) high performance computing is changing this situation by enabling us to extend the numerical techniques of lattice Quantum Chromodynamics (LQCD), applied successfully in particle physics, to the more intricate dynamics of nuclear physics. In this talk, I will discuss this revolution and the emerging understanding of hadrons and nuclei within the Standard Model.

  10. The standard model coupled to quantum gravitodynamics

    NASA Astrophysics Data System (ADS)

    Aldabe, Fermin

    2017-01-01

    We show that the renormalizable SO(4)× U(1)× SU(2)× SU(3) Yang-Mills coupled to matter and the Higgs field fits all the experimentally observed differential cross sections known in nature. This extended Standard Model reproduces the experimental gravitational differential cross sections without resorting to the graviton field and instead by exchanging SO(4) gauge fields. By construction, each SO(4) generator in quantum gravitodynamics does not commute with the Dirac gamma matrices. This produces additional interactions absent to non-Abelian gauge fields in the Standard Model. The contributions from these new terms yield differential cross sections consistent with the Newtonian and post-Newtonian interactions derived from General Relativity. Dimensional analysis of the Lagrangian shows that all its terms have total dimensionality four or less and therefore that all physical quantities in the theory renormalize by finite amounts. These properties make QGD the only renormalizable four-dimensional theory describing gravitational interactions.

  11. SU(5) heterotic Standard Model bundles

    NASA Astrophysics Data System (ADS)

    Andreas, Björn; Hoffmann, Norbert

    2012-04-01

    We construct a class of stable SU(5) bundles on an elliptically fibered Calabi-Yau threefold with two sections, a variant of the ordinary Weierstrass fibration, which admits a free involution. The bundles are invariant under the involution, solve the topological constraint imposed by the heterotic anomaly equation and give three generations of Standard Model fermions after symmetry breaking by Wilson lines of the intermediate SU(5) GUT-group to the Standard Model gauge group. Among the solutions we find some which can be perturbed to solutions of the Strominger system. Thus these solutions provide a step toward the construction of phenomenologically realistic heterotic flux compactifications via non-Kähler deformations of Calabi-Yau geometries with bundles. This particular class of solutions involves a rank two hidden sector bundle and does not require background fivebranes for anomaly cancellation.

  12. Search for the fourth standard model family

    SciTech Connect

    Sahin, M.; Sultansoy, S.; Turkoz, S.

    2011-03-01

    Existence of the fourth family follows from the basics of the standard model (SM) and the actual mass spectrum of the third family fermions. We discuss possible manifestations of the fourth SM family at existing and future colliders. The LHC and Tevatron potentials to discover the fourth SM family have been compared. The scenario with dominance of the anomalous decay modes of the fourth-family quarks has been considered in detail.

  13. Renormalization Group in the Standard Model

    SciTech Connect

    Kielanowski, P.; Juarez W, S. R.

    2007-11-27

    We discuss two applications of the renormalization group method in the Standard Model. In the first one we present some theorems about the running of the Cabibbo-Kobayashi-Maskawa matrix and show that the evolution depends on one function of energy only. In the second one we discuss the properties of the running of the Higgs potential and derive the limits for the Higgs mass.

  14. Beyond the standard model in many directions

    SciTech Connect

    Chris Quigg

    2004-04-28

    These four lectures constitute a gentle introduction to what may lie beyond the standard model of quarks and leptons interacting through SU(3){sub c} {direct_product} SU(2){sub L} {direct_product} U(1){sub Y} gauge bosons, prepared for an audience of graduate students in experimental particle physics. In the first lecture, I introduce a novel graphical representation of the particles and interactions, the double simplex, to elicit questions that motivate our interest in physics beyond the standard model, without recourse to equations and formalism. Lecture 2 is devoted to a short review of the current status of the standard model, especially the electroweak theory, which serves as the point of departure for our explorations. The third lecture is concerned with unified theories of the strong, weak, and electromagnetic interactions. In the fourth lecture, I survey some attempts to extend and complete the electroweak theory, emphasizing some of the promise and challenges of supersymmetry. A short concluding section looks forward.

  15. Indoorgml - a Standard for Indoor Spatial Modeling

    NASA Astrophysics Data System (ADS)

    Li, Ki-Joune

    2016-06-01

    With recent progress of mobile devices and indoor positioning technologies, it becomes possible to provide location-based services in indoor space as well as outdoor space. It is in a seamless way between indoor and outdoor spaces or in an independent way only for indoor space. However, we cannot simply apply spatial models developed for outdoor space to indoor space due to their differences. For example, coordinate reference systems are employed to indicate a specific position in outdoor space, while the location in indoor space is rather specified by cell number such as room number. Unlike outdoor space, the distance between two points in indoor space is not determined by the length of the straight line but the constraints given by indoor components such as walls, stairs, and doors. For this reason, we need to establish a new framework for indoor space from fundamental theoretical basis, indoor spatial data models, and information systems to store, manage, and analyse indoor spatial data. In order to provide this framework, an international standard, called IndoorGML has been developed and published by OGC (Open Geospatial Consortium). This standard is based on a cellular notion of space, which considers an indoor space as a set of non-overlapping cells. It consists of two types of modules; core module and extension module. While core module consists of four basic conceptual and implementation modeling components (geometric model for cell, topology between cells, semantic model of cell, and multi-layered space model), extension modules may be defined on the top of the core module to support an application area. As the first version of the standard, we provide an extension for indoor navigation.

  16. Beyond standard model calculations with Sherpa

    DOE PAGES

    Höche, Stefan; Kuttimalai, Silvan; Schumann, Steffen; ...

    2015-03-24

    We present a fully automated framework as part of the Sherpa event generator for the computation of tree-level cross sections in beyond Standard Model scenarios, making use of model information given in the Universal FeynRules Output format. Elementary vertices are implemented into C++ code automatically and provided to the matrix-element generator Comix at runtime. Widths and branching ratios for unstable particles are computed from the same building blocks. The corresponding decays are simulated with spin correlations. Parton showers, QED radiation and hadronization are added by Sherpa, providing a full simulation of arbitrary BSM processes at the hadron level.

  17. Extended spin symmetry and the standard model

    SciTech Connect

    Besprosvany, J.; Romero, R.

    2010-12-23

    We review unification ideas and explain the spin-extended model in this context. Its consideration is also motivated by the standard-model puzzles. With the aim of constructing a common description of discrete degrees of freedom, as spin and gauge quantum numbers, the model departs from q-bits and generalized Hilbert spaces. Physical requirements reduce the space to one that is represented by matrices. The classification of the representations is performed through Clifford algebras, with its generators associated with Lorentz and scalar symmetries. We study a reduced space with up to two spinor elements within a matrix direct product. At given dimension, the demand that Lorentz symmetry be maintained, determines the scalar symmetries, which connect to vector-and-chiral gauge-interacting fields; we review the standard-model information in each dimension. We obtain fermions and bosons, with matter fields in the fundamental representation, radiation fields in the adjoint, and scalar particles with the Higgs quantum numbers. We relate the fields' representation in such spaces to the quantum-field-theory one, and the Lagrangian. The model provides a coupling-constant definition.

  18. Standard model EFT and extended scalar sectors

    DOE PAGES

    Dawson, Sally; Murphy, Christopher W.

    2017-07-31

    One of the simplest extensions of the Standard Model is the inclusion of an additional scalar multiplet, and we consider scalars in the S U ( 2 ) L singlet, triplet, and quartet representations. Here, we examine models with heavy neutral scalars, m H ~1 – 2 TeV , and the matching of the UV complete theories to the low energy effective field theory. We also demonstrate the agreement of the kinematic distributions obtained in the singlet models for the gluon fusion of a Higgs pair with the predictions of the effective field theory. Finally, the restrictions on the extendedmore » scalar sectors due to unitarity and precision electroweak measurements are summarized and lead to highly restricted regions of viable parameter space for the triplet and quartet models.« less

  19. Standard model EFT and extended scalar sectors

    NASA Astrophysics Data System (ADS)

    Dawson, Sally; Murphy, Christopher W.

    2017-07-01

    One of the simplest extensions of the Standard Model is the inclusion of an additional scalar multiplet, and we consider scalars in the S U (2 )L singlet, triplet, and quartet representations. We examine models with heavy neutral scalars, mH˜1 - 2 TeV , and the matching of the UV complete theories to the low energy effective field theory. We demonstrate the agreement of the kinematic distributions obtained in the singlet models for the gluon fusion of a Higgs pair with the predictions of the effective field theory. The restrictions on the extended scalar sectors due to unitarity and precision electroweak measurements are summarized and lead to highly restricted regions of viable parameter space for the triplet and quartet models.

  20. Experimentally testing the standard cosmological model

    SciTech Connect

    Schramm, D.N. Fermi National Accelerator Lab., Batavia, IL )

    1990-11-01

    The standard model of cosmology, the big bang, is now being tested and confirmed to remarkable accuracy. Recent high precision measurements relate to the microwave background; and big bang nucleosynthesis. This paper focuses on the latter since that relates more directly to high energy experiments. In particular, the recent LEP (and SLC) results on the number of neutrinos are discussed as a positive laboratory test of the standard cosmology scenario. Discussion is presented on the improved light element observational data as well as the improved neutron lifetime data. alternate nucleosynthesis scenarios of decaying matter or of quark-hadron induced inhomogeneities are discussed. It is shown that when these scenarios are made to fit the observed abundances accurately, the resulting conclusions on the baryonic density relative to the critical density, {Omega}{sub b}, remain approximately the same as in the standard homogeneous case, thus, adding to the robustness of the standard model conclusion that {Omega}{sub b} {approximately} 0.06. This latter point is the deriving force behind the need for non-baryonic dark matter (assuming {Omega}{sub total} = 1) and the need for dark baryonic matter, since {Omega}{sub visible} < {Omega}{sub b}. Recent accelerator constraints on non-baryonic matter are discussed, showing that any massive cold dark matter candidate must now have a mass M{sub x} {approx gt} 20 GeV and an interaction weaker than the Z{sup 0} coupling to a neutrino. It is also noted that recent hints regarding the solar neutrino experiments coupled with the see-saw model for {nu}-masses may imply that the {nu}{sub {tau}} is a good hot dark matter candidate. 73 refs., 5 figs.

  1. A Global View Beyond the Standard Model

    SciTech Connect

    Not Available

    2008-01-20

    By 1973, the theoretical foundations of the Standard Model of fundamental interactions had been completed. In the decades that followed, new particles and phenomena predicted by the Standard Model were discovered in a dramatic series of experiments at laboratories around the world. This began with the discovery of the charm quark at SLAC and Brookhaven, predicted by Glashow, Illiopoulos and Maiani from flavor properties of the SM. The W and Z bosons were produced directly in experiments at CERN, and signals of energetic gluons were observed at DESY. Experiments eventually found a full third generation of fermions, culminating with the discovery of the top quark and tau neutrino at Fermilab. During this same period, major theoretical advances made it possible to push the accuracy of Standard Model predictions. This allowed compelling tests of the SM at the level of radiative corrections, and to test the predictions of QCD in the confining domain. Thus experiments confirmed the quantum dynamics of the SM, and validated the CKM picture of flavor mixing and CP violation. While this process took a long time, and may have appeared frustrating to many to just achieve the confirmation of the 'standard' theory, the outcome of these 30-odd years is now a cornerstone of our understanding of the natural world, occupying a deserved place next to Maxwell's electromagnetism, to relativity, and to quantum mechanics. The timescale and size of this enterprise, at the same time, gives us a benchmark for the magnitude of the efforts that may be required to go beyond the Standard Model to the next level of fundamental understanding. New ideas and theories have been put forward in the attempt to understand great questions left unanswered by the Standard Model. These theories attempt to explain why nature needs both gravitational and gauge interactions, and why their energy scales are so different. They address the possible origins of matter-antimatter asymmetry, of particle masses, and

  2. The computation of standard solar models

    NASA Technical Reports Server (NTRS)

    Ulrich, Roger K.; Cox, Arthur N.

    1991-01-01

    Procedures for calculating standard solar models with the usual simplifying approximations of spherical symmetry, no mixing except in the surface convection zone, no mass loss or gain during the solar lifetime, and no separation of elements by diffusion are described. The standard network of nuclear reactions among the light elements is discussed including rates, energy production and abundance changes. Several of the equation of state and opacity formulations required for the basic equations of mass, momentum and energy conservation are presented. The usual mixing-length convection theory is used for these results. Numerical procedures for calculating the solar evolution, and current evolution and oscillation frequency results for the present sun by some recent authors are given.

  3. Recherche du boson de Higgs standard dans le canal WH a l'experience D0 aupres du Tevatron

    SciTech Connect

    Lellouch, Jeremie

    2008-09-26

    The Higgs mechanism provides the Standard Model with an appropriate theory of the origin of the mass of gauge bosons and elementary fermions. The Higgs boson has not yet been discovered, but a lower limit on its mass has been set at 114.4 GeV at 95% con dence level by the Lep collaborations. Higgs searches are now being pursued at the Tevatron, a proton - anti-proton collider with a centreof- mass energy of 1.96TeV. At low mass the most sensitive channel is the associated production of a Higgs with a W boson. An analysis has been performed in the decay channel in which the Higgs goes into a b$\\bar{b}$ pair and the W decays to a muon and a neutrino with 1 fb-1 of Run IIa data recorded by the DØ detector. The analysis relies on all the sub-detector components but most particularly on the calorimeter which is essential for reconstruction of the b$\\bar{b}$ system. Good tracking and a b-identi cation neural network tool provide improved b-tagging performance which is crucial for this analysis. Because the energy resolution of jets is of paramount importance when hunting for a two-jet resonance bump, work has also been conducted on devising a better-performing calibration for jets which exhibit a muon and a neutrino in their fragmentation chain. The WH analysis is performed on a W+2 and W+3 jet event topology. Events containing a muon, missing transverse energy and two or three jets are selected; the jets are then b-tagged. The doubletagged and single-tagged channels are analysed separately so as to provide additional sensitivity. Signal-background separation via a neural network algorithme has furthermore been developed in the analysis in order to enhance sensitivity. This search for a standard Higgs boson has been conducted for Higgs masses ranging from 100GeV to 150GeV. Upper limits on production cross-section times branching ratio have been set. For a Higgs mass of 115GeV, the upper limit is set at 2.00 pb at 95% con dence level in the channel WH →

  4. Statistical model with a standard Gamma distribution.

    PubMed

    Patriarca, Marco; Chakraborti, Anirban; Kaski, Kimmo

    2004-01-01

    We study a statistical model consisting of N basic units which interact with each other by exchanging a physical entity, according to a given microscopic random law, depending on a parameter lambda. We focus on the equilibrium or stationary distribution of the entity exchanged and verify through numerical fitting of the simulation data that the final form of the equilibrium distribution is that of a standard Gamma distribution. The model can be interpreted as a simple closed economy in which economic agents trade money and a saving criterion is fixed by the saving propensity lambda. Alternatively, from the nature of the equilibrium distribution, we show that the model can also be interpreted as a perfect gas at an effective temperature T(lambda), where particles exchange energy in a space with an effective dimension D(lambda).

  5. Statistical model with a standard Gamma distribution

    NASA Astrophysics Data System (ADS)

    Chakraborti, Anirban; Patriarca, Marco

    2005-03-01

    We study a statistical model consisting of N basic units which interact with each other by exchanging a physical entity, according to a given microscopic random law, depending on a parameter λ. We focus on the equilibrium or stationary distribution of the entity exchanged and verify through numerical fitting of the simulation data that the final form of the equilibrium distribution is that of a standard Gamma distribution. The model can be interpreted as a simple closed economy in which economic agents trade money and a saving criterion is fixed by the saving propensity λ. Alternatively, from the nature of the equilibrium distribution, we show that the model can also be interpreted as a perfect gas at an effective temperature T (λ), where particles exchange energy in a space with an effective dimension D (λ).

  6. Statistical model with a standard Γ distribution

    NASA Astrophysics Data System (ADS)

    Patriarca, Marco; Chakraborti, Anirban; Kaski, Kimmo

    2004-07-01

    We study a statistical model consisting of N basic units which interact with each other by exchanging a physical entity, according to a given microscopic random law, depending on a parameter λ . We focus on the equilibrium or stationary distribution of the entity exchanged and verify through numerical fitting of the simulation data that the final form of the equilibrium distribution is that of a standard Gamma distribution. The model can be interpreted as a simple closed economy in which economic agents trade money and a saving criterion is fixed by the saving propensity λ . Alternatively, from the nature of the equilibrium distribution, we show that the model can also be interpreted as a perfect gas at an effective temperature T(λ) , where particles exchange energy in a space with an effective dimension D(λ) .

  7. The Hypergeometrical Universe: Cosmology and Standard Model

    SciTech Connect

    Pereira, Marco A.

    2010-12-22

    This paper presents a simple and purely geometrical Grand Unification Theory. Quantum Gravity, Electrostatic and Magnetic interactions are shown in a unified framework. Newton's, Gauss' and Biot-Savart's Laws are derived from first principles. Unification symmetry is defined for all the existing forces. This alternative model does not require Strong and Electroweak forces. A 4D Shock -Wave Hyperspherical topology is proposed for the Universe which together with a Quantum Lagrangian Principle and a Dilator based model for matter result in a quantized stepwise expansion for the whole Universe along a radial direction within a 4D spatial manifold. The Hypergeometrical Standard Model for matter, Universe Topology and a new Law of Gravitation are presented.

  8. La normalisation linguistique dans une entreprise: le mot d'ordre mondial (Linguistic Standardization in a Business: The Worldwide Password).

    ERIC Educational Resources Information Center

    Roy, Sylvie

    2000-01-01

    Examines how linguistic standard globalization in a call centre affects the value of bilingualism and the linguistic varieties of a francophone minority population. Bilingualism grants access to a job in the information and service sector, but since the emergence of linguistic standardization in this sector only a certain selection of individuals…

  9. La normalisation linguistique dans une entreprise: le mot d'ordre mondial (Linguistic Standardization in a Business: The Worldwide Password).

    ERIC Educational Resources Information Center

    Roy, Sylvie

    2000-01-01

    Examines how linguistic standard globalization in a call centre affects the value of bilingualism and the linguistic varieties of a francophone minority population. Bilingualism grants access to a job in the information and service sector, but since the emergence of linguistic standardization in this sector only a certain selection of individuals…

  10. Quantum Field Theory and the Standard Model

    NASA Astrophysics Data System (ADS)

    Schwartz, Matthew D.

    2014-03-01

    Part I. Field Theory: 1. Microscopic theory of radiation; 2. Lorentz invariance and second quantization; 3. Classical Field Theory; 4. Old-fashioned perturbation theory; 5. Cross sections and decay rates; 6. The S-matrix and time-ordered products; 7. Feynman rules; Part II. Quantum Electrodynamics: 8. Spin 1 and gauge invariance; 9. Scalar QED; 10. Spinors; 11. Spinor solutions and CPT; 12. Spin and statistics; 13. Quantum electrodynamics; 14. Path integrals; Part III. Renormalization: 15. The Casimir effect; 16. Vacuum polarization; 17. The anomalous magnetic moment; 18. Mass renormalization; 19. Renormalized perturbation theory; 20. Infrared divergences; 21. Renormalizability; 22. Non-renormalizable theories; 23. The renormalization group; 24. Implications of Unitarity; Part IV. The Standard Model: 25. Yang-Mills theory; 26. Quantum Yang-Mills theory; 27. Gluon scattering and the spinor-helicity formalism; 28. Spontaneous symmetry breaking; 29. Weak interactions; 30. Anomalies; 31. Precision tests of the standard model; 32. QCD and the parton model; Part V. Advanced Topics: 33. Effective actions and Schwinger proper time; 34. Background fields; 35. Heavy-quark physics; 36. Jets and effective field theory; Appendices; References; Index.

  11. Physics Beyond the Standard Model at Colliders

    NASA Astrophysics Data System (ADS)

    Matchev, Konstantin

    These lectures introduce the modern machinery used in searches and studies of new physics Beyond the Standard Model (BSM) at colliders. The first lecture provides an overview of the main simulation tools used in high energy physics, including automated parton-level calculators, general purpose event generators, detector simulators, etc. The second lecture is a brief introduction to low energy supersymmetry (SUSY) as a representative BSM paradigm. The third lecture discusses the main collider signatures of SUSY and methods for measuring the masses of new particles in events with missing energy.

  12. Twisted spectral geometry for the standard model

    NASA Astrophysics Data System (ADS)

    Martinetti, Pierre

    2015-07-01

    In noncommutative geometry, the spectral triple of a manifold does not generate bosonic fields, for fluctuations of the Dirac operator vanish. A Connes-Moscovici twist forces the commutative algebra to be multiplied by matrices. Keeping the space of spinors untouched, twisted-fluctuations then yield perturbations of the spin connection. Applied to the spectral triple of the Standard Model, a similar twist yields the scalar field needed to stabilize the vacuum and to make the computation of the Higgs mass compatible with its experimental value.

  13. Topological Theory and the Standard Electroweak Model

    NASA Astrophysics Data System (ADS)

    Chew, G. F.; Finkelstein, J.

    1983-03-01

    Topological theory predicts four charged and four neutral electroweak vector bosons, together with one neutral scalar boson. There is a single coupling constant e allowing immediate prediction (up to radiative corrections), given the Fermi constant G, of a 75-GeV mass for left-handed charged vector bosons. The authors further predict vanishing of vector weak neutral-current coupling to charged leptons (gV=hVV=hVA=0). Dynamical assumptions motivated by meson spectra yield a vector boson spectrum whose lowest-lying four states correspond to the standard model with 2θW=14 (MZ02=43MW2).

  14. A New Generation of Standard Solar Models

    NASA Astrophysics Data System (ADS)

    Vinyoles, Núria; Serenelli, Aldo M.; Villante, Francesco L.; Basu, Sarbani; Bergström, Johannes; Gonzalez-Garcia, M. C.; Maltoni, Michele; Peña-Garay, Carlos; Song, Ningqiang

    2017-02-01

    We compute a new generation of standard solar models (SSMs) that includes recent updates on some important nuclear reaction rates and a more consistent treatment of the equation of state. Models also include a novel and flexible treatment of opacity uncertainties based on opacity kernels, required in light of recent theoretical and experimental works on radiative opacity. Two large sets of SSMs, each based on a different canonical set of solar abundances with high and low metallicity (Z), are computed to determine model uncertainties and correlations among different observables. We present detailed comparisons of high- and low-Z models against different ensembles of solar observables, including solar neutrinos, surface helium abundance, depth of the convective envelope, and sound speed profile. A global comparison, including all observables, yields a p-value of 2.7σ for the high-Z model and 4.7σ for the low-Z one. When the sound speed differences in the narrow region of 0.65< r/{R}ȯ < 0.70 are excluded from the analysis, results are 0.9σ and 3.0σ for high- and low-Z models respectively. These results show that high-Z models agree well with solar data but have a systematic problem right below the bottom of the convective envelope linked to steepness of molecular weight and temperature gradients, and that low-Z models lead to a much more general disagreement with solar data. We also show that, while simple parametrizations of opacity uncertainties can strongly alleviate the solar abundance problem, they are insufficient to substantially improve the agreement of SSMs with helioseismic data beyond that obtained for high-Z models due to the intrinsic correlations of theoretical predictions.

  15. RK and RK* beyond the standard model

    NASA Astrophysics Data System (ADS)

    Hiller, Gudrun; Nišandžić, Ivan

    2017-08-01

    Measurements of the ratio of B →K*μ μ to B →K*e e branching fractions, RK*, by the LHCb Collaboration strengthen the hints from previous studies with pseudoscalar kaons, RK, for the breakdown of lepton universality, and therefore the Standard Model (SM), to ˜3.5 σ . Complementarity between RK and RK* allows us to pin down the Dirac structure of the new contributions to be predominantly SM-like chiral, with possible admixture of chirality-flipped contributions of up to O (few 10 %). Scalar and vector leptoquark representations (S3,V1,V3) plus possible (S˜2,V2) admixture can explain RK ,K* via tree-level exchange. Flavor models naturally predict leptoquark masses not exceeding a few TeV, with couplings to third-generation quarks at O (0.1 ), implying that this scenario can be directly tested at the LHC.

  16. Diagonal symmetries beyond the standard model

    NASA Astrophysics Data System (ADS)

    Batra, Puneet

    We use diagonal symmetries to address experimental and conceptual shortcomings of theories "Beyond the Standard Model". We first show that embedding the Weak gauge group, SU(2)W, as the diagonal subgroup of a gauged SU(2) x SU(2) symmetry can open up dramatic new regions of parameter space for Supersymmetric models: regions where the CP-even Higgs mass is as large as ˜350 GeV (Chapter 2), where tan beta < 1 (Chapter 3), and where the lightest Higgs state is charged (Chapter 3). In Chapter 4 we show that a Little Higgs theory (with a gauged SU(12) diagonal symmetry) can form the Ultraviolet completion for another Little Higgs theory (with a gauged SU(4) diagonal symmetry). This theory remains perturbative up to 100 TeV and allows for further structural extensions to yet higher cutoffs---all without introducing quadratic instability in the Weak scale.

  17. Sphaleron Rate in the Minimal Standard Model

    NASA Astrophysics Data System (ADS)

    D'Onofrio, Michela; Rummukainen, Kari; Tranberg, Anders

    2014-10-01

    We use large-scale lattice simulations to compute the rate of baryon number violating processes (the sphaleron rate), the Higgs field expectation value, and the critical temperature in the standard model across the electroweak phase transition temperature. While there is no true phase transition between the high-temperature symmetric phase and the low-temperature broken phase, the crossover is sharp and located at temperature Tc=(159.5±1.5) GeV. The sphaleron rate in the symmetric phase (T >Tc) is Γ/T4=(18±3)αW5, and in the broken phase in the physically interesting temperature range 130 GeV standard model, are relevant for, e.g., low-scale leptogenesis scenarios.

  18. The Standard Model Algebra - a summary

    NASA Astrophysics Data System (ADS)

    Cristinel Stoica, Ovidiu

    2017-08-01

    A generation of leptons and quarks and the gauge symmetries of the Standard Model can be obtained from the Clifford algebra ℂℓ 6. An instance of ℂℓ 6 is implicitly generated by the Dirac algebra combined with the electroweak symmetry, while the color symmetry gives another instance of ℂℓ 6 with a Witt decomposition. The minimal mathematical model proposed here results by identifying the two instances of ℂℓ 6. The left ideal decomposition generated by the Witt decomposition represents the leptons and quarks, and their antiparticles. The SU(3)c and U(1)em symmetries of the SM are the symmetries of this ideal decomposition. The patterns of electric charges, colors, chirality, weak isospins, and hypercharges, follow from this, without predicting additional particles or forces, or proton decay. The electroweak symmetry is present in its broken form, due to the geometry. The predicted Weinberg angle is given by sin2 W = 0.25. The model shares common features with previously known models, particularly with Chisholm and Farwell, 1996, Trayling and Baylis, 2004, and Furey, 2016.

  19. [Standardization and modeling of surgical processes].

    PubMed

    Strauss, G; Schmitz, P

    2016-12-01

    Due to the technological developments around the operating room, surgery in the twenty-first century is undergoing a paradigm shift. Which technologies have already been integrated into the surgical routine? How can a favorable cost-benefit balance be achieved by the implementation of new software-based assistance systems? This article presents the state of the art technology as exemplified by a semi-automated operation system for otorhinolaryngology surgery. The main focus is on systems for implementation of digital handbooks and navigational functions in situ. On the basis of continuous development in digital imaging, decisions may by facilitated by individual patient models thus allowing procedures to be optimized. The ongoing digitization and linking of all relevant information enable a high level of standardization in terms of operating procedures. This may be used by assistance systems as a basis for complete documentation and high process reliability. Automation of processes in the operating room results in an increase in quality, precision and standardization so that the effectiveness and efficiency of treatment can be improved; however, care must be taken that detrimental consequences, such as loss of skills and placing too much faith in technology must be avoided by adapted training concepts.

  20. Standard model fermions and N =8 supergravity

    NASA Astrophysics Data System (ADS)

    Meissner, Krzysztof A.; Nicolai, Hermann

    2015-03-01

    In a scheme originally proposed by Gell-Mann, and subsequently shown to be realized at the SU (3 )×U (1 ) stationary point of maximal gauged SO(8) supergravity by Warner and one of the present authors, the 48 spin-1/2 fermions of the theory remaining after the removal of eight Goldstinos can be identified with the 48 quarks and leptons (including right-chiral neutrinos) of the Standard model, provided one identifies the residual SU(3) with the diagonal subgroup of the color group SU (3 )c and a family symmetry SU (3 )f . However, there remained a systematic mismatch in the electric charges by a spurion charge of ±1/6 . We here identify the "missing" U(1) that rectifies this mismatch, and that takes a surprisingly simple, though unexpected form.

  1. The spherically symmetric Standard Model with gravity

    NASA Astrophysics Data System (ADS)

    Balasin, H.; Böhmer, C. G.; Grumiller, D.

    2005-08-01

    Spherical reduction of generic four-dimensional theories is revisited. Three different notions of "spherical symmetry" are defined. The following sectors are investigated: Einstein-Cartan theory, spinors, (non-)abelian gauge fields and scalar fields. In each sector a different formalism seems to be most convenient: the Cartan formulation of gravity works best in the purely gravitational sector, the Einstein formulation is convenient for the Yang-Mills sector and for reducing scalar fields, and the Newman-Penrose formalism seems to be the most transparent one in the fermionic sector. Combining them the spherically reduced Standard Model of particle physics together with the usually omitted gravity part can be presented as a two-dimensional (dilaton gravity) theory.

  2. Standard model with partial gauge invariance

    NASA Astrophysics Data System (ADS)

    Chkareuli, J. L.; Kepuladze, Z.

    2012-03-01

    We argue that an exact gauge invariance may disable some generic features of the Standard Model which could otherwise manifest themselves at high energies. One of them might be related to the spontaneous Lorentz invariance violation (SLIV), which could provide an alternative dynamical approach to QED and Yang-Mills theories with photon and non-Abelian gauge fields appearing as massless Nambu-Goldstone bosons. To see some key features of the new physics expected we propose partial rather than exact gauge invariance in an extended SM framework. This principle applied, in some minimal form, to the weak hypercharge gauge field B μ and its interactions, leads to SLIV with B field components appearing as the massless Nambu-Goldstone modes, and provides a number of distinctive Lorentz breaking effects. Being naturally suppressed at low energies they may become detectable in high energy physics and astrophysics. Some of the most interesting SLIV processes are considered in significant detail.

  3. Beyond the standard model of particle physics.

    PubMed

    Virdee, T S

    2016-08-28

    The Large Hadron Collider (LHC) at CERN and its experiments were conceived to tackle open questions in particle physics. The mechanism of the generation of mass of fundamental particles has been elucidated with the discovery of the Higgs boson. It is clear that the standard model is not the final theory. The open questions still awaiting clues or answers, from the LHC and other experiments, include: What is the composition of dark matter and of dark energy? Why is there more matter than anti-matter? Are there more space dimensions than the familiar three? What is the path to the unification of all the fundamental forces? This talk will discuss the status of, and prospects for, the search for new particles, symmetries and forces in order to address the open questions.This article is part of the themed issue 'Unifying physics and technology in light of Maxwell's equations'.

  4. Standard model Higgs boson searches at CDF

    SciTech Connect

    Stancari, Michelle

    2012-01-01

    We present recent results from searches for a standard model Higgs boson by the CDF experiment at the Tevatron $p\\bar{p}$ collider with the full Run II data set. An excess of events above the expected background is observed and is the strongest in the associated production search channels where the Higgs is produced together with a W or Z boson, and then decays to a bottom-antibottom quark pair, with a global significance of 2.5$\\sigma$. Both limits and best fit values of the Higgs production cross section are presented. For a Higgs mass of 125~GeV/c$^2$, the best agreement with data in the $(\\sigma_{WH}+\\sigma_{ZH})\\times Br(H\\rightarrow b\\overline{b})=291\\pm^{118}_{113}$~fb.

  5. Sequestering the standard model vacuum energy.

    PubMed

    Kaloper, Nemanja; Padilla, Antonio

    2014-03-07

    We propose a very simple reformulation of general relativity, which completely sequesters from gravity all of the vacuum energy from a matter sector, including all loop corrections and renders all contributions from phase transitions automatically small. The idea is to make the dimensional parameters in the matter sector functionals of the 4-volume element of the Universe. For them to be nonzero, the Universe should be finite in spacetime. If this matter is the standard model of particle physics, our mechanism prevents any of its vacuum energy, classical or quantum, from sourcing the curvature of the Universe. The mechanism is consistent with the large hierarchy between the Planck scale, electroweak scale, and curvature scale, and early Universe cosmology, including inflation. Consequences of our proposal are that the vacuum curvature of an old and large universe is not zero, but very small, that w(DE) ≃ -1 is a transient, and that the Universe will collapse in the future.

  6. The Standard Solar Model and beyond

    NASA Astrophysics Data System (ADS)

    Turck-Chièze, S.

    2016-01-01

    The Standard Solar Model (SSM) is an important reference in Astrophysics as the Sun stays today the most observed star. This model is used to predict the internal observables like neutrino fluxes and oscillation frequencies and consequently to validate its assumptions for its generalization to other stars. The model outputs result from the resolution of the classical stellar equations and the knowledge of fundamental physics like nuclear reaction rates, screening, photon interaction, plasma physics. The plasma conditions remained unmeasurable in laboratory for long due to the high temperature and high density conditions of the solar interior. Today, neutrino detections and helioseismology aboard SoHO have largely revealed the solar interior, in particular the nuclear solar core so one can estimate the reliability of SSM and also its coherence with the different indicators and between them. This has been possible thanks to a Seismic Solar Model (SeSM) which takes into account in addition the observed sound speed profile. Seismology quantifies also some internal dynamical processes that need to be properly introduced in the description of stars. This review describes the different steps of building of the SSM, its predictions and the comparisons with observations. It discusses the accuracy of such model compared to the accuracy of the SeSM. The noticed differences and observational constraints put some limits on other possible processes like dark matter, magnetic field or waves and determine the directions of progress for the near future that will come from precise emitted neutrino fluxes. High density laser facilities promise also unprecedented checks of energy transfer by photons and nuclear reaction rates.

  7. Wisconsin's Model Academic Standards for Agricultural Education. Bulletin No. 9003.

    ERIC Educational Resources Information Center

    Fortier, John D.; Albrecht, Bryan D.; Grady, Susan M.; Gagnon, Dean P.; Wendt, Sharon, W.

    These model academic standards for agricultural education in Wisconsin represent the work of a task force of educators, parents, and business people with input from the public. The introductory section of this bulletin defines the academic standards and discusses developing the standards, using the standards, relating the standards to all…

  8. Parameterization of a complex landscape for a sediment routing model of the Le Sueur River, southern Minnesota

    NASA Astrophysics Data System (ADS)

    Belmont, P.; Viparelli, E.; Parker, G.; Lauer, W.; Jennings, C.; Gran, K.; Wilcock, P.; Melesse, A.

    2008-12-01

    Modeling sediment fluxes and pathways in complex landscapes is limited by our inability to accurately measure and integrate heterogeneous, spatially distributed sources into a single coherent, predictive geomorphic transport law. In this study, we partition the complex landscape of the Le Sueur River watershed into five distributed primary source types, bluffs (including strath terrace caps), ravines, streambanks, tributaries, and flat,agriculture-dominated uplands. The sediment contribution of each source is quantified independently and parameterized for use in a sand and mud routing model. Rigorous modeling of the evolution of this landscape and sediment flux from each source type requires consideration of substrate characteristics, heterogeneity, and spatial connectivity. The subsurface architecture of the Le Sueur drainage basin is defined by a layer cake sequence of fine-grained tills, interbedded with fluvioglacial sands. Nearly instantaneous baselevel fall of 65 m occurred at 11.5 ka, as a result of the catastrophic draining of glacial Lake Agassiz through the Minnesota River, to which the Le Sueur is a tributary. The major knickpoint that was generated from that event has propagated 40 km into the Le Sueur network, initiating an incised river valley with tall, retreating bluffs and actively incising ravines. Loading estimates constrained by river gaging records that bound the knick zone indicate that bluffs connected to the river are retreating at an average rate of less than 2 cm per year and ravines are incising at an average rate of less than 0.8 mm per year, consistent with the Holocene average incision rate on the main stem of the river of less than 0.6 mm per year. Ongoing work with cosmogenic nuclide sediment tracers, ground-based LiDAR, historic aerial photos, and field mapping will be combined to represent the diversity of erosional environments and processes in a single coherent routing model.

  9. Toward community standards and software for whole-cell modeling

    PubMed Central

    Bergmann, Frank T.; Chelliah, Vijayalakshmi; Hucka, Michael; Krantz, Marcus; Liebermeister, Wolfram; Mendes, Pedro; Myers, Chris J.; Pir, Pinar; Alaybeyoglu, Begum; Aranganathan, Naveen K; Baghalian, Kambiz; Bittig, Arne T.; Pinto Burke, Paulo E.; Cantarelli, Matteo; Chew, Yin Hoon; Costa, Rafael S.; Cursons, Joseph; Czauderna, Tobias; Goldberg, Arthur P.; Gómez, Harold F.; Hahn, Jens; Hameri, Tuure; Hernandez Gardiol, Daniel F.; Kazakiewicz, Denis; Kiselev, Ilya; Knight-Schrijver, Vincent; Knüpfer, Christian; König, Matthias; Lee, Daewon; Lloret-Villas, Audald; Mandrik, Nikita; Medley, J. Kyle; Moreau, Bertrand; Naderi-Meshkin, Hojjat; Palaniappan, Sucheendra K.; Priego-Espinosa, Daniel; Scharm, Martin; Sharma, Mahesh; Smallbone, Kieran; Stanford, Natalie J.; Song, Je-Hoon; Theile, Tom; Tokic, Milenko; Tomar, Namrata; Touré, Vasundra; Uhlendorf, Jannis; Varusai, Thawfeek M; Watanabe, Leandro H.; Wendland, Florian; Wolfien, Markus; Yurkovich, James T.; Zhu, Yan; Zardilis, Argyris; Zhukova, Anna; Schreiber, Falk

    2017-01-01

    Objective Whole-cell (WC) modeling is a promising tool for biological research, bioengineering, and medicine. However, substantial work remains to create accurate, comprehensive models of complex cells. Methods We organized the 2015 Whole-Cell Modeling Summer School to teach WC modeling and evaluate the need for new WC modeling standards and software by recoding a recently published WC model in SBML. Results Our analysis revealed several challenges to representing WC models using the current standards. Conclusion We, therefore, propose several new WC modeling standards, software, and databases. Significance We anticipate that these new standards and software will enable more comprehensive models. PMID:27305665

  10. R-invariant new inflation model versus supersymmetric standard model

    SciTech Connect

    Ibe, M.; Shinbara, Y.

    2008-02-01

    We revisit the implications of the R-invariant new inflation model to the supersymmetric standard model in light of recent discussion of gravitino production processes by the decay of the inflaton and the supersymmetry breaking field. We show that the models with supergravity mediation do not work well together with the R-invariant new inflation model, where the gravitino abundance produced by the decay of the inflaton and the supersymmetry breaking field significantly exceeds the bounds from cosmological observations without fine-tuning. We also show that the models with gauge mediation can go together with the R-invariant new inflation model, where the dark matter abundance and the baryon asymmetry of the universe are consistently explained without severe fine-tuning.

  11. R-invariant New Inflation Model vs Supersymetric Standard Model

    SciTech Connect

    Ibe, Masahiro; Shinbara, Y.

    2007-10-16

    We revisit the implications of the R-invariant New Inflation model to the supersymmetric standard model in light of recent discussion of gravitino production processes by the decay of the inflaton or the supersymmetry breaking field. We show that the models with supergravity mediation do not go well with the R-invariant New Inflation model, where the gravitino abundance produced by the decay of the inflaton or the supersymmetry breaking field significantly exceeds the bounds from cosmological observations without fine-tuning. We also show that the models with gauge mediation can go together with R-invariant New Inflation model, where the dark matter and the baryon asymmetry are consistently explained without severe fine-tuning.

  12. Connected formulas for amplitudes in standard model

    NASA Astrophysics Data System (ADS)

    He, Song; Zhang, Yong

    2017-03-01

    Witten's twistor string theory has led to new representations of S-matrix in massless QFT as a single object, including Cachazo-He-Yuan formulas in general and connected formulas in four dimensions. As a first step towards more realistic processes of the standard model, we extend the construction to QCD tree amplitudes with massless quarks and those with a Higgs boson. For both cases, we find connected formulas in four dimensions for all multiplicities which are very similar to the one for Yang-Mills amplitudes. The formula for quark-gluon color-ordered amplitudes differs from the pure-gluon case only by a Jacobian factor that depends on flavors and orderings of the quarks. In the formula for Higgs plus multi-parton amplitudes, the massive Higgs boson is effectively described by two additional massless legs which do not appear in the Parke-Taylor factor. The latter also represents the first twistor-string/connected formula for form factors.

  13. Gravity, Lorentz violation, and the standard model

    NASA Astrophysics Data System (ADS)

    Kostelecký, V. Alan

    2004-05-01

    The role of the gravitational sector in the Lorentz- and CPT-violating standard-model extension (SME) is studied. A framework is developed for addressing this topic in the context of Riemann-Cartan spacetimes, which include as limiting cases the usual Riemann and Minkowski geometries. The methodology is first illustrated in the context of the QED extension in a Riemann-Cartan background. The full SME in this background is then considered, and the leading-order terms in the SME action involving operators of mass dimension three and four are constructed. The incorporation of arbitrary Lorentz and CPT violation into general relativity and other theories of gravity based on Riemann-Cartan geometries is discussed. The dominant terms in the effective low-energy action for the gravitational sector are provided, thereby completing the formulation of the leading-order terms in the SME with gravity. Explicit Lorentz symmetry breaking is found to be incompatible with generic Riemann-Cartan geometries, but spontaneous Lorentz breaking evades this difficulty.

  14. Experimental tests of the standard model.

    SciTech Connect

    Nodulman, L.

    1998-11-11

    The title implies an impossibly broad field, as the Standard Model includes the fermion matter states, as well as the forces and fields of SU(3) x SU(2) x U(1). For practical purposes, I will confine myself to electroweak unification, as discussed in the lectures of M. Herrero. Quarks and mixing were discussed in the lectures of R. Aleksan, and leptons and mixing were discussed in the lectures of K. Nakamura. I will essentially assume universality, that is flavor independence, rather than discussing tests of it. I will not pursue tests of QED beyond noting the consistency and precision of measurements of {alpha}{sub EM} in various processes including the Lamb shift, the anomalous magnetic moment (g-2) of the electron, and the quantum Hall effect. The fantastic precision and agreement of these predictions and measurements is something that convinces people that there may be something to this science enterprise. Also impressive is the success of the ''Universal Fermi Interaction'' description of beta decay processes, or in more modern parlance, weak charged current interactions. With one coupling constant G{sub F}, most precisely determined in muon decay, a huge number of nuclear instabilities are described. The slightly slow rate for neutron beta decay was one of the initial pieces of evidence for Cabbibo mixing, now generalized so that all charged current decays of any flavor are covered.

  15. Standard Model thermodynamics across the electroweak crossover

    NASA Astrophysics Data System (ADS)

    Laine, M.; Meyer, M.

    2015-07-01

    Even though the Standard Model with a Higgs mass mH = 125GeV possesses no bulk phase transition, its thermodynamics still experiences a "soft point" at temperatures around T = 160GeV, with a deviation from ideal gas thermodynamics. Such a deviation may have an effect on precision computations of weakly interacting dark matter relic abundances if their mass is in the few TeV range, or on leptogenesis scenarios operating in this temperature range. By making use of results from lattice simulations based on a dimensionally reduced effective field theory, we estimate the relevant thermodynamic functions across the crossover. The results are tabulated in a numerical form permitting for their insertion as a background equation of state into cosmological particle production/decoupling codes. We find that Higgs dynamics induces a non-trivial "structure" visible e.g. in the heat capacity, but that in general the largest radiative corrections originate from QCD effects, reducing the energy density by a couple of percent from the free value even at T > 160GeV.

  16. Standard Model thermodynamics across the electroweak crossover

    SciTech Connect

    Laine, M.; Meyer, M. E-mail: meyer@itp.unibe.ch

    2015-07-01

    Even though the Standard Model with a Higgs mass m{sub H} = 125GeV possesses no bulk phase transition, its thermodynamics still experiences a 'soft point' at temperatures around T = 160GeV, with a deviation from ideal gas thermodynamics. Such a deviation may have an effect on precision computations of weakly interacting dark matter relic abundances if their mass is in the few TeV range, or on leptogenesis scenarios operating in this temperature range. By making use of results from lattice simulations based on a dimensionally reduced effective field theory, we estimate the relevant thermodynamic functions across the crossover. The results are tabulated in a numerical form permitting for their insertion as a background equation of state into cosmological particle production/decoupling codes. We find that Higgs dynamics induces a non-trivial 'structure' visible e.g. in the heat capacity, but that in general the largest radiative corrections originate from QCD effects, reducing the energy density by a couple of percent from the free value even at T > 160GeV.

  17. Standard Model thermodynamics across the electroweak crossover

    SciTech Connect

    Laine, M.; Meyer, M.

    2015-07-22

    Even though the Standard Model with a Higgs mass m{sub \\tiny H}=125 GeV possesses no bulk phase transition, its thermodynamics still experiences a “soft point” at temperatures around T=160 GeV, with a deviation from ideal gas thermodynamics. Such a deviation may have an effect on precision computations of weakly interacting dark matter relic abundances if their mass is in the few TeV range, or on leptogenesis scenarios operating in this temperature range. By making use of results from lattice simulations based on a dimensionally reduced effective field theory, we estimate the relevant thermodynamic functions across the crossover. The results are tabulated in a numerical form permitting for their insertion as a background equation of state into cosmological particle production/decoupling codes. We find that Higgs dynamics induces a non-trivial “structure” visible e.g. in the heat capacity, but that in general the largest radiative corrections originate from QCD effects, reducing the energy density by a couple of percent from the free value even at T>160 GeV.

  18. Cosmological perturbations from the Standard Model Higgs

    SciTech Connect

    Simone, Andrea De; Riotto, Antonio E-mail: antonio.riotto@unige.ch

    2013-02-01

    We propose that the Standard Model (SM) Higgs is responsible for generating the cosmological perturbations of the universe by acting as an isocurvature mode during a de Sitter inflationary stage. In view of the recent ATLAS and CMS results for the Higgs mass, this can happen if the Hubble rate during inflation is in the range (10{sup 10}−10{sup 14}) GeV (depending on the SM parameters). Implications for the detection of primordial tensor perturbations through the B-mode of CMB polarization via the PLANCK satellite are discussed. For example, if the Higgs mass value is confirmed to be m{sub h} = 125.5 GeV and m{sub t},α{sub s} are at their central values, our mechanism predicts tensor perturbations too small to be detected in the near future. On the other hand, if tensor perturbations will be detected by PLANCK through the B-mode of CMB, then there is a definite relation between the Higgs and top masses, making the mechanism predictive and falsifiable.

  19. Standards for Distributed Modeling and Simulation

    DTIC Science & Technology

    2006-05-04

    Modula 2 • Smalltalk • Delphi • Many others 16 3.1 C language Standard • Pre-standard language reference is “The C Programming Language,” by Brian ... Kernighan and Dennis Ritchie, 1978 (termed K&R C) • American National Standards Institute (ANSI) X3.159-1989 became “ANSI C.” • C99 adopted by ISO

  20. Efficient modelling necessitates standards for model documentation and exchange.

    PubMed

    Gernaey, K V; Rosen, C; Batstone, D J; Alex, J

    2006-01-01

    In this paper, problems related to simulation model documentation and model exchange between users are discussed. Complex simulation models have gained popularity in the environmental field, but require extensive documentation to allow independent implementation. The existence of different simulation platforms puts high demands on the quality of the original documentation. Recent experiences from cross-platform implementations with the ASM2d and ADM1 models reveal that error-free model documentation is difficult to obtain, and as a consequence, considerable time is spent on searching for documentation and implementation errors of various sources. As such, the list of errors and coding pitfalls provided for ASM2d and ADM1 in this paper is vital information for any future implementation of both models. The time needed to obtain an error-free model implementation can be significantly reduced if a standard language for model documentation and exchange is adopted. The extensible markup language (XML) and languages based on this format may provide a remedy to the problem of platform independent model documentation and exchange. In this paper the possibility to apply this to environmental models is discussed, whereas the practical model implementation examples corroborate the necessity for a standardised approach.

  1. Wisconsin's Model Academic Standards for Dance.

    ERIC Educational Resources Information Center

    Wisconsin State Dept. of Public Instruction, Madison.

    Wisconsin's Department of Public Instruction, in collaboration with Wisconsin citizens, developed academic standards in 12 curricular areas. The dance education standards go beyond emphasizing mastery of individual student areas--they weave five essential characteristics of literate individuals throughout: application of the basics, ability to…

  2. Colorado Model Content Standards for Geography.

    ERIC Educational Resources Information Center

    Colorado State Dept. of Education, Denver.

    The geography standards for Colorado schools offer suggestions for geography education that prepare students to cope with the complexities of contemporary life. The standards give students a firm grasp of the place and terrain that surrounds them; the patterns of human development around the world; and the interactions of peoples, places, and…

  3. Le Indeterminacy in Spanish.

    ERIC Educational Resources Information Center

    Foster, David William

    The standard treatment of object pronouns in Latin American Spanish assigns a direct-object function to "lo" and "la" and an indirect-object function to "le." This study challenges this descriptive attribution in light of the contradictory and refractory evidence in Spanish morphosyntax. It is suggested that more…

  4. A Five Stage Conceptual Model for Information Technology Standards.

    ERIC Educational Resources Information Center

    Cargill, Carl F.

    The advent of anticipatory and boundary layer standards used in information technology standardization has created a need for a new base level theory that can be used to anticipate the problems that will be encountered in standards planning, creation, and implementation. To meet this need, a five-level model of standards has been developed. The…

  5. Models of Teaching: Connecting Student Learning with Standards

    ERIC Educational Resources Information Center

    Dell'Olio, Jeanine M.; Donk, Tony

    2007-01-01

    "Models of Teaching: Connecting Student Learning with Standards" features classic and contemporary models of teaching appropriate to elementary and secondary settings. Authors Jeanine M. Dell'Olio and Tony Donk use detailed case studies to discuss 10 models of teaching and demonstrate how the models can incorporate state content standards and…

  6. Models of Teaching: Connecting Student Learning with Standards

    ERIC Educational Resources Information Center

    Dell'Olio, Jeanine M.; Donk, Tony

    2007-01-01

    "Models of Teaching: Connecting Student Learning with Standards" features classic and contemporary models of teaching appropriate to elementary and secondary settings. Authors Jeanine M. Dell'Olio and Tony Donk use detailed case studies to discuss 10 models of teaching and demonstrate how the models can incorporate state content standards and…

  7. The General Linear Model and Direct Standardization: A Comparison.

    ERIC Educational Resources Information Center

    Little, Roderick J. A.; Pullum, Thomas W.

    1979-01-01

    Two methods of analyzing nonorthogonal (uneven cell sizes) cross-classified data sets are compared. The methods are direct standardization and the general linear model. The authors illustrate when direct standardization may be a desirable method of analysis. (JKS)

  8. Tool for physics beyond the standard model

    NASA Astrophysics Data System (ADS)

    Newby, Christopher A.

    The standard model (SM) of particle physics is a well studied theory, but there are hints that the SM is not the final story. What the full picture is, no one knows, but this thesis looks into three methods useful for exploring a few of the possibilities. To begin I present a paper by Spencer Chang, Nirmal Raj, Chaowaroj Wanotayaroj, and me, that studies the Higgs boson. The scalar particle first seen in 2012 may be the vanilla SM version, but there is some evidence that its couplings are different than predicted. By means of increasing the Higgs' coupling to vector bosons and fermions, we can be more consistent with the data. Next, in a paper by Spencer Chang, Gabriel Barello, and me, we elaborate on a tool created to study dark matter (DM) direct detection. The original work by Anand. et al. focused on elastic dark matter, whereas we extended this work to include the in elastic case, where different DM mass states enter and leave the collision. We also examine several direct detection experiments with our new framework to see if DAMA's modulation can be explained while avoiding the strong constraints imposed by the other experiments. We find that there are several operators that can do this. Finally, in a paper by Spencer Chang, Gabriel Barello, and me, we study an interesting phenomenon know as kinetic mixing, where two gauge bosons can share interactions with particles even though these particles aren't charged under both gauge groups. This, in and of itself, is not new, but we discuss a different method of obtaining this mixing where instead of mixing between two Abelian groups one of the groups is Nonabelian. Using this we then see that there is an inherent mass scale in the mixing strength; something that is absent in the Abelian-Abelian case. Furthermore, if the Nonabelian symmetry is the SU(2)L of the SM then the mass scale of the physics responsible for the mixing is about 1 TeV, right around the sweet spot for detection at the LHC. This dissertation

  9. The Standard Solar Model versus Experimental Observations

    NASA Astrophysics Data System (ADS)

    Manuel, O.

    2000-12-01

    The standard solar model (ssm) assumes the that Sun formed as a homogeneous body, its interior consists mostly of hydrogen, and its radiant energy comes from H-fusion in its core. Two sets of measurements indicate the ssm is wrong: 1. Analyses of material in the planetary system show that - (a) Fe, O, Ni, Si, Mg, S and Ca have high nuclear stability and comprise 98+% of ordinary meteorites that formed at the birth of the solar system; (b) the cores of inner planets formed in a central region consisting mostly of heavy elements like Fe, Ni and S; (c) the outer planets formed mostly from elements like H, He and C; and (d) isotopic heterogeneities accompanied these chemical gradients in debris of the supernova that exploded here 5 billion years ago to produce the solar system (See Origin of the Elements at http://www.umr.edu/õm/). 2. Analyses of material coming from the Sun show that - (a) there are not enough neutrinos for H-fusion to be its main source of energy; (b) light-weight isotopes (mass =L) of He, Ne, Ar, Kr and Xe in the solar wind are enriched relative to heavy isotopes (mass = H) by a factor, f, where log f = 4.56 log [H/L] -- - Eq. (1); (c) solar flares by-pass 3.4 of these 9-stages of diffusion and deplete the light-weight isotopes of He, Ne, Mg and Ar by a factor, f*, where log f* = -1.7 log [H/L] --- Eq. (2); (d) proton-capture on N-14 increased N-15 in the solar wind over geologic time; and (e) solar flares dredge up nitrogen with less N-15 from this H-fusion reaction. Each observation above is unexplained by ssm. After correcting photospheric abundances for diffusion [Observation 2(b)], the most abundant elements in the bulk sun are Fe, Ni, O, Si, S, Mg and Ca, the same elements that comprise ordinary meteorites [Observation 1(a)]. The probability that Eq. (1) would randomly select these elements from the photosphere, i.e., the likelihood for a meaningless agreement between observations 2(b) and 1(a), is < 2.0E(-33). Thus, ssm does not describe the

  10. Multilevel Linkages between State Standards, Teacher Standards, and Student Achievement: Testing External versus Internal Standards-Based Education Models

    ERIC Educational Resources Information Center

    Lee, Jaekyung; Liu, Xiaoyan; Amo, Laura Casey; Wang, Weichun Leilani

    2014-01-01

    Drawing on national and state assessment datasets in reading and math, this study tested "external" versus "internal" standards-based education models. The goal was to understand whether and how student performance standards work in multilayered school systems under No Child Left Behind Act of 2001 (NCLB). Under the…

  11. Multilevel Linkages between State Standards, Teacher Standards, and Student Achievement: Testing External versus Internal Standards-Based Education Models

    ERIC Educational Resources Information Center

    Lee, Jaekyung; Liu, Xiaoyan; Amo, Laura Casey; Wang, Weichun Leilani

    2014-01-01

    Drawing on national and state assessment datasets in reading and math, this study tested "external" versus "internal" standards-based education models. The goal was to understand whether and how student performance standards work in multilayered school systems under No Child Left Behind Act of 2001 (NCLB). Under the…

  12. Neutrinos: in and out of the standard model

    SciTech Connect

    Parke, Stephen; /Fermilab

    2006-07-01

    The particle physics Standard Model has been tremendously successful in predicting the outcome of a large number of experiments. In this model Neutrinos are massless. Yet recent evidence points to the fact that neutrinos are massive particles with tiny masses compared to the other particles in the Standard Model. These tiny masses allow the neutrinos to change flavor and oscillate. In this series of Lectures, I will review the properties of Neutrinos In the Standard Model and then discuss the physics of Neutrinos Beyond the Standard Model. Topics to be covered include Neutrino Flavor Transformations and Oscillations, Majorana versus Dirac Neutrino Masses, the Seesaw Mechanism and Leptogenesis.

  13. Primordial lithium and the standard model(s)

    NASA Technical Reports Server (NTRS)

    Deliyannis, Constantine P.; Demarque, Pierre; Kawaler, Steven D.; Romanelli, Paul; Krauss, Lawrence M.

    1989-01-01

    The results of new theoretical work on surface Li-7 and Li-6 evolution in the oldest halo stars are presented, along with a new and refined analysis of the predicted primordial Li abundance resulting from big-bang nucleosynthesis. This makes it possible to determine the constraints which can be imposed on cosmology using primordial Li and both standard big-bang and stellar-evolution models. This leads to limits on the baryon density today of 0.0044-0.025 (where the Hubble constant is 100h km/sec Mpc) and imposes limitations on alternative nucleosynthesis scenarios.

  14. Modeling and Simulation Network Data Standards

    DTIC Science & Technology

    2011-09-30

    Century, Joint Network Analysis Tool, and OPNET . The Architecture Integration Management Division (AIMD), the Army Materiel Systems Analysis Activity...baseline to develop enhancements in data transfers in future projects. 15. SUBJECT TERMS AWARS, COMBATXXI, JNAT, OPNET , network data standards, M&S...B-1 Appendix C. Joint Network Analysis Tool (JNAT) .............................................................. C-1 Appendix D. OPNET

  15. 42 CFR 403.210 - NAIC model standards.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 42 Public Health 2 2010-10-01 2010-10-01 false NAIC model standards. 403.210 Section 403.210 Public Health CENTERS FOR MEDICARE & MEDICAID SERVICES, DEPARTMENT OF HEALTH AND HUMAN SERVICES GENERAL PROVISIONS SPECIAL PROGRAMS AND PROJECTS Medicare Supplemental Policies General Provisions § 403.210 NAIC model standards. (a) NAIC model...

  16. Template and Model Driven Development of Standardized Electronic Health Records.

    PubMed

    Kropf, Stefan; Chalopin, Claire; Denecke, Kerstin

    2015-01-01

    Digital patient modeling targets the integration of distributed patient data into one overarching model. For this integration process, both a theoretical standard-based model and information structures combined with concrete instructions in form of a lightweight development process of single standardized Electronic Health Records (EHRs) are needed. In this paper, we introduce such a process along side a standard-based architecture. It allows the modeling and implementation of EHRs in a lightweight Electronic Health Record System (EHRS) core. The approach is demonstrated and tested by a prototype implementation. The results show that the suggested approach is useful and facilitates the development of standardized EHRSs.

  17. Standard solar model. II - g-modes

    NASA Technical Reports Server (NTRS)

    Guenther, D. B.; Demarque, P.; Pinsonneault, M. H.; Kim, Y.-C.

    1992-01-01

    The paper presents the g-mode oscillation for a set of modern solar models. Each solar model is based on a single modification or improvement to the physics of a reference solar model. Improvements were made to the nuclear reaction rates, the equation of state, the opacities, and the treatment of the atmosphere. The error in the predicted g-mode periods associated with the uncertainties in the model physics is predicted and the specific sensitivities of the g-mode periods and their period spacings to the different model structures are described. In addition, these models are compared to a sample of published observations. A remarkably good agreement is found between the 'best' solar model and the observations of Hill and Gu (1990).

  18. New results on standard solar models

    NASA Astrophysics Data System (ADS)

    Serenelli, Aldo M.

    2010-07-01

    We describe the current status of solar modeling and focus on the problems originating with the introduction of solar abundance determinations with low CNO abundance values. We use models computed with solar abundance compilations obtained during the last decade, including the newest published abundances by Asplund and collaborators. The results presented here focus on both helioseismic properties and models as well as neutrino flux predictions. We also discuss changes in radiative opacities to restore agreement between helioseismology, solar models, and solar abundances and show the effect of such modifications on solar neutrino fluxes.

  19. No-scale supersymmetric standard model

    NASA Astrophysics Data System (ADS)

    Ellis, John; Lahanas, A. B.; Nanopoulos, D. V.; Tamvakis, K.

    1984-01-01

    We propose a class of supergravity models coupled to matter in which the scales of supersymmetry breaking and of weak gauge symmetry breaking are both fixed by dimensional transmutation, not put in by hand. The models have a flat potential with zero cosmological constant before the evaluation of weak radiative corrections which determine m3/2, mW = exp [-O(1)/αt]mp:αt = O(α). These models are consistent with all particle physi cs and cosmological constraints for top quark masses in the range 30 GeV < mt < 100 GeV.

  20. Particle Physics Primer: Explaining the Standard Model of Matter.

    ERIC Educational Resources Information Center

    Vondracek, Mark

    2002-01-01

    Describes the Standard Model, a basic model of the universe that describes electromagnetic force, weak nuclear force radioactivity, and the strong nuclear force responsible for holding particles within the nucleus together. (YDS)

  1. Particle Physics Primer: Explaining the Standard Model of Matter.

    ERIC Educational Resources Information Center

    Vondracek, Mark

    2002-01-01

    Describes the Standard Model, a basic model of the universe that describes electromagnetic force, weak nuclear force radioactivity, and the strong nuclear force responsible for holding particles within the nucleus together. (YDS)

  2. NASREN: Standard reference model for telerobot control

    NASA Technical Reports Server (NTRS)

    Albus, J. S.; Lumia, R.; Mccain, H.

    1987-01-01

    A hierarchical architecture is described which supports space station telerobots in a variety of modes. The system is divided into three hierarchies: task decomposition, world model, and sensory processing. Goals at each level of the task dedomposition heirarchy are divided both spatially and temporally into simpler commands for the next lower level. This decomposition is repreated until, at the lowest level, the drive signals to the robot actuators are generated. To accomplish its goals, task decomposition modules must often use information stored it the world model. The purpose of the sensory system is to update the world model as rapidly as possible to keep the model in registration with the physical world. The architecture of the entire control system hierarch is described and how it can be applied to space telerobot applications.

  3. Creating Better School-Age Care Jobs: Model Work Standards.

    ERIC Educational Resources Information Center

    Haack, Peggy

    Built on the premise that good school-age care jobs are the cornerstone of high-quality services for school-age youth and their families, this guide presents model work standards for school-age care providers. The guide begins with a description of the strengths and challenges of the school-age care profession. The model work standards are…

  4. Creating Better School-Age Care Jobs: Model Work Standards.

    ERIC Educational Resources Information Center

    Haack, Peggy

    Built on the premise that good school-age care jobs are the cornerstone of high-quality services for school-age youth and their families, this guide presents model work standards for school-age care providers. The guide begins with a description of the strengths and challenges of the school-age care profession. The model work standards are…

  5. The Standard Model from LHC to future colliders.

    PubMed

    Forte, S; Nisati, A; Passarino, G; Tenchini, R; Calame, C M Carloni; Chiesa, M; Cobal, M; Corcella, G; Degrassi, G; Ferrera, G; Magnea, L; Maltoni, F; Montagna, G; Nason, P; Nicrosini, O; Oleari, C; Piccinini, F; Riva, F; Vicini, A

    This review summarizes the results of the activities which have taken place in 2014 within the Standard Model Working Group of the "What Next" Workshop organized by INFN, Italy. We present a framework, general questions, and some indications of possible answers on the main issue for Standard Model physics in the LHC era and in view of possible future accelerators.

  6. Energy standards and model codes development, adoption, implementation, and enforcement

    SciTech Connect

    Conover, D.R.

    1994-08-01

    This report provides an overview of the energy standards and model codes process for the voluntary sector within the United States. The report was prepared by Pacific Northwest Laboratory (PNL) for the Building Energy Standards Program and is intended to be used as a primer or reference on this process. Building standards and model codes that address energy have been developed by organizations in the voluntary sector since the early 1970s. These standards and model codes provide minimum energy-efficient design and construction requirements for new buildings and, in some instances, existing buildings. The first step in the process is developing new or revising existing standards or codes. There are two overall differences between standards and codes. Energy standards are developed by a consensus process and are revised as needed. Model codes are revised on a regular annual cycle through a public hearing process. In addition to these overall differences, the specific steps in developing/revising energy standards differ from model codes. These energy standards or model codes are then available for adoption by states and local governments. Typically, energy standards are adopted by or adopted into model codes. Model codes are in turn adopted by states through either legislation or regulation. Enforcement is essential to the implementation of energy standards and model codes. Low-rise residential construction is generally evaluated for compliance at the local level, whereas state agencies tend to be more involved with other types of buildings. Low-rise residential buildings also may be more easily evaluated for compliance because the governing requirements tend to be less complex than for commercial buildings.

  7. Big bang nucleosynthesis - The standard model and alternatives

    NASA Technical Reports Server (NTRS)

    Schramm, David N.

    1991-01-01

    The standard homogeneous-isotropic calculation of the big bang cosmological model is reviewed, and alternate models are discussed. The standard model is shown to agree with the light element abundances for He-4, H-2, He-3, and Li-7 that are available. Improved observational data from recent LEP collider and SLC results are discussed. The data agree with the standard model in terms of the number of neutrinos, and provide improved information regarding neutron lifetimes. Alternate models are reviewed which describe different scenarios for decaying matter or quark-hadron induced inhomogeneities. The baryonic density relative to the critical density in the alternate models is similar to that of the standard model when they are made to fit the abundances. This reinforces the conclusion that the baryonic density relative to critical density is about 0.06, and also reinforces the need for both nonbaryonic dark matter and dark baryonic matter.

  8. Big bang nucleosynthesis - The standard model and alternatives

    NASA Technical Reports Server (NTRS)

    Schramm, David N.

    1991-01-01

    The standard homogeneous-isotropic calculation of the big bang cosmological model is reviewed, and alternate models are discussed. The standard model is shown to agree with the light element abundances for He-4, H-2, He-3, and Li-7 that are available. Improved observational data from recent LEP collider and SLC results are discussed. The data agree with the standard model in terms of the number of neutrinos, and provide improved information regarding neutron lifetimes. Alternate models are reviewed which describe different scenarios for decaying matter or quark-hadron induced inhomogeneities. The baryonic density relative to the critical density in the alternate models is similar to that of the standard model when they are made to fit the abundances. This reinforces the conclusion that the baryonic density relative to critical density is about 0.06, and also reinforces the need for both nonbaryonic dark matter and dark baryonic matter.

  9. Standardization of A Physiologic Hypoparathyroidism Animal Model

    PubMed Central

    Jung, Soo Yeon; Kim, Ha Yeong; Park, Hae Sang; Yin, Xiang Yun; Chung, Sung Min; Kim, Han Su

    2016-01-01

    Ideal hypoparathyroidism animal models are a prerequisite to developing new treatment modalities for this disorder. The purpose of this study was to evaluate the feasibility of a model whereby rats were parathyroidectomized (PTX) using a fluorescent-identification method and the ideal calcium content of the diet was determined. Thirty male rats were divided into surgical sham (SHAM, n = 5) and PTX plus 0, 0.5, and 2% calcium diet groups (PTX-FC (n = 5), PTX-NC (n = 10), and PTX-HC (n = 10), respectively). Serum parathyroid hormone levels decreased to non-detectable levels in all PTX groups. All animals in the PTX—FC group died within 4 days after the operation. All animals survived when supplied calcium in the diet. However, serum calcium levels were higher in the PTX-HC than the SHAM group. The PTX-NC group demonstrated the most representative modeling of primary hypothyroidism. Serum calcium levels decreased and phosphorus levels increased, and bone volume was increased. All animals survived without further treatment and did not show nephrotoxicity including calcium deposits. These findings demonstrate that PTX animal models produced by using the fluorescent-identification method, and fed a 0.5% calcium diet, are appropriate for hypoparathyroidism treatment studies. PMID:27695051

  10. A Standardized Generalized Dimensionality Discrepancy Measure and a Standardized Model-Based Covariance for Dimensionality Assessment for Multidimensional Models

    ERIC Educational Resources Information Center

    Levy, Roy; Xu, Yuning; Yel, Nedim; Svetina, Dubravka

    2015-01-01

    The standardized generalized dimensionality discrepancy measure and the standardized model-based covariance are introduced as tools to critique dimensionality assumptions in multidimensional item response models. These tools are grounded in a covariance theory perspective and associated connections between dimensionality and local independence.…

  11. CKM mixings in an E[sub 6]-induced standard model extension and in the minimal supersymmetric standard model

    SciTech Connect

    Aydin, Z.Z.; Sultansoy, S.; Yilmazer, A.U. )

    1994-10-01

    The number of mixing angles and phases in the two popular extensions of the standard model (SM), the [ital E][sub 6]-induced SM extension and the minimal supersymmetric standard model with soft symmetry-breaking terms, is discussed. It is found that two [ital CP]-violating phases appear in the minimal supersymmetric SM even for the simplest case of one family.

  12. Increasing Model Efficiency Using Standard Commercial Software

    DTIC Science & Technology

    1993-09-01

    used by the U.S. Army to produce Baseline Cost Estimates (BCE*) Include ACEIT , OBCE, PICES and FLEX. Spreadsheets are also used to create a large...Like other models, PICES has its strong and weak points. PICES Is written In FORTRAN and thus can be compiled on many platforms. The primary computer...edit the monthly or yearly values. In EXCEL, you simply edit the DP, which consists of 1-5 rows, depending on how many years have data. The same

  13. Physics Beyond the Standard Model: Supersymmetry

    SciTech Connect

    Nojiri, M.M.; Plehn, T.; Polesello, G.; Alexander, John M.; Allanach, B.C.; Barr, Alan J.; Benakli, K.; Boudjema, F.; Freitas, A.; Gwenlan, C.; Jager, S.; /CERN /LPSC, Grenoble

    2008-02-01

    This collection of studies on new physics at the LHC constitutes the report of the supersymmetry working group at the Workshop 'Physics at TeV Colliders', Les Houches, France, 2007. They cover the wide spectrum of phenomenology in the LHC era, from alternative models and signatures to the extraction of relevant observables, the study of the MSSM parameter space and finally to the interplay of LHC observations with additional data expected on a similar time scale. The special feature of this collection is that while not each of the studies is explicitly performed together by theoretical and experimental LHC physicists, all of them were inspired by and discussed in this particular environment.

  14. Sporulation in Bacteria: Beyond the Standard Model.

    PubMed

    Hutchison, Elizabeth A; Miller, David A; Angert, Esther R

    2014-10-01

    Endospore formation follows a complex, highly regulated developmental pathway that occurs in a broad range of Firmicutes. Although Bacillus subtilis has served as a powerful model system to study the morphological, biochemical, and genetic determinants of sporulation, fundamental aspects of the program remain mysterious for other genera. For example, it is entirely unknown how most lineages within the Firmicutes regulate entry into sporulation. Additionally, little is known about how the sporulation pathway has evolved novel spore forms and reproductive schemes. Here, we describe endospore and internal offspring development in diverse Firmicutes and outline progress in characterizing these programs. Moreover, comparative genomics studies are identifying highly conserved sporulation genes, and predictions of sporulation potential in new isolates and uncultured bacteria can be made from these data. One surprising outcome of these comparative studies is that core regulatory and some structural aspects of the program appear to be universally conserved. This suggests that a robust and sophisticated developmental framework was already in place in the last common ancestor of all extant Firmicutes that produce internal offspring or endospores. The study of sporulation in model systems beyond B. subtilis will continue to provide key information on the flexibility of the program and provide insights into how changes in this developmental course may confer advantages to cells in diverse environments.

  15. Standardized verification of fuel cycle modeling

    DOE PAGES

    Feng, B.; Dixon, B.; Sunny, E.; ...

    2016-04-05

    A nuclear fuel cycle systems modeling and code-to-code comparison effort was coordinated across multiple national laboratories to verify the tools needed to perform fuel cycle analyses of the transition from a once-through nuclear fuel cycle to a sustainable potential future fuel cycle. For this verification study, a simplified example transition scenario was developed to serve as a test case for the four systems codes involved (DYMOND, VISION, ORION, and MARKAL), each used by a different laboratory participant. In addition, all participants produced spreadsheet solutions for the test case to check all the mass flows and reactor/facility profiles on a year-by-yearmore » basis throughout the simulation period. The test case specifications describe a transition from the current US fleet of light water reactors to a future fleet of sodium-cooled fast reactors that continuously recycle transuranic elements as fuel. After several initial coordinated modeling and calculation attempts, it was revealed that most of the differences in code results were not due to different code algorithms or calculation approaches, but due to different interpretations of the input specifications among the analysts. Therefore, the specifications for the test case itself were iteratively updated to remove ambiguity and to help calibrate interpretations. In addition, a few corrections and modifications were made to the codes as well, which led to excellent agreement between all codes and spreadsheets for this test case. Although no fuel cycle transition analysis codes matched the spreadsheet results exactly, all remaining differences in the results were due to fundamental differences in code structure and/or were thoroughly explained. As a result, the specifications and example results are provided so that they can be used to verify additional codes in the future for such fuel cycle transition scenarios.« less

  16. Standardized verification of fuel cycle modeling

    SciTech Connect

    Feng, B.; Dixon, B.; Sunny, E.; Cuadra, A.; Jacobson, J.; Brown, N. R.; Powers, J.; Worrall, A.; Passerini, S.; Gregg, R.

    2016-04-05

    A nuclear fuel cycle systems modeling and code-to-code comparison effort was coordinated across multiple national laboratories to verify the tools needed to perform fuel cycle analyses of the transition from a once-through nuclear fuel cycle to a sustainable potential future fuel cycle. For this verification study, a simplified example transition scenario was developed to serve as a test case for the four systems codes involved (DYMOND, VISION, ORION, and MARKAL), each used by a different laboratory participant. In addition, all participants produced spreadsheet solutions for the test case to check all the mass flows and reactor/facility profiles on a year-by-year basis throughout the simulation period. The test case specifications describe a transition from the current US fleet of light water reactors to a future fleet of sodium-cooled fast reactors that continuously recycle transuranic elements as fuel. After several initial coordinated modeling and calculation attempts, it was revealed that most of the differences in code results were not due to different code algorithms or calculation approaches, but due to different interpretations of the input specifications among the analysts. Therefore, the specifications for the test case itself were iteratively updated to remove ambiguity and to help calibrate interpretations. In addition, a few corrections and modifications were made to the codes as well, which led to excellent agreement between all codes and spreadsheets for this test case. Although no fuel cycle transition analysis codes matched the spreadsheet results exactly, all remaining differences in the results were due to fundamental differences in code structure and/or were thoroughly explained. As a result, the specifications and example results are provided so that they can be used to verify additional codes in the future for such fuel cycle transition scenarios.

  17. A model to determine the lake nutrient standards for drinking water sources in Yunnan-Guizhou Plateau Ecoregion, China.

    PubMed

    Ji, Danfeng; Xi, Beidou; Su, Jing; Huo, Shouliang; He, Li; Liu, Hongliang; Yang, Queping

    2013-09-01

    Lake eutrophication (LE) has become an increasingly severe environmental problem recently. However, there has been no nutrient standard established for LE control in many developing countries such as China. This study proposes a structural equation model to assist in the establishment of a lake nutrient standard for drinking water sources in Yunnan-Guizhou Plateau Ecoregion (Yungui Ecoregion), China. The modeling results indicate that the most predictive indicator for designated use-attainment is total phosphorus (TP) (total effect = -0.43), and chlorophyll a (Chl-a) is recommended as the second important indicator (total effect = -0.41). The model is further used for estimating the probability of use-attainment associated with lake water as a drinking water source and various levels of candidate criteria (based on the reference conditions and the current environmental quality standards for surface water). It is found that these candidate criteria cannot satisfy the designated 100% use-attainment. To achieve the short-term target (85% attainment of the designated use), TP and Chl-a values ought to be less than 0.02 mg/L and 1.4 microg/L, respectively. When used as a long-term target (90% or greater attainment of the designated use), the TP and Chl-a values are suggested to be less than 0.018 mg/L and 1 microg/L, respectively.

  18. Comparison of cosmological models using standard rulers and candles

    NASA Astrophysics Data System (ADS)

    Li, Xiao-Lei; Cao, Shuo; Zheng, Xiao-Gang; Li, Song; Biesiada, Marek

    2016-05-01

    In this paper, we used standard rulers and standard candles (separately and jointly) to explore five popular dark energy models under the assumption of the spatial flatness of the Universe. As standard rulers, we used a data set comprised of 118 galactic scale strong lensing systems (individual standard rulers if properly calibrated for the mass density profile) combined with BAO diagnostics (statistical standard ruler). Type Ia supernovae served as standard candles. Unlike most previous statistical studies involving strong lensing systems, we relaxed the assumption of a singular isothermal sphere (SIS) in favor of its generalization: the power-law mass density profile. Therefore, along with cosmological model parameters, we fitted the power law index and its first derivative with respect to the redshift (thus allowing for mass density profile evolution). It turned out that the best fitted γ parameters are in agreement with each other, irrespective of the cosmological model considered. This demonstrates that galactic strong lensing systems may provide a complementary probe to test the properties of dark energy. The fits for cosmological model parameters which we obtained are in agreement with alternative studies performed by other researchers. Because standard rulers and standard candles have different parameter degeneracies, a combination of standard rulers and standard candles gives much more restrictive results for cosmological parameters. Finally, we attempted an analysis based on model selection using information theoretic criteria (AIC and BIC). Our results support the claim that the cosmological constant model is still best and there is no (at least statistical) reason to prefer any other more complex model.

  19. Prospects and problems for standardizing model validation in systems biology.

    PubMed

    Gross, Fridolin; MacLeod, Miles

    2017-10-01

    There are currently no widely shared criteria by which to assess the validity of computational models in systems biology. Here we discuss the feasibility and desirability of implementing validation standards for modeling. Having such a standard would facilitate journal review, interdisciplinary collaboration, model exchange, and be especially relevant for applications close to medical practice. However, even though the production of predictively valid models is considered a central goal, in practice modeling in systems biology employs a variety of model structures and model-building practices. These serve a variety of purposes, many of which are heuristic and do not seem to require strict validation criteria and may even be restricted by them. Moreover, given the current situation in systems biology, implementing a validation standard would face serious technical obstacles mostly due to the quality of available empirical data. We advocate a cautious approach to standardization. However even though rigorous standardization seems premature at this point, raising the issue helps us develop better insights into the practices of systems biology and the technical problems modelers face validating models. Further it allows us to identify certain technical validation issues which hold regardless of modeling context and purpose. Informal guidelines could in fact play a role in the field by helping modelers handle these. Copyright © 2017 Elsevier Ltd. All rights reserved.

  20. Colorado Model Content Standards for Theatre: Suggested Grade Level Expectations.

    ERIC Educational Resources Information Center

    Colorado State Dept. of Education, Denver.

    This booklet lists six model content standards in theater arts for elementary and secondary school students in the state of Colorado. The six standards cited in the booklet are: (1) Students develop interpersonal skills and problem-solving capabilities through group interaction and artistic collaboration; (2) Students understand and apply the…

  1. Enhancements to ASHRAE Standard 90.1 Prototype Building Models

    SciTech Connect

    Goel, Supriya; Athalye, Rahul A.; Wang, Weimin; Zhang, Jian; Rosenberg, Michael I.; Xie, YuLong; Hart, Philip R.; Mendon, Vrushali V.

    2014-04-16

    This report focuses on enhancements to prototype building models used to determine the energy impact of various versions of ANSI/ASHRAE/IES Standard 90.1. Since the last publication of the prototype building models, PNNL has made numerous enhancements to the original prototype models compliant with the 2004, 2007, and 2010 editions of Standard 90.1. Those enhancements are described here and were made for several reasons: (1) to change or improve prototype design assumptions; (2) to improve the simulation accuracy; (3) to improve the simulation infrastructure; and (4) to add additional detail to the models needed to capture certain energy impacts from Standard 90.1 improvements. These enhancements impact simulated prototype energy use, and consequently impact the savings estimated from edition to edition of Standard 90.1.

  2. The Beyond the Standard Model Working Group: Summary Report

    SciTech Connect

    Rizzo, Thomas G.

    2002-08-08

    Various theoretical aspects of physics beyond the Standard Model at hadron colliders are discussed. Our focus will be on those issues that most immediately impact the projects pursued as part of the BSM group at this meeting.

  3. The standard data model approach to patient record transfer.

    PubMed

    Canfield, K; Silva, M; Petrucci, K

    1994-01-01

    This paper develops an approach to electronic data exchange of patient records from Ambulatory Encounter Systems (AESs). This approach assumes that the AES is based upon a standard data model. The data modeling standard used here is IDEFIX for Entity/Relationship (E/R) modeling. Each site that uses a relational database implementation of this standard data model (or a subset of it) can exchange very detailed patient data with other such sites using industry standard tools and without excessive programming efforts. This design is detailed below for a demonstration project between the research-oriented geriatric clinic at the Baltimore Veterans Affairs Medical Center (BVAMC) and the Laboratory for Healthcare Informatics (LHI) at the University of Maryland.

  4. A Repository for Beyond-the-Standard-Model Tools

    SciTech Connect

    Skands, P.; Richardson, P.; Allanach, B.C.; Baer, H.; Belanger, G.; El Kacimi, M.; Ellwanger, U.; Freitas, A.; Ghodbane, N.; Goujdami, D.; Hahn, T.; Heinemeyer, S.; Kneur, J.-L.; Landsberg, G.; Lee, J.S.; Muhlleitner, M.; Ohl, T.; Perez, E.; Peskin, M.; Pilaftsis, A.; Plehn, T.

    2005-05-01

    To aid phenomenological studies of Beyond-the-Standard-Model (BSM) physics scenarios, a web repository for BSM calculational tools has been created. We here present brief overviews of the relevant codes, ordered by topic as well as by alphabet.

  5. NASA Standard for Models and Simulations: Philosophy and Requirements Overview

    NASA Technical Reports Server (NTRS)

    Blattnig, St3eve R.; Luckring, James M.; Morrison, Joseph H.; Sylvester, Andre J.; Tripathi, Ram K.; Zang, Thomas A.

    2009-01-01

    Following the Columbia Accident Investigation Board report, the NASA Administrator chartered an executive team (known as the Diaz Team) to identify those CAIB report elements with NASA-wide applicability and to develop corrective measures to address each element. One such measure was the development of a standard for the development, documentation, and operation of models and simulations. This report describes the philosophy and requirements overview of the resulting NASA Standard for Models and Simulations.

  6. NASA Standard for Models and Simulations: Philosophy and Requirements Overview

    NASA Technical Reports Server (NTRS)

    Blattnig, Steve R.; Luckring, James M.; Morrison, Joseph H.; Sylvester, Andre J.; Tripathi, Ram K.; Zang, Thomas A.

    2013-01-01

    Following the Columbia Accident Investigation Board report, the NASA Administrator chartered an executive team (known as the Diaz Team) to identify those CAIB report elements with NASA-wide applicability and to develop corrective measures to address each element. One such measure was the development of a standard for the development, documentation, and operation of models and simulations. This report describes the philosophy and requirements overview of the resulting NASA Standard for Models and Simulations.

  7. Standard Model of Particle Physics--a health physics perspective.

    PubMed

    Bevelacqua, J J

    2010-11-01

    The Standard Model of Particle Physics is reviewed with an emphasis on its relationship to the physics supporting the health physics profession. Concepts important to health physics are emphasized and specific applications are presented. The capability of the Standard Model to provide health physics relevant information is illustrated with application of conservation laws to neutron and muon decay and in the calculation of the neutron mean lifetime.

  8. Sustainable model building the role of standards and biological semantics.

    PubMed

    Krause, Falko; Schulz, Marvin; Swainston, Neil; Liebermeister, Wolfram

    2011-01-01

    Systems biology models can be reused within new simulation scenarios, as parts of more complex models or as sources of biochemical knowledge. Reusability does not come by itself but has to be ensured while creating a model. Most important, models should be designed to remain valid in different contexts-for example, for different experimental conditions-and be published in a standardized and well-documented form. Creating reusable models is worthwhile, but it requires some efforts when a model is developed, implemented, documented, and published. Minimum requirements for published systems biology models have been formulated by the MIRIAM initiative. Main criteria are completeness of information and documentation, availability of machine-readable models in standard formats, and semantic annotations connecting the model elements with entries in biological Web resources. In this chapter, we discuss the assumptions behind bottom-up modeling; present important standards like MIRIAM, the Systems Biology Markup Language (SBML), and the Systems Biology Graphical Notation (SBGN); and describe software tools and services for handling semantic annotations. Finally, we show how standards can facilitate the construction of large metabolic network models.

  9. Bounce inflation cosmology with Standard Model Higgs boson

    SciTech Connect

    Wan, Youping; Huang, Fa Peng; Zhang, Xinmin; Qiu, Taotao; Cai, Yi-Fu; Li, Hong E-mail: qiutt@mail.ccnu.edu.cn E-mail: yifucai@ustc.edu.cn E-mail: xmzhang@ihep.ac.cn

    2015-12-01

    It is of great interest to connect cosmology in the early universe to the Standard Model of particle physics. In this paper, we try to construct a bounce inflation model with the standard model Higgs boson, where the one loop correction is taken into account in the effective potential of Higgs field. In this model, a Galileon term has been introduced to eliminate the ghost mode when bounce happens. Moreover, due to the fact that the Fermion loop correction can make part of the Higgs potential negative, one naturally obtains a large equation of state(EoS) parameter in the contracting phase, which can eliminate the anisotropy problem. After the bounce, the model can drive the universe into the standard higgs inflation phase, which can generate nearly scale-invariant power spectrum.

  10. Intraday LeBaron effects

    PubMed Central

    Bianco, Simone; Corsi, Fulvio; Renò, Roberto

    2009-01-01

    We study the relation at intraday level between serial correlation and volatility of the Standard and Poor (S&P) 500 stock index futures returns. At daily and weekly levels, serial correlation and volatility forecasts have been found to be negatively correlated (LeBaron effect). After finding a significant attenuation of the original effect over time, we show that a similar but more pronounced effect holds by using intraday measures, by such as realized volatility and variance ratio. We also test the impact of unexpected volatility, defined as the part of volatility which cannot be forecasted, on the presence of intraday serial correlation in the time series by employing a model for realized volatility based on the heterogeneous market hypothesis. We find that intraday serial correlation is negatively correlated to volatility forecasts, whereas it is positively correlated to unexpected volatility.

  11. Animal Models of Tourette Syndrome-From Proliferation to Standardization.

    PubMed

    Yael, Dorin; Israelashvili, Michal; Bar-Gad, Izhar

    2016-01-01

    Tourette syndrome (TS) is a childhood onset disorder characterized by motor and vocal tics and associated with multiple comorbid symptoms. Over the last decade, the accumulation of findings from TS patients and the emergence of new technologies have led to the development of novel animal models with high construct validity. In addition, animal models which were previously associated with other disorders were recently attributed to TS. The proliferation of TS animal models has accelerated TS research and provided a better understanding of the mechanism underlying the disorder. This newfound success generates novel challenges, since the conclusions that can be drawn from TS animal model studies are constrained by the considerable variation across models. Typically, each animal model examines a specific subset of deficits and centers on one field of research (physiology/genetics/pharmacology/etc.). Moreover, different studies do not use a standard lexicon to characterize different properties of the model. These factors hinder the evaluation of individual model validity as well as the comparison across models, leading to a formation of a fuzzy, segregated landscape of TS pathophysiology. Here, we call for a standardization process in the study of TS animal models as the next logical step. We believe that a generation of standard examination criteria will improve the utility of these models and enable their consolidation into a general framework. This should lead to a better understanding of these models and their relationship to TS, thereby improving the research of the mechanism underlying this disorder and aiding the development of new treatments.

  12. Animal Models of Tourette Syndrome—From Proliferation to Standardization

    PubMed Central

    Yael, Dorin; Israelashvili, Michal; Bar-Gad, Izhar

    2016-01-01

    Tourette syndrome (TS) is a childhood onset disorder characterized by motor and vocal tics and associated with multiple comorbid symptoms. Over the last decade, the accumulation of findings from TS patients and the emergence of new technologies have led to the development of novel animal models with high construct validity. In addition, animal models which were previously associated with other disorders were recently attributed to TS. The proliferation of TS animal models has accelerated TS research and provided a better understanding of the mechanism underlying the disorder. This newfound success generates novel challenges, since the conclusions that can be drawn from TS animal model studies are constrained by the considerable variation across models. Typically, each animal model examines a specific subset of deficits and centers on one field of research (physiology/genetics/pharmacology/etc.). Moreover, different studies do not use a standard lexicon to characterize different properties of the model. These factors hinder the evaluation of individual model validity as well as the comparison across models, leading to a formation of a fuzzy, segregated landscape of TS pathophysiology. Here, we call for a standardization process in the study of TS animal models as the next logical step. We believe that a generation of standard examination criteria will improve the utility of these models and enable their consolidation into a general framework. This should lead to a better understanding of these models and their relationship to TS, thereby improving the research of the mechanism underlying this disorder and aiding the development of new treatments. PMID:27065791

  13. GIS-based RUSLE modelling of Leça River Basin, Northern Portugal, in two different grid scales

    NASA Astrophysics Data System (ADS)

    Petan, S.; Barbosa, J. L. P.; Mikoš, M.; Pinto, F. T.

    2009-04-01

    Soil erosion is the mechanical degradation caused by the natural forces and it is also influenced by human activities. The biggest threats are the related loss of fertile soil for food production and disturbances of aquatic ecosystems which could unbalance the environment in a wider range. Thus, precise predictions of the soil erosion processes are of a major importance for preventing any kind of environmental degradations. Spatial GIS modelling and erosion maps greatly support the policymaking for land planning and environmental management. Leça River Basin, with a surface of 187 km2, is located in the Northern part of Portugal and it was chosen for testing RUSLE methodology for soil loss prediction and identifying areas with high potential erosion. The model involves daily rainfall data for rainfall erosivity estimation, topographic data for slope length and steepness factor calculation, soil type data, CORINE land cover and land use data. The raster layer model was structured in two different scales: with a grid cell size of 10 and 30 meters. The similarities and differences between the model results of both scales were evaluated.

  14. Improving automation standards via semantic modelling: Application to ISA88.

    PubMed

    Dombayci, Canan; Farreres, Javier; Rodríguez, Horacio; Espuña, Antonio; Graells, Moisès

    2017-03-01

    Standardization is essential for automation. Extensibility, scalability, and reusability are important features for automation software that rely in the efficient modelling of the addressed systems. The work presented here is from the ongoing development of a methodology for semi-automatic ontology construction methodology from technical documents. The main aim of this work is to systematically check the consistency of technical documents and support the improvement of technical document consistency. The formalization of conceptual models and the subsequent writing of technical standards are simultaneously analyzed, and guidelines proposed for application to future technical standards. Three paradigms are discussed for the development of domain ontologies from technical documents, starting from the current state of the art, continuing with the intermediate method presented and used in this paper, and ending with the suggested paradigm for the future. The ISA88 Standard is taken as a representative case study. Linguistic techniques from the semi-automatic ontology construction methodology is applied to the ISA88 Standard and different modelling and standardization aspects that are worth sharing with the automation community is addressed. This study discusses different paradigms for developing and sharing conceptual models for the subsequent development of automation software, along with presenting the systematic consistency checking method.

  15. A Standard Kinematic Model for Flight Simulation at NASA Ames

    NASA Technical Reports Server (NTRS)

    Mcfarland, R. E.

    1975-01-01

    A standard kinematic model for aircraft simulation exists at NASA-Ames on a variety of computer systems, one of which is used to control the flight simulator for advanced aircraft (FSAA). The derivation of the kinematic model is given and various mathematical relationships are presented as a guide. These include descriptions of standardized simulation subsystems such as the atmospheric turbulence model and the generalized six-degrees-of-freedom trim routine, as well as an introduction to the emulative batch-processing system which enables this facility to optimize its real-time environment.

  16. Tevatron searches for Higgs bosons beyond the standard model

    SciTech Connect

    Nielsen, Jason; /UC, Santa Cruz

    2007-06-01

    Theoretical frameworks beyond the standard model predict a rich Higgs sector with multiple charged and neutral Higgs bosons. Both the CDF II and D0 experiments at the Tevatron have analyzed 1 fb{sup -1} of p{bar p} collisions at {radical}s = 1.96TeV in search of Higgs boson production. A complete suite of results on searches for neutral, charged, and fermiophobic Higgs bosons limit the allowed production rates and constrain extended models, including the minimal supersymmetric standard model.

  17. Higher Education Quality Assessment Model: Towards Achieving Educational Quality Standard

    ERIC Educational Resources Information Center

    Noaman, Amin Y.; Ragab, Abdul Hamid M.; Madbouly, Ayman I.; Khedra, Ahmed M.; Fayoumi, Ayman G.

    2017-01-01

    This paper presents a developed higher education quality assessment model (HEQAM) that can be applied for enhancement of university services. This is because there is no universal unified quality standard model that can be used to assess the quality criteria of higher education institutes. The analytical hierarchy process is used to identify the…

  18. Kaon physics: Probing the standard model and beyond

    SciTech Connect

    Tschirhart, R.; /Fermilab

    2009-01-01

    The status and prospects of current and future kaon physics experiments is discussed. Both precision measurements and the search for and measurement of ultra-rare decays are powerful probes of many models of new physics beyond the Standard Model. The physics reach of these experiments is briefly discussed.

  19. Informatics in radiology: an information model of the DICOM standard.

    PubMed

    Kahn, Charles E; Langlotz, Curtis P; Channin, David S; Rubin, Daniel L

    2011-01-01

    The Digital Imaging and Communications in Medicine (DICOM) Standard is a key foundational technology for radiology. However, its complexity creates challenges for information system developers because the current DICOM specification requires human interpretation and is subject to nonstandard implementation. To address this problem, a formally sound and computationally accessible information model of the DICOM Standard was created. The DICOM Standard was modeled as an ontology, a machine-accessible and human-interpretable representation that may be viewed and manipulated by information-modeling tools. The DICOM Ontology includes a real-world model and a DICOM entity model. The real-world model describes patients, studies, images, and other features of medical imaging. The DICOM entity model describes connections between real-world entities and the classes that model the corresponding DICOM information entities. The DICOM Ontology was created to support the Cancer Biomedical Informatics Grid (caBIG) initiative, and it may be extended to encompass the entire DICOM Standard and serve as a foundation of medical imaging systems for research and patient care.

  20. Higher Education Quality Assessment Model: Towards Achieving Educational Quality Standard

    ERIC Educational Resources Information Center

    Noaman, Amin Y.; Ragab, Abdul Hamid M.; Madbouly, Ayman I.; Khedra, Ahmed M.; Fayoumi, Ayman G.

    2017-01-01

    This paper presents a developed higher education quality assessment model (HEQAM) that can be applied for enhancement of university services. This is because there is no universal unified quality standard model that can be used to assess the quality criteria of higher education institutes. The analytical hierarchy process is used to identify the…

  1. Non-standard models and the sociology of cosmology

    NASA Astrophysics Data System (ADS)

    López-Corredoira, Martín

    2014-05-01

    I review some theoretical ideas in cosmology different from the standard "Big Bang": the quasi-steady state model, the plasma cosmology model, non-cosmological redshifts, alternatives to non-baryonic dark matter and/or dark energy, and others. Cosmologists do not usually work within the framework of alternative cosmologies because they feel that these are not at present as competitive as the standard model. Certainly, they are not so developed, and they are not so developed because cosmologists do not work on them. It is a vicious circle. The fact that most cosmologists do not pay them any attention and only dedicate their research time to the standard model is to a great extent due to a sociological phenomenon (the "snowball effect" or "groupthink"). We might well wonder whether cosmology, our knowledge of the Universe as a whole, is a science like other fields of physics or a predominant ideology.

  2. Conformal Loop quantization of gravity coupled to the standard model

    NASA Astrophysics Data System (ADS)

    Pullin, Jorge; Gambini, Rodolfo

    2016-03-01

    We consider a local conformal invariant coupling of the standard model to gravity free of any dimensional parameter. The theory is formulated in order to have a quantized version that admits a spin network description at the kinematical level like that of loop quantum gravity. The Gauss constraint, the diffeomorphism constraint and the conformal constraint are automatically satisfied and the standard inner product of the spin-network basis still holds. The resulting theory has resemblances with the Bars-Steinhardt-Turok local conformal theory, except it admits a canonical quantization in terms of loops. By considering a gauge fixed version of the theory we show that the Standard model coupled to gravity is recovered and the Higgs boson acquires mass. This in turn induces via the standard mechanism masses for massive bosons, baryons and leptons.

  3. Explore Physics Beyond the Standard Model with GLAST

    SciTech Connect

    Lionetto, A. M.

    2007-07-12

    We give an overview of the possibility of GLAST to explore theories beyond the Standard Model of particle physics. Among the wide taxonomy we will focus in particular on low scale supersymmetry and theories with extra space-time dimensions. These theories give a suitable dark matter candidate whose interactions and composition can be studied using a gamma ray probe. We show the possibility of GLAST to disentangle such exotic signals from a standard production background.

  4. NASA Standard for Models and Simulations: Credibility Assessment Scale

    NASA Technical Reports Server (NTRS)

    Babula, Maria; Bertch, William J.; Green, Lawrence L.; Hale, Joseph P.; Mosier, Gary E.; Steele, Martin J.; Woods, Jody

    2009-01-01

    As one of its many responses to the 2003 Space Shuttle Columbia accident, NASA decided to develop a formal standard for models and simulations (M&S). Work commenced in May 2005. An interim version was issued in late 2006. This interim version underwent considerable revision following an extensive Agency-wide review in 2007 along with some additional revisions as a result of the review by the NASA Engineering Management Board (EMB) in the first half of 2008. Issuance of the revised, permanent version, hereafter referred to as the M&S Standard or just the Standard, occurred in July 2008. Bertch, Zang and Steeleiv provided a summary review of the development process of this standard up through the start of the review by the EMB. A thorough recount of the entire development process, major issues, key decisions, and all review processes are available in Ref. v. This is the second of a pair of papers providing a summary of the final version of the Standard. Its focus is the Credibility Assessment Scale, a key feature of the Standard, including an example of its application to a real-world M&S problem for the James Webb Space Telescope. The companion paper summarizes the overall philosophy of the Standard and an overview of the requirements. Verbatim quotes from the Standard are integrated into the text of this paper, and are indicated by quotation marks.

  5. Prediction of Standard Enthalpy of Formation by a QSPR Model

    PubMed Central

    Vatani, Ali; Mehrpooya, Mehdi; Gharagheizi, Farhad

    2007-01-01

    The standard enthalpy of formation of 1115 compounds from all chemical groups, were predicted using genetic algorithm-based multivariate linear regression (GA-MLR). The obtained multivariate linear five descriptors model by GA-MLR has correlation coefficient (R2 = 0.9830). All molecular descriptors which have entered in this model are calculated from chemical structure of any molecule. As a result, application of this model for any compound is easy and accurate.

  6. Achieving Peak Flow and Sediment Loading Reductions through Increased Water Storage in the Le Sueur Watershed, Minnesota: A Modeling Approach

    NASA Astrophysics Data System (ADS)

    Mitchell, N. A.; Gran, K. B.; Cho, S. J.; Dalzell, B. J.; Kumarasamy, K.

    2015-12-01

    A combination of factors including climate change, land clearing, and artificial drainage have increased many agricultural regions' stream flows and rates at which channel banks and bluffs are eroded. Increasing erosion rates within the Minnesota River Basin have contributed to higher sediment-loading rates, excess turbidity levels, and increases in sedimentation rates in Lake Pepin further downstream. Water storage sites (e.g., wetlands) have been discussed as a means to address these issues. This study uses the Soil and Water Assessment Tool (SWAT) to assess a range of water retention site (WRS) implementation scenarios in the Le Sueur watershed in south-central Minnesota, a subwatershed of the Minnesota River Basin. Sediment loading from bluffs was assessed through an empirical relationship developed from gauging data. Sites were delineated as topographic depressions with specific land uses, minimum areas (3000 m2), and high compound topographic index values. Contributing areas for the WRS were manually measured and used with different site characteristics to create 210 initial WRS scenarios. A generalized relationship between WRS area and contributing area was identified from measurements, and this relationship was used with different site characteristics (e.g., depth, hydraulic conductivity (K), and placement) to create 225 generalized WRS scenarios. Reductions in peak flow volumes and sediment-loading rates are generally maximized by placing site with high K values in the upper half of the watershed. High K values allow sites to lose more water through seepage, emptying their storages between precipitation events and preventing frequent overflowing. Reductions in peak flow volumes and sediment-loading rates also level off at high WRS extents due to the decreasing frequencies of high-magnitude events. The generalized WRS scenarios were also used to create a simplified empirical model capable of generating peak flows and sediment-loading rates from near

  7. Peer Review of NRC Standardized Plant Analysis Risk Models

    SciTech Connect

    Anthony Koonce; James Knudsen; Robert Buell

    2011-03-01

    The Nuclear Regulatory Commission (NRC) Standardized Plant Analysis Risk (SPAR) Models underwent a Peer Review using ASME PRA standard (Addendum C) as endorsed by NRC in Regulatory Guide (RG) 1.200. The review was performed by a mix of industry probabilistic risk analysis (PRA) experts and NRC PRA experts. Representative SPAR models, one PWR and one BWR, were reviewed against Capability Category I of the ASME PRA standard. Capability Category I was selected as the basis for review due to the specific uses/applications of the SPAR models. The BWR SPAR model was reviewed against 331 ASME PRA Standard Supporting Requirements; however, based on the Capability Category I level of review and the absence of internal flooding and containment performance (LERF) logic only 216 requirements were determined to be applicable. Based on the review, the BWR SPAR model met 139 of the 216 supporting requirements. The review also generated 200 findings or suggestions. Of these 200 findings and suggestions 142 were findings and 58 were suggestions. The PWR SPAR model was also evaluated against the same 331 ASME PRA Standard Supporting Requirements. Of these requirements only 215 were deemed appropriate for the review (for the same reason as noted for the BWR). The PWR review determined that 125 of the 215 supporting requirements met Capability Category I or greater. The review identified 101 findings or suggestions (76 findings and 25 suggestions). These findings or suggestions were developed to identify areas where SPAR models could be enhanced. A process to prioritize and incorporate the findings/suggestions supporting requirements into the SPAR models is being developed. The prioritization process focuses on those findings that will enhance the accuracy, completeness and usability of the SPAR models.

  8. Solar Luminosity on the Main Sequence, Standard Model and Variations

    NASA Astrophysics Data System (ADS)

    Ayukov, S. V.; Baturin, V. A.; Gorshkov, A. B.; Oreshina, A. V.

    2017-05-01

    Our Sun became Main Sequence star 4.6 Gyr ago according Standard Solar Model. At that time solar luminosity was 30% lower than current value. This conclusion is based on assumption that Sun is fueled by thermonuclear reactions. If Earth's albedo and emissivity in infrared are unchanged during Earth history, 2.3 Gyr ago oceans had to be frozen. This contradicts to geological data: there was liquid water 3.6-3.8 Gyr ago on Earth. This problem is known as Faint Young Sun Paradox. We analyze luminosity change in standard solar evolution theory. Increase of mean molecular weight in the central part of the Sun due to conversion of hydrogen to helium leads to gradual increase of luminosity with time on the Main Sequence. We also consider several exotic models: fully mixed Sun; drastic change of pp reaction rate; Sun consisting of hydrogen and helium only. Solar neutrino observations however exclude most non-standard solar models.

  9. Lee-Wick standard model at finite temperature

    NASA Astrophysics Data System (ADS)

    Lebed, Richard F.; Long, Andrew J.; TerBeek, Russell H.

    2013-10-01

    The Lee-Wick Standard Model at temperatures near the electroweak scale is considered, with the aim of studying the electroweak phase transition. While Lee-Wick theories possess states of negative norm, they are not pathological but instead are treated by imposing particular boundary conditions and using particular integration contours in the calculation of S-matrix elements. It is not immediately clear how to extend this prescription to formulate the theory at finite temperature; we explore two different pictures of finite-temperature Lee-Wick theories, and calculate the thermodynamic variables and the (one-loop) thermal effective potential. We apply these results to study the Lee-Wick Standard Model and find that the electroweak phase transition is a continuous crossover, much like in the Standard Model. However, the high-temperature behavior is modified due to cancellations between thermal corrections arising from the negative- and positive-norm states.

  10. Test of a Power Transfer Model for Standardized Electrofishing

    USGS Publications Warehouse

    Miranda, L.E.; Dolan, C.R.

    2003-01-01

    Standardization of electrofishing in waters with differing conductivities is critical when monitoring temporal and spatial differences in fish assemblages. We tested a model that can help improve the consistency of electrofishing by allowing control over the amount of power that is transferred to the fish. The primary objective was to verify, under controlled laboratory conditions, whether the model adequately described fish immobilization responses elicited with various electrical settings over a range of water conductivities. We found that the model accurately described empirical observations over conductivities ranging from 12 to 1,030 ??S/cm for DC and various pulsed-DC settings. Because the model requires knowledge of a fish's effective conductivity, an attribute that is likely to vary according to species, size, temperature, and other variables, a second objective was to gather available estimates of the effective conductivity of fish to examine the magnitude of variation and to assess whether in practical applications a standard effective conductivity value for fish may be assumed. We found that applying a standard fish effective conductivity of 115 ??S/cm introduced relatively little error into the estimation of the peak power density required to immobilize fish with electrofishing. However, this standard was derived from few estimates of fish effective conductivity and a limited number of species; more estimates are needed to validate our working standard.

  11. Non-standard Hubbard models in optical lattices: a review.

    PubMed

    Dutta, Omjyoti; Gajda, Mariusz; Hauke, Philipp; Lewenstein, Maciej; Lühmann, Dirk-Sören; Malomed, Boris A; Sowiński, Tomasz; Zakrzewski, Jakub

    2015-06-01

    Originally, the Hubbard model was derived for describing the behavior of strongly correlated electrons in solids. However, for over a decade now, variations of it have also routinely been implemented with ultracold atoms in optical lattices, allowing their study in a clean, essentially defect-free environment. Here, we review some of the vast literature on this subject, with a focus on more recent non-standard forms of the Hubbard model. After giving an introduction to standard (fermionic and bosonic) Hubbard models, we discuss briefly common models for mixtures, as well as the so-called extended Bose-Hubbard models, that include interactions between neighboring sites, next-neighbor sites, and so on. The main part of the review discusses the importance of additional terms appearing when refining the tight-binding approximation for the original physical Hamiltonian. Even when restricting the models to the lowest Bloch band is justified, the standard approach neglects the density-induced tunneling (which has the same origin as the usual on-site interaction). The importance of these contributions is discussed for both contact and dipolar interactions. For sufficiently strong interactions, the effects related to higher Bloch bands also become important even for deep optical lattices. Different approaches that aim at incorporating these effects, mainly via dressing the basis, Wannier functions with interactions, leading to effective, density-dependent Hubbard-type models, are reviewed. We discuss also examples of Hubbard-like models that explicitly involve higher p orbitals, as well as models that dynamically couple spin and orbital degrees of freedom. Finally, we review mean-field nonlinear Schrödinger models of the Salerno type that share with the non-standard Hubbard models nonlinear coupling between the adjacent sites. In that part, discrete solitons are the main subject of consideration. We conclude by listing some open problems, to be addressed in the future.

  12. Characterization and control of chaotic stress oscillations in a model for the Portevin-Le Chatelier effect

    SciTech Connect

    Markworth, A.J.; Gupta, A.; Rollins, R.W.

    1998-08-04

    The Portevin-Le Chatelier (PLC) effect, otherwise known as serrated yielding, repeated yielding, or jerky flow, has been a subject of investigation for many years. Modeling studies, and experiments have shown that the oscillatory PLC stress-strain dynamics can sometimes be a form of deterministic chaos. In such cases, the spontaneously occurring oscillations are aperiodic and are characterized by at least one positive Lyapunov exponent. However, whether they be chaotic or not, these oscillations are detrimental to the mechanical integrity of the material, so that some means by which they can be suppressed would be a desirable feature. In the work described below, a strategy for suppressing these oscillations is developed and applied to a model for the PLC effect. The strategy is physically realistic in the sense that it is based on the feedback of a control signal, obtained from a measurable quantity, to an accessible (i.e., controllable) parameter. Application is made here to a case for which the oscillations are chaotic, although the approach is applicable to periodic stress oscillations as well.

  13. Constraining new physics with collider measurements of Standard Model signatures

    NASA Astrophysics Data System (ADS)

    Butterworth, Jonathan M.; Grellscheid, David; Krämer, Michael; Sarrazin, Björn; Yallup, David

    2017-03-01

    A new method providing general consistency constraints for Beyond-the-Standard-Model (BSM) theories, using measurements at particle colliders, is presented. The method, `Constraints On New Theories Using Rivet', Contur, exploits the fact that particle-level differential measurements made in fiducial regions of phase-space have a high degree of model-independence. These measurements can therefore be compared to BSM physics implemented in Monte Carlo generators in a very generic way, allowing a wider array of final states to be considered than is typically the case. The Contur approach should be seen as complementary to the discovery potential of direct searches, being designed to eliminate inconsistent BSM proposals in a context where many (but perhaps not all) measurements are consistent with the Standard Model. We demonstrate, using a competitive simplified dark matter model, the power of this approach. The Contur method is highly scaleable to other models and future measurements.

  14. Loop Corrections to Standard Model fields in inflation

    NASA Astrophysics Data System (ADS)

    Chen, Xingang; Wang, Yi; Xianyu, Zhong-Zhi

    2016-08-01

    We calculate 1-loop corrections to the Schwinger-Keldysh propagators of Standard-Model-like fields of spin-0, 1/2, and 1, with all renormalizable interactions during inflation. We pay special attention to the late-time divergences of loop corrections, and show that the divergences can be resummed into finite results in the late-time limit using dynamical renormalization group method. This is our first step toward studying both the Standard Model and new physics in the primordial universe.

  15. Reheating the Standard Model from a hidden sector

    NASA Astrophysics Data System (ADS)

    Tenkanen, Tommi; Vaskonen, Ville

    2016-10-01

    We consider a scenario where the inflaton decays to a hidden sector thermally decoupled from the visible Standard Model sector. A tiny portal coupling between the hidden and the visible sectors later heats the visible sector so that the Standard Model degrees of freedom come to dominate the energy density of the Universe before big bang nucleosynthesis. We find that this scenario is viable, although obtaining the correct dark matter abundance and retaining successful big bang nucleosynthesis is not obvious. We also show that the isocurvature perturbations constituted by a primordial Higgs condensate are not problematic for the viability of the scenario.

  16. Expressing hNF-LE397K results in abnormal gaiting in a transgenic model of CMT2E

    PubMed Central

    Dale, Jeffrey M.; Villalon, Eric; Shannon, Stephen G.; Barry, Devin M.; Markey, Rachel M.; Garcia, Virginia B.; Garcia, Michael L.

    2012-01-01

    Charcot-Marie-Tooth disease (CMT) is the most commonly inherited peripheral neuropathy. CMT disease signs include distal limb neuropathy, abnormal gaiting, exacerbation of neuropathy, sensory defects, and deafness. We generated a novel line of CMT2E mice expressing a hNF-LE397K transgene, which displayed muscle atrophy of the lower limbs without denervation, proximal reduction in large caliber axons, and decreased nerve conduction velocity. In this study, we demonstrated that hNF-LE397K mice developed abnormal gait of the hind limbs. The identification of severe gaiting defects in combination with previously observed muscle atrophy, reduced axon caliber, and decreased nerve conduction velocity suggests that hNF-LE397K mice recapitulate many of clinical signs associated with CMT2E. Therefore, hNF-LE397K mice provide a context for potential therapeutic intervention. PMID:22288874

  17. Status of the AIAA Modeling and Simulation Format Standard

    NASA Technical Reports Server (NTRS)

    Jackson, E. Bruce; Hildreth, Bruce L.

    2008-01-01

    The current draft AIAA Standard for flight simulation models represents an on-going effort to improve the productivity of practitioners of the art of digital flight simulation (one of the original digital computer applications). This initial release provides the capability for the efficient representation and exchange of an aerodynamic model in full fidelity; the DAVE-ML format can be easily imported (with development of site-specific import tools) in an unambiguous way with automatic verification. An attractive feature of the standard is the ability to coexist with existing legacy software or tools. The draft Standard is currently limited in scope to static elements of dynamic flight simulations; however, these static elements represent the bulk of typical flight simulation mathematical models. It is already seeing application within U.S. and Australian government agencies in an effort to improve productivity and reduce model rehosting overhead. An existing tool allows import of DAVE-ML models into a popular simulation modeling and analysis tool, and other community-contributed tools and libraries can simplify the use of DAVE-ML compliant models at compile- or run-time of high-fidelity flight simulation.

  18. Search for the standard model Higgs boson in $l\

    SciTech Connect

    Li, Dikai

    2013-01-01

    Humans have always attempted to understand the mystery of Nature, and more recently physicists have established theories to describe the observed phenomena. The most recent theory is a gauge quantum field theory framework, called Standard Model (SM), which proposes a model comprised of elementary matter particles and interaction particles which are fundamental force carriers in the most unified way. The Standard Model contains the internal symmetries of the unitary product group SU(3)c ⓍSU(2)L Ⓧ U(1)Y , describes the electromagnetic, weak and strong interactions; the model also describes how quarks interact with each other through all of these three interactions, how leptons interact with each other through electromagnetic and weak forces, and how force carriers mediate the fundamental interactions.

  19. Radiative breaking of conformal symmetry in the Standard Model

    NASA Astrophysics Data System (ADS)

    Arbuzov, A. B.; Nazmitdinov, R. G.; Pavlov, A. E.; Pervushin, V. N.; Zakharov, A. F.

    2016-02-01

    Radiative mechanism of conformal symmetry breaking in a comformal-invariant version of the Standard Model is considered. The Coleman-Weinberg mechanism of dimensional transmutation in this system gives rise to finite vacuum expectation values and, consequently, masses of scalar and spinor fields. A natural bootstrap between the energy scales of the top quark and Higgs boson is suggested.

  20. View of a five inch standard Mark III model 1 ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    View of a five inch standard Mark III model 1 #39, manufactured in 1916 at the naval gun factory waterveliet, NY; this is the only gun remaining on olympia dating from the period when it was in commission; note ammunition lift at left side of photograph. (p36) - USS Olympia, Penn's Landing, 211 South Columbus Boulevard, Philadelphia, Philadelphia County, PA

  1. Precision tests of quantum chromodynamics and the standard model

    SciTech Connect

    Brodsky, S.J.; Lu, H.J.

    1995-06-01

    The authors discuss three topics relevant to testing the Standard Model to high precision: commensurate scale relations, which relate observables to each other in perturbation theory without renormalization scale or scheme ambiguity, the relationship of compositeness to anomalous moments, and new methods for measuring the anomalous magnetic and quadrupole moments of the W and Z.

  2. Beyond the Standard Model at the LHC and Beyond

    SciTech Connect

    Ellis, John

    2007-11-20

    Many of the open questions beyond the Standard Model will be addressed by the LHC, including the origin of mass, supersymmetry, dark matter and the possibility of large extra dimensions. A linear e{sup +}e{sup -} collider (LC) with sufficient centre-of-mass energy would add considerable value to the capabilities of the LHC.

  3. Mathematical Modeling, Sense Making, and the Common Core State Standards

    ERIC Educational Resources Information Center

    Schoenfeld, Alan H.

    2013-01-01

    On October 14, 2013 the Mathematics Education Department at Teachers College hosted a full-day conference focused on the Common Core Standards Mathematical Modeling requirements to be implemented in September 2014 and in honor of Professor Henry Pollak's 25 years of service to the school. This article is adapted from my talk at this conference…

  4. Model Child Care Licensing Standards for Indian Reservations.

    ERIC Educational Resources Information Center

    Southwest Educational Development Lab., Austin, TX.

    A grant to develop a model child care licensing code for Indian reservations gave Southwest Educational Development Laboratory a chance to determine whether realistic licensing standards were possible for reservation child care programs. Beginning in 1976, a task force composed of 14 federal and tribal representatives drafted licensing standards…

  5. Home Economics Education Career Path Guide and Model Curriculum Standards.

    ERIC Educational Resources Information Center

    California State Univ., Northridge.

    This curriculum guide developed in California and organized in 10 chapters, provides a home economics education career path guide and model curriculum standards for high school home economics programs. The first chapter contains information on the following: home economics education in California, home economics careers for the future, home…

  6. Specification for a Standard Radar Sea Clutter Model

    DTIC Science & Technology

    1990-09-01

    7, •.i;j i :"Y Technical Document 1917 00 September 1990 0 N• Specification for a Standard Radar Sea I Clutter Model Richard A. Paulus .OTIC SELECTE...2 2.2 Radar Param eters ............................................................. 3 2.3 O utput...4 3.1 Grazing Angle at Sea Surface ................................................... 4 3.2 Radar Clutter Cross Section

  7. Searches for Standard Model Higgs at the Tevatron

    SciTech Connect

    Cortavitarte, Rocio Vilar; /Cantabria Inst. of Phys.

    2007-11-01

    A summary of the latest results of Standard Model Higgs boson searches from CDF and D0 presented at the DIS 2007 conference is reported in this paper. All analyses presented use 1 fb{sup -1} of Tevatron data. The strategy of the different analyses is determined by the Higgs production mechanism and decay channel.

  8. Searches for standard model Higgs at the Tevatron

    SciTech Connect

    Vilar Cortabitarte, Rocio; /Cantabria U., Santander

    2007-04-01

    A summary of the latest results of Standard Model Higgs boson searches from CDF and D0 presented at the DIS 2007 conference is reported in this paper. All analyses presented use 1 fb{sup -1} of Tevatron data. The strategy of the different analyses is determined by the Higgs production mechanism and decay channel.

  9. Teacher Leader Model Standards: Implications for Preparation, Policy, and Practice

    ERIC Educational Resources Information Center

    Berg, Jill Harrison; Carver, Cynthia L.; Mangin, Melinda M.

    2014-01-01

    Teacher leadership is increasingly recognized as a resource for instructional improvement. Consequently, teacher leader initiatives have expanded rapidly despite limited knowledge about how to prepare and support teacher leaders. In this context, the "Teacher Leader Model Standards" represent an important development in the field. In…

  10. Mathematical Modeling, Sense Making, and the Common Core State Standards

    ERIC Educational Resources Information Center

    Schoenfeld, Alan H.

    2013-01-01

    On October 14, 2013 the Mathematics Education Department at Teachers College hosted a full-day conference focused on the Common Core Standards Mathematical Modeling requirements to be implemented in September 2014 and in honor of Professor Henry Pollak's 25 years of service to the school. This article is adapted from my talk at this conference…

  11. Home Economics Education Career Path Guide and Model Curriculum Standards.

    ERIC Educational Resources Information Center

    California State Univ., Northridge.

    This curriculum guide developed in California and organized in 10 chapters, provides a home economics education career path guide and model curriculum standards for high school home economics programs. The first chapter contains information on the following: home economics education in California, home economics careers for the future, home…

  12. 42 CFR 403.210 - NAIC model standards.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 42 Public Health 2 2012-10-01 2012-10-01 false NAIC model standards. 403.210 Section 403.210 Public Health CENTERS FOR MEDICARE & MEDICAID SERVICES, DEPARTMENT OF HEALTH AND HUMAN SERVICES GENERAL PROVISIONS SPECIAL PROGRAMS AND PROJECTS Medicare Supplemental Policies General Provisions § 403.210 NAIC...

  13. 42 CFR 403.210 - NAIC model standards.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 42 Public Health 2 2014-10-01 2014-10-01 false NAIC model standards. 403.210 Section 403.210 Public Health CENTERS FOR MEDICARE & MEDICAID SERVICES, DEPARTMENT OF HEALTH AND HUMAN SERVICES GENERAL PROVISIONS SPECIAL PROGRAMS AND PROJECTS Medicare Supplemental Policies General Provisions § 403.210 NAIC...

  14. 42 CFR 403.210 - NAIC model standards.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 42 Public Health 2 2011-10-01 2011-10-01 false NAIC model standards. 403.210 Section 403.210 Public Health CENTERS FOR MEDICARE & MEDICAID SERVICES, DEPARTMENT OF HEALTH AND HUMAN SERVICES GENERAL PROVISIONS SPECIAL PROGRAMS AND PROJECTS Medicare Supplemental Policies General Provisions § 403.210 NAIC...

  15. 42 CFR 403.210 - NAIC model standards.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 42 Public Health 2 2013-10-01 2013-10-01 false NAIC model standards. 403.210 Section 403.210 Public Health CENTERS FOR MEDICARE & MEDICAID SERVICES, DEPARTMENT OF HEALTH AND HUMAN SERVICES GENERAL PROVISIONS SPECIAL PROGRAMS AND PROJECTS Medicare Supplemental Policies General Provisions § 403.210 NAIC...

  16. Ontology based standardization of petri net modeling for signaling pathways.

    PubMed

    Takai-Igarashi, Takako

    2011-01-01

    Taking account of the great availability of Petri nets in modeling and analyzing large complicated signaling networks, semantics of Petri nets is in need of systematization for the purpose of consistency and reusability of the models. This paper reports on standardization of units of Petri nets on the basis of an ontology that gives an intrinsic definition to the process of signaling in signaling pathways.

  17. Further tests of the Scalar Expectancy Theory (SET) and the Learning-to-Time (LeT) model in a temporal bisection task.

    PubMed

    Machado, Armando; Arantes, Joana

    2006-06-01

    To contrast two models of timing, Scalar Expectancy Theory (SET) and Learning to Time (LeT), pigeons were exposed to a double temporal bisection procedure. On half of the trials, they learned to choose a red key after a 1s signal and a green key after a 4s signal; on the other half of the trials, they learned to choose a blue key after a 4-s signal and a yellow key after a 16-s signal. This was Phase A of an ABA design. On Phase B, the pigeons were divided into two groups and exposed to a new bisection task in which the signals ranged from 1 to 16s and the choice keys were blue and green. One group was reinforced for choosing blue after 1-s signals and green after 16-s signals and the other group was reinforced for the opposite mapping (green after 1-s signals and blue after 16-s signals). Whereas SET predicted no differences between the groups, LeT predicted that the former group would learn the new discrimination faster than the latter group. The results were consistent with LeT. Finally, the pigeons returned to Phase A. Only LeT made specific predictions regarding the reacquisition of the four temporal discriminations. These predictions were only partly consistent with the results.

  18. Standard fire behavior fuel models: a comprehensive set for use with Rothermel's surface fire spread model

    Treesearch

    Joe H. Scott; Robert E. Burgan

    2005-01-01

    This report describes a new set of standard fire behavior fuel models for use with Rothermel's surface fire spread model and the relationship of the new set to the original set of 13 fire behavior fuel models. To assist with transition to using the new fuel models, a fuel model selection guide, fuel model crosswalk, and set of fuel model photos are provided.

  19. Accuracy of hospital standardized mortality rates: effects of model calibration.

    PubMed

    Kipnis, Patricia; Liu, Vincent; Escobar, Gabriel J

    2014-04-01

    Risk-adjusted mortality rates are commonly used in quality report cards to compare hospital performance. The risk adjustment depends on models that are assessed for goodness-of-fit using various discrimination and calibration measures. However, the relationship between model fit and the accuracy of hospital comparisons is not well characterized. To evaluate the impact of imperfect model calibration (miscalibration) on the accuracy of hospital comparisons. We constructed Monte Carlo simulations where a risk-adjustment model is used in a population with a different mortality distribution than in the original model. We estimated the power of calibration metrics to detect miscalibration. We estimated the sensitivity and specificity of a hospital comparisons method under different imperfect model calibration scenarios using an empirical method. The U-statistics showed the highest power to detect intercept and slope deviations in the calibration curve, followed by the Hosmer-Lemeshow, and the calibration intercept and slope tests. The specificity decreased with increased intercept and slope deviations and with hospital size. The effect of an imperfect model fit on sensitivity is a function of the true standardized mortality ratio, the underlying mortality rate, sample size, and observed intercept and slope deviations. Poorly performing hospitals can appear as good performers and vice versa, depending on the deviation magnitude and direction. Deviations from perfect model calibration have a direct impact on the accuracy of hospital comparisons. Publishing the calibration intercept and slope of risk-adjustment models would allow the users to monitor their performance against the true standard population.

  20. A first test of the framed standard model against experiment

    NASA Astrophysics Data System (ADS)

    Bordes, José; Chan, Hong-Mo; Tsou, Sheung Tsun

    2015-04-01

    The framed standard model (FSM) is obtained from the standard model by incorporating, as field variables, the frame vectors (vielbeins) in internal symmetry space. It gives the standard Higgs boson and 3 generations of quarks and leptons as immediate consequences. It gives moreover a fermion mass matrix of the form: m = mTαα†, where α is a vector in generation space independent of the fermion species and rotating with changing scale, which has already been shown to lead, generically, to up-down mixing, neutrino oscillations and mass hierarchy. In this paper, pushing the FSM further, one first derives to 1-loop order the RGE for the rotation of α, and then applies it to fit mass and mixing data as a first test of the model. With 7 real adjustable parameters, 18 measured quantities are fitted, most (12) to within experimental error or to better than 0.5 percent, and the rest (6) not far off. (A summary of this fit can be found in Table 2 of this paper.) Two notable features, both generic to FSM, not just specific to the fit, are: (i) that a theta-angle of order unity in the instanton term in QCD would translate via rotation into a Kobayashi-Maskawa phase in the CKM matrix of about the observed magnitude (J 10-5), (ii) that it would come out correctly that mu < md, despite the fact that mt ≫ mb, mc ≫ ms. Of the 18 quantities fitted, 12 are deemed independent in the usual formulation of the standard model. In fact, the fit gives a total of 17 independent parameters of the standard model, but 5 of these have not been measured by experiment.

  1. Comparing standardized coefficients in structural equation modeling: a model reparameterization approach.

    PubMed

    Kwan, Joyce L Y; Chan, Wai

    2011-09-01

    We propose a two-stage method for comparing standardized coefficients in structural equation modeling (SEM). At stage 1, we transform the original model of interest into the standardized model by model reparameterization, so that the model parameters appearing in the standardized model are equivalent to the standardized parameters of the original model. At stage 2, we impose appropriate linear equality constraints on the standardized model and use a likelihood ratio test to make statistical inferences about the equality of standardized coefficients. Unlike other existing methods for comparing standardized coefficients, the proposed method does not require specific modeling features (e.g., specification of nonlinear constraints), which are available only in certain SEM software programs. Moreover, this method allows researchers to compare two or more standardized coefficients simultaneously in a standard and convenient way. Three real examples are given to illustrate the proposed method, using EQS, a popular SEM software program. Results show that the proposed method performs satisfactorily for testing the equality of standardized coefficients.

  2. Development of tools for evaluating rainfall estimation models in real- time using the Integrated Meteorological Observation Network in Castilla y León (Spain)

    NASA Astrophysics Data System (ADS)

    Merino, Andres; Guerrero-Higueras, Angel Manuel; López, Laura; Gascón, Estibaliz; Sánchez, José Luis; Lorente, José Manuel; Marcos, José Luis; Matía, Pedro; Ortiz de Galisteo, José Pablo; Nafría, David; Fernández-González, Sergio; Weigand, Roberto; Hermida, Lucía; García-Ortega, Eduardo

    2014-05-01

    The integration of various public and private observation networks into the Observation Network of Castile-León (ONet_CyL), Spain, allows us to monitor the risks in real-time. One of the most frequent risks in this region is severe precipitation. Thus, the data from the network allows us to determine the area where precipitation was registered and also to know the areas with precipitation in real-time. The observation network is managed with a LINUX system. The observation platform makes it possible to consult the observation data in a specific point in the region, or otherwise to see the spatial distribution of the precipitation in a user-defined area and time interval. In this study, we compared several rainfall estimation models, based on satellite data for Castile-León, with precipitation data from the meteorological observation network. The rainfall estimation models obtained from the meteorological satellite data provide us with a precipitation field covering a wide area, although its operational use requires a prior evaluation using ground truth data. The aim is to develop a real-time evaluation tool for rainfall estimation models that allows us to monitor the accuracy of its forecasting. This tool makes it possible to visualise different Skill Scores (Probability of Detection, False Alarm Ratio and others) of each rainfall estimation model in real time, thereby not only allowing us to know the areas where the rainfall models indicate precipitation, but also the validation of the model in real-time for each specific meteorological situation. Acknowledgements The authors would like to thank the Regional Government of Castile-León for its financial support through the project LE220A11-2. This study was supported by the following grants: GRANIMETRO (CGL2010-15930); MICROMETEO (IPT-310000-2010-22).

  3. Characteristics of the LeRC/Hughes J-series 30-cm engineering model thruster

    NASA Technical Reports Server (NTRS)

    Collett, C. R.; Poeschel, R. L.; Kami, S.

    1981-01-01

    As a consequence of endurance and structural tests performed on 900-series engineering model thrusters (EMT), several modifications in design were found to be necessary for achieving performance goals. The modified thruster is known as the J-series EMT. The most important of the design modifications affect the accelerator grid, gimbal mount, cathode polepiece, and wiring harness. The paper discusses the design modifications incorporated, the condition(s) they corrected, and the characteristics of the modified thruster.

  4. Phenomenological studies of minimal extensions of standard model

    NASA Astrophysics Data System (ADS)

    Kumar, Nilanjana

    The ATLAS and CMS experiments at the Large Hadron Collider (LHC) have confirmed the existence of the Higgs boson, the last missing piece of the Standard Model, making this era a great time to look for beyond Standard Model physics, which can explain the deficiencies in the Standard Model. The research described here is highly motivated by supersymmetry, which appears as an extension of the Standard Model. The phenomenological consequences of that are of great importance and have been reflected in this research. In the first project, the prospects for LHC discovery of a narrow resonance that decays to two Higgs bosons using the bb¯gammagamma final state are studied. This study is inspired by the compressed Minimal Supersymmetric Standard Model, which allows the production of stoponium (a bound state of the supersymmetric partners of the top quark and its antiquark) and its decay to Higgs boson pairs, but this study is applicable to any other di-Higgs resonance produced by gluon fusion. The cross-section needed for a 5-sigma discovery at the 14 TeV LHC for such a narrow di-Higgs resonance is estimated as a function of the integrated luminosity, using the invariant mass distributions for bb¯ and photons. I have also found the integrated luminosity required for discovery of stoponium as a function of its mass. In my second project a viable extension of the Standard Model which incorporates vectorlike fermions near the electroweak scale has been explored. Vectorlike quarks and leptons are exotic new fermions that transform in non-chiral representations of the unbroken Standard Model gauge group. Two models are considered, in which the vectorlike leptons are weak isosinglets and isodoublets. The vectorlike leptons decay to tau leptons. I have studied the prospects for excluding or discovering vectorlike leptons using multilepton events at the LHC. If the vectorlike leptons are weak isosinglets, then discovery in multilepton states is found to be extremely challenging

  5. Progress Toward a Format Standard for Flight Dynamics Models

    NASA Technical Reports Server (NTRS)

    Jackson, E. Bruce; Hildreth, Bruce L.

    2006-01-01

    In the beginning, there was FORTRAN, and it was... not so good. But it was universal, and all flight simulator equations of motion were coded with it. Then came ACSL, C, Ada, C++, C#, Java, FORTRAN-90, Matlab/Simulink, and a number of other programming languages. Since the halcyon punch card days of 1968, models of aircraft flight dynamics have proliferated in training devices, desktop engineering and development computers, and control design textbooks. With the rise of industry teaming and increased reliance on simulation for procurement decisions, aircraft and missile simulation models are created, updated, and exchanged with increasing frequency. However, there is no real lingua franca to facilitate the exchange of models from one simulation user to another. The current state-of-the-art is such that several staff-months if not staff-years are required to 'rehost' each release of a flight dynamics model from one simulation environment to another one. If a standard data package or exchange format were to be universally adopted, the cost and time of sharing and updating aerodynamics, control laws, mass and inertia, and other flight dynamic components of the equations of motion of an aircraft or spacecraft simulation could be drastically reduced. A 2002 paper estimated over $ 6 million in savings could be realized for one military aircraft type alone. This paper describes the efforts of the American Institute of Aeronautics and Astronautics (AIAA) to develop a standard flight dynamic model exchange standard based on XML and HDF-5 data formats.

  6. Rare B decays as tests of the Standard Model

    NASA Astrophysics Data System (ADS)

    Blake, Thomas; Lanfranchi, Gaia; Straub, David M.

    2017-01-01

    One of the most interesting puzzles in particle physics today is that new physics is expected at the TeV energy scale to solve the hierarchy problem, and stabilises the Higgs mass, but so far no unambiguous signal of new physics has been found. Strong constraints on the energy scale of new physics can be derived from precision tests of the electroweak theory and from flavour-changing or CP-violating processes in strange, charm and beauty hadron decays. Decays that proceed via flavour-changing-neutral-current processes are forbidden at the lowest perturbative order in the Standard Model and are, therefore, rare. Rare b hadron decays are playing a central role in the understanding of the underlying patterns of Standard Model physics and in setting up new directions in model building for new physics contributions. In this article the status and prospects of this field are reviewed.

  7. Distinguishing standard model extensions using monotop chirality at the LHC

    NASA Astrophysics Data System (ADS)

    Allahverdi, Rouzbeh; Dalchenko, Mykhailo; Dutta, Bhaskar; Flórez, Andrés; Gao, Yu; Kamon, Teruki; Kolev, Nikolay; Mueller, Ryan; Segura, Manuel

    2016-12-01

    We present two minimal extensions of the standard model, each giving rise to baryogenesis. They include heavy color-triplet scalars interacting with a light Majorana fermion that can be the dark matter (DM) candidate. The electroweak charges of the new scalars govern their couplings to quarks of different chirality, which leads to different collider signals. These models predict monotop events at the LHC and the energy spectrum of decay products of highly polarized top quarks can be used to establish the chiral nature of the interactions involving the heavy scalars and the DM. Detailed simulation of signal and standard model background events is performed, showing that top quark chirality can be distinguished in hadronic and leptonic decays of the top quarks.

  8. One Higgs and a Standard Model, No Need for Supersymmetry

    NASA Astrophysics Data System (ADS)

    Neto, David

    2017-01-01

    With the detection of a Higgs like boson at 125 GeV in the summer of 2012, the Standard Model of particle physics was complete. However, there remain theoretical problems with the SM, such as naturalness and the hierarchy problem to name a few. For many years, extensions of the SM such as Supersymmetry, have provided interesting theoretical solutions to many of these problems. In addition to the wealth of beyond the Standard Model physics these theories predict. Yet, with the latest LHC data, there is still no sign of SUSY. With the SM holding up extremely well to the highest energies we have been able to experimentally probe. Here, we examine some of the ``theoretical shortcomings'' of the SM. We intend to question, based on LHC data, whether there is a need for SUSY, or if the SM, with the so far discovered single Higgs boson, can remain a consistent model of particle physics at yet higher energies.

  9. Tests and Problems of the Standard Model in Cosmology

    NASA Astrophysics Data System (ADS)

    López-Corredoira, Martín

    2017-06-01

    The main foundations of the standard Λ CDM model of cosmology are that: (1) the redshifts of the galaxies are due to the expansion of the Universe plus peculiar motions; (2) the cosmic microwave background radiation and its anisotropies derive from the high energy primordial Universe when matter and radiation became decoupled; (3) the abundance pattern of the light elements is explained in terms of primordial nucleosynthesis; and (4) the formation and evolution of galaxies can be explained only in terms of gravitation within a inflation + dark matter + dark energy scenario. Numerous tests have been carried out on these ideas and, although the standard model works pretty well in fitting many observations, there are also many data that present apparent caveats to be understood with it. In this paper, I offer a review of these tests and problems, as well as some examples of alternative models.

  10. Our sun. I. The standard model: Successes and failures

    SciTech Connect

    Sackmann, I.J.; Boothroyd, A.I.; Fowler, W.A. )

    1990-09-01

    The results of computing a number of standard solar models are reported. A presolar helium content of Y = 0.278 is obtained, and a Cl-37 capture rate of 7.7 SNUs, consistently several times the observed rate of 2.1 SNUs, is determined. Thus, the solar neutrino problem remains. The solar Z value is determined primarily by the observed Z/X ratio and is affected very little by differences in solar models. Even large changes in the low-temperature molecular opacities have no effect on Y, nor even on conditions at the base of the convective envelope. Large molecular opacities do cause a large increase in the mixing-length parameter alpha but do not cause the convective envelope to reach deeper. The temperature remains too low for lithium burning, and there is no surface lithium depletion; thus, the lithium problem of the standard solar model remains. 103 refs.

  11. Tests and Problems of the Standard Model in Cosmology

    NASA Astrophysics Data System (ADS)

    López-Corredoira, Martín

    2017-02-01

    The main foundations of the standard Λ CDM model of cosmology are that: (1) the redshifts of the galaxies are due to the expansion of the Universe plus peculiar motions; (2) the cosmic microwave background radiation and its anisotropies derive from the high energy primordial Universe when matter and radiation became decoupled; (3) the abundance pattern of the light elements is explained in terms of primordial nucleosynthesis; and (4) the formation and evolution of galaxies can be explained only in terms of gravitation within a inflation + dark matter + dark energy scenario. Numerous tests have been carried out on these ideas and, although the standard model works pretty well in fitting many observations, there are also many data that present apparent caveats to be understood with it. In this paper, I offer a review of these tests and problems, as well as some examples of alternative models.

  12. Using geodetic VLBI to test Standard-Model Extension

    NASA Astrophysics Data System (ADS)

    Hees, Aurélien; Lambert, Sébastien; Le Poncin-Lafitte, Christophe

    2016-04-01

    The modeling of the relativistic delay in geodetic techniques is primordial to get accurate geodetic products. And geodetic techniques can also be used to measure the relativistic delay and get constraints on parameters describing the relativity theory. The effective field theory framework called the Standard-Model Extension (SME) has been developed in order to systematically parametrize hypothetical violations of Lorentz symmetry (in the Standard Model and in the gravitational sector). In terms of light deflexion by a massive body like the Sun, one can expect a dependence in the elongation angle different from GR. In this communication, we use geodetic VLBI observations of quasars made in the frame of the permanent geodetic VLBI monitoring program to constrain the first SME coefficient. Our results do not show any deviation from GR and they improve current constraints on both GR and SME parameters.

  13. Higgs decays in gauge extensions of the standard model

    NASA Astrophysics Data System (ADS)

    Bunk, Don; Hubisz, Jay; Jain, Bithika

    2014-02-01

    We explore the phenomenology of virtual spin-1 contributions to the h→γγ and h→Zγ decay rates in gauge extensions of the standard model. We consider generic Lorentz and gauge-invariant vector self-interactions, which can have nontrivial structure after diagonalizing the quadratic part of the action. Such features are phenomenologically relevant in models where the electroweak gauge bosons mix with additional spin-1 fields, such as occurs in little Higgs models, extra dimensional models, strongly coupled variants of electroweak symmetry breaking, and other gauge extensions of the standard model. In models where nonrenormalizable operators mix field strengths of gauge groups, the one-loop Higgs decay amplitudes can be logarithmically divergent, and we provide power counting for the size of the relevant counterterm. We provide an example calculation in a four-site moose model that contains degrees of freedom that model the effects of vector and axial-vector resonances arising from TeV scale strong dynamics.

  14. Effects of Radion Mixing on the Standard Model Higgs Boson

    SciTech Connect

    Rizzo, Thomas G.

    2002-09-09

    We discuss how mixing between the Standard Model Higgs boson and the radion of the Randall-Sundrum model can lead to significant shifts in the expected properties of the Higgs boson. In particular we show that the total and partial decay widths of the Higgs, as well as the h {yields} gg branching fraction, can be substantially altered from their SM expectations, while the remaining branching fractions are modified less than about 5% for most of the parameter space volume. Precision measurements of Higgs boson properties at at a Linear Collider are shown to probe a large region of the Randall-Sundrum model parameter space.

  15. Bounds for lighter Higgses in extensions of the Standard Model

    NASA Astrophysics Data System (ADS)

    Deandrea, Aldo

    2017-07-01

    Higgs bosons lighter than the 125 GeV one can be present in extensions of the Standard Model and exclusion or discovery of such a possibility allows to improve our knowledge of the Higgs sector. I consider Two Higgs Doublet Models, which are the minimal scalar structure of a large number of models, in order to explore the possibility of constraining a neutral scalar particle lighter than the 125 GeV Higgs boson. Such a lighter particle is not yet completely excluded by present data. I show that some new constraints can be obtained at the LHC for these light particles.

  16. Quantum gravity and Standard-Model-like fermions

    NASA Astrophysics Data System (ADS)

    Eichhorn, Astrid; Lippoldt, Stefan

    2017-04-01

    We discover that chiral symmetry does not act as an infrared attractor of the renormalization group flow under the impact of quantum gravity fluctuations. Thus, observationally viable quantum gravity models must respect chiral symmetry. In our truncation, asymptotically safe gravity does, as a chiral fixed point exists. A second non-chiral fixed point with massive fermions provides a template for models with dark matter. This fixed point disappears for more than 10 fermions, suggesting that an asymptotically safe ultraviolet completion for the standard model plus gravity enforces chiral symmetry.

  17. Towards realistic standard model from D-brane configurations

    SciTech Connect

    Leontaris, G. K.; Tracas, N. D.; Korakianitis, O.; Vlachos, N. D.

    2007-12-01

    Effective low energy models arising in the context of D-brane configurations with standard model (SM) gauge symmetry extended by several gauged Abelian factors are discussed. The models are classified according to their hypercharge embeddings consistent with the SM spectrum hypercharge assignment. Particular cases are analyzed according to their perspectives and viability as low energy effective field theory candidates. The resulting string scale is determined by means of a two-loop renormalization group calculation. Their implications in Yukawa couplings, neutrinos and flavor changing processes are also presented.

  18. Extending the Standard Model with Confining and Conformal Dynamics

    NASA Astrophysics Data System (ADS)

    McRaven, John Emory

    This dissertation will provide a survey of models that involve extending the standard model with confining and conformal dynamics. We will study a series of models, describe them in detail, outline their phenomenology, and provide some search strategies for finding them. The Gaugephobic Higgs model provides an interpolation between three different models of electroweak symmetry breaking: Higgsless models, Randall-Sundrum models, and the Standard Model. At parameter points between the extremes, Standard Model Higgs signals are present at reduced rates, and Higgsless Kaluza-Klein excitations are present with shifted masses and couplings, as well as signals from exotic quarks necessary to protect the Zbb coupling. Using a new implementation of the model in SHERPA, we show the LHC signals which differentiate the generic Gaugephobic Higgs model from its limiting cases. These are all signals involving a Higgs coupling to a Kaluza-Klein gauge boson or quark. We identify the clean signal pp → W (i) → WH mediated by a Kaluza-Klein W, which can be present at large rates and is enhanced for even Kaluza-Klein numbers. Due to the very hard lepton coming from the W+/- decay, this signature has little background, and provides a better discovery channel for the Higgs than any of the Standard Model modes, over its entire mass range. A Higgs radiated from new heavy quarks also has large rates, but is much less promising due to very high multiplicity final states. The AdS/CFT conjectures a relation between Extra Dimensional models in AdS5 space, such as the Gaugephobic Higgs Model, and 4D Conformal Field theories. The notion of conformality has found its way into several phenomenological models for TeV-scale physics extending the standard model. We proceed to explore the phenomenology of a new heavy quark that transforms under a hidden strongly coupled conformal gauge group in addition to transforming under QCD. This object would form states similar to R-Hadrons. The heavy state

  19. Beyond standard model physics at current and future colliders

    NASA Astrophysics Data System (ADS)

    Liu, Zhen

    The Large Hadron Collider (LHC), a multinational experiment which began running in 2009, is highly expected to discover new physics that will help us understand the nature of the universe and begin to find solutions to many of the unsolved puzzles of particle physics. For over 40 years the Standard Model has been the accepted theory of elementary particle physics, except for one unconfirmed component, the Higgs boson. The experiments at the LHC have recently discovered this Standard-Model-like Higgs boson. This discovery is one of the most exciting achievements in elementary particle physics. Yet, a profound question remains: Is this rather light, weakly-coupled boson nothing but a Standard Model Higgs or a first manifestation of a deeper theory? Also, the recent discoveries of neutrino mass and mixing, experimental evidences of dark matter and dark energy, matter-antimatter asymmetry, indicate that our understanding of fundamental physics is currently incomplete. For the next decade and more, the LHC and future colliders will be at the cutting-edge of particle physics discoveries and will shed light on many of these unanswered questions. There are many promising beyond-Standard-Model theories that may help solve the central puzzles of particle physics. To fill the gaps in our knowledge, we need to know how these theories will manifest themselves in controlled experiments, such as high energy colliders. I discuss how we can probe fundamental physics at current and future colliders directly through searches for new phenomena such as resonances, rare Higgs decays, exotic displaced signatures, and indirectly through precision measurements on Higgs in this work. I explore beyond standard model physics effects from different perspectives, including explicit models such as supersymmetry, generic models in terms of resonances, as well as effective field theory approach in terms of higher dimensional operators. This work provides a generic and broad overview of the physics

  20. E-health stakeholders experiences with clinical modelling and standardizations.

    PubMed

    Gøeg, Kirstine Rosenbeck; Elberg, Pia Britt; Højen, Anne Randorff

    2015-01-01

    Stakeholders in e-health such as governance officials, health IT-implementers and vendors have to co-operate to achieve the goal of a future-proof interoperable e-health infrastructure. Co-operation requires knowledge on the responsibility and competences of stakeholder groups. To increase awareness on clinical modeling and standardization we conducted a workshop for Danish and a few Norwegian e-health stakeholders' and made them discuss their views on different aspects of clinical modeling using a theoretical model as a point of departure. Based on the model, we traced stakeholders' experiences. Our results showed there was a tendency that stakeholders were more familiar with e-health requirements than with design methods, clinical information models and clinical terminology as they are described in the scientific literature. The workshop made it possible for stakeholders to discuss their roles and expectations to each other.

  1. Comparison of Scalar Expectancy Theory (SET) and the Learning-to-Time (LeT) model in a successive temporal bisection task.

    PubMed

    Arantes, Joana

    2008-06-01

    The present research tested the generality of the "context effect" previously reported in experiments using temporal double bisection tasks [e.g., Arantes, J., Machado, A. Context effects in a temporal discrimination task: Further tests of the Scalar Expectancy Theory and Learning-to-Time models. J. Exp. Anal. Behav., in press]. Pigeons learned two temporal discriminations in which all the stimuli appear successively: 1s (red) vs. 4s (green) and 4s (blue) vs. 16s (yellow). Then, two tests were conducted to compare predictions of two timing models, Scalar Expectancy Theory (SET) and the Learning-to-Time (LeT) model. In one test, two psychometric functions were obtained by presenting pigeons with intermediate signal durations (1-4s and 4-16s). Results were mixed. In the critical test, pigeons were exposed to signals ranging from 1 to 16s and followed by the green or the blue key. Whereas SET predicted that the relative response rate to each of these keys should be independent of the signal duration, LeT predicted that the relative response rate to the green key (compared with the blue key) should increase with the signal duration. Results were consistent with LeT's predictions, showing that the context effect is obtained even when subjects do not need to make a choice between two keys presented simultaneously.

  2. Challenges to the standard model of Big Bang nucleosynthesis.

    PubMed Central

    Steigman, G

    1993-01-01

    Big Bang nucleosynthesis provides a unique probe of the early evolution of the Universe and a crucial test of the consistency of the standard hot Big Bang cosmological model. Although the primordial abundances of 2H, 3He, 4He, and 7Li inferred from current observational data are in agreement with those predicted by Big Bang nucleosynthesis, recent analysis has severely restricted the consistent range for the nucleon-to-photon ratio: 3.7 standard model and suggest that no new light particles may be allowed (N(BBN)nu

  3. Beyond the Standard Model Physics with Lattice Simulations

    NASA Astrophysics Data System (ADS)

    Rinaldi, Enrico

    2016-03-01

    Lattice simulations of gauge theories are a powerful tool to investigate strongly interacting systems like Quantum ChromoDynamics (QCD). In recent years, the expertise gathered from lattice QCD studies has been used to explore new extensions of the Standard Model of particle physics that include strong dynamics. This change of gear in lattice field theories is related to the growing experimental search for new physics, from accelerator facilites like the Large Hadron Collider (LHC) to dark matter detectors like LUX or ADMX. In my presentation I will explore different plausible scenarios for physics beyond the standard model where strong dynamics play a dominant role and can be tackled by numerical lattice simulations. The importance of lattice field theories is highlighted in the context of dark matter searches and the search for new resonances at the LHC. Acknowledge the support of the DOE under Contract DE-AC52-07NA27344 (LLNL).

  4. Search for Beyond the Standard Model Physics at D0

    SciTech Connect

    Kraus, James

    2011-08-01

    The standard model (SM) of particle physics has been remarkably successful at predicting the outcomes of particle physics experiments, but there are reasons to expect new physics at the electroweak scale. Over the last several years, there have been a number of searches for beyond the standard model (BSM) physics at D0. Here, we limit our focus to three: searches for diphoton events with large missing transverse energy (E{sub T}), searches for leptonic jets and E{sub T}, and searches for single vector-like quarks. We have discussed three recent searches at D0. There are many more, including limits on heavy neutral gauge boson in the ee channel, a search for scalar top quarks, a search for quirks, and limits on a new resonance decaying to WW or WZ.

  5. Challenges to the standard model of Big Bang nucleosynthesis.

    PubMed

    Steigman, G

    1993-06-01

    Big Bang nucleosynthesis provides a unique probe of the early evolution of the Universe and a crucial test of the consistency of the standard hot Big Bang cosmological model. Although the primordial abundances of 2H, 3He, 4He, and 7Li inferred from current observational data are in agreement with those predicted by Big Bang nucleosynthesis, recent analysis has severely restricted the consistent range for the nucleon-to-photon ratio: 3.7 standard model and suggest that no new light particles may be allowed (N(BBN)nu

  6. Precision Electroweak Measurements and Constraints on the Standard Model

    SciTech Connect

    Not Available

    2011-11-11

    This note presents constraints on Standard Model parameters using published and preliminary precision electroweak results measured at the electron-positron colliders LEP and SLC. The results are compared with precise electroweak measurements from other experiments, notably CDF and D0 at the Tevatron. Constraints on the input parameters of the Standard Model are derived from the results obtained in high-Q{sup 2} interactions, and used to predict results in low-Q{sup 2} experiments, such as atomic parity violation, Moller scattering, and neutrino-nucleon scattering. The main changes with respect to the experimental results presented in 2007 are new combinations of results on the W-boson mass and width and the mass of the top quark.

  7. Conformal loop quantum gravity coupled to the standard model

    NASA Astrophysics Data System (ADS)

    Campiglia, Miguel; Gambini, Rodolfo; Pullin, Jorge

    2017-01-01

    We argue that a conformally invariant extension of general relativity coupled to the standard model is the fundamental theory that needs to be quantized. We show that it can be treated by loop quantum gravity techniques. Through a gauge fixing and a modified Higgs mechanism particles acquire mass and one recovers general relativity coupled to the standard model. The theory suggests new views with respect to the definition of the Hamiltonian constraint in loop quantum gravity, the semi-classical limit and the issue of finite renormalization in quantum field theory in quantum space-time. It also gives hints about the elimination of ambiguities that arise in quantum field theory in quantum space-time in the calculation of back-reaction.

  8. Development of a standard documentation protocol for communicating exposure models.

    PubMed

    Ciffroy, P; Altenpohl, A; Fait, G; Fransman, W; Paini, A; Radovnikovic, A; Simon-Cornu, M; Suciu, N; Verdonck, F

    2016-10-15

    An important step in building a computational model is its documentation; a comprehensive and structured documentation can improve the model applicability and transparency in science/research and for regulatory purposes. This is particularly crucial and challenging for environmental and/or human exposure models that aim to establish quantitative relationships between personal exposure levels and their determinants. Exposure models simulate the transport and fate of a contaminant from the source to the receptor and may involve a large set of entities (e.g. all the media the contaminants may pass though). Such complex models are difficult to be described in a comprehensive, unambiguous and accessible way. Bad communication of assumptions, theory, structure and/or parameterization can lead to lack of confidence by the user and it may be source of errors. The goal of this paper is to propose a standard documentation protocol (SDP) for exposure models, i.e. a generic format and a standard structure by which all exposure models could be documented. For this purpose, a CEN (European Committee for Standardisation) workshop was set up with objective to agree on minimum requirements for the amount and type of information to be provided on exposure models documentation along with guidelines for the structure and presentation of the information. The resulting CEN workshop agreement (CWA) was expected to facilitate a more rigorous formulation of exposure models description and the understanding by users. This paper intends to describe the process followed for defining the SDP, the standardisation approach, as well as the main components of the SDP resulting from a wide consultation of interested stakeholders. The main outcome is a CEN CWA which establishes terms and definitions for exposure models and their elements, specifies minimum requirements for the amount and type of information to be documented, and proposes a structure for communicating the documentation to different

  9. Cosmology and the noncommutative approach to the standard model

    SciTech Connect

    Nelson, William; Sakellariadou, Mairi

    2010-04-15

    We study cosmological consequences of the noncommutative approach to the standard model of particle physics. Neglecting the nonminimal coupling of the Higgs field to the curvature, noncommutative corrections to Einstein's equations are present only for inhomogeneous and anisotropic space-times. Considering the nonminimal coupling however, corrections are obtained even for background cosmologies. Links with dilatonic gravity as well as chameleon cosmology are briefly discussed, and potential experimental consequences are mentioned.

  10. Naturalness and renormalization group in the standard model

    NASA Astrophysics Data System (ADS)

    Pivovarov, Grigorii B.

    2016-10-01

    I define a naturalness criterion formalizing the intuitive notion of naturalness discussed in the literature. After that, using ϕ4 as an example, I demonstrate that a theory may be natural in the MS-scheme and, at the same time, unnatural in the Gell-Mann-Low scheme. Finally, I discuss the prospects of using a version of the Gell-Mann-Low scheme in the Standard Model.

  11. Naturalness and Renormalization Group in the Standard Model

    NASA Astrophysics Data System (ADS)

    Pivovarov, Grigorii B.

    I define a naturalness criterion formalizing the intuitive notion of naturalness discussed in the literature. After that, using ϕ4 as an example, I demonstrate that a theory may be natural in the MS-scheme and, at the same time, unnatural in the Gell-Mann-Low scheme. Finally, I discuss the prospects of using a version of the Gell-Mann-Low scheme in the Standard Model.

  12. Standard model parameters and the search for new physics

    SciTech Connect

    Marciano, W.J.

    1988-04-01

    In these lectures, my aim is to present an up-to-date status report on the standard model and some key tests of electroweak unification. Within that context, I also discuss how and where hints of new physics may emerge. To accomplish those goals, I have organized my presentation as follows: I discuss the standard model parameters with particular emphasis on the gauge coupling constants and vector boson masses. Examples of new physics appendages are also briefly commented on. In addition, because these lectures are intended for students and thus somewhat pedagogical, I have included an appendix on dimensional regularization and a simple computational example that employs that technique. Next, I focus on weak charged current phenomenology. Precision tests of the standard model are described and up-to-date values for the Cabibbo-Kobayashi-Maskawa (CKM) mixing matrix parameters are presented. Constraints implied by those tests for a 4th generation, supersymmetry, extra Z/prime/ bosons, and compositeness are also discussed. I discuss weak neutral current phenomenology and the extraction of sin/sup 2/ /theta//sub W/ from experiment. The results presented there are based on a recently completed global analysis of all existing data. I have chosen to concentrate that discussion on radiative corrections, the effect of a heavy top quark mass, and implications for grand unified theories (GUTS). The potential for further experimental progress is also commented on. I depart from the narrowest version of the standard model and discuss effects of neutrino masses and mixings. I have chosen to concentrate on oscillations, the Mikheyev-Smirnov- Wolfenstein (MSW) effect, and electromagnetic properties of neutrinos. On the latter topic, I will describe some recent work on resonant spin-flavor precession. Finally, I conclude with a prospectus on hopes for the future. 76 refs.

  13. On the Standard Model prediction for RK and RK*

    NASA Astrophysics Data System (ADS)

    Pattori, A.

    2016-11-01

    In this article a recent work is reviewed, where we evaluated the impact of radiative corrections in RK and RK * . We find that, employing the cuts presently applied by the LHCb Collaboration, such corrections do not exceed a few percent. Moreover, their effect is well described (and corrected) by existing Monte Carlo codes. Our analysis reinforces the interest of these observables as clean probe of physics beyond the Standard Model.

  14. Framework for an asymptotically safe standard model via dynamical breaking

    NASA Astrophysics Data System (ADS)

    Abel, Steven; Sannino, Francesco

    2017-09-01

    We present a consistent embedding of the matter and gauge content of the Standard Model into an underlying asymptotically safe theory that has a well-determined interacting UV fixed point in the large color/flavor limit. The scales of symmetry breaking are determined by two mass-squared parameters with the breaking of electroweak symmetry being driven radiatively. There are no other free parameters in the theory apart from gauge couplings.

  15. Gold-standard performance for 2D hydrodynamic modeling

    NASA Astrophysics Data System (ADS)

    Pasternack, G. B.; MacVicar, B. J.

    2013-12-01

    Two-dimensional, depth-averaged hydrodynamic (2D) models are emerging as an increasingly useful tool for environmental water resources engineering. One of the remaining technical hurdles to the wider adoption and acceptance of 2D modeling is the lack of standards for 2D model performance evaluation when the riverbed undulates, causing lateral flow divergence and convergence. The goal of this study was to establish a gold-standard that quantifies the upper limit of model performance for 2D models of undulating riverbeds when topography is perfectly known and surface roughness is well constrained. A review was conducted of published model performance metrics and the value ranges exhibited by models thus far for each one. Typically predicted velocity differs from observed by 20 to 30 % and the coefficient of determination between the two ranges from 0.5 to 0.8, though there tends to be a bias toward overpredicting low velocity and underpredicting high velocity. To establish a gold standard as to the best performance possible for a 2D model of an undulating bed, two straight, rectangular-walled flume experiments were done with no bed slope and only different bed undulations and water surface slopes. One flume tested model performance in the presence of a porous, homogenous gravel bed with a long flat section, then a linear slope down to a flat pool bottom, and then the same linear slope back up to the flat bed. The other flume had a PVC plastic solid bed with a long flat section followed by a sequence of five identical riffle-pool pairs in close proximity, so it tested model performance given frequent undulations. Detailed water surface elevation and velocity measurements were made for both flumes. Comparing predicted versus observed velocity magnitude for 3 discharges with the gravel-bed flume and 1 discharge for the PVC-bed flume, the coefficient of determination ranged from 0.952 to 0.987 and the slope for the regression line was 0.957 to 1.02. Unsigned velocity

  16. Phenomenology of the N = 3 Lee-Wick Standard Model

    NASA Astrophysics Data System (ADS)

    TerBeek, Russell Henry

    With the discovery of the Higgs Boson in 2012, particle physics has decidedly moved beyond the Standard Model into a new epoch. Though the Standard Model particle content is now completely accounted for, there remain many theoretical issues about the structure of the theory in need of resolution. Among these is the hierarchy problem: since the renormalized Higgs mass receives quadratic corrections from a higher cutoff scale, what keeps the Higgs boson light? Many possible solutions to this problem have been advanced, such as supersymmetry, Randall-Sundrum models, or sub-millimeter corrections to gravity. One such solution has been advanced by the Lee-Wick Standard Model. In this theory, higher-derivative operators are added to the Lagrangian for each Standard Model field, which result in propagators that possess two physical poles and fall off more rapidly in the ultraviolet regime. It can be shown by an auxiliary field transformation that the higher-derivative theory is identical to positing a second, manifestly renormalizable theory in which new fields with opposite-sign kinetic and mass terms are found. These so-called Lee-Wick fields have opposite-sign propagators, and famously cancel off the quadratic divergences that plague the renormalized Higgs mass. The states in the Hilbert space corresponding to Lee-Wick particles have negative norm, and implications for causality and unitarity are examined. This dissertation explores a variant of the theory called the N = 3 Lee-Wick Standard Model. The Lagrangian of this theory features a yet-higher derivative operator, which produces a propagator with three physical poles and possesses even better high-energy behavior than the minimal Lee-Wick theory. An analogous auxiliary field transformation takes this higher-derivative theory into a renormalizable theory with states of alternating positive, negative, and positive norm. The phenomenology of this theory is examined in detail, with particular emphasis on the collider

  17. Impersonating the Standard Model Higgs boson: Alignment without decoupling

    SciTech Connect

    Carena, Marcela; Low, Ian; Shah, Nausheen R.; Wagner, Carlos E. M.

    2014-04-03

    In models with an extended Higgs sector there exists an alignment limit, in which the lightest CP-even Higgs boson mimics the Standard Model Higgs. The alignment limit is commonly associated with the decoupling limit, where all non-standard scalars are significantly heavier than the Z boson. However, alignment can occur irrespective of the mass scale of the rest of the Higgs sector. In this work we discuss the general conditions that lead to “alignment without decoupling”, therefore allowing for the existence of additional non-standard Higgs bosons at the weak scale. The values of tan β for which this happens are derived in terms of the effective Higgs quartic couplings in general two-Higgs-doublet models as well as in supersymmetric theories, including the MSSM and the NMSSM. In addition, we study the information encoded in the variations of the SM Higgs-fermion couplings to explore regions in the mA – tan β parameter space.

  18. Impersonating the Standard Model Higgs boson: Alignment without decoupling

    DOE PAGES

    Carena, Marcela; Low, Ian; Shah, Nausheen R.; ...

    2014-04-03

    In models with an extended Higgs sector there exists an alignment limit, in which the lightest CP-even Higgs boson mimics the Standard Model Higgs. The alignment limit is commonly associated with the decoupling limit, where all non-standard scalars are significantly heavier than the Z boson. However, alignment can occur irrespective of the mass scale of the rest of the Higgs sector. In this work we discuss the general conditions that lead to “alignment without decoupling”, therefore allowing for the existence of additional non-standard Higgs bosons at the weak scale. The values of tan β for which this happens are derivedmore » in terms of the effective Higgs quartic couplings in general two-Higgs-doublet models as well as in supersymmetric theories, including the MSSM and the NMSSM. In addition, we study the information encoded in the variations of the SM Higgs-fermion couplings to explore regions in the mA – tan β parameter space.« less

  19. BOOK REVIEW: Supersymmetry and String Theory: Beyond the Standard Model

    NASA Astrophysics Data System (ADS)

    Rocek, Martin

    2007-11-01

    When I was asked to review Michael Dine's new book, 'Supersymmetry and String Theory', I was pleased to have a chance to read a book by such an established authority on how string theory might become testable. The book is most useful as a list of current topics of interest in modern theoretical physics. It gives a succinct summary of a huge variety of subjects, including the standard model, symmetry, Yang Mills theory, quantization of gauge theories, the phenomenology of the standard model, the renormalization group, lattice gauge theory, effective field theories, anomalies, instantons, solitons, monopoles, dualities, technicolor, supersymmetry, the minimal supersymmetric standard model, dynamical supersymmetry breaking, extended supersymmetry, Seiberg Witten theory, general relativity, cosmology, inflation, bosonic string theory, the superstring, the heterotic string, string compactifications, the quintic, string dualities, large extra dimensions, and, in the appendices, Goldstone's theorem, path integrals, and exact beta-functions in supersymmetric gauge theories. Its breadth is both its strength and its weakness: it is not (and could not possibly be) either a definitive reference for experts, where the details of thorny technical issues are carefully explored, or a textbook for graduate students, with detailed pedagogical expositions. As such, it complements rather than replaces the much narrower and more focussed String Theory I and II volumes by Polchinski, with their deep insights, as well the two older volumes by Green, Schwarz, and Witten, which develop string theory pedagogically.

  20. Wisconsin Model Early Learning Standards Alignment with Wisconsin Common Core State Standards for English Language Arts and Mathematics

    ERIC Educational Resources Information Center

    Wisconsin Department of Public Instruction, 2011

    2011-01-01

    Wisconsin's adoption of the Common Core State Standards provides an excellent opportunity for Wisconsin school districts and communities to define expectations from birth through preparation for college and work. By aligning the existing Wisconsin Model Early Learning Standards with the Wisconsin Common Core State Standards, expectations can be…

  1. No Evidence for Extensions to the Standard Cosmological Model

    NASA Astrophysics Data System (ADS)

    Heavens, Alan; Fantaye, Yabebal; Sellentin, Elena; Eggers, Hans; Hosenie, Zafiirah; Kroon, Steve; Mootoovaloo, Arrykrishna

    2017-09-01

    We compute the Bayesian evidence for models considered in the main analysis of Planck cosmic microwave background data. By utilizing carefully defined nearest-neighbor distances in parameter space, we reuse the Monte Carlo Markov chains already produced for parameter inference to compute Bayes factors B for many different model-data set combinations. The standard 6-parameter flat cold dark matter model with a cosmological constant (Λ CDM ) is favored over all other models considered, with curvature being mildly favored only when cosmic microwave background lensing is not included. Many alternative models are strongly disfavored by the data, including primordial correlated isocurvature models (ln B =-7.8 ), nonzero scalar-to-tensor ratio (ln B =-4.3 ), running of the spectral index (ln B =-4.7 ), curvature (ln B =-3.6 ), nonstandard numbers of neutrinos (ln B =-3.1 ), nonstandard neutrino masses (ln B =-3.2 ), nonstandard lensing potential (ln B =-4.6 ), evolving dark energy (ln B =-3.2 ), sterile neutrinos (ln B =-6.9 ), and extra sterile neutrinos with a nonzero scalar-to-tensor ratio (ln B =-10.8 ). Other models are less strongly disfavored with respect to flat Λ CDM . As with all analyses based on Bayesian evidence, the final numbers depend on the widths of the parameter priors. We adopt the priors used in the Planck analysis, while performing a prior sensitivity analysis. Our quantitative conclusion is that extensions beyond the standard cosmological model are disfavored by Planck data. Only when newer Hubble constant measurements are included does Λ CDM become disfavored, and only mildly, compared with a dynamical dark energy model (ln B ˜+2 ).

  2. Scalar dark matter in scale invariant standard model

    NASA Astrophysics Data System (ADS)

    Ghorbani, Karim; Ghorbani, Hossein

    2016-04-01

    We investigate single and two-component scalar dark matter scenarios in classically scale invariant standard model which is free of the hierarchy problem in the Higgs sector. We show that despite the very restricted space of parameters imposed by the scale invariance symmetry, both single and two-component scalar dark matter models overcome the direct and indirect constraints provided by the Planck/WMAP observational data and the LUX/Xenon100 experiment. We comment also on the radiative mass corrections of the classically massless scalon that plays a crucial role in our study.

  3. Elementary particles, dark matter candidate and new extended standard model

    NASA Astrophysics Data System (ADS)

    Hwang, Jaekwang

    2017-01-01

    Elementary particle decays and reactions are discussed in terms of the three-dimensional quantized space model beyond the standard model. Three generations of the leptons and quarks correspond to the lepton charges. Three heavy leptons and three heavy quarks are introduced. And the bastons (new particles) are proposed as the possible candidate of the dark matters. Dark matter force, weak force and strong force are explained consistently. Possible rest masses of the new particles are, tentatively, proposed for the experimental searches. For more details, see the conference paper at https://www.researchgate.net/publication/308723916.

  4. New model of inflation with nonminimal derivative coupling of standard model Higgs boson to gravity.

    PubMed

    Germani, Cristiano; Kehagias, Alex

    2010-07-02

    In this Letter we show that there is a unique nonminimal derivative coupling of the standard model Higgs boson to gravity such that it propagates no more degrees of freedom than general relativity sourced by a scalar field, reproduces a successful inflating background within the standard model Higgs parameters, and finally does not suffer from dangerous quantum corrections.

  5. FCNC decays of standard model fermions into a dark photon

    NASA Astrophysics Data System (ADS)

    Gabrielli, Emidio; Mele, Barbara; Raidal, Martti; Venturini, Elena

    2016-12-01

    We analyze a new class of FCNC processes, the f →f'γ ¯ decays of a fermion f into a lighter (same-charge) fermion f' plus a massless neutral vector boson, a dark photon γ ¯. A massless dark photon does not interact at tree level with observable fields, and the f →f'γ ¯ decay presents a characteristic signature where the final fermion f' is balanced by a massless invisible system. Models recently proposed to explain the exponential spread in the standard-model Yukawa couplings can indeed foresee an extra unbroken dark U (1 ) gauge group, and the possibility to couple on-shell dark photons to standard-model fermions via one-loop magnetic-dipole kind of FCNC interactions. The latter are suppressed by the characteristic scale related to the mass of heavy messengers, connecting the standard model particles to the dark sector. We compute the corresponding decay rates for the top, bottom, and charm decays (t →c γ ¯ , u γ ¯ , b →s γ ¯ , d γ ¯ , and c →u γ ¯), and for the charged-lepton decays (τ →μ γ ¯ , e γ ¯ , and μ →e γ ¯) in terms of model parameters. We find that large branching ratios for both quark and lepton decays are allowed in case the messenger masses are in the discovery range of the LHC. Implications of these new decay channels at present and future collider experiments are briefly discussed.

  6. BiGG Models: A platform for integrating, standardizing and sharing genome-scale models

    DOE PAGES

    King, Zachary A.; Lu, Justin; Drager, Andreas; ...

    2015-10-17

    In this study, genome-scale metabolic models are mathematically structured knowledge bases that can be used to predict metabolic pathway usage and growth phenotypes. Furthermore, they can generate and test hypotheses when integrated with experimental data. To maximize the value of these models, centralized repositories of high-quality models must be established, models must adhere to established standards and model components must be linked to relevant databases. Tools for model visualization further enhance their utility. To meet these needs, we present BiGG Models (http://bigg.ucsd.edu), a completely redesigned Biochemical, Genetic and Genomic knowledge base. BiGG Models contains more than 75 high-quality, manually-curated genome-scalemore » metabolic models. On the website, users can browse, search and visualize models. BiGG Models connects genome-scale models to genome annotations and external databases. Reaction and metabolite identifiers have been standardized across models to conform to community standards and enable rapid comparison across models. Furthermore, BiGG Models provides a comprehensive application programming interface for accessing BiGG Models with modeling and analysis tools. As a resource for highly curated, standardized and accessible models of metabolism, BiGG Models will facilitate diverse systems biology studies and support knowledge-based analysis of diverse experimental data.« less

  7. BiGG Models: A platform for integrating, standardizing and sharing genome-scale models

    SciTech Connect

    King, Zachary A.; Lu, Justin; Drager, Andreas; Miller, Philip; Federowicz, Stephen; Lerman, Joshua A.; Ebrahim, Ali; Palsson, Bernhard O.; Lewis, Nathan E.

    2015-10-17

    In this study, genome-scale metabolic models are mathematically structured knowledge bases that can be used to predict metabolic pathway usage and growth phenotypes. Furthermore, they can generate and test hypotheses when integrated with experimental data. To maximize the value of these models, centralized repositories of high-quality models must be established, models must adhere to established standards and model components must be linked to relevant databases. Tools for model visualization further enhance their utility. To meet these needs, we present BiGG Models (http://bigg.ucsd.edu), a completely redesigned Biochemical, Genetic and Genomic knowledge base. BiGG Models contains more than 75 high-quality, manually-curated genome-scale metabolic models. On the website, users can browse, search and visualize models. BiGG Models connects genome-scale models to genome annotations and external databases. Reaction and metabolite identifiers have been standardized across models to conform to community standards and enable rapid comparison across models. Furthermore, BiGG Models provides a comprehensive application programming interface for accessing BiGG Models with modeling and analysis tools. As a resource for highly curated, standardized and accessible models of metabolism, BiGG Models will facilitate diverse systems biology studies and support knowledge-based analysis of diverse experimental data.

  8. BiGG Models: A platform for integrating, standardizing and sharing genome-scale models

    PubMed Central

    King, Zachary A.; Lu, Justin; Dräger, Andreas; Miller, Philip; Federowicz, Stephen; Lerman, Joshua A.; Ebrahim, Ali; Palsson, Bernhard O.; Lewis, Nathan E.

    2016-01-01

    Genome-scale metabolic models are mathematically-structured knowledge bases that can be used to predict metabolic pathway usage and growth phenotypes. Furthermore, they can generate and test hypotheses when integrated with experimental data. To maximize the value of these models, centralized repositories of high-quality models must be established, models must adhere to established standards and model components must be linked to relevant databases. Tools for model visualization further enhance their utility. To meet these needs, we present BiGG Models (http://bigg.ucsd.edu), a completely redesigned Biochemical, Genetic and Genomic knowledge base. BiGG Models contains more than 75 high-quality, manually-curated genome-scale metabolic models. On the website, users can browse, search and visualize models. BiGG Models connects genome-scale models to genome annotations and external databases. Reaction and metabolite identifiers have been standardized across models to conform to community standards and enable rapid comparison across models. Furthermore, BiGG Models provides a comprehensive application programming interface for accessing BiGG Models with modeling and analysis tools. As a resource for highly curated, standardized and accessible models of metabolism, BiGG Models will facilitate diverse systems biology studies and support knowledge-based analysis of diverse experimental data. PMID:26476456

  9. A unified model of the standard genetic code

    PubMed Central

    Morgado, Eberto R.

    2017-01-01

    The Rodin–Ohno (RO) and the Delarue models divide the table of the genetic code into two classes of aminoacyl-tRNA synthetases (aaRSs I and II) with recognition from the minor or major groove sides of the tRNA acceptor stem, respectively. These models are asymmetric but they are biologically meaningful. On the other hand, the standard genetic code (SGC) can be derived from the primeval RNY code (R stands for purines, Y for pyrimidines and N any of them). In this work, the RO-model is derived by means of group actions, namely, symmetries represented by automorphisms, assuming that the SGC originated from a primeval RNY code. It turns out that the RO-model is symmetric in a six-dimensional (6D) hypercube. Conversely, using the same automorphisms, we show that the RO-model can lead to the SGC. In addition, the asymmetric Delarue model becomes symmetric by means of quotient group operations. We formulate isometric functions that convert the class aaRS I into the class aaRS II and vice versa. We show that the four polar requirement categories display a symmetrical arrangement in our 6D hypercube. Altogether these results cannot be attained, neither in two nor in three dimensions. We discuss the present unified 6D algebraic model, which is compatible with both the SGC (based upon the primeval RNY code) and the RO-model. PMID:28405378

  10. A unified model of the standard genetic code.

    PubMed

    José, Marco V; Zamudio, Gabriel S; Morgado, Eberto R

    2017-03-01

    The Rodin-Ohno (RO) and the Delarue models divide the table of the genetic code into two classes of aminoacyl-tRNA synthetases (aaRSs I and II) with recognition from the minor or major groove sides of the tRNA acceptor stem, respectively. These models are asymmetric but they are biologically meaningful. On the other hand, the standard genetic code (SGC) can be derived from the primeval RNY code (R stands for purines, Y for pyrimidines and N any of them). In this work, the RO-model is derived by means of group actions, namely, symmetries represented by automorphisms, assuming that the SGC originated from a primeval RNY code. It turns out that the RO-model is symmetric in a six-dimensional (6D) hypercube. Conversely, using the same automorphisms, we show that the RO-model can lead to the SGC. In addition, the asymmetric Delarue model becomes symmetric by means of quotient group operations. We formulate isometric functions that convert the class aaRS I into the class aaRS II and vice versa. We show that the four polar requirement categories display a symmetrical arrangement in our 6D hypercube. Altogether these results cannot be attained, neither in two nor in three dimensions. We discuss the present unified 6D algebraic model, which is compatible with both the SGC (based upon the primeval RNY code) and the RO-model.

  11. Domain walls and gravitational waves in the Standard Model

    NASA Astrophysics Data System (ADS)

    Krajewski, Tomasz; Lalak, Zygmunt; Lewicki, Marek; Olszewski, Paweł

    2016-12-01

    We study domain walls which can be created in the Standard Model under the assumption that it is valid up to very high energy scales. We focus on domain walls interpolating between the physical electroweak vacuum and the global minimum appearing at very high field strengths. The creation of the network which ends up in the electroweak vacuum percolating through the Universe is not as difficult to obtain as one may expect, although it requires certain tuning of initial conditions. Our numerical simulations confirm that such domain walls would swiftly decay and thus cannot dominate the Universe. We discuss the possibility of detection of gravitational waves produced in this scenario. We have found that for the standard cosmology the energy density of these gravitational waves is too small to be observed in present and planned detectors.

  12. Standardization of Thermo-Fluid Modeling in Modelica.Fluid

    SciTech Connect

    Franke, Rudiger; Casella, Francesco; Sielemann, Michael; Proelss, Katrin; Otter, Martin; Wetter, Michael

    2009-09-01

    This article discusses the Modelica.Fluid library that has been included in the Modelica Standard Library 3.1. Modelica.Fluid provides interfaces and basic components for the device-oriented modeling of onedimensional thermo-fluid flow in networks containing vessels, pipes, fluid machines, valves and fittings. A unique feature of Modelica.Fluid is that the component equations and the media models as well as pressure loss and heat transfer correlations are decoupled from each other. All components are implemented such that they can be used for media from the Modelica.Media library. This means that an incompressible or compressible medium, a single or a multiple substance medium with one or more phases might be used with one and the same model as long as the modeling assumptions made hold. Furthermore, trace substances are supported. Modeling assumptions can be configured globally in an outer System object. This covers in particular the initialization, uni- or bi-directional flow, and dynamic or steady-state formulation of mass, energy, and momentum balance. All assumptions can be locally refined for every component. While Modelica.Fluid contains a reasonable set of component models, the goal of the library is not to provide a comprehensive set of models, but rather to provide interfaces and best practices for the treatment of issues such as connector design and implementation of energy, mass and momentum balances. Applications from various domains are presented.

  13. Toward Standardizing a Lexicon of Infectious Disease Modeling Terms.

    PubMed

    Milwid, Rachael; Steriu, Andreea; Arino, Julien; Heffernan, Jane; Hyder, Ayaz; Schanzer, Dena; Gardner, Emma; Haworth-Brockman, Margaret; Isfeld-Kiely, Harpa; Langley, Joanne M; Moghadas, Seyed M

    2016-01-01

    Disease modeling is increasingly being used to evaluate the effect of health intervention strategies, particularly for infectious diseases. However, the utility and application of such models are hampered by the inconsistent use of infectious disease modeling terms between and within disciplines. We sought to standardize the lexicon of infectious disease modeling terms and develop a glossary of terms commonly used in describing models' assumptions, parameters, variables, and outcomes. We combined a comprehensive literature review of relevant terms with an online forum discussion in a virtual community of practice, mod4PH (Modeling for Public Health). Using a convergent discussion process and consensus amongst the members of mod4PH, a glossary of terms was developed as an online resource. We anticipate that the glossary will improve inter- and intradisciplinary communication and will result in a greater uptake and understanding of disease modeling outcomes in heath policy decision-making. We highlight the role of the mod4PH community of practice and the methodologies used in this endeavor to link theory, policy, and practice in the public health domain.

  14. Rare radiative charm decays within the standard model and beyond

    NASA Astrophysics Data System (ADS)

    de Boer, Stefan; Hiller, Gudrun

    2017-08-01

    We present standard model (SM) estimates for exclusive c → uγ processes in heavy quark and hybrid frameworks. Measured branching ratios \\mathrmB({D}^0\\to (φ, {\\overline{K}}^{\\ast 0})γ ) are at or somewhat exceeding the upper range of the SM and suggest slow convergence of the 1 /m D , α s -expansion. Model-independent constraints on |Δ C| = | Δ U | = 1 dipole operators from ℬ( D 0 → ρ 0 γ) data are obtained. Predictions and implications for leptoquark models are worked out. While branching ratios are SM-like CP asymmetries ≲ 10% can be induced. In SUSY deviations from the SM can be even larger with CP asymmetries of O(0 .1). If Λ c -baryons are produced polarized, such as at the Z-pole, an angular asymmetry in Λ c → pγ decays can be studied that is sensitive to chirality-flipped contributions.

  15. On a radiative origin of the Standard Model from trinification

    NASA Astrophysics Data System (ADS)

    Camargo-Molina, José Eliel; Morais, António P.; Pasechnik, Roman; Wessén, Jonas

    2016-09-01

    In this work, we present a trinification-based grand unified theory incorporating a global SU(3) family symmetry that after a spontaneous breaking leads to a left-right symmetric model. Already at the classical level, this model can accommodate the matter content and the quark Cabbibo mixing in the Standard Model (SM) with only one Yukawa coupling at the unification scale. Considering the minimal low-energy scenario with the least amount of light states, we show that the resulting effective theory enables dynamical breaking of its gauge group down to that of the SM by means of radiative corrections accounted for by the renormalisation group evolution at one loop. This result paves the way for a consistent explanation of the SM breaking scale and fermion mass hierarchies.

  16. On light dilaton extensions of the Standard Model

    NASA Astrophysics Data System (ADS)

    Megías, Eugenio; Pujolàs, Oriol; Quirós, Mariano

    2016-11-01

    We discuss the presence of a light dilaton in Conformal Field Theories deformed by a single scalar operator, in the holographic realization consisting of confining Renormalization Group flows. Then, we apply this formalism to study the extension of the Standard Model with a light dilaton in a 5D warped model. We study the spectrum of scalar and vector perturbations, compare the model predictions with Electroweak Precision Tests and find the corresponding bounds for the lightest modes. Finally, we analyze the possibility that the Higgs resonance found at the LHC be a dilaton. Presented by E. Megías at the 4th International Conference on New Frontiers in Physics (ICNFP 2015), 23-30 August 2015, Kolymbari, Crete, Greece.

  17. The Beyond the standard model working group: Summary report

    SciTech Connect

    G. Azuelos et al.

    2004-03-18

    In this working group we have investigated a number of aspects of searches for new physics beyond the Standard Model (SM) at the running or planned TeV-scale colliders. For the most part, we have considered hadron colliders, as they will define particle physics at the energy frontier for the next ten years at least. The variety of models for Beyond the Standard Model (BSM) physics has grown immensely. It is clear that only future experiments can provide the needed direction to clarify the correct theory. Thus, our focus has been on exploring the extent to which hadron colliders can discover and study BSM physics in various models. We have placed special emphasis on scenarios in which the new signal might be difficult to find or of a very unexpected nature. For example, in the context of supersymmetry (SUSY), we have considered: how to make fully precise predictions for the Higgs bosons as well as the superparticles of the Minimal Supersymmetric Standard Model (MSSM) (parts III and IV); MSSM scenarios in which most or all SUSY particles have rather large masses (parts V and VI); the ability to sort out the many parameters of the MSSM using a variety of signals and study channels (part VII); whether the no-lose theorem for MSSM Higgs discovery can be extended to the next-to-minimal Supersymmetric Standard Model (NMSSM) in which an additional singlet superfield is added to the minimal collection of superfields, potentially providing a natural explanation of the electroweak value of the parameter {micro} (part VIII); sorting out the effects of CP violation using Higgs plus squark associate production (part IX); the impact of lepton flavor violation of various kinds (part X); experimental possibilities for the gravitino and its sgoldstino partner (part XI); what the implications for SUSY would be if the NuTeV signal for di-muon events were interpreted as a sign of R-parity violation (part XII). Our other main focus was on the phenomenological implications of extra

  18. Object Oriented Design and the Standard Model of particle physics

    NASA Astrophysics Data System (ADS)

    Lipovaca, Samir

    2007-04-01

    Inspired by the computer as both tool and metaphor, a new path emerges toward understanding life, physics, and existence. The path leads throughout all of nature, from the interior of cells to inside black holes. This view of science is based on the idea that information is the ultimate ``substance'' from which all things are made. Exploring this view, we will focus on Object - Oriented (OO) design as one of the most important designs in software development. The OO design views the world as composed of objects with well defined properties. The dynamics is pictured as interactions among objects. Interactions are mediated by messages that objects exchange with each other. This description closely resembles the view of the elementary particles world created by the Standard Model of particle physics. The object model (OM) provides a theoretical foundation upon which the OO design is built. The OM is based on the principles of abstraction, encapsulation, modularity and hierarchy. We will show that the Standard Model of particle physics follows the OM principles.

  19. Dark matter candidates in the constrained exceptional supersymmetric standard model

    NASA Astrophysics Data System (ADS)

    Athron, P.; Thomas, A. W.; Underwood, S. J.; White, M. J.

    2017-02-01

    The exceptional supersymmetric standard model is a low energy alternative to the minimal supersymmetric standard model (MSSM) with an extra U (1 ) gauge symmetry and three generations of matter filling complete 27-plet representations of E6. This provides both new D and F term contributions that raise the Higgs mass at tree level, and a compelling solution to the μ -problem of the MSSM by forbidding such a term with the extra U (1 ) symmetry. Instead, an effective μ -term is generated from the vacuum expectation value of an SM singlet which breaks the extra U (1 ) symmetry at low energies, giving rise to a massive Z'. We explore the phenomenology of the constrained version of this model in substantially more detail than has been carried out previously, performing a ten dimensional scan that reveals a large volume of viable parameter space. We classify the different mechanisms for generating the measured relic density of dark matter found in the scan, including the identification of a new mechanism involving mixed bino/inert-Higgsino dark matter. We show which mechanisms can evade the latest direct detection limits from the LUX 2016 experiment. Finally we present benchmarks consistent with all the experimental constraints and which could be discovered with the XENON1T experiment.

  20. Dark Matter and Color Octets Beyond the Standard Model

    SciTech Connect

    Krnjaic, Gordan Zdenko

    2012-07-01

    Although the Standard Model (SM) of particles and interactions has survived forty years of experimental tests, it does not provide a complete description of nature. From cosmological and astrophysical observations, it is now clear that the majority of matter in the universe is not baryonic and interacts very weakly (if at all) via non-gravitational forces. The SM does not provide a dark matter candidate, so new particles must be introduced. Furthermore, recent Tevatron results suggest that SM predictions for benchmark collider observables are in tension with experimental observations. In this thesis, we will propose extensions to the SM that address each of these issues.

  1. Standard model and supersymmetric Higgs searches at CDF

    SciTech Connect

    Kilminster, Ben; /Ohio State U.

    2005-10-01

    We present the results on the searches for the SM and the MSSM Higgs boson production in proton-antiproton collisions at {radical}s = 1.96 GeV with the CDF detector. The Higgs bosons are searched for in various production and decay channels, with data samples corresponding to 400 pb{sup -1}. Using these measurements, we set an upper limit on the production cross section times branching fraction for the Standard Model Higgs as a function of the Higgs mass, and we obtain exclusion regions in the tan{beta} vs mass for the neutral MSSM Higgs, and branching fraction vs mass for the charged Higgs.

  2. Naturalness of CP Violation in the Standard Model

    SciTech Connect

    Gibbons, Gary W.; Gielen, Steffen; Pope, C. N.; Turok, Neil

    2009-03-27

    We construct a natural measure on the space of Cabibbo-Kobayashi-Maskawa matrices in the standard model, assuming the fermion mass matrices are randomly selected from a distribution which incorporates the observed quark mass hierarchy. This measure allows us to assess the likelihood of Jarlskog's CP violation parameter J taking its observed value J{approx_equal}3x10{sup -5}. We find that the observed value, while well below the mathematically allowed maximum, is in fact typical once the observed quark masses are assumed.

  3. Electroweak baryogenesis in the exceptional supersymmetric standard model

    SciTech Connect

    Chao, Wei

    2015-08-28

    We study electroweak baryogenesis in the E{sub 6} inspired exceptional supersymmetric standard model (E{sub 6}SSM). The relaxation coefficients driven by singlinos and the new gaugino as well as the transport equation of the Higgs supermultiplet number density in the E{sub 6}SSM are calculated. Our numerical simulation shows that both CP-violating source terms from singlinos and the new gaugino can solely give rise to a correct baryon asymmetry of the Universe via the electroweak baryogenesis mechanism.

  4. Standard Model parton distributions at very high energies

    NASA Astrophysics Data System (ADS)

    Bauer, Christian W.; Ferland, Nicolas; Webber, Bryan R.

    2017-08-01

    We compute the leading-order evolution of parton distribution functions for all the Standard Model fermions and bosons up to energy scales far above the electroweak scale, where electroweak symmetry is restored. Our results include the 52 PDFs of the unpolarized proton, evolving according to the SU(3), SU(2), U(1), mixed SU(2)×U(1) and Yukawa interactions. We illustrate the numerical effects on parton distributions at large energies, and show that this can lead to important corrections to parton luminosities at a future 100 TeV collider.

  5. a Holographic View of Beyond the Standard Model Physics

    NASA Astrophysics Data System (ADS)

    Gherghetta, T.

    2011-03-01

    We provide an introduction to the physics of a warped extra dimension and the AdS/CFT correspondence. An AdS/CFT dictionary is given which leads to a 4D holographic view of the 5th dimension. With a particular emphasis on beyond the standard model physics, this provides a window into the strong dynamics associated with electroweak symmetry breaking and/or supersymmetry breaking. In this way hierarchies associated with the electroweak and/or supersymmetry breaking scale, together with the fermion mass spectrum, can be addressed in a consistent framework.

  6. Future high precision experiments and new physics beyond Standard Model

    SciTech Connect

    Luo, Mingxing

    1993-04-01

    High precision (< 1%) electroweak experiments that have been done or are likely to be done in this decade are examined on the basis of Standard Model (SM) predictions of fourteen weak neutral current observables and fifteen W and Z properties to the one-loop level, the implications of the corresponding experimental measurements to various types of possible new physics that enter at the tree or loop level were investigated. Certain experiments appear to have special promise as probes of the new physics considered here.

  7. Future high precision experiments and new physics beyond Standard Model

    SciTech Connect

    Luo, Mingxing.

    1993-01-01

    High precision (< 1%) electroweak experiments that have been done or are likely to be done in this decade are examined on the basis of Standard Model (SM) predictions of fourteen weak neutral current observables and fifteen W and Z properties to the one-loop level, the implications of the corresponding experimental measurements to various types of possible new physics that enter at the tree or loop level were investigated. Certain experiments appear to have special promise as probes of the new physics considered here.

  8. A theorem on the Higgs sector of the Standard Model

    NASA Astrophysics Data System (ADS)

    Frasca, Marco

    2016-06-01

    We provide the solution of the classical theory for the Higgs sector of the Standard Model obtaining the exact Green's function for the broken phase. Solving the Dyson-Schwinger equations for the Higgs field we show that the propagator coincides with that of the classical theory confirming the spectrum also at the quantum level. In this way we obtain a proof of triviality using the Källen-Lehman representation. This has as a consequence that higher excited states must exist for the Higgs particle, representing an internal spectrum for it. Higher excited states have exponentially smaller amplitudes and, so, their production rates are significantly depressed.

  9. Searches for the standard model Higgs boson at the Tevatron

    SciTech Connect

    Dorigo, Tommaso; /Padua U.

    2005-05-01

    The CDF and D0 experiments at the Tevatron have searched for the Standard Model Higgs boson in data collected between 2001 and 2004. Upper limits have been placed on the production cross section times branching ratio to b{bar b} pairs or W{sup +}W{sup -} pairs as a function of the Higgs boson mass. projections indicate that the Tevatron experiments have a chance of discovering a M{sub H} = 115 GeV Higgs with the total dataset foreseen by 2009, or excluding it at 95% C.L. up to a mass of 135 GeV.

  10. Quantum corrections in Higgs inflation: the Standard Model case

    SciTech Connect

    George, Damien P.; Mooij, Sander; Postma, Marieke E-mail: sander.mooij@ing.uchile.cl

    2016-04-01

    We compute the one-loop renormalization group equations for Standard Model Higgs inflation. The calculation is done in the Einstein frame, using a covariant formalism for the multi-field system. All counterterms, and thus the betafunctions, can be extracted from the radiative corrections to the two-point functions; the calculation of higher n-point functions then serves as a consistency check of the approach. We find that the theory is renormalizable in the effective field theory sense in the small, mid and large field regime. In the large field regime our results differ slightly from those found in the literature, due to a different treatment of the Goldstone bosons.

  11. Searches for the standard model Higgs at the Tevatron

    SciTech Connect

    Kilminster, Ben; /Ohio State U.

    2007-05-01

    The CDF and D0 experiments at the Tevatron are currently the only capable of searching for the Standard Model Higgs boson. This article describes their most sensitive searches in the expected Higgs mass range, focusing on advanced methods used to extract the maximal sensitivity from the data. CDF presents newly updated results for H {yields} W{sup +}W{sup -} and Zh {yields} l{sup +}l{sup -}b{bar b}. D0 presents two new searches for WH {yields} lvb{bar b}. These new analyses use the same 1 fb{sup -1} dataset as previous searches, but with improved techniques resulting in markedly improved sensitivity.

  12. Toward Standardizing a Lexicon of Infectious Disease Modeling Terms

    PubMed Central

    Milwid, Rachael; Steriu, Andreea; Arino, Julien; Heffernan, Jane; Hyder, Ayaz; Schanzer, Dena; Gardner, Emma; Haworth-Brockman, Margaret; Isfeld-Kiely, Harpa; Langley, Joanne M.; Moghadas, Seyed M.

    2016-01-01

    Disease modeling is increasingly being used to evaluate the effect of health intervention strategies, particularly for infectious diseases. However, the utility and application of such models are hampered by the inconsistent use of infectious disease modeling terms between and within disciplines. We sought to standardize the lexicon of infectious disease modeling terms and develop a glossary of terms commonly used in describing models’ assumptions, parameters, variables, and outcomes. We combined a comprehensive literature review of relevant terms with an online forum discussion in a virtual community of practice, mod4PH (Modeling for Public Health). Using a convergent discussion process and consensus amongst the members of mod4PH, a glossary of terms was developed as an online resource. We anticipate that the glossary will improve inter- and intradisciplinary communication and will result in a greater uptake and understanding of disease modeling outcomes in heath policy decision-making. We highlight the role of the mod4PH community of practice and the methodologies used in this endeavor to link theory, policy, and practice in the public health domain. PMID:27734014

  13. Radiative electroweak symmetry breaking in standard model extensions

    NASA Astrophysics Data System (ADS)

    Babu, K. S.; Gogoladze, Ilia; Khan, S.

    2017-05-01

    We study the possibility of radiative electroweak symmetry breaking where loop corrections to the mass parameter of the Higgs boson trigger the symmetry breaking in various extensions of the Standard Model (SM). Although the mechanism fails in the SM, it is shown to be quite successful in several extensions which share a common feature of having an additional scalar around the TeV scale. The positive Higgs mass parameter at a high energy scale is turned negative in the renormalization group flow to lower energy by the cross couplings between the scalars in the Higgs potential. The type-II seesaw model with a TeV scale weak scalar triplet, a two-loop radiative neutrino mass model with new scalars at the TeV scale, the inert doublet model, scalar singlet dark matter model, and a universal seesaw model with an additional U (1 ) broken at the TeV scale are studied and shown to exhibit successful radiative electroweak symmetry breaking.

  14. Big bang nucleosynthesis: The standard model and alternatives

    NASA Technical Reports Server (NTRS)

    Schramm, David N.

    1991-01-01

    Big bang nucleosynthesis provides (with the microwave background radiation) one of the two quantitative experimental tests of the big bang cosmological model. This paper reviews the standard homogeneous-isotropic calculation and shows how it fits the light element abundances ranging from He-4 at 24% by mass through H-2 and He-3 at parts in 10(exp 5) down to Li-7 at parts in 10(exp 10). Furthermore, the recent large electron positron (LEP) (and the stanford linear collider (SLC)) results on the number of neutrinos are discussed as a positive laboratory test of the standard scenario. Discussion is presented on the improved observational data as well as the improved neutron lifetime data. Alternate scenarios of decaying matter or of quark-hadron induced inhomogeneities are discussed. It is shown that when these scenarios are made to fit the observed abundances accurately, the resulting conlusions on the baryonic density relative to the critical density, omega(sub b) remain approximately the same as in the standard homogeneous case, thus, adding to the robustness of the conclusion that omega(sub b) approximately equals 0.06. This latter point is the driving force behind the need for non-baryonic dark matter (assuming omega(sub total) = 1) and the need for dark baryonic matter, since omega(sub visible) is less than omega(sub b).

  15. Big bang nucleosynthesis: The standard model and alternatives

    NASA Technical Reports Server (NTRS)

    Schramm, David N.

    1991-01-01

    Big bang nucleosynthesis provides (with the microwave background radiation) one of the two quantitative experimental tests of the big bang cosmological model. This paper reviews the standard homogeneous-isotropic calculation and shows how it fits the light element abundances ranging from He-4 at 24% by mass through H-2 and He-3 at parts in 10(exp 5) down to Li-7 at parts in 10(exp 10). Furthermore, the recent large electron positron (LEP) (and the stanford linear collider (SLC)) results on the number of neutrinos are discussed as a positive laboratory test of the standard scenario. Discussion is presented on the improved observational data as well as the improved neutron lifetime data. Alternate scenarios of decaying matter or of quark-hadron induced inhomogeneities are discussed. It is shown that when these scenarios are made to fit the observed abundances accurately, the resulting conlusions on the baryonic density relative to the critical density, omega(sub b) remain approximately the same as in the standard homogeneous case, thus, adding to the robustness of the conclusion that omega(sub b) approximately equals 0.06. This latter point is the driving force behind the need for non-baryonic dark matter (assuming omega(sub total) = 1) and the need for dark baryonic matter, since omega(sub visible) is less than omega(sub b).

  16. Colorado Model Content Standards for Science: Suggested Grade Level Expectations.

    ERIC Educational Resources Information Center

    Colorado State Dept. of Education, Denver.

    This document outlines the content standards for science in the state of Colorado. The document is organized into six standards, each of which is subdivided into a set of guiding questions exemplifying the standard and a series of lists defining what is expected of students at each grade level within the standard. The standards are that students…

  17. Medical Education Again Provides a Model for Law Schools: The Standardized Patient becomes the Standardized Client.

    ERIC Educational Resources Information Center

    Grosberg, Lawrence M.

    2001-01-01

    Describes how medical schools have successfully used the "standardized patient" teaching technique, and the use of "standardized clients" at New York Law School. Proposes establishing consortiums among small groups of law schools to implement the standardized client technique, and using the technique in high stakes testing. (EV)

  18. Modelization des consequences possibles de la future demande bioenergetique mondiale pour le bois et les forest en France

    Treesearch

    Joseph Buongiorno; Ronald Raunikar; Shushuai Zhu

    2011-01-01

    L’article présente une exploration, menée au moyen d’un modèle mondial de la filière-bois, de l’effet sur la filière-bois française des modifications actuelles et prévisibles de la demande mondiale en énergie issue de la biomasse. Deux scénarios contrastés sont testés. Les résultats sont mis en perspective et soulignent le conflit potentiel entre usages du bois : bois...

  19. Modeling the wet bulb globe temperature using standard meteorological measurements.

    PubMed

    Liljegren, James C; Carhart, Richard A; Lawday, Philip; Tschopp, Stephen; Sharp, Robert

    2008-10-01

    The U.S. Army has a need for continuous, accurate estimates of the wet bulb globe temperature to protect soldiers and civilian workers from heat-related injuries, including those involved in the storage and destruction of aging chemical munitions at depots across the United States. At these depots, workers must don protective clothing that increases their risk of heat-related injury. Because of the difficulty in making continuous, accurate measurements of wet bulb globe temperature outdoors, the authors have developed a model of the wet bulb globe temperature that relies only on standard meteorological data available at each storage depot for input. The model is composed of separate submodels of the natural wet bulb and globe temperatures that are based on fundamental principles of heat and mass transfer, has no site-dependent parameters, and achieves an accuracy of better than 1 degree C based on comparisons with wet bulb globe temperature measurements at all depots.

  20. Modeling the wet bulb globe temperature using standard meteorological measurements.

    SciTech Connect

    Liljegren, J. C.; Carhart, R. A.; Lawday, P.; Tschopp, S.; Sharp, R.; Decision and Information Sciences

    2008-10-01

    The U.S. Army has a need for continuous, accurate estimates of the wet bulb globe temperature to protect soldiers and civilian workers from heat-related injuries, including those involved in the storage and destruction of aging chemical munitions at depots across the United States. At these depots, workers must don protective clothing that increases their risk of heat-related injury. Because of the difficulty in making continuous, accurate measurements of wet bulb globe temperature outdoors, the authors have developed a model of the wet bulb globe temperature that relies only on standard meteorological data available at each storage depot for input. The model is composed of separate submodels of the natural wet bulb and globe temperatures that are based on fundamental principles of heat and mass transfer, has no site-dependent parameters, and achieves an accuracy of better than 1 C based on comparisons with wet bulb globe temperature measurements at all depots.

  1. Using clinical element models for pharmacogenomic study data standardization.

    PubMed

    Zhu, Qian; Freimuth, Robert R; Pathak, Jyotishman; Chute, Christopher G

    2013-01-01

    Standardized representations for pharmacogenomics data are seldom used, which leads to data heterogeneity and hinders data reuse and integration. In this study, we attempted to represent data elements from the Pharmacogenomics Research Network (PGRN) that are related to four categories, patient, drug, disease and laboratory, in a standard way using Clinical Element Models (CEMs), which have been adopted in the Strategic Health IT Advanced Research Project, secondary use of EHR (SHARPn) as a library of common logical models that facilitate consistent data representation, interpretation, and exchange within and across heterogeneous sources and applications. This was accomplished by grouping PGRN data elements into categories based on UMLS semantic type, then mapping each to one or more CEM attributes using a web-based tool that was developed to support curation activities. This study demonstrates the successful application of SHARPn CEMs to the pharmacogenomic domain. It also identified several categories of data elements that are not currently supported by SHARPn CEMs, which represent opportunities for further development and collaboration.

  2. Penguin-like diagrams from the standard model

    SciTech Connect

    Ping, Chia Swee

    2015-04-24

    The Standard Model is highly successful in describing the interactions of leptons and quarks. There are, however, rare processes that involve higher order effects in electroweak interactions. One specific class of processes is the penguin-like diagram. Such class of diagrams involves the neutral change of quark flavours accompanied by the emission of a gluon (gluon penguin), a photon (photon penguin), a gluon and a photon (gluon-photon penguin), a Z-boson (Z penguin), or a Higgs-boson (Higgs penguin). Such diagrams do not arise at the tree level in the Standard Model. They are, however, induced by one-loop effects. In this paper, we present an exact calculation of the penguin diagram vertices in the ‘tHooft-Feynman gauge. Renormalization of the vertex is effected by a prescription by Chia and Chong which gives an expression for the counter term identical to that obtained by employing Ward-Takahashi identity. The on-shell vertex functions for the penguin diagram vertices are obtained. The various penguin diagram vertex functions are related to one another via Ward-Takahashi identity. From these, a set of relations is obtained connecting the vertex form factors of various penguin diagrams. Explicit expressions for the gluon-photon penguin vertex form factors are obtained, and their contributions to the flavor changing processes estimated.

  3. Electric-Magnetic Duality and the Dualized Standard Model

    NASA Astrophysics Data System (ADS)

    Tsou, Sheung Tsun

    In these lectures I shall explain how a new-found nonabelian duality can be used to solve some outstanding questions in particle physics. The first lecture introduces the concept of electromagnetic duality and goes on to present its nonabelian generalization in terms of loop space variables. The second lecture discusses certain puzzles that remain with the Standard Model of particle physics, particularly aimed at nonexperts. The third lecture presents a solution to these problems in the form of the Dualized Standard Model, first proposed by Chan and the author, using nonabelian dual symmetry. The fundamental particles exist in three generations, and if this is a manifestation of dual colour symmetry, which by 't Hooft's theorem is necessarily broken, then we have a natural explanation of the generation puzzle, together with tested and testable consequences not only in particle physics, but also in astrophysics, nuclear and atomic physics. Reported is mainly work done in collaboration with Chan Hong-Mo, and also various parts with Peter Scharbach, Jacqueline Faridani, José Bordes, Jakov Pfaudler, Ricardo Gallego severally.

  4. Vacuum stability in an extended standard model with a leptoquark

    NASA Astrophysics Data System (ADS)

    Bandyopadhyay, Priyotosh; Mandal, Rusa

    2017-02-01

    We investigate the standard model with the extension of a charged scalar having fractional electromagnetic charge of -1 /3 unit and with lepton and baryon number-violating couplings at tree level. Without directly taking part in the electroweak (EW) symmetry breaking, this scalar can affect stability of the EW vacuum via loop effects. The impact of such a scalar, i.e., a leptoquark, on the perturbativity of standard model dimensionless couplings as well as on new physics couplings has been studied at two-loop order. The vacuum stability of the Higgs potential is checked using the one-loop renormalization group-improved effective potential approach with a two-loop beta function for all the couplings. From the stability analysis, various bounds are drawn on parameter space by identifying the region corresponding to the metastability and stability of the EW vacuum. Later, we also address the Higgs mass fine-tuning issue via the Veltman condition, and the presence of such a scalar increases the scale up to which the theory can be considered as reasonably fine-tuned. All these constraints give a very predictive parameter space for leptoquark couplings which can be tested at present and future colliders. Especially, a leptoquark with mass O (TeV ) can give rise to lepton-quark flavor-violating signatures via decaying into the t τ channel at tree level, which can be tested at the LHC or future colliders.

  5. Topological Higgs inflation: Origin of Standard Model criticality

    NASA Astrophysics Data System (ADS)

    Hamada, Yuta; Oda, Kin-ya; Takahashi, Fuminobu

    2014-11-01

    The measured values of the Higgs and top masses and of the strong gauge coupling constant point to the near-criticality of the Standard Model, where two vacua at the electroweak and Planck scales are quasidegenerate. We argue that the criticality is required by the occurrence of an eternal topological inflation induced by the Higgs potential. The role of this inflation is to continuously create a sufficiently flat and homogeneous Universe, providing the necessary initial condition for the subsequent slow-roll inflation that generates the density perturbations of the right magnitude. While the condition for the topological Higgs inflation is only marginally satisfied in the Standard Model, it can be readily satisfied if one introduces the right-handed neutrinos and/or the nonminimal coupling to gravity; currently unknown quantum gravity corrections to the potential may also help. We also discuss the B -L Higgs inflation as a possible origin of the observed density perturbations. Its necessary initial condition, the restored B -L symmetry, can be naturally realized by the preceding topological Higgs inflation.

  6. Penguin-like diagrams from the standard model

    NASA Astrophysics Data System (ADS)

    Ping, Chia Swee

    2015-04-01

    The Standard Model is highly successful in describing the interactions of leptons and quarks. There are, however, rare processes that involve higher order effects in electroweak interactions. One specific class of processes is the penguin-like diagram. Such class of diagrams involves the neutral change of quark flavours accompanied by the emission of a gluon (gluon penguin), a photon (photon penguin), a gluon and a photon (gluon-photon penguin), a Z-boson (Z penguin), or a Higgs-boson (Higgs penguin). Such diagrams do not arise at the tree level in the Standard Model. They are, however, induced by one-loop effects. In this paper, we present an exact calculation of the penguin diagram vertices in the `tHooft-Feynman gauge. Renormalization of the vertex is effected by a prescription by Chia and Chong which gives an expression for the counter term identical to that obtained by employing Ward-Takahashi identity. The on-shell vertex functions for the penguin diagram vertices are obtained. The various penguin diagram vertex functions are related to one another via Ward-Takahashi identity. From these, a set of relations is obtained connecting the vertex form factors of various penguin diagrams. Explicit expressions for the gluon-photon penguin vertex form factors are obtained, and their contributions to the flavor changing processes estimated.

  7. Long-term archiving and data access: modelling and standardization

    NASA Technical Reports Server (NTRS)

    Hoc, Claude; Levoir, Thierry; Nonon-Latapie, Michel

    1996-01-01

    This paper reports on the multiple difficulties inherent in the long-term archiving of digital data, and in particular on the different possible causes of definitive data loss. It defines the basic principles which must be respected when creating long-term archives. Such principles concern both the archival systems and the data. The archival systems should have two primary qualities: independence of architecture with respect to technological evolution, and generic-ness, i.e., the capability of ensuring identical service for heterogeneous data. These characteristics are implicit in the Reference Model for Archival Services, currently being designed within an ISO-CCSDS framework. A system prototype has been developed at the French Space Agency (CNES) in conformance with these principles, and its main characteristics will be discussed in this paper. Moreover, the data archived should be capable of abstract representation regardless of the technology used, and should, to the extent that it is possible, be organized, structured and described with the help of existing standards. The immediate advantage of standardization is illustrated by several concrete examples. Both the positive facets and the limitations of this approach are analyzed. The advantages of developing an object-oriented data model within this contxt are then examined.

  8. Standard model with spontaneously broken quantum scale invariance

    NASA Astrophysics Data System (ADS)

    Ghilencea, D. M.; Lalak, Z.; Olszewski, P.

    2017-09-01

    We explore the possibility that scale symmetry is a quantum symmetry that is broken only spontaneously and apply this idea to the standard model. We compute the quantum corrections to the potential of the Higgs field (ϕ ) in the classically scale-invariant version of the standard model (mϕ=0 at tree level) extended by the dilaton (σ ). The tree-level potential of ϕ and σ , dictated by scale invariance, may contain nonpolynomial effective operators, e.g., ϕ6/σ2, ϕ8/σ4, ϕ10/σ6, etc. The one-loop scalar potential is scale invariant, since the loop calculations manifestly preserve the scale symmetry, with the dimensional regularization subtraction scale μ generated spontaneously by the dilaton vacuum expectation value μ ˜⟨σ ⟩. The Callan-Symanzik equation of the potential is verified in the presence of the gauge, Yukawa, and the nonpolynomial operators. The couplings of the nonpolynomial operators have nonzero beta functions that we can actually compute from the quantum potential. At the quantum level, the Higgs mass is protected by spontaneously broken scale symmetry, even though the theory is nonrenormalizable. We compare the one-loop potential to its counterpart computed in the "traditional" dimensional regularization scheme that breaks scale symmetry explicitly (μ =constant) in the presence at the tree level of the nonpolynomial operators.

  9. Generalized Pure Density Matrices and the Standard Model

    NASA Astrophysics Data System (ADS)

    Brannen, Carl

    2015-04-01

    We consider generalizations of pure density matrices that have ρρ = ρ , but give up the trace=1 requirement. Given a representation of a quantum algebra in N × N complex matrices, the elements that satisfy ρρ = ρ can be taken to be pure density matrix states. In the Standard Model, particles from different ``superselection sectors'' cannot form linear superpositions. For example, it is impossible to form a linear superposition between an electron and a neutrino. We report that some quantum algebras give symmetry, particle and generation content, gauge freedom, and superselection sectors that are similar to those of the Standard Model. Our lecture will consider as an example the 4 × 4 complex matrices. There are 16 that are diagonal with ρρ = ρ . The 4 with trace=1 give the usual pure density matrices. We will show that the 6 with trace=2 form an SU(3) triplet of three superselection sectors, with each sector consisting of an SU(2) doublet. Considering one of these sectors, the mapping to SU(2) is not unique; there is an SU(2) gauge freedom. This gauge freedom is an analogy to the U(1) gauge freedom that arises when converting a pure density matrix to a state vector.

  10. Myocardial scar imaging by standard single-energy and dual-energy late enhancement CT: Comparison with pathology and electroanatomic map in an experimental chronic infarct porcine model.

    PubMed

    Truong, Quynh A; Thai, Wai-Ee; Wai, Bryan; Cordaro, Kevin; Cheng, Teresa; Beaudoin, Jonathan; Xiong, Guanglei; Cheung, Jim W; Altman, Robert; Min, James K; Singh, Jagmeet P; Barrett, Conor D; Danik, Stephan

    2015-01-01

    Myocardial scar is a substrate for ventricular tachycardia and sudden cardiac death. Late enhancement CT imaging can detect scar, but it remains unclear whether newer late enhancement dual-energy (LE-DECT) acquisition has benefit over standard single-energy late enhancement (LE-CT). We aim to compare late enhancement CT using newer LE-DECT acquisition and single-energy LE-CT acquisitions with pathology and electroanatomic map (EAM) in an experimental chronic myocardial infarction (MI) porcine study. In 8 pigs with chronic myocardial infarction (59 ± 5 kg), we performed dual-source CT, EAM, and pathology. For CT imaging, we performed 3 acquisitions at 10 minutes after contrast administration: LE-CT 80 kV, LE-CT 100 kV, and LE-DECT with 2 postprocessing software settings. Of the sequences, LE-CT 100 kV provided the best contrast-to-noise ratio (all P ≤ .03) and correlation to pathology for scar (ρ = 0.88). LE-DECT overestimated scar (both P = .02), whereas LE-CT images did not (both P = .08). On a segment basis (n = 136), all CT sequences had high specificity (87%-93%) and modest sensitivity (50%-67%), with LE-CT 100 kV having the highest specificity of 93% for scar detection compared to pathology and agreement with EAM (κ = 0.69). Standard single-energy LE-CT, particularly 100 kV, matched better to pathology and EAM than dual-energy LE-DECT for scar detection. Larger human trials as well as more technical studies that optimize varying different energies with newer hardware and software are warranted. Copyright © 2015 Society of Cardiovascular Computed Tomography. Published by Elsevier Inc. All rights reserved.

  11. A new apparatus for standardization of experimental burn models.

    PubMed

    Arda, Mehmet Surhan; Koçman, Atacan Emre; Söztutar, Erdem; Baksan, Bedri; Çetin, Cengiz

    2017-09-01

    Burns have severe economic burden for families and countries therefore its treatment modalities have utmost importance. Several study both experimental or clinic has been reported accordingly. Although contact burns were frequently used models, most of them were manually designed. The elapsed time was recorded only. However, the real time contact surface temperature (T) and weight force (WF) were fundamental characteristics of a burn model. The aim of this study is to create a standard burn model with recording real time variables on behalf of custom designed apparatus. A custom designed apparatus was manufactured in which the variables of real time T, WF and elapsed time could be set and record. A vertical angle was provided to ensure the applied WF. And hence, Sprague-Dawley rats were randomly divided into four groups: (1) Burned at 60±1°C with low WF(G60WFL), (2) Burned at 60±1°C with high WF(G60WFH), (3) Burned at 80±1°C with low WF(G80WFL), (4) Burned at 80±1°C with high WF(G80WFH). The healthy skin thickness and burn depth were measured. The percentage of burn depth to healthy skin was used for statistical analysis. Constant variables T and WF were achieved. The pressure applied on skin was not significant between low [G60WFL vs G80WFL, (p=0.1704)] and high [G60WFH vs G80WFH (p=0.2369)] WF groups. However the percentage of burn depth was increasing owing to applied WF in 60°C group [G60WFL vs G60WFH, (p=0.0125)] and in 80°C group [G80WFL vs G80WFH (p<0.0001)]. And also the percentage was significantly increasing owing to set T, in low WF group [G60WFL vs G80WFL (p<0.0001)] and high WF group [G60WFH vs G80WFH (p<0.0001)]. Besides neither T nor WF has priority. Without recording the real time T and WF, it is infeasible to achieve a standard burn model. For a standard depth of burn, variables should be under control, as if our custom designed apparatus. Copyright © 2017 Elsevier Ltd and ISBI. All rights reserved.

  12. Type II Supernovae: Model Light Curves and Standard Candle Relationships

    NASA Astrophysics Data System (ADS)

    Kasen, Daniel; Woosley, S. E.

    2009-10-01

    A survey of Type II supernovae explosion models has been carried out to determine how their light curves and spectra vary with their mass, metallicity, and explosion energy. The presupernova models are taken from a recent survey of massive stellar evolution at solar metallicity supplemented by new calculations at subsolar metallicity. Explosions are simulated by the motion of a piston near the edge of the iron core and the resulting light curves and spectra are calculated using full multi-wavelength radiation transport. Formulae are developed that describe approximately how the model observables (light curve luminosity and duration) scale with the progenitor mass, explosion energy, and radioactive nucleosynthesis. Comparison with observational data shows that the explosion energy of typical supernovae (as measured by kinetic energy at infinity) varies by nearly an order of magnitude—from 0.5 to 4.0 × 1051 ergs, with a typical value of ~0.9 × 1051 ergs. Despite the large variation, the models exhibit a tight relationship between luminosity and expansion velocity, similar to that previously employed empirically to make SNe IIP standardized candles. This relation is explained by the simple behavior of hydrogen recombination in the supernova envelope, but we find a sensitivity to progenitor metallicity and mass that could lead to systematic errors. Additional correlations between light curve luminosity, duration, and color might enable the use of SNe IIP to obtain distances accurate to ~20% using only photometric data.

  13. TYPE II SUPERNOVAE: MODEL LIGHT CURVES AND STANDARD CANDLE RELATIONSHIPS

    SciTech Connect

    Kasen, Daniel; Woosley, S. E.

    2009-10-01

    A survey of Type II supernovae explosion models has been carried out to determine how their light curves and spectra vary with their mass, metallicity, and explosion energy. The presupernova models are taken from a recent survey of massive stellar evolution at solar metallicity supplemented by new calculations at subsolar metallicity. Explosions are simulated by the motion of a piston near the edge of the iron core and the resulting light curves and spectra are calculated using full multi-wavelength radiation transport. Formulae are developed that describe approximately how the model observables (light curve luminosity and duration) scale with the progenitor mass, explosion energy, and radioactive nucleosynthesis. Comparison with observational data shows that the explosion energy of typical supernovae (as measured by kinetic energy at infinity) varies by nearly an order of magnitude-from 0.5 to 4.0 x 10{sup 51} ergs, with a typical value of approx0.9 x 10{sup 51} ergs. Despite the large variation, the models exhibit a tight relationship between luminosity and expansion velocity, similar to that previously employed empirically to make SNe IIP standardized candles. This relation is explained by the simple behavior of hydrogen recombination in the supernova envelope, but we find a sensitivity to progenitor metallicity and mass that could lead to systematic errors. Additional correlations between light curve luminosity, duration, and color might enable the use of SNe IIP to obtain distances accurate to approx20% using only photometric data.

  14. Experimental validation of Swy-2 clay standard's PHREEQC model

    NASA Astrophysics Data System (ADS)

    Szabó, Zsuzsanna; Hegyfalvi, Csaba; Freiler, Ágnes; Udvardi, Beatrix; Kónya, Péter; Székely, Edit; Falus, György

    2017-04-01

    One of the challenges of the present century is to limit the greenhouse gas emissions for the mitigation of climate change which is possible for example by a transitional technology, CCS (Carbon Capture and Storage) and, among others, by the increase of nuclear proportion in the energy mix. Clay minerals are considered to be responsible for the low permeability and sealing capacity of caprocks sealing off stored CO2 and they are also the main constituents of bentonite in high level radioactive waste disposal facilities. The understanding of clay behaviour in these deep geological environments is possible through laboratory batch experiments of well-known standards and coupled geochemical models. Such experimentally validated models are scarce even though they allow deriving more precise long-term predictions of mineral reactions and rock and bentonite degradation underground and, therefore, ensuring the safety of the above technologies and increase their public acceptance. This ongoing work aims to create a kinetic geochemical model of Na-montmorillonite standard Swy-2 in the widely used PHREEQC code, supported by solution and mineral composition results from batch experiments. Several four days experiments have been carried out in 1:35 rock:water ratio at atmospheric conditions, and with inert and CO2 supercritical phase at 100 bar and 80 ⁰C relevant for the potential Hungarian CO2 reservoir complex. Solution samples have been taken during and after experiments and their compositions were measured by ICP-OES. The treated solid phase has been analysed by XRD and ATR-FTIR and compared to in-parallel measured references (dried Swy-2). Kinetic geochemical modelling of the experimental conditions has been performed by PHREEQC version 3 using equations and kinetic rate parameters from the USGS report of Palandri and Kharaka (2004). The visualization of experimental and numerous modelling results has been automatized by R. Experiments and models show very fast

  15. Application of standards and models in body composition analysis.

    PubMed

    Müller, Manfred J; Braun, Wiebke; Pourhassan, Maryam; Geisler, Corinna; Bosy-Westphal, Anja

    2016-05-01

    The aim of this review is to extend present concepts of body composition and to integrate it into physiology. In vivo body composition analysis (BCA) has a sound theoretical and methodological basis. Present methods used for BCA are reliable and valid. Individual data on body components, organs and tissues are included into different models, e.g. a 2-, 3-, 4- or multi-component model. Today the so-called 4-compartment model as well as whole body MRI (or computed tomography) scans are considered as gold standards of BCA. In practice the use of the appropriate method depends on the question of interest and the accuracy needed to address it. Body composition data are descriptive and used for normative analyses (e.g. generating normal values, centiles and cut offs). Advanced models of BCA go beyond description and normative approaches. The concept of functional body composition (FBC) takes into account the relationships between individual body components, organs and tissues and related metabolic and physical functions. FBC can be further extended to the model of healthy body composition (HBC) based on horizontal (i.e. structural) and vertical (e.g. metabolism and its neuroendocrine control) relationships between individual components as well as between component and body functions using mathematical modelling with a hierarchical multi-level multi-scale approach at the software level. HBC integrates into whole body systems of cardiovascular, respiratory, hepatic and renal functions. To conclude BCA is a prerequisite for detailed phenotyping of individuals providing a sound basis for in depth biomedical research and clinical decision making.

  16. Wisconsin's Model Academic Standards for Art and Design Education.

    ERIC Educational Resources Information Center

    Wisconsin State Dept. of Public Instruction, Madison.

    This Wisconsin academic standards guide for art and design explains what is meant by academic standards. The guide declares that academic standards specify what students should know and be able to do; what students might be asked to do to give evidence of standards; how well students must perform; and that content, performance, and proficiency…

  17. Evolution of Climate Science Modelling Language within international standards frameworks

    NASA Astrophysics Data System (ADS)

    Lowe, Dominic; Woolf, Andrew

    2010-05-01

    The Climate Science Modelling Language (CSML) was originally developed as part of the NERC Data Grid (NDG) project in the UK. It was one of the first Geography Markup Language (GML) application schemas describing complex feature types for the metocean domain. CSML feature types can be used to describe typical climate products such as model runs or atmospheric profiles. CSML has been successfully used within NDG to provide harmonised access to a number of different data sources. For example, meteorological observations held in heterogeneous databases by the British Atmospheric Data Centre (BADC) and Centre for Ecology and Hydrology (CEH) were served uniformly as CSML features via Web Feature Service. CSML has now been substantially revised to harmonise it with the latest developments in OGC and ISO conceptual modelling for geographic information. In particular, CSML is now aligned with the near-final ISO 19156 Observations & Measurements (O&M) standard. CSML combines the O&M concept of 'sampling features' together with an observation result based on the coverage model (ISO 19123). This general pattern is specialised for particular data types of interest, classified on the basis of sampling geometry and topology. In parallel work, the OGC Met Ocean Domain Working Group has established a conceptual modelling activity. This is a cross-organisational effort aimed at reaching consensus on a common core data model that could be re-used in a number of met-related application areas: operational meteorology, aviation meteorology, climate studies, and the research community. It is significant to note that this group has also identified sampling geometry and topology as a key classification axis for data types. Using the Model Driven Architecture (MDA) approach as adopted by INSPIRE we demonstrate how the CSML application schema is derived from a formal UML conceptual model based on the ISO TC211 framework. By employing MDA tools which map consistently between UML and GML we

  18. Standards and Guidelines for Numerical Models for Tsunami Hazard Mitigation

    NASA Astrophysics Data System (ADS)

    Titov, V.; Gonzalez, F.; Kanoglu, U.; Yalciner, A.; Synolakis, C. E.

    2006-12-01

    An increased number of nations around the workd need to develop tsunami mitigation plans which invariably involve inundation maps for warning guidance and evacuation planning. There is the risk that inundation maps may be produced with older or untested methodology, as there are currently no standards for modeling tools. In the aftermath of the 2004 megatsunami, some models were used to model inundation for Cascadia events with results much larger than sediment records and existing state-of-the-art studies suggest leading to confusion among emergency management. Incorrectly assessing tsunami impact is hazardous, as recent events in 2006 in Tonga, Kythira, Greece and Central Java have suggested (Synolakis and Bernard, 2006). To calculate tsunami currents, forces and runup on coastal structures, and inundation of coastlines one must calculate the evolution of the tsunami wave from the deep ocean to its target site, numerically. No matter what the numerical model, validation (the process of ensuring that the model solves the parent equations of motion accurately) and verification (the process of ensuring that the model used represents geophysical reality appropriately) both are an essential. Validation ensures that the model performs well in a wide range of circumstances and is accomplished through comparison with analytical solutions. Verification ensures that the computational code performs well over a range of geophysical problems. A few analytic solutions have been validated themselves with laboratory data. Even fewer existing numerical models have been both validated with the analytical solutions and verified with both laboratory measurements and field measurements, thus establishing a gold standard for numerical codes for inundation mapping. While there is in principle no absolute certainty that a numerical code that has performed well in all the benchmark tests will also produce correct inundation predictions with any given source motions, validated codes

  19. Delayed standard neural network models for control systems.

    PubMed

    Liu, Meiqin

    2007-09-01

    In order to conveniently analyze the stability of recurrent neural networks (RNNs) and successfully synthesize the controllers for nonlinear systems, similar to the nominal model in linear robust control theory, the novel neural network model, named delayed standard neural network model (DSNNM) is presented, which is the interconnection of a linear dynamic system and a bounded static delayed (or nondelayed) nonlinear operator. By combining a number of different Lyapunov functionals with S-procedure, some useful criteria of global asymptotic stability and global exponential stability for the continuous-time DSNNMs (CDSNNMs) and discrete-time DSNNMs (DDSNNMs) are derived, whose conditions are formulated as linear matrix inequalities (LMIs). Based on the stability analysis, some state-feedback control laws for the DSNNM with input and output are designed to stabilize the closed-loop systems. Most RNNs and neurocontrol nonlinear systems with (or without) time delays can be transformed into the DSNNMs to be stability-analyzed or stabilization-synthesized in a unified way. In this paper, the DSNNMs are applied to analyzing the stability of the continuous-time and discrete-time RNNs with or without time delays, and synthesizing the state-feedback controllers for the chaotic neural-network-system and discrete-time nonlinear system. It turns out that the DSNNM makes the stability conditions of the RNNs easily verified, and provides a new idea for the synthesis of the controllers for the nonlinear systems.

  20. New extended standard model, dark matters and relativity theory

    NASA Astrophysics Data System (ADS)

    Hwang, Jae-Kwang

    2016-03-01

    Three-dimensional quantized space model is newly introduced as the extended standard model. Four three-dimensional quantized spaces with total 12 dimensions are used to explain the universes including ours. Electric (EC), lepton (LC) and color (CC) charges are defined to be the charges of the x1x2x3, x4x5x6 and x7x8x9 warped spaces, respectively. Then, the lepton is the xi(EC) - xj(LC) correlated state which makes 3x3 = 9 leptons and the quark is the xi(EC) - xj(LC) - xk(CC) correlated state which makes 3x3x3 = 27 quarks. The new three bastons with the xi(EC) state are proposed as the dark matters seen in the x1x2x3 space, too. The matter universe question, three generations of the leptons and quarks, dark matter and dark energy, hadronization, the big bang, quantum entanglement, quantum mechanics and general relativity are briefly discussed in terms of this new model. The details can be found in the article titled as ``journey into the universe; three-dimensional quantized spaces, elementary particles and quantum mechanics at https://www.researchgate.net/profile/J_Hwang2''.

  1. Quantum gravity and Lorentz invariance violation in the standard model.

    PubMed

    Alfaro, Jorge

    2005-06-10

    The most important problem of fundamental physics is the quantization of the gravitational field. A main difficulty is the lack of available experimental tests that discriminate among the theories proposed to quantize gravity. Recently, Lorentz invariance violation by quantum gravity (QG) has been the source of growing interest. However, the predictions depend on an ad hoc hypothesis and too many arbitrary parameters. Here we show that the standard model itself contains tiny Lorentz invariance violation terms coming from QG. All terms depend on one arbitrary parameter alpha that sets the scale of QG effects. This parameter can be estimated using data from the ultrahigh energy cosmic ray spectrum to be |alpha|< approximately 10(-22)-10(-23).

  2. Sakurai Prize: Beyond the Standard Model Higgs Boson

    NASA Astrophysics Data System (ADS)

    Haber, Howard

    2017-01-01

    The discovery of the Higgs boson strongly suggests that the first elementary spin 0 particle has been observed. Is the Higgs boson a solo act, or are there additional Higgs bosons to be discovered? Given that there are three generations of fundamental fermions, one might also expect the sector of fundamental scalars of nature to be non-minimal. However, there are already strong constraints on the possible structure of an extended Higgs sector. In this talk, I review the theoretical motivations that have been put forward for an extended Higgs sector and discuss its implications in light of the observation that the properties of the observed Higgs boson are close to those predicted by the Standard Model. supported in part by the U.S. Department of Energy Grant Number DE-SC0010107.

  3. A Hierarchical Model for Accuracy and Choice on Standardized Tests.

    PubMed

    Culpepper, Steven Andrew; Balamuta, James Joseph

    2015-11-25

    This paper assesses the psychometric value of allowing test-takers choice in standardized testing. New theoretical results examine the conditions where allowing choice improves score precision. A hierarchical framework is presented for jointly modeling the accuracy of cognitive responses and item choices. The statistical methodology is disseminated in the 'cIRT' R package. An 'answer two, choose one' (A2C1) test administration design is introduced to avoid challenges associated with nonignorable missing data. Experimental results suggest that the A2C1 design and payout structure encouraged subjects to choose items consistent with their cognitive trait levels. Substantively, the experimental data suggest that item choices yielded comparable information and discrimination ability as cognitive items. Given there are no clear guidelines for writing more or less discriminating items, one practical implication is that choice can serve as a mechanism to improve score precision.

  4. Baryon number dissipation at finite temperature in the standard model

    SciTech Connect

    Mottola, E. ); Raby, S. . Dept. of Physics); Starkman, G. . Dept. of Astronomy)

    1990-01-01

    We analyze the phenomenon of baryon number violation at finite temperature in the standard model, and derive the relaxation rate for the baryon density in the high temperature electroweak plasma. The relaxation rate, {gamma} is given in terms of real time correlation functions of the operator E{center dot}B, and is directly proportional to the sphaleron transition rate, {Gamma}: {gamma} {preceq} n{sub f}{Gamma}/T{sup 3}. Hence it is not instanton suppressed, as claimed by Cohen, Dugan and Manohar (CDM). We show explicitly how this result is consistent with the methods of CDM, once it is recognized that a new anomalous commutator is required in their approach. 19 refs., 2 figs.

  5. DsixTools: the standard model effective field theory toolkit

    NASA Astrophysics Data System (ADS)

    Celis, Alejandro; Fuentes-Martín, Javier; Vicente, Avelino; Virto, Javier

    2017-06-01

    We present DsixTools, a Mathematica package for the handling of the dimension-six standard model effective field theory. Among other features, DsixTools allows the user to perform the full one-loop renormalization group evolution of the Wilson coefficients in the Warsaw basis. This is achieved thanks to the SMEFTrunner module, which implements the full one-loop anomalous dimension matrix previously derived in the literature. In addition, DsixTools also contains modules devoted to the matching to the Δ B = Δ S = 1,2 and Δ B = Δ C = 1 operators of the Weak Effective Theory at the electroweak scale, and their QCD and QED Renormalization group evolution below the electroweak scale.

  6. CosPA 2015 and the Standard Model

    NASA Astrophysics Data System (ADS)

    Pauchy Hwang, W.-Y.

    2016-07-01

    In this keynote speech, I describe briefly “The Universe”, a journal/newsletter launched by APCosPA Organization, and my lifetime research on the Standard Model of particle physics. In this 21st Century, we should declare that we live in the quantum 4-dimensional Minkowski space-time with the force-fields gauge-group structure SUc(3) × SUL(2) × U(1) × SUf(3) built-in from the very beginning. This background can see the lepton world, of atomic sizes, and offers us the eyes to see other things. It also can see the quark world, of the Fermi sizes, and this fact makes this entire world much more interesting.

  7. A predictive standard model for heavy electron systems

    SciTech Connect

    Yang, Yifeng; Curro, N J; Fisk, Z; Pines, D

    2010-01-01

    We propose a predictive standard model for heavy electron systems based on a detailed phenomenological two-fluid description of existing experimental data. It leads to a new phase diagram that replaces the Doniach picture, describes the emergent anomalous scaling behavior of the heavy electron (Kondo) liquid measured below the lattice coherence temperature, T*, seen by many different experimental probes, that marks the onset of collective hybridization, and enables one to obtain important information on quantum criticality and the superconducting/antiferromagnetic states at low temperatures. Because T* is {approx} J{sup 2} {rho}/2, the nearest neighbor RKKY interaction, a knowledge of the single-ion Kondo coupling, J, to the background conduction electron density of states, {rho}, makes it possible to predict Kondo liquid behavior, and to estimate its maximum superconducting transition temperature in both existing and newly discovered heavy electron families.

  8. Image contrast enhancement based on a local standard deviation model

    SciTech Connect

    Chang, Dah-Chung; Wu, Wen-Rong

    1996-12-31

    The adaptive contrast enhancement (ACE) algorithm is a widely used image enhancement method, which needs a contrast gain to adjust high frequency components of an image. In the literature, the gain is usually inversely proportional to the local standard deviation (LSD) or is a constant. But these cause two problems in practical applications, i.e., noise overenhancement and ringing artifact. In this paper a new gain is developed based on Hunt`s Gaussian image model to prevent the two defects. The new gain is a nonlinear function of LSD and has the desired characteristic emphasizing the LSD regions in which details are concentrated. We have applied the new ACE algorithm to chest x-ray images and the simulations show the effectiveness of the proposed algorithm.

  9. Physics Beyond the Standard Model from Molecular Hydrogen Spectroscopy

    NASA Astrophysics Data System (ADS)

    Ubachs, Wim; Salumbides, Edcel John; Bagdonaite, Julija

    2015-06-01

    The spectrum of molecular hydrogen can be measured in the laboratory to very high precision using advanced laser and molecular beam techniques, as well as frequency-comb based calibration [1,2]. The quantum level structure of this smallest neutral molecule can now be calculated to very high precision, based on a very accurate (10-15 precision) Born-Oppenheimer potential [3] and including subtle non-adiabatic, relativistic and quantum electrodynamic effects [4]. Comparison between theory and experiment yields a test of QED, and in fact of the Standard Model of Physics, since the weak, strong and gravitational forces have a negligible effect. Even fifth forces beyond the Standard Model can be searched for [5]. Astronomical observation of molecular hydrogen spectra, using the largest telescopes on Earth and in space, may reveal possible variations of fundamental constants on a cosmological time scale [6]. A study has been performed at a 'look-back' time of 12.5 billion years [7]. In addition the possible dependence of a fundamental constant on a gravitational field has been investigated from observation of molecular hydrogen in the photospheres of white dwarfs [8]. The latter involves a test of the Einsteins equivalence principle. [1] E.J. Salumbides et al., Phys. Rev. Lett. 107, 143005 (2011). [2] G. Dickenson et al., Phys. Rev. Lett. 110, 193601 (2013). [3] K. Pachucki, Phys. Rev. A82, 032509 (2010). [4] J. Komasa et al., J. Chem. Theory Comp. 7, 3105 (2011). [5] E.J. Salumbides et al., Phys. Rev. D87, 112008 (2013). [6] F. van Weerdenburg et al., Phys. Rev. Lett. 106, 180802 (2011). [7] J. Badonaite et al., Phys. Rev. Lett. 114, 071301 (2015). [8] J. Bagdonaite et al., Phys. Rev. Lett. 113, 123002 (2014).

  10. Physics beyond the Standard Model from hydrogen spectroscopy

    NASA Astrophysics Data System (ADS)

    Ubachs, W.; Koelemeij, J. C. J.; Eikema, K. S. E.; Salumbides, E. J.

    2016-02-01

    Spectroscopy of hydrogen can be used for a search into physics beyond the Standard Model. Differences between the absorption spectra of the Lyman and Werner bands of H2 as observed at high redshift and those measured in the laboratory can be interpreted in terms of possible variations of the proton-electron mass ratio μ =mp /me over cosmological history. Investigation of ten such absorbers in the redshift range z = 2.0 -4.2 yields a constraint of | Δμ / μ | < 5 ×10-6 at 3σ. Observation of H2 from the photospheres of white dwarf stars inside our Galaxy delivers a constraint of similar magnitude on a dependence of μ on a gravitational potential 104 times as strong as on the Earth's surface. While such astronomical studies aim at finding quintessence in an indirect manner, laboratory precision measurements target such additional quantum fields in a direct manner. Laser-based precision measurements of dissociation energies, vibrational splittings and rotational level energies in H2 molecules and their deuterated isotopomers HD and D2 produce values for the rovibrational binding energies fully consistent with quantum ab initio calculations including relativistic and quantum electrodynamical (QED) effects. Similarly, precision measurements of high-overtone vibrational transitions of HD+ ions, captured in ion traps and sympathetically cooled to mK temperatures, also result in transition frequencies fully consistent with calculations including QED corrections. Precision measurements of inter-Rydberg transitions in H2 can be extrapolated to yield accurate values for level splittings in the H2+ -ion. These comprehensive results of laboratory precision measurements on neutral and ionic hydrogen molecules can be interpreted to set bounds on the existence of possible fifth forces and of higher dimensions, phenomena describing physics beyond the Standard Model.

  11. Theories beyond the standard model, one year before the LHC

    NASA Astrophysics Data System (ADS)

    Dimopoulos, Savas

    2006-04-01

    Next year the Large Hadron Collider at CERN will begin what may well be a new golden era of particle physics. I will discuss three theories that will be tested at the LHC. I will begin with the supersymmetric standard model, proposed with Howard Georgi in 1981. This theory made a precise quantitative prediction, the unification of couplings, that has been experimentally confirmed in 1991 by experiments at CERN and SLAC. This established it as the leading theory for physics beyond the standard model. Its main prediction, the existence of supersymmetric particles, will be tested at the large hadron collider. I will next overview theories with large new dimensions, proposed with Nima Arkani-Hamed and Gia Dvali in 1998. This links the weakness of gravity to the presence of sub-millimeter size dimensions, that are presently searched for in experiments looking for deviations from Newton's law at short distances. In this framework quantum gravity, string theory, and black holes may be experimentally investigated at the large hadron collider. I will end with the recent proposal of split supersymmetry with Nima Arkani-Hamed. This theory is motivated by the possible existence of an enormous number of ground states in the fundamental theory, as suggested by the cosmological constant problem and recent developments in string theory and cosmology. It can be tested at the large hadron collider and, if confirmed, it will lend support to the idea that our universe and its laws are not unique and that there is an enormous variety of universes each with its own distinct physical laws.

  12. Standard Model in multiscale theories and observational constraints

    NASA Astrophysics Data System (ADS)

    Calcagni, Gianluca; Nardelli, Giuseppe; Rodríguez-Fernández, David

    2016-08-01

    We construct and analyze the Standard Model of electroweak and strong interactions in multiscale spacetimes with (i) weighted derivatives and (ii) q -derivatives. Both theories can be formulated in two different frames, called fractional and integer picture. By definition, the fractional picture is where physical predictions should be made. (i) In the theory with weighted derivatives, it is shown that gauge invariance and the requirement of having constant masses in all reference frames make the Standard Model in the integer picture indistinguishable from the ordinary one. Experiments involving only weak and strong forces are insensitive to a change of spacetime dimensionality also in the fractional picture, and only the electromagnetic and gravitational sectors can break the degeneracy. For the simplest multiscale measures with only one characteristic time, length and energy scale t*, ℓ* and E*, we compute the Lamb shift in the hydrogen atom and constrain the multiscale correction to the ordinary result, getting the absolute upper bound t*<10-23 s . For the natural choice α0=1 /2 of the fractional exponent in the measure, this bound is strengthened to t*<10-29 s , corresponding to ℓ*<10-20 m and E*>28 TeV . Stronger bounds are obtained from the measurement of the fine-structure constant. (ii) In the theory with q -derivatives, considering the muon decay rate and the Lamb shift in light atoms, we obtain the independent absolute upper bounds t*<10-13 s and E*>35 MeV . For α0=1 /2 , the Lamb shift alone yields t*<10-27 s , ℓ*<10-19 m and E*>450 GeV .

  13. Observational consequences of the standard model Higgs inflation variants

    SciTech Connect

    Popa, L.A.

    2011-10-01

    We consider the possibility to observationally differentiate the Standard Model (SM) Higgs driven inflation with non-minimal coupling to gravity from other variants of SM Higgs inflation based on the scalar field theories with non-canonical kinetic term such as Galileon-like kinetic term and kinetic term with non-minimal derivative coupling to the Einstein tensor. In order to ensure consistent results, we study the SM Higgs inflation variants by using the same method, computing the full dynamics of the background and perturbations of the Higgs field during inflation at quantum level. Assuming that all the SM Higgs inflation variants are consistent theories, we use the MCMC technique to derive constraints on the inflationary parameters and the Higgs boson mass from their fit to WMAP7+SN+BAO data set. We conclude that a combination of the SM Higgs mass measurement by the LHC and accurate determination by the PLANCK satellite of the spectral index of curvature perturbations and tensor-to-scalar ratio will enable to distinguish among these models. We also show that the consistency relations of the SM Higgs inflation variants are distinct enough to differentiate among them.

  14. Wisconsin's Model Academic Standards for Business. Bulletin No. 9004.

    ERIC Educational Resources Information Center

    Wisconsin State Dept. of Public Instruction, Madison.

    This document contains standards for the academic content of the Wisconsin K-12 curriculum in the area of business. Developed by task forces of educators, parents, board of education members, and employers and employees, the standards cover content, performance, and proficiency areas. They are cross-referenced to the state standards for English…

  15. Setting, Evaluating, and Maintaining Certification Standards with the Rasch Model.

    ERIC Educational Resources Information Center

    Grosse, Martin E.; Wright, Benjamin D.

    1986-01-01

    Based on the standard setting procedures or the American Board of Preventive Medicine for their Core Test, this article describes how Rasch measurement can facilitate using test content judgments in setting a standard. Rasch measurement can then be used to evaluate and improve the precision of the standard and to hold it constant across time.…

  16. Setting, Evaluating, and Maintaining Certification Standards with the Rasch Model.

    ERIC Educational Resources Information Center

    Grosse, Martin E.; Wright, Benjamin D.

    1986-01-01

    Based on the standard setting procedures or the American Board of Preventive Medicine for their Core Test, this article describes how Rasch measurement can facilitate using test content judgments in setting a standard. Rasch measurement can then be used to evaluate and improve the precision of the standard and to hold it constant across time.…

  17. Locally-excited (LE) versus charge-transfer (CT) excited state competition in a series of para-substituted neutral green fluorescent protein (GFP) chromophore models.

    PubMed

    Olsen, Seth

    2015-02-12

    In this paper, I provide a characterization of the low-energy electronic structure of a series of para-substituted neutral green fluorescent protein (GFP) chromophore models using a theoretical approach that blends linear free energy relationships (LFERs) with state-averaged complete-active-space self-consistent field (SA-CASSCF) theory. The substituents are chosen to sample the Hammett σ(p) scale from R = F to NH2, and a model of the neutral GFP chromophore structure (R = OH) is included. I analyze the electronic structure for different members of the series in a common complete-active-space valence-bond (CASVB) representation, exploiting an isolobal analogy between active-space orbitals for different members of the series. I find that the electronic structure of the lowest adiabatic excited state is a strong mixture of weakly coupled states with charge-transfer (CT) or locally excited (LE) character and that the dominant character changes as the series is traversed. Chromophores with strongly electron-donating substituents have a CT-like excited state such as expected for a push-pull polyene or asymmetric cyanine. Chromophores with weakly electron-donating (or electron-withdrawing) substituents have an LE-like excited state with an ionic biradicaloid structure localized to the ground-state bridge π bond.

  18. 29 CFR 1990.151 - Model standard pursuant to section 6(b) of the Act.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... OCCUPATIONAL CARCINOGENS Model Standards § 1990.151 Model standard pursuant to section 6(b) of the Act... and labels provision for a Category II potential carcinogen. (q) Recordkeeping—(1) Exposure monitoring...

  19. Early, computer-Aided Design/Computer-Aided Modeling Planned, Le Fort I Advancement With Internal Distractors to Treat Severe Maxillary Hypoplasia in Cleft Lip and Palate.

    PubMed

    Chang, Catherine S; Swanson, Jordan; Yu, Jason; Taylor, Jesse A

    2017-04-11

    Traditionally, maxillary hypoplasia in the setting of cleft lip and palate is treated via orthognathic surgery at skeletal maturity, which condemns these patients to abnormal facial proportions during adolescence. The authors sought to determine the safety profile of computer-aided design/computer-aided modeling (CAD/CAM) planned, Le Fort I distraction osteogenesis with internal distractors in select patients presenting at a young age with severe maxillary retrusion. The authors retrospectively reviewed our "early" Le Fort I distraction osteogenesis experience-patients performed for severe maxillary retrusion (≥12 mm underjet), after canine eruption but prior to skeletal maturity-at a single institution. Patient demographics, cleft characteristics, CAD/CAM operative plans, surgical complications, postoperative imaging, and outcomes were analyzed. Four patients were reviewed, with a median age of 12.8 years at surgery (range 8.6-16.1 years). Overall mean advancement was 17.95 + 2.9 mm (range 13.7-19.9 mm) with mean SNA improved 18.4° to 87.4 ± 5.7°. Similarly, ANB improved 17.7° to a postoperative mean of 2.4 ± 3.1°. Mean follow-up was 100.7 weeks, with 3 of 4 patients in a Class I occlusion with moderate-term follow-up; 1 of 4 will need an additional maxillary advancement due to pseudo-relapse. In conclusion, Le Fort I distraction osteogenesis with internal distractors is a safe procedure to treat severe maxillary hypoplasia after canine eruption but before skeletal maturity. Short-term follow-up demonstrates safety of the procedure and relative stability of the advancement. Pseudo-relapse is a risk of the procedure that must be discussed at length with patients and families.

  20. Dark matter and color octets beyond the Standard Model

    NASA Astrophysics Data System (ADS)

    Krnjaic, Gordan Z.

    Although the Standard Model (SM) of particles and interactions has survived forty years of experimental tests, it does not provide a complete description of nature. From cosmological and astrophysical observations, it is now clear that the majority of matter in the universe is not baryonic and interacts very weakly (if at all) via non-gravitational forces. The SM does not provide a dark matter candidate, so new particles must be introduced. Furthermore, recent Tevatron results suggest that SM predictions for benchmark collider observables are in tension with experimental observations. In this thesis, we will propose extensions to the SM that address each of these issues. Although there is abundant indirect evidence for the existence of dark matter, terrestrial efforts to observe its interactions have yielded conflicting results. We address this situation with a simple model of dark matter that features hydrogen-like bound states that scatter off SM nuclei by undergoing inelastic hyperfine transitions. We explore the available parameter space that results from demanding that DM self-interactions satisfy experimental bounds and ameliorate the tension between positive and null signals at the DAMA and CDMS experiments respectively. However, this simple model does not explain the cosmological abundance of dark matter and also encounters a Landau pole at a low energy scale. We, therefore, extend the field content and gauge group of the dark sector to resolve these issues with a renormalizable UV completion. We also explore the galactic dynamics of unbound dark matter and find that "dark ions" settle into a diffuse isothermal halo that differs from that of the bound states. This suppresses the local dark-ion density and expands the model's viable parameter space. We also consider the > 3σ excess in W plus dijet events recently observed at the Tevatron collider. We show that decays of a color-octet, electroweak-triplet scalar particle ("octo-triplet") can yield the

  1. The pion: an enigma within the Standard Model

    NASA Astrophysics Data System (ADS)

    Horn, Tanja; Roberts, Craig D.

    2016-07-01

    Quantum chromodynamics (QCDs) is the strongly interacting part of the Standard Model. It is supposed to describe all of nuclear physics; and yet, almost 50 years after the discovery of gluons and quarks, we are only just beginning to understand how QCD builds the basic bricks for nuclei: neutrons and protons, and the pions that bind them together. QCD is characterised by two emergent phenomena: confinement and dynamical chiral symmetry breaking (DCSB). They have far-reaching consequences, expressed with great force in the character of the pion; and pion properties, in turn, suggest that confinement and DCSB are intimately connected. Indeed, since the pion is both a Nambu-Goldstone boson and a quark-antiquark bound-state, it holds a unique position in nature and, consequently, developing an understanding of its properties is critical to revealing some very basic features of the Standard Model. We describe experimental progress toward meeting this challenge that has been made using electromagnetic probes, highlighting both dramatic improvements in the precision of charged-pion form factor data that have been achieved in the past decade and new results on the neutral-pion transition form factor, both of which challenge existing notions of pion structure. We also provide a theoretical context for these empirical advances, which begins with an explanation of how DCSB works to guarantee that the pion is un-naturally light; but also, nevertheless, ensures that the pion is the best object to study in order to reveal the mechanisms that generate nearly all the mass of hadrons. In canvassing advances in these areas, our discussion unifies many aspects of pion structure and interactions, connecting the charged-pion elastic form factor, the neutral-pion transition form factor and the pion's leading-twist parton distribution amplitude. It also sketches novel ways in which experimental and theoretical studies of the charged-kaon electromagnetic form factor can provide

  2. Challenging the standard model at the Tevatron collider

    SciTech Connect

    Filthaut, Frank; /Nijmegen U.

    2011-03-01

    Even at a time where the world's eyes are focused on the Large Hadron Collider at CERN, which has reached the energy frontier in 2010, many important results are still being obtained from data analyses performed at the Tevatron collider at Fermilab. This contribution discusses recent highlights in the areas of B hadron, electroweak, top quark, and Higgs boson physics. The standard model (SM) of particle physics forms the cornerstone of our understanding of elementary particles and their interactions, and many of its aspects have been investigated in great detail. Yet it is generally suspected to be incomplete (e.g. by not allowing for the incorporation of gravity in a field theoretical setting) and un-natural (e.g. the mass of the Higgs boson is not well protected against radiative corrections). In addition, it does not explain the dark matter and dark energy content of the Universe. It is therefore of eminent importance to test the limits of validity of the SM. In the decade since its upgrade to a centre-of-mass energy {radical}s = 1.96 TeV, the Tevatron p{bar p} collider has delivered an integrated luminosity of about 10 fb{sup -1}, up to 9 fb{sup -1} of which are available for analysis by its CDF and D0 collaborations. These large datasets allow for stringent tests of the SM in two areas: direct searches for particles or final states that are not very heavy but that suffer from small production cross sections (e.g. the Higgs boson), and searches for indirect manifestations of beyond-the-standard-model (BSM) effects through virtual effects. The latter searches can often be carried out by precise measurements of otherwise known processes. This contribution describes such tests of the SM carried out by the CDF and D0 collaborations. In particular, recent highlights in the areas of B hadron physics, electroweak physics, top quark physics, and Higgs boson physics are discussed. Recent results of tests of QCD and of direct searches for new phenomena are described in

  3. The pion: an enigma within the Standard Model

    SciTech Connect

    Horn, Tanja; Roberts, Craig D.

    2016-05-27

    Quantum chromodynamics (QCDs) is the strongly interacting part of the Standard Model. It is supposed to describe all of nuclear physics; and yet, almost 50 years after the discovery of gluons and quarks, we are only just beginning to understand how QCD builds the basic bricks for nuclei: neutrons and protons, and the pions that bind them together. QCD is characterised by two emergent phenomena: confinement and dynamical chiral symmetry breaking (DCSB). They have far-reaching consequences, expressed with great force in the character of the pion; and pion properties, in turn, suggest that confinement and DCSB are intimately connected. Indeed, since the pion is both a Nambu–Goldstone boson and a quark–antiquark bound-state, it holds a unique position in nature and, consequently, developing an understanding of its properties is critical to revealing some very basic features of the Standard Model. We describe experimental progress toward meeting this challenge that has been made using electromagnetic probes, highlighting both dramatic improvements in the precision of charged-pion form factor data that have been achieved in the past decade and new results on the neutral-pion transition form factor, both of which challenge existing notions of pion structure. We also provide a theoretical context for these empirical advances, which begins with an explanation of how DCSB works to guarantee that the pion is un-naturally light; but also, nevertheless, ensures that the pion is the best object to study in order to reveal the mechanisms that generate nearly all the mass of hadrons. In canvassing advances in these areas, our discussion unifies many aspects of pion structure and interactions, connecting the charged-pion elastic form factor, the neutral-pion transition form factor and the pion's leading-twist parton distribution amplitude. It also sketches novel ways in which experimental and theoretical studies of the charged-kaon electromagnetic form factor can provide

  4. CP violation outside the standard model phenomenology for pedestrians

    SciTech Connect

    Lipkin, H.J. ||

    1993-09-23

    So far the only experimental evidence for CP violation is the 1964 discovery of K{sub L}{yields}2{pi} where the two mass eigenstates produced by neutral meson mixing both decay into the same CP eigenstate. This result is described by two parameters {epsilon} and {epsilon}{prime}. Today {epsilon} {approx} its 1964 value, {epsilon}{prime} data are still inconclusive and there is no new evidence for CP violation. One might expect to observe similar phenomena in other systems and also direct CP violation as charge asymmetries between decays of charge conjugate hadrons H{sup {+-}} {yields} f{sup {+-}}. Why is it so hard to find CP violation? How can B Physics help? Does CP lead beyond the standard model? The author presents a pedestrian symmetry approach which exhibits the difficulties and future possibilities of these two types of CP-violation experiments, neutral meson mixing and direct charge asymmetry: what may work, what doesn`t work and why.

  5. Gravitational wave background from Standard Model physics: qualitative features

    SciTech Connect

    Ghiglieri, J.; Laine, M. E-mail: laine@itp.unibe.ch

    2015-07-01

    Because of physical processes ranging from microscopic particle collisions to macroscopic hydrodynamic fluctuations, any plasma in thermal equilibrium emits gravitational waves. For the largest wavelengths the emission rate is proportional to the shear viscosity of the plasma. In the Standard Model at 0T > 16 GeV, the shear viscosity is dominated by the most weakly interacting particles, right-handed leptons, and is relatively large. We estimate the order of magnitude of the corresponding spectrum of gravitational waves. Even though at small frequencies (corresponding to the sub-Hz range relevant for planned observatories such as eLISA) this background is tiny compared with that from non-equilibrium sources, the total energy carried by the high-frequency part of the spectrum is non-negligible if the production continues for a long time. We suggest that this may constrain (weakly) the highest temperature of the radiation epoch. Observing the high-frequency part directly sets a very ambitious goal for future generations of GHz-range detectors.

  6. Preheating the Universe from the Standard Model Higgs

    NASA Astrophysics Data System (ADS)

    Figueroa, Daniel G.

    2010-06-01

    We discuss Preheating after an inflationary stage driven by the Standard Model (SM) Higgs field non-minimally coupled to gravity. We find that Preheating is driven by a complex process in which perturbative and non-perturbative effects occur simultaneously. The Higgs field, initially an oscillating coherent condensate, produces non-perturbatively W and Z gauge fields. These decay very rapidly into fermions, thus preventing gauge bosons to accumulate and, consequently, blocking the usual parametric resonance. The energy transferred into the fermionic species is, nevertheless, not enough to reheat the Universe, and resonant effects are eventually developed. Soon after resonance becomes effective, also backreaction from the gauge bosons into the Higgs condensate becomes relevant. We have determined the time evolution of the energy distribution among the remnant Higgs condensate and the non-thermal distribution of the SM fermions and gauge fields, until the moment in which backreaction becomes important. Beyond backreaction our approximations break down and numerical simulations and theoretical considerations beyond this work are required, in order to study the evolution of the system until thermalization.

  7. Effects of the Noncommutative Standard Model in WW Scattering

    SciTech Connect

    Conley, John A.; Hewett, JoAnne L.

    2008-12-02

    We examine W pair production in the Noncommutative Standard Model constructed with the Seiberg-Witten map. Consideration of partial wave unitarity in the reactions WW {yields} WW and e{sup +}e{sup -} {yields} WW shows that the latter process is more sensitive and that tree-level unitarity is violated when scattering energies are of order a TeV and the noncommutative scale is below about a TeV. We find that WW production at the LHC is not sensitive to scales above the unitarity bounds. WW production in e{sup +}e{sup -} annihilation, however, provides a good probe of such effects with noncommutative scales below 300-400 GeV being excluded at LEP-II, and the ILC being sensitive to scales up to 10-20 TeV. In addition, we find that the ability to measure the helicity states of the final state W bosons at the ILC provides a diagnostic tool to determine and disentangle the different possible noncommutative contributions.

  8. The Framed Standard Model (II) -- A First Test Against Experiment

    NASA Astrophysics Data System (ADS)

    Chan, Hong-Mo; Tsou, Sheung Tsun

    Apart from the qualitative features described in Paper I (Ref. 1), the renormalization group equation derived for the rotation of the fermion mass matrices are amenable to quantitative study. The equation depends on a coupling and a fudge factor and, on integration, on 3 integration constants. Its application to data analysis, however, requires the input from experiment of the heaviest generation masses mt,mb,mτ,mυ3 all of which are known, except for mυ3. Together then with the theta-angle in the QCD action, there are in all 7 real unknown parameters. Determining these 7 parameters by fitting to the experimental values of the masses mc, mυ, me, the CKM elements |Vus|, |Vub|, and the neutrino oscillation angle sin2 θ13, one can then calculate and compare with experiment the following 12 other quantities ms, mu/md, |Vud|, |Vcs|, |Vtb|, |Vcd|, |Vcb|, |Vts|, |Vtd|, J, sin2 2θ12, sin2 2θ23, and the results all agree reasonably well with data, often to within the stringent experimental error now achieved. Counting the predictions not yet measured by experiment, this means that 17 independent parameters of the standard model are now replaced by 7 in the FSM...

  9. Gravitational wave background from Standard Model physics: qualitative features

    SciTech Connect

    Ghiglieri, J.; Laine, M.

    2015-07-16

    Because of physical processes ranging from microscopic particle collisions to macroscopic hydrodynamic fluctuations, any plasma in thermal equilibrium emits gravitational waves. For the largest wavelengths the emission rate is proportional to the shear viscosity of the plasma. In the Standard Model at T>160 GeV, the shear viscosity is dominated by the most weakly interacting particles, right-handed leptons, and is relatively large. We estimate the order of magnitude of the corresponding spectrum of gravitational waves. Even though at small frequencies (corresponding to the sub-Hz range relevant for planned observatories such as eLISA) this background is tiny compared with that from non-equilibrium sources, the total energy carried by the high-frequency part of the spectrum is non-negligible if the production continues for a long time. We suggest that this may constrain (weakly) the highest temperature of the radiation epoch. Observing the high-frequency part directly sets a very ambitious goal for future generations of GHz-range detectors.

  10. On the fate of the Standard Model at finite temperature

    NASA Astrophysics Data System (ADS)

    Rose, Luigi Delle; Marzo, Carlo; Urbano, Alfredo

    2016-05-01

    In this paper we revisit and update the computation of thermal corrections to the stability of the electroweak vacuum in the Standard Model. At zero temperature, we make use of the full two-loop effective potential, improved by three-loop beta functions with two-loop matching conditions. At finite temperature, we include one-loop thermal corrections together with resummation of daisy diagrams. We solve numerically — both at zero and finite temperature — the bounce equation, thus providing an accurate description of the thermal tunneling. Assuming a maximum temperature in the early Universe of the order of 1018 GeV, we find that the instability bound excludes values of the top mass M t ≳ 173 .6 GeV, with M h ≃ 125 GeV and including uncertainties on the strong coupling. We discuss the validity and temperature-dependence of this bound in the early Universe, with a special focus on the reheating phase after inflation.

  11. Implications of Higgs’ universality for physics beyond the Standard Model

    NASA Astrophysics Data System (ADS)

    Goldman, T.; Stephenson, G. J.

    2017-06-01

    We emulate Cabibbo by assuming a kind of universality for fermion mass terms in the Standard Model. We show that this is consistent with all current data and with the concept that deviations from what we term Higgs’ universality are due to corrections from currently unknown physics of nonetheless conventional form. The application to quarks is straightforward, while the application to leptons makes use of the recognition that Dark Matter can provide the “sterile” neutrinos needed for the seesaw mechanism. Requiring agreement with neutrino oscillation results leads to the prediction that the mass eigenstates of the sterile neutrinos are separated by quadratically larger ratios than for the charged fermions. Using consistency with the global fit to LSND-like, short-baseline oscillations to determine the scale of the lowest mass sterile neutrino strongly suggests that the recently observed astrophysical 3.55 keV γ-ray line is also consistent with the mass expected for the second most massive sterile neutrino in our analysis.

  12. A new model for the standardization of experimental burn wounds.

    PubMed

    Venter, Neil G; Monte-Alto-Costa, Andréa; Marques, Ruy G

    2015-05-01

    Burns are common and recurrent events treated by physicians on a daily basis at most emergency rooms around the world. There is a constant need to understand the physiopathology of burns, so as to minimize their devastating results. The objective of the present report is to describe a burn apparatus in association with an innovative method of animal fixation, as to produce burns of varying sizes and depths. Rats were subjected to burns of 60 °C, 70 °C, and 80 °C for 10 s and after 3 days half of the rats in each group were killed and the resulting lesions were analyzed using histological techniques. In the other half of the rats the wound was measured weakly until complete re-epithelialization. All burns were easily visible and the histological feature for the 60 °C burn was a superficial second-degree burn (28% of the dermis), for 70 °C we observed a deep second-degree burn (72% of the dermis), and in the 80 °C group, a third degree-burn was present (100% of the dermis). This is a safe, reliable, easy to construct and use model that has the ability to produce a regular and uniform reproducible burn due to precise temperature control associated with standardized animal positioning. Copyright © 2014 Elsevier Ltd and ISBI. All rights reserved.

  13. Affine group formulation of the Standard Model coupled to gravity

    SciTech Connect

    Chou, Ching-Yi; Ita, Eyo; Soo, Chopin

    2014-04-15

    In this work we apply the affine group formalism for four dimensional gravity of Lorentzian signature, which is based on Klauder’s affine algebraic program, to the formulation of the Hamiltonian constraint of the interaction of matter and all forces, including gravity with non-vanishing cosmological constant Λ, as an affine Lie algebra. We use the hermitian action of fermions coupled to gravitation and Yang–Mills theory to find the density weight one fermionic super-Hamiltonian constraint. This term, combined with the Yang–Mills and Higgs energy densities, are composed with York’s integrated time functional. The result, when combined with the imaginary part of the Chern–Simons functional Q, forms the affine commutation relation with the volume element V(x). Affine algebraic quantization of gravitation and matter on equal footing implies a fundamental uncertainty relation which is predicated upon a non-vanishing cosmological constant. -- Highlights: •Wheeler–DeWitt equation (WDW) quantized as affine algebra, realizing Klauder’s program. •WDW formulated for interaction of matter and all forces, including gravity, as affine algebra. •WDW features Hermitian generators in spite of fermionic content: Standard Model addressed. •Constructed a family of physical states for the full, coupled theory via affine coherent states. •Fundamental uncertainty relation, predicated on non-vanishing cosmological constant.

  14. Review of Standard Model Higgs results at the ATLAS experiment

    NASA Astrophysics Data System (ADS)

    Calandri, Alessandro; ATLAS Collaboration

    2017-07-01

    The investigation of the mechanism of electroweak (EW) symmetry breaking has been one of the main goals of the ATLAS experiment [1] research program at the CERN Large Hadron Collider (LHC). In the Standard Model (SM) of particle physics, the breaking of the EW symmetry is realised by introducing a complex doublet scalar field which is related to the existence of a neutral particle, the Higgs boson. The Higgs scalar field is responsible of the mass of the other particles; the mass of its mediator, the Higgs boson, is the only free parameter in the theory. In 2012, the ATLAS and CMS collaborations discovered a new particle consistent with the SM Higgs boson and two years later measured its mass, in the H → γγ and H →ZZ* → 4l channels, to be mH =125.09±0.21 (stat) ±0.11 (sys) GeV with data collected in 2011 and 2012 (LHC Run 1). This document will cover the SM Higgs analyses performed with approximately 15 fb-1 of pp collision data collected until summer 2016 during the so-called Run 2 LHC data-taking at a center-of-mass energy (\\sqrt{s}=13 {{TeV}}).

  15. On push-forward representations in the standard gyrokinetic model

    SciTech Connect

    Miyato, N. Yagi, M.; Scott, B. D.

    2015-01-15

    Two representations of fluid moments in terms of a gyro-center distribution function and gyro-center coordinates, which are called push-forward representations, are compared in the standard electrostatic gyrokinetic model. In the representation conventionally used to derive the gyrokinetic Poisson equation, the pull-back transformation of the gyro-center distribution function contains effects of the gyro-center transformation and therefore electrostatic potential fluctuations, which is described by the Poisson brackets between the distribution function and scalar functions generating the gyro-center transformation. Usually, only the lowest order solution of the generating function at first order is considered to explicitly derive the gyrokinetic Poisson equation. This is true in explicitly deriving representations of scalar fluid moments with polarization terms. One also recovers the particle diamagnetic flux at this order because it is associated with the guiding-center transformation. However, higher-order solutions are needed to derive finite Larmor radius terms of particle flux including the polarization drift flux from the conventional representation. On the other hand, the lowest order solution is sufficient for the other representation, in which the gyro-center transformation part is combined with the guiding-center one and the pull-back transformation of the distribution function does not appear.

  16. Standard Model with a real singlet scalar and inflation

    SciTech Connect

    Enqvist, Kari; Nurmi, Sami; Tenkanen, Tommi; Tuominen, Kimmo E-mail: sami.nurmi@helsinki.fi E-mail: kimmo.i.tuominen@helsinki.fi

    2014-08-01

    We study the post-inflationary dynamics of the Standard Model Higgs and a real singlet scalar s, coupled together through a renormalizable coupling λ{sub sh}h{sup 2}s{sup 2}, in a Z{sub 2} symmetric model that may explain the observed dark matter abundance and/or the origin of baryon asymmetry. The initial values for the Higgs and s condensates are given by inflationary fluctuations, and we follow their dissipation and relaxation to the low energy vacua. We find that both the lowest order perturbative and the non-perturbative decays are blocked by thermal effects and large background fields and that the condensates decay by two-loop thermal effects. Assuming instant reheating at T=10{sup 16} GeV, the characteristic temperature for the Higgs condensate thermalization is found to be T{sub h} ∼ 10{sup 14} GeV, whereas s thermalizes typically around T{sub s} ∼ 10{sup 6} GeV. By that time, the amplitude of the singlet is driven very close to the vacuum value by the expansion of the universe, unless the portal coupling takes a value λ{sub sh}∼< 10{sup -7} and the singlet s never thermalizes. With these values of the coupling, it is possible to slowly produce a sizeable fraction of the observed dark matter abundance via singlet condensate fragmentation and thermal Higgs scattering. Physics also below the electroweak scale can therefore be affected by the non-vacuum initial conditions generated by inflation.

  17. Flavor democracy in standard models at high energies

    NASA Astrophysics Data System (ADS)

    Cvetič, G.; Kim, C. S.

    1993-10-01

    It is possible that the standard model (SM) is replaced around some transition energy Λ by a new, possibly Higgsless, "flavor gauge theory" such that the Yukawa (running) parameters of SM at E ˜ Λ show up an (approximate) flavor democracy (FD). We investigate the latter possibility by studying the renormalization group equations for the Yukawa couplings of SM with one and two Higgs doublets, by evolving them from given physical values at low energies ( E ⋍ 1 GeV) to Λ (˜ Λpole) and comparing the resulting fermion masses and CKM matrix elements at E ⋍ Λ for various mtphy and ratios νu/ νd of vacuum expectation values. We find that the minimal SM and the closely related SM with two Higgs doublets (type I) show increasing deviation from FD when energy is increased, but that SM with two Higgs doublets (type II) clearly tends to FD with increasing energy—in both the quark and the leptonic sector (q-q and l- l FD). Furthermore, we find within the type-II model that, for Λpole ≪ ΛPlack, mtphy can be less than 200 GeV in most cases of chosen νu/ νd. Under the assumption that also the corresponding Yukawa couplings in the quark and the leptonic sector at E ⋍ Λ are equal ( l-q FD), we derive estimates of bounds on masses of top quark and tau-neutrino, which are compatible with experimental bounds.

  18. Use and abuse of the model waveform accuracy standards

    SciTech Connect

    Lindblom, Lee

    2009-09-15

    Accuracy standards have been developed to ensure that the waveforms used for gravitational-wave data analysis are good enough to serve their intended purposes. These standards place constraints on certain norms of the frequency-domain representations of the waveform errors. Examples are given here of possible misinterpretations and misapplications of these standards, whose effect could be to vitiate the quality control they were intended to enforce. Suggestions are given for ways to avoid these problems.

  19. Accuracy of the Generalizability-Model Standard Errors for the Percents of Examinees Reaching Standards.

    ERIC Educational Resources Information Center

    Li, Yuan H.; Schafer, William D.

    An empirical study of the Yen (W. Yen, 1997) analytic formula for the standard error of a percent-above-cut [SE(PAC)] was conducted. This formula was derived from variance component information gathered in the context of generalizability theory. SE(PAC)s were estimated by different methods of estimating variance components (e.g., W. Yens…

  20. Comparison of Standard Wind Turbine Models with Vendor Models for Power System Stability Analysis: Preprint

    SciTech Connect

    Honrubia-Escribano, A.; Gomez Lazaro, E.; Jimenez-Buendia, F.; Muljadi, Eduard

    2016-11-01

    The International Electrotechnical Commission Standard 61400-27-1 was published in February 2015. This standard deals with the development of generic terms and parameters to specify the electrical characteristics of wind turbines. Generic models of very complex technological systems, such as wind turbines, are thus defined based on the four common configurations available in the market. Due to its recent publication, the comparison of the response of generic models with specific vendor models plays a key role in ensuring the widespread use of this standard. This paper compares the response of a specific Gamesa dynamic wind turbine model to the corresponding generic IEC Type III wind turbine model response when the wind turbine is subjected to a three-phase voltage dip. This Type III model represents the doubly-fed induction generator wind turbine, which is not only one of the most commonly sold and installed technologies in the current market but also a complex variable-speed operation implementation. In fact, active and reactive power transients are observed due to the voltage reduction. Special attention is given to the reactive power injection provided by the wind turbine models because it is a requirement of current grid codes. Further, the boundaries of the generic models associated with transient events that cannot be represented exactly are included in the paper.

  1. Wisconsin's Model Academic Standards for Marketing Education. Bulletin No. 9005.

    ERIC Educational Resources Information Center

    Wisconsin State Dept. of Public Instruction, Madison.

    This document contains standards for the academic content of the Wisconsin K-12 curriculum in the area of marketing education. Developed by task forces of educators, parents, board of education members, and employers and employees, the standards cover content, performance, and proficiency areas. The first part of the guide is an introduction that…

  2. Judgmental Standard Setting Using a Cognitive Components Model.

    ERIC Educational Resources Information Center

    McGinty, Dixie; Neel, John H.

    A new standard setting approach is introduced, called the cognitive components approach. Like the Angoff method, the cognitive components method generates minimum pass levels (MPLs) for each item. In both approaches, the item MPLs are summed for each judge, then averaged across judges to yield the standard. In the cognitive components approach,…

  3. A Visual Model for the Variance and Standard Deviation

    ERIC Educational Resources Information Center

    Orris, J. B.

    2011-01-01

    This paper shows how the variance and standard deviation can be represented graphically by looking at each squared deviation as a graphical object--in particular, as a square. A series of displays show how the standard deviation is the size of the average square.

  4. Wisconsin's Model Academic Standards for Information and Technology Literacy. Bulletin No. 90002.

    ERIC Educational Resources Information Center

    Wisconsin State Dept. of Public Instruction, Madison.

    This publication defines academic standards and describes the development, adoption, and use of Wisconsin's Model Academic Standards, as well benefits of academic standards and application of the standards across the curriculum. The five categories that applications fall into include: (1) application of the basics; (2) ability to think; (3) skill…

  5. Le recours aux modeles dans l'enseignement de la biologie au secondaire : Conceptions d'enseignantes et d'enseignants et modes d'utilisation

    NASA Astrophysics Data System (ADS)

    Varlet, Madeleine

    Le recours aux modeles et a la modelisation est mentionne dans la documentation scientifique comme un moyen de favoriser la mise en oeuvre de pratiques d'enseignement-apprentissage constructivistes pour pallier les difficultes d'apprentissage en sciences. L'etude prealable du rapport des enseignantes et des enseignants aux modeles et a la modelisation est alors pertinente pour comprendre leurs pratiques d'enseignement et identifier des elements dont la prise en compte dans les formations initiale et disciplinaire peut contribuer au developpement d'un enseignement constructiviste des sciences. Plusieurs recherches ont porte sur ces conceptions sans faire de distinction selon les matieres enseignees, telles la physique, la chimie ou la biologie, alors que les modeles ne sont pas forcement utilises ou compris de la meme maniere dans ces differentes disciplines. Notre recherche s'est interessee aux conceptions d'enseignantes et d'enseignants de biologie au secondaire au sujet des modeles scientifiques, de quelques formes de representations de ces modeles ainsi que de leurs modes d'utilisation en classe. Les resultats, que nous avons obtenus au moyen d'une serie d'entrevues semi-dirigees, indiquent que globalement leurs conceptions au sujet des modeles sont compatibles avec celle scientifiquement admise, mais varient quant aux formes de representations des modeles. L'examen de ces conceptions temoigne d'une connaissance limitee des modeles et variable selon la matiere enseignee. Le niveau d'etudes, la formation prealable, l'experience en enseignement et un possible cloisonnement des matieres pourraient expliquer les differentes conceptions identifiees. En outre, des difficultes temporelles, conceptuelles et techniques peuvent freiner leurs tentatives de modelisation avec les eleves. Toutefois, nos resultats accreditent l'hypothese que les conceptions des enseignantes et des enseignants eux-memes au sujet des modeles, de leurs formes de representation et de leur approche

  6. Fourth standard model family neutrino at future linear colliders

    SciTech Connect

    Ciftci, A.K.; Ciftci, R.; Sultansoy, S.

    2005-09-01

    It is known that flavor democracy favors the existence of the fourth standard model (SM) family. In order to give nonzero masses for the first three-family fermions flavor democracy has to be slightly broken. A parametrization for democracy breaking, which gives the correct values for fundamental fermion masses and, at the same time, predicts quark and lepton Cabibbo-Kobayashi-Maskawa (CKM) matrices in a good agreement with the experimental data, is proposed. The pair productions of the fourth SM family Dirac ({nu}{sub 4}) and Majorana (N{sub 1}) neutrinos at future linear colliders with {radical}(s)=500 GeV, 1 TeV, and 3 TeV are considered. The cross section for the process e{sup +}e{sup -}{yields}{nu}{sub 4}{nu}{sub 4}(N{sub 1}N{sub 1}) and the branching ratios for possible decay modes of the both neutrinos are determined. The decays of the fourth family neutrinos into muon channels ({nu}{sub 4}(N{sub 1}){yields}{mu}{sup {+-}}W{sup {+-}}) provide cleanest signature at e{sup +}e{sup -} colliders. Meanwhile, in our parametrization this channel is dominant. W bosons produced in decays of the fourth family neutrinos will be seen in detector as either di-jets or isolated leptons. As an example, we consider the production of 200 GeV mass fourth family neutrinos at {radical}(s)=500 GeV linear colliders by taking into account di-muon plus four jet events as signatures.

  7. Standard Model and Beyond with Neutron Beta Decay Experiments

    NASA Astrophysics Data System (ADS)

    Liu, Jianglai

    2010-11-01

    The underlying charge-current weak interaction of the neutron beta decay connects together the Fermi constant GF, CKM matrix element Vud, the nucleon axial weak coupling constant gA, and the free neutron life time τn. Consequently, the combination of direct measurements of these provides stringent constraints to the Standard Model. At present, GF and Vud have been measured to a precision of 5 ppm and 225 ppm, respectively, whereas the data in gA and τn are less precise, and both exhibit significant inconsistency among measurements. With polarized neutrons, gA can be determined by measuring the angular correlation of the decay electrons with the neutron spin (so-called β-asymmetry). In the past, β-asymmetry have been measured in the cold neutron beam experiments, yielding a range of results much wider than the reported uncertainties. A new β-asymmetry measurement, UCNA (Ultracold Neutron Asymmetry), has been developed using the solid deuterium pulse spallation ultracold neutron (UCN) source at the Los Alamos Neutron Science Center, where UCN are transported in a guide system, fully polarized, then loaded into a decay trap within a solenoidal beta spectrometer. Utilizing UCN give this experiment very different systematics compared to cold neutron experiments. In this talk, I will give a brief review of the neutron beta decay measurements on the angular correlations as well as the life time. The main focus of this talk will be on the UCNA experiment. I will discuss the experimental techniques, and present the new results from the data in 2008 and 2009. The implication of the new results, combined with the world data on β-asymmetry, Vud, and τn, will also be discussed.

  8. The pion: an enigma within the Standard Model

    SciTech Connect

    Horn, Tanja; Roberts, Craig D.

    2016-05-27

    Almost 50 years after the discovery of gluons & quarks, we are only just beginning to understand how QCD builds the basic bricks for nuclei: neutrons, protons, and the pions that bind them. QCD is characterised by two emergent phenomena: confinement & dynamical chiral symmetry breaking (DCSB). They are expressed with great force in the character of the pion. In turn, pion properties suggest that confinement & DCSB are closely connected. As both a Nambu-Goldstone boson and a quark-antiquark bound-state, the pion is unique in Nature. Developing an understanding of its properties is thus critical to revealing basic features of the Standard Model. We describe experimental progress in this direction, made using electromagnetic probes, highlighting both improvements in the precision of charged-pion form factor data, achieved in the past decade, and new results on the neutral-pion transition form factor. Both challenge existing notions of pion structure. We also provide a theoretical context for these empirical advances, first explaining how DCSB works to guarantee that the pion is unnaturally light; but also, nevertheless, ensures the pion is key to revealing the mechanisms that generate nearly all the mass of hadrons. Our discussion unifies the charged-pion elastic and neutral-pion transition form factors, and the pion's twist-2 parton distribution amplitude. It also indicates how studies of the charged-kaon form factor can provide significant contributions. Importantly, recent predictions for the large-$Q^2$ behaviour of the pion form factor can be tested by experiments planned at JLab 12. Those experiments will extend precise charged-pion form factor data to momenta that can potentially serve in validating factorisation theorems in QCD, exposing the transition between the nonperturbative and perturbative domains, and thereby reaching a goal that has long driven hadro-particle physics.

  9. Improved anatomy of ɛ'/ ɛ in the Standard Model

    NASA Astrophysics Data System (ADS)

    Buras, Andrzej J.; Gorbahn, Martin; Jäger, Sebastian; Jamin, Matthias

    2015-11-01

    We present a new analysis of the ratio ɛ'/ ɛ within the Standard Model (SM) using a formalism that is manifestly independent of the values of leading ( V - A) ⊗ ( V - A) QCD penguin, and EW penguin hadronic matrix elements of the operators Q 4, Q 9, and Q 10, and applies to the SM as well as extensions with the same operator structure. It is valid under the assumption that the SM exactly describes the data on CP-conserving K → ππ amplitudes. As a result of this and the high precision now available for CKM and quark mass parameters, to high accuracy ɛ' /ɛ depends only on two non-perturbative parameters, B 6 (1/2) and B 8 (3/2) , and perturbatively calculable Wilson coefficients. Within the SM, we are separately able to determine the hadronic matrix element < Q 4>0 from CP-conserving data, significantly more precisely than presently possible with lattice QCD. Employing B 6 (1/2) = 0 .57 ± 0 .19 and B 8 (3/2) = 0 .76 ± 0 .05, extracted from recent results by the RBC-UKQCD collaboration, we obtain ɛ' /ɛ = (1 .9 ± 4 .5) × 10-4, substantially more precise than the recent RBC-UKQCD prediction and 2 .9 σ below the experimental value (16 .6 ± 2 .3) × 10-4, with the error being fully dominated by that on B 6 (1/2) . Even discarding lattice input completely, but employing the recently obtained bound B 6 (1/2) ≤ B 8 (3/2) ≤ 1 from the large- N approach, the SM value is found more than 2 σ below the experimental value. At B 6 (1/2) = B 8 (3/2) = 1, varying all other parameters within one sigma, we find ɛ' /ɛ = (8 .6 ± 3 .2) × 10-4. We present a detailed anatomy of the various SM uncertainties, including all sub-leading hadronic matrix elements, briefly commenting on the possibility of underestimated SM contributions as well as on the impact of our results on new physics models.

  10. Data Format Standardization of Space Weather Model Output at the Community Coordinated Modeling Center

    NASA Technical Reports Server (NTRS)

    Maddox, M.; Rastatter, L.; Hesse, M.

    2005-01-01

    The disparate nature of space weather model output provides many challenges with regards to the portability and reuse of not only the data itself, but also any tools that are developed for analysis and visualization. We are developing and implementing a comprehensive data format standardization methodology that allows heterogeneous model output data to be stored uniformly in any common science data format. We will discuss our approach to identifying core meta-data elements that can be used to supplement raw model output data, thus creating self-descriptive files. The meta-data should also contain information describing the simulation grid. This will ultimately assists in the development of efficient data access tools capable of extracting data at any given point and time. We will also discuss our experiences standardizing the output of two global magnetospheric models, and how we plan to apply similar procedures when standardizing the output of the solar, heliospheric, and ionospheric models that are also currently hosted at the Community Coordinated Modeling Center.

  11. Benchmarks for double Higgs production in the singlet-extended standard model at the LHC

    NASA Astrophysics Data System (ADS)

    Lewis, Ian; Sullivan, Matthew

    2017-08-01

    The simplest extension of the standard model is to add a gauge singlet scalar, S : the singlet-extended standard model. In the absence of a Z2 symmetry S →-S and if the new scalar is sufficiently heavy, this model can lead to resonant double Higgs production, significantly increasing the production rate over the standard model prediction. While searches for this signal are being performed, it is important to have benchmark points and models with which to compare the experimental results. In this paper we determine these benchmarks by maximizing the double Higgs production rate at the LHC in the singlet-extended standard model. We find that, within current constraints, the branching ratio of the new scalar into two standard model-like Higgs bosons can be upwards of 0.76, and the double Higgs rate can be increased upwards of 30 times the standard model prediction.

  12. Le ciment brûle toujours

    PubMed Central

    Lebreton, T.; Fontaine, M.; Le Floch, R.

    2017-01-01

    Summary Les brûlures chimiques par ciment représentent une cause fréquente de corrosion cutanée en France. Elles nécessitent fréquemment un traitement chirurgical. Notre étude rétrospective concerne tous les patients admis pour une brûlure par ciment dans le service entre 2004 et 2016. Quarante-neuf patients âgés de 21 à 71 ans ont été pris en charge dans le centre des brûlés du Centre Hospitalier Saint Joseph Saint Luc à Lyon entre 2004 et 2016. La population concernée était majoritairement masculine, relativement jeune (44 ans en moyenne) et professionnellement active. Les brûlures survenaient principalement dans le cadre d’accidents domestiques (78%). Elles étaient profondes et atteignaient majoritairement les membres inférieurs, de façon bilatérale. La surface brûlée représentait 3% de la surface cutanée totale. Presque tous les patients (98%) ont nécessité une prise en charge chirurgicale pour excision et autogreffe de peau mince. Un seul patient a bénéficié d’une cicatrisation dirigée. Le délai moyen entre la brûlure et la chirurgie était de 13 jours et la durée moyenne d’hospitalisation de 8 jours. Sept patients ont nécessité une prise en charge en centre de rééducation à leur sortie du service. Cette étude confirme la sévérité des brûlures chimiques par ciment. Elle met également en avant l’impact que peut avoir ce type de brûlure en terme de retentissement socio-économique dans une population de patients majoritairement jeune et active. Elle insiste sur le fait que des mesures doivent être prises afin d’informer cette population rarement professionnelle sur les risques encourus lors du mésusage du ciment. La réglementation actuelle, classant le ciment comme irritant, ne prend pas en compte son caractère corrosif et devrait être amendée. PMID:28592929

  13. CCR+: Metadata Based Extended Personal Health Record Data Model Interoperable with the ASTM CCR Standard

    PubMed Central

    Park, Yu Rang; Yoon, Young Jo; Jang, Tae Hun; Seo, Hwa Jeong

    2014-01-01

    Objectives Extension of the standard model while retaining compliance with it is a challenging issue because there is currently no method for semantically or syntactically verifying an extended data model. A metadata-based extended model, named CCR+, was designed and implemented to achieve interoperability between standard and extended models. Methods Furthermore, a multilayered validation method was devised to validate the standard and extended models. The American Society for Testing and Materials (ASTM) Community Care Record (CCR) standard was selected to evaluate the CCR+ model; two CCR and one CCR+ XML files were evaluated. Results In total, 188 metadata were extracted from the ASTM CCR standard; these metadata are semantically interconnected and registered in the metadata registry. An extended-data-model-specific validation file was generated from these metadata. This file can be used in a smartphone application (Health Avatar CCR+) as a part of a multilayered validation. The new CCR+ model was successfully evaluated via a patient-centric exchange scenario involving multiple hospitals, with the results supporting both syntactic and semantic interoperability between the standard CCR and extended, CCR+, model. Conclusions A feasible method for delivering an extended model that complies with the standard model is presented herein. There is a great need to extend static standard models such as the ASTM CCR in various domains: the methods presented here represent an important reference for achieving interoperability between standard and extended models. PMID:24627817

  14. CCR+: Metadata Based Extended Personal Health Record Data Model Interoperable with the ASTM CCR Standard.

    PubMed

    Park, Yu Rang; Yoon, Young Jo; Jang, Tae Hun; Seo, Hwa Jeong; Kim, Ju Han

    2014-01-01

    Extension of the standard model while retaining compliance with it is a challenging issue because there is currently no method for semantically or syntactically verifying an extended data model. A metadata-based extended model, named CCR+, was designed and implemented to achieve interoperability between standard and extended models. Furthermore, a multilayered validation method was devised to validate the standard and extended models. The American Society for Testing and Materials (ASTM) Community Care Record (CCR) standard was selected to evaluate the CCR+ model; two CCR and one CCR+ XML files were evaluated. In total, 188 metadata were extracted from the ASTM CCR standard; these metadata are semantically interconnected and registered in the metadata registry. An extended-data-model-specific validation file was generated from these metadata. This file can be used in a smartphone application (Health Avatar CCR+) as a part of a multilayered validation. The new CCR+ model was successfully evaluated via a patient-centric exchange scenario involving multiple hospitals, with the results supporting both syntactic and semantic interoperability between the standard CCR and extended, CCR+, model. A feasible method for delivering an extended model that complies with the standard model is presented herein. There is a great need to extend static standard models such as the ASTM CCR in various domains: the methods presented here represent an important reference for achieving interoperability between standard and extended models.

  15. Missing experimental challenges to the Standard Model of particle physics

    NASA Astrophysics Data System (ADS)

    Perovic, Slobodan

    The success of particle detection in high energy physics colliders critically depends on the criteria for selecting a small number of interactions from an overwhelming number that occur in the detector. It also depends on the selection of the exact data to be analyzed and the techniques of analysis. The introduction of automation into the detection process has traded the direct involvement of the physicist at each stage of selection and analysis for the efficient handling of vast amounts of data. This tradeoff, in combination with the organizational changes in laboratories of increasing size and complexity, has resulted in automated and semi-automated systems of detection. Various aspects of the semi-automated regime were greatly diminished in more generic automated systems, but turned out to be essential to a number of surprising discoveries of anomalous processes that led to theoretical breakthroughs, notably the establishment of the Standard Model of particle physics. The automated systems are much more efficient in confirming specific hypothesis in narrow energy domains than in performing broad exploratory searches. Thus, in the main, detection processes relying excessively on automation are more likely to miss potential anomalies and impede potential theoretical advances. I suggest that putting substantially more effort into the study of electron-positron colliders and increasing its funding could minimize the likelihood of missing potential anomalies, because detection in such an environment can be handled by the semi-automated regime-unlike detection in hadron colliders. Despite virtually unavoidable excessive reliance on automated detection in hadron colliders, their development has been deemed a priority because they can operate at currently highest energy levels. I suggest, however, that a focus on collisions at the highest achievable energy levels diverts funds from searches for potential anomalies overlooked due to tradeoffs at the previous energy

  16. Topics in physics beyond the standard model with strong interactions

    NASA Astrophysics Data System (ADS)

    Gomez Sanchez, Catalina

    In this thesis we study a few complementary topics related to some of the open questions in the Standard Model (SM). We first consider the scalar spectrum of gauge theories with walking dynamics. The question of whether or not a light pseudo-Nambu-Goldstone boson associated with the spontaneous breaking of approximate dilatation symmetry appears in these theories has been long withstanding. We derive an effective action for the scalars, including new terms not previously considered in the literature, and obtain solutions for the lightest scalar's momentum-dependent form factor that determines the value of its pole mass. Our results for the lowest-lying state suggest that this scalar is never expected to be light, but it can have some properties that closely resemble the SM Higgs boson. We then propose a new leptonic charge-asymmetry observable well suited for the study of some Beyond the SM (BSM) physics objects at the LHC. New resonances decaying to one or many leptons could constitute the first signs of BSM physics that we observe at the LHC; if these new objects carry QCD charge they may have an associated charge asymmetry in their daughter leptons. Our observable can be used in events with single or multiple leptons in the final state. We discuss this measurement in the context of coloured scalar diquarks, as well as that of top-antitop pairs. We argue that, although a fainter signal is expected relative to other charge asymmetry observables, the low systematic uncertainties keep this particular observable relevant, especially in cases where reconstruction of the parent particle is not a viable strategy. Finally, we propose a simple dark-sector extension to the SM that communicates with ordinary quarks and leptons only through a small kinetic mixing of the dark photon and the photon. The dark sector is assumed to undergo a series of phase transitions such that monopoles and strings arise. These objects form long-lived states that eventually decay and can

  17. A Journey in Standard Development: The Core Manufacturing Simulation Data (CMSD) Information Model.

    PubMed

    Lee, Yung-Tsun Tina

    2015-01-01

    This report documents a journey "from research to an approved standard" of a NIST-led standard development activity. That standard, Core Manufacturing Simulation Data (CMSD) information model, provides neutral structures for the efficient exchange of manufacturing data in a simulation environment. The model was standardized under the auspices of the international Simulation Interoperability Standards Organization (SISO). NIST started the research in 2001 and initiated the standardization effort in 2004. The CMSD standard was published in two SISO Products. In the first Product, the information model was defined in the Unified Modeling Language (UML) and published in 2010 as SISO-STD-008-2010. In the second Product, the information model was defined in Extensible Markup Language (XML) and published in 2013 as SISO-STD-008-01-2012. Both SISO-STD-008-2010 and SISO-STD-008-01-2012 are intended to be used together.

  18. Advances in Geoscience Modeling: Smart Modeling Frameworks, Self-Describing Models and the Role of Standardized Metadata

    NASA Astrophysics Data System (ADS)

    Peckham, Scott

    2016-04-01

    Over the last decade, model coupling frameworks like CSDMS (Community Surface Dynamics Modeling System) and ESMF (Earth System Modeling Framework) have developed mechanisms that make it much easier for modelers to connect heterogeneous sets of process models in a plug-and-play manner to create composite "system models". These mechanisms greatly simplify code reuse, but must simultaneously satisfy many different design criteria. They must be able to mediate or compensate for differences between the process models, such as their different programming languages, computational grids, time-stepping schemes, variable names and variable units. However, they must achieve this interoperability in a way that: (1) is noninvasive, requiring only relatively small and isolated changes to the original source code, (2) does not significantly reduce performance, (3) is not time-consuming or confusing for a model developer to implement, (4) can very easily be updated to accommodate new versions of a given process model and (5) does not shift the burden of providing model interoperability to the model developers. In tackling these design challenges, model framework developers have learned that the best solution is to provide each model with a simple, standardized interface, i.e. a set of standardized functions that make the model: (1) fully-controllable by a caller (e.g. a model framework) and (2) self-describing with standardized metadata. Model control functions are separate functions that allow a caller to initialize the model, advance the model's state variables in time and finalize the model. Model description functions allow a caller to retrieve detailed information on the model's input and output variables, its computational grid and its timestepping scheme. If the caller is a modeling framework, it can use the self description functions to learn about each process model in a collection to be coupled and then automatically call framework service components (e.g. regridders

  19. Prototyping an online wetland ecosystem services model using open model sharing standards

    USGS Publications Warehouse

    Feng, M.; Liu, S.; Euliss, N.H.; Young, Caitlin; Mushet, D.M.

    2011-01-01

    Great interest currently exists for developing ecosystem models to forecast how ecosystem services may change under alternative land use and climate futures. Ecosystem services are diverse and include supporting services or functions (e.g., primary production, nutrient cycling), provisioning services (e.g., wildlife, groundwater), regulating services (e.g., water purification, floodwater retention), and even cultural services (e.g., ecotourism, cultural heritage). Hence, the knowledge base necessary to quantify ecosystem services is broad and derived from many diverse scientific disciplines. Building the required interdisciplinary models is especially challenging as modelers from different locations and times may develop the disciplinary models needed for ecosystem simulations, and these models must be identified and made accessible to the interdisciplinary simulation. Additional difficulties include inconsistent data structures, formats, and metadata required by geospatial models as well as limitations on computing, storage, and connectivity. Traditional standalone and closed network systems cannot fully support sharing and integrating interdisciplinary geospatial models from variant sources. To address this need, we developed an approach to openly share and access geospatial computational models using distributed Geographic Information System (GIS) techniques and open geospatial standards. We included a means to share computational models compliant with Open Geospatial Consortium (OGC) Web Processing Services (WPS) standard to ensure modelers have an efficient and simplified means to publish new models. To demonstrate our approach, we developed five disciplinary models that can be integrated and shared to simulate a few of the ecosystem services (e.g., water storage, waterfowl breeding) that are provided by wetlands in the Prairie Pothole Region (PPR) of North America.

  20. Implementing the Standards: Incorporating Mathematical Modeling into the Curriculum.

    ERIC Educational Resources Information Center

    Swetz, Frank

    1991-01-01

    Following a brief historical review of the mechanism of mathematical modeling, examples are included that associate a mathematical model with given data (changes in sea level) and that model a real-life situation (process of parallel parking). Also provided is the rationale for the curricular implementation of mathematical modeling. (JJK)

  1. Implementing the Standards: Incorporating Mathematical Modeling into the Curriculum.

    ERIC Educational Resources Information Center

    Swetz, Frank

    1991-01-01

    Following a brief historical review of the mechanism of mathematical modeling, examples are included that associate a mathematical model with given data (changes in sea level) and that model a real-life situation (process of parallel parking). Also provided is the rationale for the curricular implementation of mathematical modeling. (JJK)

  2. Massive neutrinos in the standard model and beyond

    NASA Astrophysics Data System (ADS)

    Thalapillil, Arun Madhav

    The generation of the fermion mass hierarchy in the standard model of particle physics is a long-standing puzzle. The recent discoveries from neutrino physics suggests that the mixing in the lepton sector is large compared to the quark mixings. To understand this asymmetry between the quark and lepton mixings is an important aim for particle physics. In this regard, two promising approaches from the theoretical side are grand unified theories and family symmetries. In the first part of my thesis we try to understand certain general features of grand unified theories with Abelian family symmetries by taking the simplest SU(5) grand unified theory as a prototype. We construct an SU(5) toy model with U(1) F ⊗Z'2 ⊗Z'' 2⊗Z''' 2 family symmetry that, in a natural way, duplicates the observed mass hierarchy and mixing matrices to lowest approximation. The system for generating the mass hierarchy is through a Froggatt-Nielsen type mechanism. One idea that we use in the model is that the quark and charged lepton sectors are hierarchical with small mixing angles while the light neutrino sector is democratic with larger mixing angles. We also discuss some of the difficulties in incorporating finer details into the model without making further assumptions or adding a large scalar sector. In the second part of my thesis, the interaction of high energy neutrinos with weak gravitational fields is explored. The form of the graviton-neutrino vertex is motivated from Lorentz and gauge invariance and the non-relativistic interpretations of the neutrino gravitational form factors are obtained. We comment on the renormalization conditions, the preservation of the weak equivalence principle and the definition of the neutrino mass radius. We associate the neutrino gravitational form factors with specific angular momentum states. Based on Feynman diagrams, spin-statistics, CP invariance and symmetries of the angular momentum states in the neutrino-graviton vertex, we deduce

  3. Le Chatelier--Right or Wrong?

    ERIC Educational Resources Information Center

    Helfferich, Friedrich G.

    1985-01-01

    Presents a class exercise designed to find out how well students understand the nature and consequences of the mass action law and Le Chatelier's principle as applied to chemical equilibria. The exercise relates to a practical situation and provides simple relations for maximizing equilibrium quantities not found in standard textbooks. (JN)

  4. Colorado Model Content Standards for Dance: Suggested Grade Level Expectations.

    ERIC Educational Resources Information Center

    Colorado State Dept. of Education, Denver.

    The state of Colorado has set forth six content standards for dance education in its public schools: (1) students will understand and demonstrate dance skills; (2) students will understand and apply the principles of choreography; (3) students will create, communicate, and problem solve through dance; (4) students will understand and relate the…

  5. Addressing Standardized Testing through a Novel Assesment Model

    ERIC Educational Resources Information Center

    Schifter, Catherine C.; Carey, Martha

    2014-01-01

    The No Child Left Behind (NCLB) legislation spawned a plethora of standardized testing services for all the high stakes testing required by the law. We argue that one-size-fits all assessments disadvantage students who are English Language Learners, in the USA, as well as students with limited economic resources, special needs, and not reading on…

  6. ISO 9000 quality standards: a model for blood banking?

    PubMed

    Nevalainen, D E; Lloyd, H L

    1995-06-01

    The recent American Association of Blood Banks publications Quality Program and Quality Systems in the Blood Bank and Laboratory Environment, the FDA's draft guidelines, and recent changes in the GMP regulations all discuss the benefits of implementing quality systems in blood center and/or manufacturing operations. While the medical device GMPs in the United States have been rewritten to accommodate a quality system approach similar to ISO 9000, the Center for Biologics Evaluation and Research of the FDA is also beginning to make moves toward adopting "quality systems audits" as an inspection process rather than using the historical approach of record reviews. The approach is one of prevention of errors rather than detection after the fact (Tourault MA, oral communication, November 1994). The ISO 9000 series of standards is a quality system that has worldwide scope and can be applied in any industry or service. The use of such international standards in blood banking should raise the level of quality within an organization, among organizations on a regional level, within a country, and among nations on a worldwide basis. Whether an organization wishes to become registered to a voluntary standard or not, the use of such standards to become ISO 9000-compliant would be a move in the right direction and would be a positive sign to the regulatory authorities and the public that blood banking is making a visible effort to implement world-class quality systems in its operations. Implementation of quality system standards such as the ISO 9000 series will provide an organized approach for blood banks and blood bank testing operations. With the continued trend toward consolidation and mergers, resulting in larger operational units with more complexity, quality systems will become even more important as the industry moves into the future.(ABSTRACT TRUNCATED AT 250 WORDS)

  7. Structural Equation Models of Latent Interactions: An Appropriate Standardized Solution and Its Scale-Free Properties

    ERIC Educational Resources Information Center

    Wen, Zhonglin; Marsh, Herbert W.; Hau, Kit-Tai

    2010-01-01

    Standardized parameter estimates are routinely used to summarize the results of multiple regression models of manifest variables and structural equation models of latent variables, because they facilitate interpretation. Although the typical standardization of interaction terms is not appropriate for multiple regression models, straightforward…

  8. Modelling force deployment from army intelligence using the transportation system capability (TRANSCAP) model : a standardized approach.

    SciTech Connect

    Burke, J. F., Jr.; Love, R. J.; Macal, C. M.; Decision and Information Sciences

    2004-07-01

    Argonne National Laboratory (Argonne) developed the transportation system capability (TRANSCAP) model to simulate the deployment of forces from Army bases, in collaboration with and under the sponsorship of the Military Transportation Management Command Transportation Engineering Agency (MTMCTEA). TRANSCAP's design separates its pre- and post-processing modules (developed in Java) from its simulation module (developed in MODSIM III). This paper describes TRANSCAP's modelling approach, emphasizing Argonne's highly detailed, object-oriented, multilanguage software design principles. Fundamental to these design principles is TRANSCAP's implementation of an improved method for standardizing the transmission of simulated data to output analysis tools and the implementation of three Army deployment/redeployment community standards, all of which are in the final phases of community acceptance. The first is the extensive hierarchy and object representation for transport simulations (EXHORT), which is a reusable, object-oriented deployment simulation source code framework of classes. The second and third are algorithms for rail deployment operations at a military base.

  9. Plot Scale Factor Models for Standard Orthographic Views

    ERIC Educational Resources Information Center

    Osakue, Edward E.

    2007-01-01

    Geometric modeling provides graphic representations of real or abstract objects. Realistic representation requires three dimensional (3D) attributes since natural objects have three principal dimensions. CAD software gives the user the ability to construct realistic 3D models of objects, but often prints of these models must be generated on two…

  10. Job Grading Standard for Model Maker, WG-4714.

    ERIC Educational Resources Information Center

    Civil Service Commission, Washington, DC. Bureau of Policies and Standards.

    The pamphlet explains the different job requirements for different grades of model maker (WG-14 and WG-15) and contrasts them to the position of premium journeyman. It includes comment on what a model maker is (a nonsupervisory job involved in planning and fabricating complex research and prototype models which are made from a variety of materials…

  11. Extending the standard model effective field theory with the complete set of dimension-7 operators

    NASA Astrophysics Data System (ADS)

    Lehman, Landon

    2014-12-01

    We present a complete list of the independent dimension-7 operators that are constructed using the standard model degrees of freedom and are invariant under the standard model gauge group. This list contains only 20 independent operators, far fewer than the 63 operators available at dimension 6. All of these dimension-7 operators contain fermions and violate lepton number, and 7 of the 20 violate baryon number as well. This result extends the standard model effective field theory and allows a more detailed exploration of the structure and properties of possible deformations from the standard model Lagrangian.

  12. A NUMERICAL MODEL OF STANDARD TO BLOWOUT JETS

    SciTech Connect

    Archontis, V.; Hood, A. W.

    2013-06-01

    We report on three-dimensional (3D) MHD simulations of the formation of jets produced during the emergence and eruption of solar magnetic fields. The interaction between an emerging and an ambient magnetic field in the solar atmosphere leads to (external) reconnection and the formation of ''standard'' jets with an inverse Y-shaped configuration. Eventually, low-atmosphere (internal) reconnection of sheared fieldlines in the emerging flux region produces an erupting magnetic flux rope and a reconnection jet underneath it. The erupting plasma blows out the ambient field and, moreover, it unwinds as it is ejected into the outer solar atmosphere. The fast emission of the cool material that erupts together with the hot outflows due to external/internal reconnection form a wider ''blowout'' jet. We show the transition from ''standard'' to ''blowout'' jets and report on their 3D structure. The physical plasma properties of the jets are consistent with observational studies.

  13. Relating electrophotographic printing model and ISO13660 standard attributes

    NASA Astrophysics Data System (ADS)

    Barney Smith, Elisa H.

    2010-01-01

    A mathematical model of the electrophotographic printing process has been developed. This model can be used for analysis. From this a print simulation process has been developed to simulate the effects of the model components on toner particle placement. A wide variety of simulated prints are produced from the model's three main inputs, laser spread, charge to toner proportionality factor and toner particle size. While the exact placement of toner particles is a random process, the total effect is not. The effect of each model parameter on the ISO 13660 print quality attributes line width, fill, raggedness and blurriness is described.

  14. Standard Scenarios for the Less-Lethal Weapons Evaluation Model

    DTIC Science & Technology

    1975-08-01

    consideration. The scenario forms a standard basis for evaluation. It sets the criteria for estimating the probabilities of occurrence of both the...the particular context, though they might be unacceptable in another set of circumstances. Disobedience to certain laws may be of little consequence...These criteria are set forth because the safety of the hostage is thc primary concern of the police, and they desire to subdue the offender

  15. Battery Ownership Model - Medium Duty HEV Battery Leasing & Standardization

    SciTech Connect

    Kelly, Ken; Smith, Kandler; Cosgrove, Jon; Prohaska, Robert; Pesaran, Ahmad; Paul, James; Wiseman, Marc

    2015-12-01

    Prepared for the U.S. Department of Energy, this milestone report focuses on the economics of leasing versus owning batteries for medium-duty hybrid electric vehicles as well as various battery standardization scenarios. The work described in this report was performed by members of the Energy Storage Team and the Vehicle Simulation Team in NREL's Transportation and Hydrogen Systems Center along with members of the Vehicles Analysis Team at Ricardo.

  16. A Journey in Standard Development: The Core Manufacturing Simulation Data (CMSD) Information Model

    PubMed Central

    Lee, Yung-Tsun Tina

    2015-01-01

    This report documents a journey “from research to an approved standard” of a NIST-led standard development activity. That standard, Core Manufacturing Simulation Data (CMSD) information model, provides neutral structures for the efficient exchange of manufacturing data in a simulation environment. The model was standardized under the auspices of the international Simulation Interoperability Standards Organization (SISO). NIST started the research in 2001 and initiated the standardization effort in 2004. The CMSD standard was published in two SISO Products. In the first Product, the information model was defined in the Unified Modeling Language (UML) and published in 2010 as SISO-STD-008-2010. In the second Product, the information model was defined in Extensible Markup Language (XML) and published in 2013 as SISO-STD-008-01-2012. Both SISO-STD-008-2010 and SISO-STD-008-01-2012 are intended to be used together. PMID:26958450

  17. Two Models for Evaluating Alignment of State Standards and Assessments: Competing or Complementary Perspectives?

    ERIC Educational Resources Information Center

    Newton, Jill A.; Kasten, Sarah E.

    2013-01-01

    The release of the Common Core State Standards for Mathematics and their adoption across the United States calls for careful attention to the alignment between mathematics standards and assessments. This study investigates 2 models that measure alignment between standards and assessments, the Surveys of Enacted Curriculum (SEC) and the Webb…

  18. Development and Field Testing of a Model to Simulate a Demonstration of Le Chatelier's Principle Using the Wheatstone Bridge Circuit.

    ERIC Educational Resources Information Center

    Vickner, Edward Henry, Jr.

    An electronic simulation model was designed, constructed, and then field tested to determine student opinion of its effectiveness as an instructional aid. The model was designated as the Equilibrium System Simulator (ESS). The model was built on the principle of electrical symmetry applied to the Wheatstone bridge and was constructed from readily…

  19. New framework for standardized notation in wastewater treatment modelling.

    PubMed

    Corominas, L L; Rieger, L; Takács, I; Ekama, G; Hauduc, H; Vanrolleghem, P A; Oehmen, A; Gernaey, K V; van Loosdrecht, M C M; Comeau, Y

    2010-01-01

    Many unit process models are available in the field of wastewater treatment. All of these models use their own notation, causing problems for documentation, implementation and connection of different models (using different sets of state variables). The main goal of this paper is to propose a new notational framework which allows unique and systematic naming of state variables and parameters of biokinetic models in the wastewater treatment field. The symbols are based on one main letter that gives a general description of the state variable or parameter and several subscript levels that provide greater specification. Only those levels that make the name unique within the model context are needed in creating the symbol. The paper describes specific problems encountered with the currently used notation, presents the proposed framework and provides additional practical examples. The overall result is a framework that can be used in whole plant modelling, which consists of different fields such as activated sludge, anaerobic digestion, sidestream treatment, membrane bioreactors, metabolic approaches, fate of micropollutants and biofilm processes. The main objective of this consensus building paper is to establish a consistent set of rules that can be applied to existing and most importantly, future models. Applying the proposed notation should make it easier for everyone active in the wastewater treatment field to read, write and review documents describing modelling projects.

  20. Why does the Standard GARCH(1, 1) Model Work Well?

    NASA Astrophysics Data System (ADS)

    Jafari, G. R.; Bahraminasab, A.; Norouzzadeh, P.

    The AutoRegressive Conditional Heteroskedasticity (ARCH) and its generalized version (GARCH) family of models have grown to encompass a wide range of specifications, each of them is designed to enhance the ability of the model to capture the characteristics of stochastic data, such as financial time series. The existing literature provides little guidance on how to select optimal parameters, which are critical in efficiency of the model, among the infinite range of available parameters. We introduce a new criterion to find suitable parameters in GARCH models by using Markov length, which is the minimum time interval over which the data can be considered as constituting a Markov process. This criterion is applied to various time series and its results support the known idea that GARCH(1, 1) model works well.

  1. Symmetry Breaking, Unification, and Theories Beyond the Standard Model

    SciTech Connect

    Nomura, Yasunori

    2009-07-31

    A model was constructed in which the supersymmetric fine-tuning problem is solved without extending the Higgs sector at the weak scale. We have demonstrated that the model can avoid all the phenomenological constraints, while avoiding excessive fine-tuning. We have also studied implications of the model on dark matter physics and collider physics. I have proposed in an extremely simple construction for models of gauge mediation. We found that the {mu} problem can be simply and elegantly solved in a class of models where the Higgs fields couple directly to the supersymmetry breaking sector. We proposed a new way of addressing the flavor problem of supersymmetric theories. We have proposed a new framework of constructing theories of grand unification. We constructed a simple and elegant model of dark matter which explains excess flux of electrons/positrons. We constructed a model of dark energy in which evolving quintessence-type dark energy is naturally obtained. We studied if we can find evidence of the multiverse.

  2. Standard Codon Substitution Models Overestimate Purifying Selection for Nonstationary Data

    PubMed Central

    Yap, Von Bing; Huttley, Gavin A.

    2017-01-01

    Estimation of natural selection on protein-coding sequences is a key comparative genomics approach for de novo prediction of lineage-specific adaptations. Selective pressure is measured on a per-gene basis by comparing the rate of nonsynonymous substitutions to the rate of synonymous substitutions. All published codon substitution models have been time-reversible and thus assume that sequence composition does not change over time. We previously demonstrated that if time-reversible DNA substitution models are applied in the presence of changing sequence composition, the number of substitutions is systematically biased towards overestimation. We extend these findings to the case of codon substitution models and further demonstrate that the ratio of nonsynonymous to synonymous rates of substitution tends to be underestimated over three data sets of mammals, vertebrates, and insects. Our basis for comparison is a nonstationary codon substitution model that allows sequence composition to change. Goodness-of-fit results demonstrate that our new model tends to fit the data better. Direct measurement of nonstationarity shows that bias in estimates of natural selection and genetic distance increases with the degree of violation of the stationarity assumption. Additionally, inferences drawn under time-reversible models are systematically affected by compositional divergence. As genomic sequences accumulate at an accelerating rate, the importance of accurate de novo estimation of natural selection increases. Our results establish that our new model provides a more robust perspective on this fundamental quantity. PMID:28175284

  3. Physics beyond the standard model: Focusing on the muon anomaly

    SciTech Connect

    Chavez, Helder; Ferreira, Cristine N.; Helayel-Neto, Jose A.

    2006-08-01

    We present a model based on the implication of an exceptional E{sub 6}-GUT symmetry for the anomalous magnetic moment of the muon. We follow a particular chain of breakings with Higgses in the 78 and 351 representations. We analyze the radiative correction contributions to the muon mass and the effects of the breaking of the so-called Weinberg symmetry. We also estimate the range of values of the parameters of our model.

  4. Higgs boson mass in the standard model at two-loop order and beyond

    SciTech Connect

    Martin, Stephen P.; Robertson, David G.

    2014-10-01

    We calculate the mass of the Higgs boson in the standard model in terms of the underlying Lagrangian parameters at complete 2-loop order with leading 3-loop corrections. A computer program implementing the results is provided. The program also computes and minimizes the standard model effective potential in Landau gauge at 2-loop order with leading 3-loop corrections.

  5. The Model Standards Project: Creating Inclusive Systems for LGBT Youth in Out-of-Home Care

    ERIC Educational Resources Information Center

    Wilber, Shannan; Reyes, Carolyn; Marksamer, Jody

    2006-01-01

    This article describes the Model Standards Project (MSP), a collaboration of Legal Services for Children and the National Center for Lesbian Rights. The MSP developed a set of model professional standards governing the care of lesbian, gay, bisexual and transgender (LGBT) youth in out-of-home care. This article provides an overview of the…

  6. Standard model large-ET processes and searches for new physics at HERA

    NASA Astrophysics Data System (ADS)

    Krasny, M. W.; Spiesberger, H.

    1999-07-01

    Existing and missing calculations of standard model processes producing large transverse energy in electron-proton interactions at HERA are reviewed. The adequacy of the existing standard model Monte Carlo programs for generic searches of exotic processes is analysed. This is a shortened version of Krasny M W and Spiesberger H (1998 Preprint hep-th 9901359).

  7. Needed: A Standard Information Processing Model of Learning and Learning Processes.

    ERIC Educational Resources Information Center

    Carifio, James

    One strategy to prevent confusion as new paradigms emerge is to have professionals in the area develop and use a standard model of the phenomenon in question. The development and use of standard models in physics, genetics, archaeology, and cosmology have been very productive. The cognitive revolution in psychology and education has produced a…

  8. The Model Standards Project: Creating Inclusive Systems for LGBT Youth in Out-of-Home Care

    ERIC Educational Resources Information Center

    Wilber, Shannan; Reyes, Carolyn; Marksamer, Jody

    2006-01-01

    This article describes the Model Standards Project (MSP), a collaboration of Legal Services for Children and the National Center for Lesbian Rights. The MSP developed a set of model professional standards governing the care of lesbian, gay, bisexual and transgender (LGBT) youth in out-of-home care. This article provides an overview of the…

  9. Physical Education Teachers Fidelity to and Perspectives of a Standardized Curricular Model

    ERIC Educational Resources Information Center

    Kloeppel, Tiffany; Stylianou, Michalis; Kulinna, Pamela Hodges

    2014-01-01

    Relatively little is known about the use of standardized physical education curricular models and teachers perceptions of and fidelity to such curricula. The purpose of this study was to examine teachers perceptions of and fidelity to a standardized physical education curricular model (i.e., Dynamic Physical Education [DPE]). Participants for this…

  10. Physical Education Teachers Fidelity to and Perspectives of a Standardized Curricular Model

    ERIC Educational Resources Information Center

    Kloeppel, Tiffany; Stylianou, Michalis; Kulinna, Pamela Hodges

    2014-01-01

    Relatively little is known about the use of standardized physical education curricular models and teachers perceptions of and fidelity to such curricula. The purpose of this study was to examine teachers perceptions of and fidelity to a standardized physical education curricular model (i.e., Dynamic Physical Education [DPE]). Participants for this…

  11. COHERENT search strategy for beyond standard model neutrino interactions

    NASA Astrophysics Data System (ADS)

    Shoemaker, Ian M.

    2017-06-01

    We study the sensitivity of the COHERENT experiment's stopped pion source of neutrinos at the Spallation Neutron Source (SNS) at Oak Ridge National Laboratory to neutrino non-standard interactions (NSI). In particular, we find that simple event counting above threshold can improve constraints on NSI for electron- and muon-flavored NSI, with the strongest constraints arising for flavor-diagonal NSI coupling to μ neutrinos. However, if the detector resolution is sufficient to all for even a coarse spectral study of events, COHERENT will also be sensitive to the mass scale of NSI. We demonstrate that this can yield new limits on Z' completions of NSI if the gauge boson mass is ≲1 GeV .

  12. Testing the Standard Model with the Primordial Inflation Explorer

    NASA Technical Reports Server (NTRS)

    Kogut, Alan J.

    2011-01-01

    The Primordial Inflation Explorer is an Explorer-class mission to measure the gravity-wave signature of primordial inflation through its distinctive imprint on the linear polarization of the cosmic microwave background. PIXIE uses an innovative optical design to achieve background-limited sensitivity in 400 spectral channels spanning 2.5 decades in frequency from 30 GHz to 6 THz (1 cm to 50 micron wavelength). The principal science goal is the detection and characterization of linear polarization from an inflationary epoch in the early universe, with tensor-to-scalar ratio r < 10A{-3) at 5 standard deviations. The rich PIXIE data set will also constrain physical processes ranging from Big Bang cosmology to the nature of the first stars to physical conditions within the interstellar medium of the Galaxy. I describe the PIXIE instrument and mission architecture needed to detect the inflationary signature using only 4 semiconductor bolometers.

  13. Leveraging Strengths Assessment and Intervention Model (LeStAIM): A Theoretical Strength-Based Assessment Framework

    ERIC Educational Resources Information Center

    Laija-Rodriguez, Wilda; Grites, Karen; Bouman, Doug; Pohlman, Craig; Goldman, Richard L.

    2013-01-01

    Current assessments in the schools are based on a deficit model (Epstein, 1998). "The National Association of School Psychologists (NASP) Model for Comprehensive and Integrated School Psychological Services" (2010), federal initiatives and mandates, and experts in the field of assessment have highlighted the need for the comprehensive…

  14. Leveraging Strengths Assessment and Intervention Model (LeStAIM): A Theoretical Strength-Based Assessment Framework

    ERIC Educational Resources Information Center

    Laija-Rodriguez, Wilda; Grites, Karen; Bouman, Doug; Pohlman, Craig; Goldman, Richard L.

    2013-01-01

    Current assessments in the schools are based on a deficit model (Epstein, 1998). "The National Association of School Psychologists (NASP) Model for Comprehensive and Integrated School Psychological Services" (2010), federal initiatives and mandates, and experts in the field of assessment have highlighted the need for the comprehensive…

  15. The SLq(2) extension of the standard model

    NASA Astrophysics Data System (ADS)

    Finkelstein, Robert J.

    2015-06-01

    The idea that the elementary particles might have the symmetry of knots has had a long history. In any modern formulation of this idea, however, the knot must be quantized. The present review is a summary of a small set of papers that began as an attempt to correlate the properties of quantized knots with empirical properties of the elementary particles. As the ideas behind these papers have developed over a number of years, the model has evolved, and this review is intended to present the model in its current form. The original picture of an elementary fermion as a solitonic knot of field, described by the trefoil representation of SUq(2), has expanded into its present form in which a knotted field is complementary to a composite structure composed of three preons that in turn are described by the fundamental representation of SLq(2). Higher representations of SLq(2) are interpreted as describing composite particles composed of three or more preons bound by a knotted field. This preon model unexpectedly agrees in important detail with the Harari-Shupe model. There is an associated Lagrangian dynamics capable in principle of describing the interactions and masses of the particles generated by the model.

  16. Beyond the standard gauging: gauge symmetries of Dirac sigma models

    NASA Astrophysics Data System (ADS)

    Chatzistavrakidis, Athanasios; Deser, Andreas; Jonke, Larisa; Strobl, Thomas

    2016-08-01

    In this paper we study the general conditions that have to be met for a gauged extension of a two-dimensional bosonic σ-model to exist. In an inversion of the usual approach of identifying a global symmetry and then promoting it to a local one, we focus directly on the gauge symmetries of the theory. This allows for action functionals which are gauge invariant for rather general background fields in the sense that their invariance conditions are milder than the usual case. In particular, the vector fields that control the gauging need not be Killing. The relaxation of isometry for the background fields is controlled by two connections on a Lie algebroid L in which the gauge fields take values, in a generalization of the common Lie-algebraic picture. Here we show that these connections can always be determined when L is a Dirac structure in the H-twisted Courant algebroid. This also leads us to a derivation of the general form for the gauge symmetries of a wide class of two-dimensional topological field theories called Dirac σ-models, which interpolate between the G/G Wess-Zumino-Witten model and the (Wess-Zumino-term twisted) Poisson sigma model.

  17. Ex-Nihilo: Obstacles Surrounding Teaching the Standard Model

    ERIC Educational Resources Information Center

    Pimbblet, Kevin A.

    2002-01-01

    The model of the Big Bang is an integral part of the national curricula in England and Wales. Previous work (e.g. Baxter 1989) has shown that pupils often come into education with many and varied prior misconceptions emanating from both internal and external sources. Whilst virtually all of these misconceptions can be remedied, there will remain…

  18. Ex-Nihilo: Obstacles Surrounding Teaching the Standard Model

    ERIC Educational Resources Information Center

    Pimbblet, Kevin A.

    2002-01-01

    The model of the Big Bang is an integral part of the national curricula in England and Wales. Previous work (e.g. Baxter 1989) has shown that pupils often come into education with many and varied prior misconceptions emanating from both internal and external sources. Whilst virtually all of these misconceptions can be remedied, there will remain…

  19. A model for predicting the wearout lifetime of the LeRC/Hughes 30-cm mercury ion thruster

    NASA Technical Reports Server (NTRS)

    Beattie, J. R.

    1979-01-01

    An investigation of parameters that affect the erosion rates of 30-cm-diameter mercury-ion-thruster components is described. A sputter-erosion model is formulated in terms of the design, operational, and material characteristics of the thruster. The erosion model is applied to the screen electrode, which is assumed to be the life-limiting component of the 30-cm thruster, resulting in a model of wearout lifetime. Results of short-term erosion-rate tests are presented that illustrate the dependence of component wear rates on variables such as discharge voltage, accelerator-grid open-area fraction, ion energy, electrode material, and the partial pressure of facility residual gases such as nitrogen. Test results are compared with wearout rates predicted by the sputter-erosion model.

  20. Dimensional reduction of the Standard Model coupled to a new singlet scalar field

    NASA Astrophysics Data System (ADS)

    Brauner, Tomáš; Tenkanen, Tuomas V. I.; Tranberg, Anders; Vuorinen, Aleksi; Weir, David J.

    2017-03-01

    We derive an effective dimensionally reduced theory for the Standard Model augmented by a real singlet scalar. We treat the singlet as a superheavy field and integrate it out, leaving an effective theory involving only the Higgs and SU(2) L × U(1) Y gauge fields, identical to the one studied previously for the Standard Model. This opens up the possibility of efficiently computing the order and strength of the electroweak phase transition, numerically and nonperturbatively, in this extension of the Standard Model. Understanding the phase diagram is crucial for models of electroweak baryogenesis and for studying the production of gravitational waves at thermal phase transitions.

  1. Solving the standard model problems in softened gravity

    NASA Astrophysics Data System (ADS)

    Salvio, Alberto

    2016-11-01

    The Higgs naturalness problem is solved if the growth of Einstein's gravitational interaction is softened at an energy ≲1 011 GeV (softened gravity). We work here within an explicit realization where the Einstein-Hilbert Lagrangian is extended to include terms quadratic in the curvature and a nonminimal coupling with the Higgs. We show that this solution is preserved by adding three right-handed neutrinos with masses below the electroweak scale, accounting for neutrino oscillations, dark matter and the baryon asymmetry. The smallness of the right-handed neutrino masses (compared to the Planck scale) and the QCD θ -term are also shown to be natural. We prove that a possible gravitational source of C P violation cannot spoil the model, thanks to the presence of right-handed neutrinos. Inflation is approximately described by the Starobinsky model in this context and can occur even if we live in a metastable vacuum.

  2. Modeling of filtration efficiency of nanoparticles in standard filter media

    NASA Astrophysics Data System (ADS)

    Wang, J.; Chen, D. R.; Pui, D. Y. H.

    2007-01-01

    The goal of this study is to model the data from the experiments of nanoparticle filtration performed at the Particle Technology Lab, University of Minnesota and at the 3M Company. Comparison shows that the experimental data for filter efficiency are bounded by the values computed from theoretical expressions which do not consider thermal rebound. Therefore thermal rebound in the tested filter media is not detected down to 3 nm particles in the present analysis. The efficiency measured experimentally is in good agreement with the theoretical expression by Stechkina (1966, Dokl. Acad. Nauk SSSR 167, 1327) when the Pectlet number Pe is larger than 100; it agrees well with the theoretical expression by Kirsch and Stechkina (1978, Fundamentals of Aerosol Science. Wiley, New York) when Pe is of the order of unit. We develop an empirical power law model for the efficiency depending on the Peclet number, which leads to satisfactory agreement with experimental results.

  3. Beyond the Standard Model Searches at the Tevatron

    SciTech Connect

    Sajot, G.

    2007-11-20

    Recent searches for non-SUSY exotics in pp-bar collisions at a center-of-mass energy of 1.96 TeV at the Tevatron Run II are reported. The emphasis is put on the results of model-driven analyses which were updated to the full Run IIa datasets corresponding to integrated luminosities of about 1 fb{sup -1}.

  4. Metabolomics, Standards, and Metabolic Modeling for Synthetic Biology in Plants

    PubMed Central

    Hill, Camilla Beate; Czauderna, Tobias; Klapperstück, Matthias; Roessner, Ute; Schreiber, Falk

    2015-01-01

    Life on earth depends on dynamic chemical transformations that enable cellular functions, including electron transfer reactions, as well as synthesis and degradation of biomolecules. Biochemical reactions are coordinated in metabolic pathways that interact in a complex way to allow adequate regulation. Biotechnology, food, biofuel, agricultural, and pharmaceutical industries are highly interested in metabolic engineering as an enabling technology of synthetic biology to exploit cells for the controlled production of metabolites of interest. These approaches have only recently been extended to plants due to their greater metabolic complexity (such as primary and secondary metabolism) and highly compartmentalized cellular structures and functions (including plant-specific organelles) compared with bacteria and other microorganisms. Technological advances in analytical instrumentation in combination with advances in data analysis and modeling have opened up new approaches to engineer plant metabolic pathways and allow the impact of modifications to be predicted more accurately. In this article, we review challenges in the integration and analysis of large-scale metabolic data, present an overview of current bioinformatics methods for the modeling and visualization of metabolic networks, and discuss approaches for interfacing bioinformatics approaches with metabolic models of cellular processes and flux distributions in order to predict phenotypes derived from specific genetic modifications or subjected to different environmental conditions. PMID:26557642

  5. Le Figaro. Revised.

    ERIC Educational Resources Information Center

    Crawford, Linda

    These instructional materials are designed for students with some French reading skills and vocabulary in late beginning or early intermediate senior high school French. The objectives are to introduce students to a French newspaper, "Le Figaro," and develop reading skills for skimming, gathering specific information, and relying on cognates. The…

  6. Research and development of the evolving architecture for beyond the Standard Model

    NASA Astrophysics Data System (ADS)

    Cho, Kihyeon; Kim, Jangho; Kim, Junghyun

    2015-12-01

    The Standard Model (SM) has been successfully validated with the discovery of Higgs boson. However, the model is not yet fully regarded as a complete description. There are efforts to develop phenomenological models that are collectively termed beyond the standard model (BSM). The BSM requires several orders of magnitude more simulations compared with those required for the Higgs boson events. On the other hand, particle physics research involves major investments in hardware coupled with large-scale theoretical and computational efforts along with experiments. These fields include simulation toolkits based on an evolving computing architecture. Using the simulation toolkits, we study particle physics beyond the standard model. Here, we describe the state of this research and development effort for evolving computing architecture of high throughput computing (HTC) and graphic processing units (GPUs) for searching beyond the standard model.

  7. Standardization of Operational Solar Wind Model Inputs, Products and Verification Schemes

    NASA Astrophysics Data System (ADS)

    Fry, C. D.; Dryer, M.; Smith, Z.; Deehr, C. S.; Sun, W.; Detman, T. R.

    2006-12-01

    The Hakamada-Akasofu-Fry Version 2 (HAFv2) solar wind model was implemented at Air Force Weather Agency on 29 August 2006 after extensive development, validation and verification. AFWA is now using HAFv2 to provide operational real-time forecast products to DOD customers. Other solar wind models, (e.g., 3D-MHD) are being evaluated for implementation in the near future. As the number of operational solar wind models increases and as ensemble forecast systems are developed, the need will increase for standardized model inputs, products and metrics for assessing forecast skill. The international cooperation and collaboration fostered by the IHY (International Heliophysical Year) offers a unique opportunity to develop such standards. This talk will present the case for standardization to improve operational space weather forecasting, a methodology for determining these standards, and a list of recommended standards for consideration.

  8. Functional Competency Development Model for Academic Personnel Based on International Professional Qualification Standards in Computing Field

    ERIC Educational Resources Information Center

    Tumthong, Suwut; Piriyasurawong, Pullop; Jeerangsuwan, Namon

    2016-01-01

    This research proposes a functional competency development model for academic personnel based on international professional qualification standards in computing field and examines the appropriateness of the model. Specifically, the model consists of three key components which are: 1) functional competency development model, 2) blended training…

  9. New perspectives in physics beyond the standard model

    SciTech Connect

    Weiner, Neal Jonathan

    2000-09-01

    In 1934 Fermi postulated a theory for weak interactions containing a dimensionful coupling with a size of roughly 250 GeV. Only now are we finally exploring this energy regime. What arises is an open question: supersymmetry and large extra dimensions are two possible scenarios. Meanwhile, other experiments will begin providing definitive information into the nature of neutrino masses and CP violation. In this paper, we explore features of possible theoretical scenarios, and study the phenomenological implications of various models addressing the open questions surrounding these issues.

  10. Exploring New Physics Beyond the Standard Model: Final Technical Report

    SciTech Connect

    Wang, Liantao

    2016-10-17

    This grant in 2015 to 2016 was for support in the area of theoretical High Energy Physics. The research supported focused mainly on the energy frontier, but it also has connections to both the cosmic and intensity frontiers. Lian-Tao Wang (PI) focused mainly on signal of new physics at colliders. The year 2015 - 2016, covered by this grant, has been an exciting period of digesting the influx of LHC data, understanding its meaning, and using it to refine strategies for deeper exploration. The PI proposed new methods of searching for new physics at the LHC, such as for the compressed stops. He also investigated in detail the signal of composite Higgs models, focusing on spin-1 composite resonances in the di-boson channel. He has also considered di-photon as a probe for such models. He has also made contributions in formulating search strategies of dark matter at the LHC, resulting in two documents with recommendations. The PI has also been active in studying the physics potential of future colliders, including Higgs factories and 100 TeV pp colliders. He has given comprehensive overview of the physics potential of the high energy proton collider, and outline its luminosity targets. He has also studied the use of lepton colliders to probe fermionic Higgs portal and bottom quark couplings to the Z boson.

  11. Ex-nihilo: obstacles surrounding teaching the Standard Model

    NASA Astrophysics Data System (ADS)

    Pimbblet, Kevin A.

    2002-11-01

    The model of the Big Bang is an integral part of the national curricula in England and Wales. Previous work (e.g. Baxter 1989) has shown that pupils often come into education with many and varied prior misconceptions emanating from both internal and external sources. Whilst virtually all of these misconceptions can be remedied, there will remain (by its very nature) the obstacle of ex-nihilo, as characterized by the question `how do you get something from nothing?' There are two origins of this obstacle: conceptual (i.e. knowledge-based) and cultural (e.g. deeply held religious viewpoints). This article shows how the citizenship section of the national curriculum, which came `online' in England from September 2002, presents a new opportunity for exploiting these.

  12. Myocardial Scar Imaging by Standard Single-Energy and Dual-Energy Late Enhancement Computed Tomography: Comparison to Pathology and Electroanatomical Map in an Experimental Chronic Infarct Porcine Model

    PubMed Central

    Truong, Quynh A.; Thai, Wai-ee; Wai, Bryan; Cordaro, Kevin; Cheng, Teresa; Beaudoin, Jonathan; Xiong, Guanglei; Cheung, Jim W.; Altman, Robert; Min, James K.; Singh, Jagmeet P.; Barrett, Conor D.; Danik, Stephan

    2015-01-01

    Background Myocardial scar is a substrate for ventricular tachycardia and sudden cardiac death. Late enhancement computed tomography (CT) imaging can detect scar, but it remains unclear whether newer late enhancement dual-energy (LE-DECT) acquisition has benefit over standard single-energy late enhancement (LE-CT). Objective We aim to compare late enhancement CT using newer LE-DECT acquisition and single-energy LE-CT acquisitions to pathology and electroanatomical map (EAM) in an experimental chronic myocardial infarction (MI) porcine study. Methods In 8 chronic MI pigs (59±5 kg), we performed dual-source CT, EAM, and pathology. For CT imaging, we performed 3 acquisitions at 10 minutes post-contrast: LE-CT 80 kV, LE-CT 100 kV, and LE-DECT with two post-processing software settings. Results Of the sequences, LE-CT 100 kV provided the best contrast-to-noise ratio (all p≤0.03) and correlation to pathology for scar (ρ=0.88). While LE-DECT overestimated scar (both p=0.02), LE-CT images did not (both p=0.08). On a segment basis (n=136), all CT sequences had high specificity (87–93%) and modest sensitivity (50–67%), with LE-CT 100 kV having the highest specificity of 93% for scar detection compared to pathology and agreement with EAM (κ 0.69). Conclusions Standard single-energy LE-CT, particularly 100kV, matched better to pathology and EAM than dual-energy LE-DECT for scar detection. Larger human trials as well as more technical-based studies that optimize varying different energies with newer hardware and software are warranted. PMID:25977115

  13. Cosmic strings in hidden sectors: 1. Radiation of standard model particles

    SciTech Connect

    Long, Andrew J.; Hyde, Jeffrey M.; Vachaspati, Tanmay E-mail: jmhyde@asu.edu

    2014-09-01

    In hidden sector models with an extra U(1) gauge group, new fields can interact with the Standard Model only through gauge kinetic mixing and the Higgs portal. After the U(1) is spontaneously broken, these interactions couple the resultant cosmic strings to Standard Model particles. We calculate the spectrum of radiation emitted by these ''dark strings'' in the form of Higgs bosons, Z bosons, and Standard Model fermions assuming that string tension is above the TeV scale. We also calculate the scattering cross sections of Standard Model fermions on dark strings due to the Aharonov-Bohm interaction. These radiation and scattering calculations will be applied in a subsequent paper to study the cosmological evolution and observational signatures of dark strings.

  14. O (θ ) Feynman rules for quadrilinear gauge boson couplings in the noncommutative standard model

    NASA Astrophysics Data System (ADS)

    Sajadi, Seyed Shams; Boroun, G. R.

    2017-02-01

    We examine the electroweak gauge sector of the noncommutative standard model and, in particular, obtain the O (θ ) Feynman rules for all quadrilinear gauge boson couplings. Surprisingly, an electroweak-chromodynamics mixing appears in the gauge sector of the noncommutative standard model, where the photon as well as the neutral weak boson is coupled directly to three gluons. The phenomenological perspectives of the model in W-W+→Z Z scattering are studied and it is shown that there is a characteristic oscillatory behavior in azimuthal distribution of scattering cross sections that can be interpreted as a direct signal of the noncommutative standard model. Assuming the integrated luminosity 100 fb-1, the number of W-W+→Z Z subprocesses are estimated for some values of noncommutative scale ΛNC at different center of mass energies and the results are compared with predictions of the standard model.

  15. Radiation Observations from CREAM & CREDO and Comparison with Standard Models

    NASA Astrophysics Data System (ADS)

    Dyer, C.; Watson, C.; Truscott, P.; Peerless, C.

    1996-12-01

    The Cosmic Radiation Environment and Activation Monitor (CREAM) has flown on six Shuttle flights between September 1991 and February 1995, covering the full range of inclinations as well as altitudes between 210 and 550 km. Meanwhile the Cosmic Radiation Environment and Dosimetry experiment (CREDO) has operated continuously on UOSAT-3 in 800 km, 98.7 degree orbit since April 1990. Similar detectors were launched on KITSAT-1 (1330 km, 66 degree inclination) in August 1992 and POSAT-l (790 km, 98.7 degree inclination) in September 1993. Since the summer of 1994, CREDO-II versions have been operating on APEX in an eccentric orbit (350x2486 km) at 70 degree inclination, and on STRV in geostationary transfer orbit (298x35953 km, 7 degree inclination). These experiments are designed to measure protons, cosmic rays and accumulated dose. Through the variety of missions employed they have now achieved wide coverage of the magnetosphere as well as a significant portion of a solar cycle. The LEO observations have shown the Westward drift of the South Atlantic Anomaly, new regimes of trapped protons in the region of L=2.6 following solar flare events in March 1991 and October 1992, and an altitude dependence of trapped protons which is at variance with AP8. On STRV the background channel of the Cold Ion Detector serves as a complementary electron detector and shows the extreme time variability of the outer radiation belt, while the total dose is significantly less than AE8 predictions. In addition to the data on trapped radiation, important results are being obtained on the linear energy transfer spectra from cosmic rays. Detailed shielding models of the APEX and STRV spacecraft have been constructed and used to compare the observations of dose and LET spectra with predictions from AE8, AP8 and CREME for a variety of shielding depths. Consistent results on the LET spectra are obtained from APEX and STRV when data are selected by cut-off rigidity. The influence of spacecraft

  16. Search for the Standard Model Higgs Boson Produced in Association with Top Quarks

    SciTech Connect

    Wilson, Jonathan Samuel

    2011-01-01

    We have performed a search for the Standard Model Higgs boson produced in association with top quarks in the lepton plus jets channel. We impose no constraints on the decay of the Higgs boson. We employ ensembles of neural networks to discriminate events containing a Higgs boson from the dominant tt¯background, and set upper bounds on the Higgs production cross section. At a Higgs boson mass mH = 120 GeV/c2 , we expect to exclude a cross section 12.7 times the Standard Model prediction, and we observe an exclusion 27.4 times the Standard Model prediction with 95 % confidence.

  17. On the Origin of Mass in the Standard Model

    NASA Astrophysics Data System (ADS)

    Sundman, Stig

    2013-01-01

    A model is proposed in which the presently existing elementary particles are the result of an evolution proceeding from the simplest possible particle state to successively more complex states via a series of symmetry-breaking transitions. The properties of two fossil particles — the tauon and muon — together with the observed photon-baryon number ratio provide information that makes it possible to track the early development of particles. A computer simulation of the evolution reveals details about the purpose and history of all presently known elementary particles. In particular, it is concluded that the heavy Higgs particle that generates the bulk of the mass of the Z and W bosons also comes in a light version, which generates small mass contributions to the charged leptons. The predicted mass of this "flyweight" Higgs boson is 0.505 MeV/c2, 106.086 eV/c2 or 12.0007 μeV/c2 (corresponding to a photon of frequency 2.9018 GHz) depending on whether it is associated with the tauon, muon or electron. Support for the conclusion comes from the Brookhaven muon g-2 experiment, which indicates the existence of a Higgs particle lighter than the muon.

  18. Le mouvement du pôle

    NASA Astrophysics Data System (ADS)

    Bizouard, Christian

    2012-03-01

    Les variations de la rotation terrestre. En conditionnant à la fois notre vie quotidienne, notre perception du ciel, et bon nombre de phénomènes géophysiques comme la formation des cyclones, la rotation de la Terre se trouve au croisement de plusieurs disciplines. Si le phenomena se faisait uniformément, le sujet serait vite discuté, mais c'est parce que la rotation terrestre varie, même imperceptiblement pour nos sens, dans sa vitesse angulaire comme dans la direction de son axe, qu'elle suscite un grand intérêt. D'abord pour des raisons pratiques : non seulement les aléas de la rotation terrestre modi_ent à la longue les pointés astrométriques à un instant donné de la journée mais in_uencent aussi les mesures opérées par les techniques spatiales ; en consequence l'exploitation de ces mesures, par exemple pour déterminer les orbites des satellites impliqués ou pratiquer le positionnement au sol, nécessite une connaissance précise de ces variations. Plus fondamentalement, elles traduisent les propriétés globales de la Terre comme les processus physiques qui s'y déroulent, si bien qu'en analysant les causes des fluctuations observées, on dispose d'un moyen de mieux connaître notre globe. La découverte progressive des fluctuations de la rotation de la Terre a une longue histoire. Sous l'angle des techniques d'observation, trois époques se pro-celle du pointé astrométrique à l'oeil nu, à l'aide d'instruments en bois ou métalliques (quart de cercle muraux par exemple). À partir du XVIIe siècle débute l'astrométrie télescopique dont les pointés sont complétés par des datations de plus en plus précises grâce à l'invention d'horloges régulées par balancier. Cette deuxième époque se termine vers 1960, avec l'avènement des techniques spatiales : les pointés astrométriques sont délaissés au profit de la mesure ultra-précise de durées ou de fréquences de signaux électromagnétiques, grâce à l'invention des horloges

  19. Quantitative calculation model of dilution ratio based on reaching standard of water function zone

    NASA Astrophysics Data System (ADS)

    Du, Zhong; Dong, Zengchuan; Wu, Huixiu; Yang, Lin

    2017-03-01

    Dilution ratio is an important indicator of water quality assessment, and it’s difficult to calculate quantitatively. This paper proposed quantitative calculation model of dilution ratio based on the permissible pollution bearing capacity model of water function zone. The model contains three parameters of concentration. Particularly, the 1-D model has three river characteristics parameters in addition. Applications of the model are based on the national standard of wastewater discharge concentration and reaching standard concentration. The results show the inverse correlation between the dilution ratio and the C P and C 0, and the positive correlation with C s . The quantitative maximum control standard of dilution ratio is 12.50% by 0-D model and 22.96% by 1-D model. Moreover, we propose to choose the minimum parameter and find the invalid pollution bearing capacity.

  20. Standard virtual biological parts: a repository of modular modeling components for synthetic biology.

    PubMed

    Cooling, M T; Rouilly, V; Misirli, G; Lawson, J; Yu, T; Hallinan, J; Wipat, A

    2010-04-01

    Fabrication of synthetic biological systems is greatly enhanced by incorporating engineering design principles and techniques such as computer-aided design. To this end, the ongoing standardization of biological parts presents an opportunity to develop libraries of standard virtual parts in the form of mathematical models that can be combined to inform system design. We present an online Repository, populated with a collection of standardized models that can readily be recombined to model different biological systems using the inherent modularity support of the CellML 1.1 model exchange format. The applicability of this approach is demonstrated by modeling gold-medal winning iGEM machines. The Repository is available online as part of http://models.cellml.org. We hope to stimulate the worldwide community to reuse and extend the models therein, and contribute to the Repository of Standard Virtual Parts thus founded. Systems Model architecture information for the Systems Model described here, along with an additional example and a tutorial, is also available as Supplementary information. The example Systems Model from this manuscript can be found at http://models.cellml.org/workspace/bugbuster. The Template models used in the example can be found at http://models.cellml.org/workspace/SVP_Templates200906.

  1. CP violation in neutrino oscillations in Minimal Supersymmetric extension of the Standard Model

    SciTech Connect

    Delepine, David; Gonzalez Macias, Vannia

    2008-07-02

    In this talk, we estimate the size of lepton flavor and CP violation in neutrino oscillations in the framework of Minimal Supersymmetric extension of the Standard Model (MSSM). We find that we may have significant CP-violating contributions up to an order of magnitude ({approx}10{sup -2}) smaller than the standard four-Fermi couplings.

  2. Correlates of Mexican American Students' Standardized Test Scores: An Integrated Model Approach

    ERIC Educational Resources Information Center

    Morales, M. Cristina; Saenz, Rogelio

    2007-01-01

    The use of standardized testing to evaluate academic achievement is a widely debated topic. Despite controversies, standardized testing is used in all educational levels from elementary school to college entrance examinations. One of the ethnic groups particularly affected by this is the Mexican-origin population. An integrated model (individual,…

  3. Model Core Teaching Standards: A Resource for State Dialogue. (Draft for Public Comment)

    ERIC Educational Resources Information Center

    Council of Chief State School Officers, 2010

    2010-01-01

    With this document, the Council of Chief State School Officers (CCSSO) offers for public dialogue and comment a set of model core teaching standards that outline what teachers should know and be able to do to help all students reach the goal of being college- and career-ready in today's world. These standards are an update of the 1992 Interstate…

  4. Can Cognitive Writing Models Inform the Design of the Common Core State Standards?

    ERIC Educational Resources Information Center

    Hayes, John R.; Olinghouse, Natalie G.

    2015-01-01

    In this article, we compare the Common Core State Standards in Writing to the Hayes cognitive model of writing, adapted to describe the performance of young and developing writers. Based on the comparison, we propose the inclusion of standards for motivation, goal setting, writing strategies, and attention by writers to the text they have just…

  5. Diagnostic Profiles: A Standard Setting Method for Use with a Cognitive Diagnostic Model

    ERIC Educational Resources Information Center

    Skaggs, Gary; Hein, Serge F.; Wilkins, Jesse L. M.

    2016-01-01

    This article introduces the Diagnostic Profiles (DP) standard setting method for setting a performance standard on a test developed from a cognitive diagnostic model (CDM), the outcome of which is a profile of mastered and not-mastered skills or attributes rather than a single test score. In the DP method, the key judgment task for panelists is a…

  6. Can Cognitive Writing Models Inform the Design of the Common Core State Standards?

    ERIC Educational Resources Information Center

    Hayes, John R.; Olinghouse, Natalie G.

    2015-01-01

    In this article, we compare the Common Core State Standards in Writing to the Hayes cognitive model of writing, adapted to describe the performance of young and developing writers. Based on the comparison, we propose the inclusion of standards for motivation, goal setting, writing strategies, and attention by writers to the text they have just…

  7. Colorado Model Content Standards for Geography, Grades K-8. Suggested Grade Level Expectations.

    ERIC Educational Resources Information Center

    Colorado State Dept. of Education, Denver.

    This guide lists six model content standards for geography education in Colorado elementary schools. Standards cited in the guide are: (1) students know how to use and construct maps, globes, and other geographic tools to locate and derive information about people, places, and environments; (2) students know the physical and human characteristics…

  8. An extended standard model and its Higgs geometry from the matrix model

    NASA Astrophysics Data System (ADS)

    Steinacker, Harold C.; Zahn, Jochen

    2014-08-01

    We find a simple brane configuration in the IKKT matrix model which resembles the standard model at low energies, with a second Higgs doublet and right-handed neutrinos. The electroweak sector is realized geometrically in terms of two minimal fuzzy ellipsoids, which can be interpreted in terms of four point-branes in the extra dimensions. The electroweak Higgs connects these branes and is an indispensable part of the geometry. Fermionic would-be zero modes arise at the intersections with two larger branes, leading precisely to the correct chiral matter fields at low energy, along with right-handed neutrinos which can acquire a Majorana mass due to a Higgs singlet. The larger branes give rise to SU(3)_c, extended by U(1)_B and another U(1) which are anomalous at low energies and expected to disappear. At higher energies, mirror fermions and additional fields arise, completing the full {N}=4 supersymmetry. The brane configuration is a solution of the model, assuming a suitable effective potential and a non-linear stabilization of the singlet Higgs. The basic results can be carried over to {N}=4 SU(N) super Yang-Mills on ordinary Minkowski space with sufficiently large N.

  9. 40 CFR 86.410-90 - Emission standards for 1990 and later model year motorcycles.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... model year motorcycles. 86.410-90 Section 86.410-90 Protection of Environment ENVIRONMENTAL PROTECTION... ENGINES Emission Regulations for 1978 and Later New Motorcycles, General Provisions § 86.410-90 Emission standards for 1990 and later model year motorcycles. (a)(1) Exhaust emissions from 1990 and later model year...

  10. 40 CFR 86.410-90 - Emission standards for 1990 and later model year motorcycles.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... model year motorcycles. 86.410-90 Section 86.410-90 Protection of Environment ENVIRONMENTAL PROTECTION... ENGINES Emission Regulations for 1978 and Later New Motorcycles, General Provisions § 86.410-90 Emission standards for 1990 and later model year motorcycles. (a)(1) Exhaust emissions from 1990 and later model year...

  11. 40 CFR 86.410-90 - Emission standards for 1990 and later model year motorcycles.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... model year motorcycles. 86.410-90 Section 86.410-90 Protection of Environment ENVIRONMENTAL PROTECTION... ENGINES Emission Regulations for 1978 and Later New Motorcycles, General Provisions § 86.410-90 Emission standards for 1990 and later model year motorcycles. (a)(1) Exhaust emissions from 1990 and later model year...

  12. 40 CFR 86.410-90 - Emission standards for 1990 and later model year motorcycles.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... model year motorcycles. 86.410-90 Section 86.410-90 Protection of Environment ENVIRONMENTAL PROTECTION... ENGINES Emission Regulations for 1978 and Later New Motorcycles, General Provisions § 86.410-90 Emission standards for 1990 and later model year motorcycles. (a)(1) Exhaust emissions from 1990 and later model year...

  13. 40 CFR 86.410-90 - Emission standards for 1990 and later model year motorcycles.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... model year motorcycles. 86.410-90 Section 86.410-90 Protection of Environment ENVIRONMENTAL PROTECTION... ENGINES Emission Regulations for 1978 and Later New Motorcycles, General Provisions § 86.410-90 Emission standards for 1990 and later model year motorcycles. (a)(1) Exhaust emissions from 1990 and later model year...

  14. A Sandwich-Type Standard Error Estimator of SEM Models with Multivariate Time Series

    ERIC Educational Resources Information Center

    Zhang, Guangjian; Chow, Sy-Miin; Ong, Anthony D.

    2011-01-01

    Structural equation models are increasingly used as a modeling tool for multivariate time series data in the social and behavioral sciences. Standard error estimators of SEM models, originally developed for independent data, require modifications to accommodate the fact that time series data are inherently dependent. In this article, we extend a…

  15. Using the Modification Index and Standardized Expected Parameter Change for Model Modification

    ERIC Educational Resources Information Center

    Whittaker, Tiffany A.

    2012-01-01

    Model modification is oftentimes conducted after discovering a badly fitting structural equation model. During the modification process, the modification index (MI) and the standardized expected parameter change (SEPC) are 2 statistics that may be used to aid in the selection of parameters to add to a model to improve the fit. The purpose of this…

  16. A Sandwich-Type Standard Error Estimator of SEM Models with Multivariate Time Series

    ERIC Educational Resources Information Center

    Zhang, Guangjian; Chow, Sy-Miin; Ong, Anthony D.

    2011-01-01

    Structural equation models are increasingly used as a modeling tool for multivariate time series data in the social and behavioral sciences. Standard error estimators of SEM models, originally developed for independent data, require modifications to accommodate the fact that time series data are inherently dependent. In this article, we extend a…

  17. Tests of local Lorentz invariance violation of gravity in the standard model extension with pulsars.

    PubMed

    Shao, Lijing

    2014-03-21

    The standard model extension is an effective field theory introducing all possible Lorentz-violating (LV) operators to the standard model and general relativity (GR). In the pure-gravity sector of minimal standard model extension, nine coefficients describe dominant observable deviations from GR. We systematically implemented 27 tests from 13 pulsar systems to tightly constrain eight linear combinations of these coefficients with extensive Monte Carlo simulations. It constitutes the first detailed and systematic test of the pure-gravity sector of minimal standard model extension with the state-of-the-art pulsar observations. No deviation from GR was detected. The limits of LV coefficients are expressed in the canonical Sun-centered celestial-equatorial frame for the convenience of further studies. They are all improved by significant factors of tens to hundreds with existing ones. As a consequence, Einstein's equivalence principle is verified substantially further by pulsar experiments in terms of local Lorentz invariance in gravity.

  18. Model Standards and Techniques for Control of Radon in New Residential Buildings

    EPA Pesticide Factsheets

    This document is intended to serve as a model for use to develop and adopt building codes, appendices to codes, or standards specifically applicable to unique local or regional radon control requirements.

  19. Classical conformality in the Standard Model from Coleman’s theory

    NASA Astrophysics Data System (ADS)

    Kawana, Kiyoharu

    2016-09-01

    The classical conformality (CC) is one of the possible candidates for explaining the gauge hierarchy of the Standard Model (SM). We show that it is naturally obtained from the Coleman’s theory on baby universe.

  20. Natural or controlled experiment? Disentangling anthropogenic and geologic contributions to the sediment load of the Le Sueur River, MN, USA

    NASA Astrophysics Data System (ADS)

    Finnegan, N. J.; Gran, K. B.

    2012-12-01

    Catastrophic draining of glacial Lake Agassiz at the end of the Pleistocene triggered a pulse of incision along the Minnesota River, MN, USA, that is currently propagating into tributary channels and elevating channel incision rates far above regional background levels. At the same time, installation of artificial drainage to remove excess soil water (tiling) in tributaries of the Minnesota has resulted in shorter and higher amplitude hydrographs during spring snow melt and storm events. Thus both natural and anthropogenic explanations exist for high sediment loads from tributaries to the Minnesota River, among them the Le Sueur River, which is currently impaired for turbidity under EPA Clean Water Act standards. Here we investigate the transient incision history of the Le Sueur River to aid in the development of Total Maximum Daily Loads (TMDLs) for sediment in the Le Sueur. Establishing TMDLs for the Le Sueur requires separation of anthropogenic and geologic contributions to current sediment loads. Towards this end, we ran a series of numerical simulations of the excavation of the Le Sueur River valley over the Holocene in order to constrain pre-settlement rates of sediment export. Our approach relies on coupling (with varying strength) a 2D numerical model for river meandering to various 1D numerical models for river incision. Fortuitously, both the initial profile of the Le Sueur (prior to the flood from Lake Agassiz) as well as the timing of the flood itself can be reasonably constrained from LiDAR data and previous Quaternary studies, respectively. Additionally, LiDAR-mapping of discontinuous, unpaired strath terraces combined with OSL and/or 14C dates on 18 strath terrace deposits pin pieces of the long profile of the Le Sueur River in time and space. By minimizing the model misfit for strath terrace ages, the current river elevation long profile, and the width between bluffs along the Le Sueur River valley, we identify a preferred valley excavation history

  1. A modeling analysis of alternative primary and secondary US ozone standards in urban and rural areas

    NASA Astrophysics Data System (ADS)

    Nopmongcol, Uarporn; Emery, Chris; Sakulyanontvittaya, Tanarit; Jung, Jaegun; Knipping, Eladio; Yarwood, Greg

    2014-12-01

    This study employed the High-Order Decoupled Direct Method (HDDM) of sensitivity analysis in a photochemical grid model to determine US anthropogenic emissions reductions required from 2006 levels to meet alternative US primary (health-based) and secondary (welfare-based) ozone (O3) standards. Applying the modeling techniques developed by Yarwood et al. (2013), we specifically evaluated sector-wide emission reductions needed to meet primary standards in the range of 60-75 ppb, and secondary standards in the range of 7-15 ppm-h, in 22 cities and at 20 rural sites across the US for NOx-only, combined NOx and VOC, and VOC-only scenarios. Site-specific model biases were taken into account by applying adjustment factors separately for the primary and secondary standard metrics, analogous to the US Environmental Protection Agency's (EPA) relative response factor technique. Both bias-adjusted and unadjusted results are presented and analyzed. We found that the secondary metric does not necessarily respond to emission reductions the same way the primary metric does, indicating sensitivity to their different forms. Combined NOx and VOC reductions are most effective for cities, whereas NOx-only reductions are sufficient at rural sites. Most cities we examined require more than 50% US anthropogenic emission reductions from 2006 levels to meet the current primary 75 ppb US standard and secondary 15 ppm-h target. Most rural sites require less than 20% reductions to meet the primary 75 ppb standard and less than 40% reductions to meet the secondary 15 ppm-h target. Whether the primary standard is protective of the secondary standard depends on the combination of alternative standard levels. Our modeling suggests that the current 75 ppb standard achieves a 15 ppm-h secondary target in most (17 of 22) cities, but only half of the rural sites; the inability for several western cities and rural areas to achieve the seasonally-summed secondary 15 ppm-h target while meeting the 75 ppb

  2. Search for New Physics Beyond the Standard Model at BaBar

    SciTech Connect

    Barrett, Matthew; /Brunel U.

    2008-04-16

    A review of selected recent BaBar results are presented that illustrate the ability of the experiment to search for physics beyond the standard model. The decays B {yields} {tau}{nu} and B {yields} s{gamma} provide constraints on the mass of a charged Higgs. Searches for Lepton Flavour Violation could provide a clear signal for beyond the standard model physics. Babar does not observe any signal for New Physics with the current dataset.

  3. From many body wee partons dynamics to perfect fluid: a standard model for heavy ion collisions

    SciTech Connect

    Venugopalan, R.

    2010-07-22

    We discuss a standard model of heavy ion collisions that has emerged both from experimental results of the RHIC program and associated theoretical developments. We comment briefly on the impact of early results of the LHC program on this picture. We consider how this standard model of heavy ion collisions could be solidified or falsified in future experiments at RHIC, the LHC and a future Electro-Ion Collider.

  4. Higgs production cross-section in a Standard Model with four generations at the LHC

    SciTech Connect

    Furlan E.; Anastasiou, C.; Buehler, S.; Herzog, F.; Lazopoulos, A.

    2011-07-12

    We present theoretical predictions for the Higgs boson production cross-section via gluon fusion at the LHC in a Standard Model with four generations. We include QCD corrections through NLO retaining the full dependence on the quark masses, and the NNLO corrections in the heavy quark effective theory approximation. We also include electroweak corrections through three loops. Electroweak and bottom-quark contributions are suppressed in comparison to the Standard Model with three generations.

  5. The model standards project: creating inclusive systems for LGBT youth in out-of-home care.

    PubMed

    Wilber, Shannan; Reyes, Carolyn; Marksamer, Jody

    2006-01-01

    This article describes the Model Standards Project (MSP), a collaboration of Legal Services for Children and the National Center for Lesbian Rights. The MSP developed a set of model professional standards governing the care of lesbian, gay, bisexual and transgender (LGBT) youth in out-of-home care. This article provides an overview of the experiences of LGBT youth in state custody, drawing from existing research, as well as the actual experiences of youth who participated in the project or spoke with project staff. It will describe existing professional standards applicable to child welfare and juvenile justice systems, and the need for standards specifically focused on serving LGBT youth. The article concludes with recommendations for implementation of the standards in local jurisdictions.

  6. Search for Standard Model $ZH \\to \\ell^+\\ell^-b\\bar{b}$ at DØ

    SciTech Connect

    Jiang, Peng

    2014-07-01

    We present a search for the Standard Model Higgs boson in the ZH → ℓ + ℓ ₋ $b\\bar{b}$ channel, using data collected with the DØ detector at the Fermilab Tevatron Collider. This analysis is based on a sample of reprocessed data incorporating several improve ments relative to a previous published result, and a modified multivariate analysis strategy. For a Standard Model Higgs boson of mass 125 GeV, the expected cross section limit over the Standard M odel prediction is improved by about 5% compared to the previous published results in this c hannel from the DØ Collaboration

  7. Promoting Coordinated Development of Community-Based Information Standards for Modeling in Biology: The COMBINE Initiative

    PubMed Central

    Hucka, Michael; Nickerson, David P.; Bader, Gary D.; Bergmann, Frank T.; Cooper, Jonathan; Demir, Emek; Garny, Alan; Golebiewski, Martin; Myers, Chris J.; Schreiber, Falk; Waltemath, Dagmar; Le Novère, Nicolas

    2015-01-01

    The Computational Modeling in Biology Network (COMBINE) is a consortium of groups involved in the development of open community standards and formats used in computational modeling in biology. COMBINE’s aim is to act as a coordinator, facilitator, and resource for different standardization efforts whose domains of use cover related areas of the computational biology space. In this perspective article, we summarize COMBINE, its general organization, and the community standards and other efforts involved in it. Our goals are to help guide readers toward standards that may be suitable for their research activities, as well as to direct interested readers to relevant communities where they can best expect to receive assistance in how to develop interoperable computational models. PMID:25759811

  8. Modelling and mapping the local distribution of representative species on the Le Danois Bank, El Cachucho Marine Protected Area (Cantabrian Sea)

    NASA Astrophysics Data System (ADS)

    García-Alegre, Ana; Sánchez, Francisco; Gómez-Ballesteros, María; Hinz, Hilmar; Serrano, Alberto; Parra, Santiago

    2014-08-01

    The management and protection of potentially vulnerable species and habitats require the availability of detailed spatial data. However, such data are often not readily available in particular areas that are challenging for sampling by traditional sampling techniques, for example seamounts. Within this study habitat modelling techniques were used to create predictive maps of six species of conservation concern for the Le Danois Bank (El Cachucho Marine Protected Area in the South of the Bay of Biscay). The study used data from ECOMARG multidisciplinary surveys that aimed to create a representative picture of the physical and biological composition of the area. Classical fishing gear (otter trawl and beam trawl) was used to sample benthic communities that inhabit sedimentary areas, and non-destructive visual sampling techniques (ROV and photogrammetric sled) were used to determine the presence of epibenthic macrofauna in complex and vulnerable habitats. Multibeam echosounder data, high-resolution seismic profiles (TOPAS system) and geological data from box-corer were used to characterize the benthic terrain. ArcGIS software was used to produce high-resolution maps (75×75 m2) of such variables in the entire area. The Maximum Entropy (MAXENT) technique was used to process these data and create Habitat Suitability maps for six species of special conservation interest. The model used seven environmental variables (depth, rugosity, aspect, slope, Bathymetric Position Index (BPI) in fine and broad scale and morphosedimentary characteristics) to identify the most suitable habitats for such species and indicates which environmental factors determine their distribution. The six species models performed highly significantly better than random (p<0.0001; Mann-Whitney test) when Area Under the Curve (AUC) values were tested. This indicates that the environmental variables chosen are relevant to distinguish the distribution of these species. The Jackknife test estimated depth

  9. Panel C report: Standards needed for the use of ISO Open Systems Interconnection - basic reference model

    NASA Technical Reports Server (NTRS)

    1981-01-01

    The use of an International Standards Organization (ISO) Open Systems Interconnection (OSI) Reference Model and its relevance to interconnecting an Applications Data Service (ADS) pilot program for data sharing is discussed. A top level mapping between the conjectured ADS requirements and identified layers within the OSI Reference Model was performed. It was concluded that the OSI model represents an orderly architecture for the ADS networking planning and that the protocols being developed by the National Bureau of Standards offer the best available implementation approach.

  10. 40 CFR 86.410-2006 - Emission standards for 2006 and later model year motorcycles.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... model year motorcycles. 86.410-2006 Section 86.410-2006 Protection of Environment ENVIRONMENTAL... VEHICLES AND ENGINES Emission Regulations for 1978 and Later New Motorcycles, General Provisions § 86.410-2006 Emission standards for 2006 and later model year motorcycles. (a)(1) Exhaust emissions from Class...

  11. 40 CFR 86.410-2006 - Emission standards for 2006 and later model year motorcycles.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... model year motorcycles. 86.410-2006 Section 86.410-2006 Protection of Environment ENVIRONMENTAL... VEHICLES AND ENGINES Emission Regulations for 1978 and Later New Motorcycles, General Provisions § 86.410-2006 Emission standards for 2006 and later model year motorcycles. (a)(1) Exhaust emissions from Class...

  12. 40 CFR 86.410-2006 - Emission standards for 2006 and later model year motorcycles.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... model year motorcycles. 86.410-2006 Section 86.410-2006 Protection of Environment ENVIRONMENTAL... VEHICLES AND ENGINES Emission Regulations for 1978 and Later New Motorcycles, General Provisions § 86.410-2006 Emission standards for 2006 and later model year motorcycles. (a)(1) Exhaust emissions from Class...

  13. 40 CFR 86.410-2006 - Emission standards for 2006 and later model year motorcycles.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... model year motorcycles. 86.410-2006 Section 86.410-2006 Protection of Environment ENVIRONMENTAL... VEHICLES AND ENGINES Emission Regulations for 1978 and Later New Motorcycles, General Provisions § 86.410-2006 Emission standards for 2006 and later model year motorcycles. (a)(1) Exhaust emissions from Class...

  14. 40 CFR 86.410-2006 - Emission standards for 2006 and later model year motorcycles.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... model year motorcycles. 86.410-2006 Section 86.410-2006 Protection of Environment ENVIRONMENTAL... VEHICLES AND ENGINES Emission Regulations for 1978 and Later New Motorcycles, General Provisions § 86.410-2006 Emission standards for 2006 and later model year motorcycles. (a)(1) Exhaust emissions from Class...

  15. Standards-Based Evaluation and Teacher Career Satisfaction: A Structural Equation Modeling Analysis

    ERIC Educational Resources Information Center

    Conley, Sharon; Muncey, Donna E.; You, Sukkyung

    2005-01-01

    Structural equation modeling was used to assess the plausibility of a conceptual model specifying hypothesized linkages among perceptions of characteristics of standards-based evaluation, work environment mediators, and career satisfaction and other outcomes. Four comprehensive high schools located in two neighboring counties in southern…

  16. The Relationship between Students' Performance on Conventional Standardized Mathematics Assessments and Complex Mathematical Modeling Problems

    ERIC Educational Resources Information Center

    Kartal, Ozgul; Dunya, Beyza Aksu; Diefes-Dux, Heidi A.; Zawojewski, Judith S.

    2016-01-01

    Critical to many science, technology, engineering, and mathematics (STEM) career paths is mathematical modeling--specifically, the creation and adaptation of mathematical models to solve problems in complex settings. Conventional standardized measures of mathematics achievement are not structured to directly assess this type of mathematical…

  17. Use of the Probit Model to Estimate School Performance in Student Attainment of Achievement Testing Standards

    ERIC Educational Resources Information Center

    Finch, W. Holmes; Cassady, Jerrell C.

    2014-01-01

    In the USA, trends in educational accountability have driven several models attempting to provide quality data for decision making at the national, state, and local levels, regarding the success of schools in meeting standards for competence. Statistical methods to generate data for such decisions have generally included (a) status models that…

  18. Standards-Based Evaluation and Teacher Career Satisfaction: A Structural Equation Modeling Analysis

    ERIC Educational Resources Information Center

    Conley, Sharon; Muncey, Donna E.; You, Sukkyung

    2005-01-01

    Structural equation modeling was used to assess the plausibility of a conceptual model specifying hypothesized linkages among perceptions of characteristics of standards-based evaluation, work environment mediators, and career satisfaction and other outcomes. Four comprehensive high schools located in two neighboring counties in southern…

  19. Stationary distribution and extinction of a stochastic SIRS epidemic model with standard incidence

    NASA Astrophysics Data System (ADS)

    Liu, Qun; Jiang, Daqing; Shi, Ningzhong; Hayat, Tasawar; Alsaedi, Ahmed

    2017-03-01

    In this paper, we consider a stochastic SIRS epidemic model with standard incidence. By constructing suitable stochastic Lyapunov function, we establish sufficient conditions for the existence of ergodic stationary distribution of the model. Moreover, we also establish sufficient conditions for extinction of the disease.

  20. Streptococcus pneumoniae, le transformiste.

    PubMed

    Johnston, Calum; Campo, Nathalie; Bergé, Matthieu J; Polard, Patrice; Claverys, Jean-Pierre

    2014-03-01

    Streptococcus pneumoniae (the pneumococcus) is an important human pathogen. Natural genetic transformation, which was discovered in this species, involves internalization of exogenous single-stranded DNA and its incorporation into the chromosome. It allows acquisition of pathogenicity islands and antibiotic resistance and promotes vaccine escape via capsule switching. This opinion article discusses how recent advances regarding several facets of pneumococcal transformation support the view that the process has evolved to maximize plasticity potential in this species, making the pneumococcus le transformiste of the bacterial kingdom and providing an advantage in the constant struggle between this pathogen and its host.

  1. Standardization Process for Space Radiation Models Used for Space System Design

    NASA Technical Reports Server (NTRS)

    Barth, Janet; Daly, Eamonn; Brautigam, Donald

    2005-01-01

    The space system design community has three concerns related to models of the radiation belts and plasma: 1) AP-8 and AE-8 models are not adequate for modern applications; 2) Data that have become available since the creation of AP-8 and AE-8 are not being fully exploited for modeling purposes; 3) When new models are produced, there is no authorizing organization identified to evaluate the models or their datasets for accuracy and robustness. This viewgraph presentation provided an overview of the roadmap adopted by the Working Group Meeting on New Standard Radiation Belt and Space Plasma Models.

  2. Extracting business vocabularies from business process models: SBVR and BPMN standards-based approach

    NASA Astrophysics Data System (ADS)

    Skersys, Tomas; Butleris, Rimantas; Kapocius, Kestutis

    2013-10-01

    Approaches for the analysis and specification of business vocabularies and rules are very relevant topics in both Business Process Management and Information Systems Development disciplines. However, in common practice of Information Systems Development, the Business modeling activities still are of mostly empiric nature. In this paper, basic aspects of the approach for business vocabularies' semi-automated extraction from business process models are presented. The approach is based on novel business modeling-level OMG standards "Business Process Model and Notation" (BPMN) and "Semantics for Business Vocabularies and Business Rules" (SBVR), thus contributing to OMG's vision about Model-Driven Architecture (MDA) and to model-driven development in general.

  3. Assessment of the Draft AIAA S-119 Flight Dynamic Model Exchange Standard

    NASA Technical Reports Server (NTRS)

    Jackson, E. Bruce; Murri, Daniel G.; Hill, Melissa A.; Jessick, Matthew V.; Penn, John M.; Hasan, David A.; Crues, Edwin Z.; Falck, Robert D.; McCarthy, Thomas G.; Vuong, Nghia; Zimmerman, Curtis

    2011-01-01

    An assessment of a draft AIAA standard for flight dynamics model exchange, ANSI/AIAA S-119-2011, was conducted on behalf of NASA by a team from the NASA Engineering and Safety Center. The assessment included adding the capability of importing standard models into real-time simulation facilities at several NASA Centers as well as into analysis simulation tools. All participants were successful at importing two example models into their respective simulation frameworks by using existing software libraries or by writing new import tools. Deficiencies in the libraries and format documentation were identified and fixed; suggestions for improvements to the standard were provided to the AIAA. An innovative tool to generate C code directly from such a model was developed. Performance of the software libraries compared favorably with compiled code. As a result of this assessment, several NASA Centers can now import standard models directly into their simulations. NASA is considering adopting the now-published S-119 standard as an internal recommended practice.

  4. Effects of LeY glycan expression on embryo implantation.

    PubMed

    Gu, J; Sui, L-L; Cui, D; Ma, Y-N; Zhu, C-Y; Kong, Y

    2016-08-01

    To investigate the correlation between LeY glycan expression and embryo implantation. Uterine epithelial cells before implantation were transfected with FUT1siRNA to inhibit FUT1 (the gene encoding the key enzyme of LeY synthesis) expression and treated with 10 ng/ml leukemia inhibitory factor (LIF). Murine embryo implantation model in vitro was prepared by late blastocysts with identical morphology and treated uterine epithelial cells co-culture. Using RT-PCR, dot blot and observation of embryo attachment to analyze FUT1 gene expression and LeY synthesis of uterine epithelial cells and studied further the correlation of LeY expression level and embryo implantation. FUT1 gene expression and LeY synthesis declined after cells were transfected with FUT1siRNA, and LIF promoted FUT1 expression and LeY synthesis. After expression of FUT1 gene was inhibited, attachment rate of embryos lowered, but LIF up-regulated FUT1 expression and increased the attachment rate of embryos. These results indicated regulating FUT1 expression affected LeY synthesis, and then LeY regulated the recognition and attachment of uterus-embryo and participates in embryo implantation further.

  5. A Lifecycle Approach to Brokered Data Management for Hydrologic Modeling Data Using Open Standards.

    NASA Astrophysics Data System (ADS)

    Blodgett, D. L.; Booth, N.; Kunicki, T.; Walker, J.

    2012-12-01

    The U.S. Geological Survey Center for Integrated Data Analytics has formalized an information management-architecture to facilitate hydrologic modeling and subsequent decision support throughout a project's lifecycle. The architecture is based on open standards and open source software to decrease the adoption barrier and to build on existing, community supported software. The components of this system have been developed and evaluated to support data management activities of the interagency Great Lakes Restoration Initiative, Department of Interior's Climate Science Centers and WaterSmart National Water Census. Much of the research and development of this system has been in cooperation with international interoperability experiments conducted within the Open Geospatial Consortium. Community-developed standards and software, implemented to meet the unique requirements of specific disciplines, are used as a system of interoperable, discipline specific, data types and interfaces. This approach has allowed adoption of existing software that satisfies the majority of system requirements. Four major features of the system include: 1) assistance in model parameter and forcing creation from large enterprise data sources; 2) conversion of model results and calibrated parameters to standard formats, making them available via standard web services; 3) tracking a model's processes, inputs, and outputs as a cohesive metadata record, allowing provenance tracking via reference to web services; and 4) generalized decision support tools which rely on a suite of standard data types and interfaces, rather than particular manually curated model-derived datasets. Recent progress made in data and web service standards related to sensor and/or model derived station time series, dynamic web processing, and metadata management are central to this system's function and will be presented briefly along with a functional overview of the applications that make up the system. As the separate

  6. A unification of models for meta-analysis of diagnostic accuracy studies without a gold standard.

    PubMed

    Liu, Yulun; Chen, Yong; Chu, Haitao

    2015-06-01

    Several statistical methods for meta-analysis of diagnostic accuracy studies have been discussed in the presence of a gold standard. However, in practice, the selected reference test may be imperfect due to measurement error, non-existence, invasive nature, or expensive cost of a gold standard. It has been suggested that treating an imperfect reference test as a gold standard can lead to substantial bias in the estimation of diagnostic test accuracy. Recently, two models have been proposed to account for imperfect reference test, namely, a multivariate generalized linear mixed model (MGLMM) and a hierarchical summary receiver operating characteristic (HSROC) model. Both models are very flexible in accounting for heterogeneity in accuracies of tests across studies as well as the dependence between tests. In this article, we show that these two models, although with different formulations, are closely related and are equivalent in the absence of study-level covariates. Furthermore, we provide the exact relations between the parameters of these two models and assumptions under which two models can be reduced to equivalent submodels. On the other hand, we show that some submodels of the MGLMM do not have corresponding equivalent submodels of the HSROC model, and vice versa. With three real examples, we illustrate the cases when fitting the MGLMM and HSROC models leads to equivalent submodels and hence identical inference, and the cases when the inferences from two models are slightly different. Our results generalize the important relations between the bivariate generalized linear mixed model and HSROC model when the reference test is a gold standard. © 2014, The International Biometric Society.

  7. Standard setting with dichotomous and constructed response items: some Rasch model approaches.

    PubMed

    MacCann, Robert G

    2009-01-01

    Using real data comprising responses to both dichotomously scored and constructed response items, this paper shows how Rasch modeling may be used to facilitate standard-setting. The modeling uses Andrich's Extended Logistic Model, which is incorporated into the RUMM software package. After a review of the fundamental equations of the model, an application to Bookmark standard setting is given, showing how to calculate the bookmark difficulty location (BDL) for both dichomotous items and tests containing a mixture of item types. An example showing how the bookmark is set is also discussed. The Rasch model is then applied in various ways to the Angoff standard-setting methods. In the first Angoff approach, the judges' item ratings are compared to Rasch model expected scores, allowing the judges to find items where their ratings differ significantly from the Rasch model values. In the second Angoff approach, the distribution of item ratings are converted to a distribution of possible cutscores, from which a final cutscore may be selected. In the third Angoff approach, the Rasch model provides a comprehensive information set to the judges. For every total score on the test, the model provides a column of item ratings (expected scores) for the ability associated with the total score. The judges consider each column of item ratings as a whole and select the column that best fits the expected pattern of responses of a marginal candidate. The total score corresponding to the selected column is then the performance band cutscore.

  8. Search for mono-Higgs signals at the LHC in the B -L supersymmetric standard model

    NASA Astrophysics Data System (ADS)

    Abdallah, W.; Hammad, A.; Khalil, S.; Moretti, S.

    2017-03-01

    We study mono-Higgs signatures emerging in the B -L supersymmetric standard model induced by new channels not present in the minimal supersymmetric standard model, i.e., via topologies in which the mediator is either a heavy Z', with mass of O (2 TeV ) , or an intermediate h' (the lightest C P -even Higgs state of B -L origin), with a mass of O (0.2 TeV ) . The mono-Higgs probe considered is the standard model-like Higgs state recently discovered at the Large Hadron Collider, so as to enforce its mass reconstruction for background reduction purposes. With this in mind, its two cleanest signatures are selected: γ γ and Z Z*→4 l (l =e , μ ). We show how both of these can be accessed with foreseen energy and luminosity options using a dedicated kinematic analysis performed in the presence of partonic, showering, hadronization and detector effects.

  9. Exploring the Standard Model with the High Luminosity, Polarized Electron-Ion Collider

    SciTech Connect

    Milner, Richard G.

    2009-08-04

    The Standard Model is only a few decades old and has been successfully confirmed by experiment, particularly at the high energy frontier. This will continue with renewed vigor at the LHC. However, many important elements of the Standard Model remain poorly understood. In particular, the exploration of the strong interaction theory Quantum Chromodynamics is in its infancy. How does the spin-1/2 of the proton arise from the fundamental quark and gluon constituents? Can we understand the new QCD world of virtual quarks and gluons in the nucleon? Using precision measurements can we test the limits of the Standard Model and look for new physics? To address these and other important questions, physicists have developed a concept for a new type of accelerator, namely a high luminosity, polarized electron-ion collider. Here the scientific motivation is summarized and the accelerator concepts are outlined.

  10. B → K∗ ℓ + ℓ - decays at large recoil in the Standard Model: a theoretical reappraisal

    NASA Astrophysics Data System (ADS)

    Ciuchini, Marco; Fedele, Marco; Franco, Enrico; Mishima, Satoshi; Paul, Ayan; Silvestrini, Luca; Valli, Mauro

    2016-06-01

    We critically reassess the theoretical uncertainties in the Standard Model calculation of the B → K ∗ ℓ + ℓ - observables, focusing on the low q 2 region. We point out that even optimized observables are affected by sizable uncertainties, since hadronic contributions generated by current-current operators with charm are difficult to estimate, especially for q 2 ˜ 4 m c 2 ≃ 6.8 GeV2. We perform a detailed numerical analysis and present both predictions and results from the fit obtained using most recent data. We find that non-factorizable power corrections of the expected order of magnitude are sufficient to give a good description of current experimental data within the Standard Model. We discuss in detail the q 2 dependence of the corrections and their possible interpretation as shifts of the Standard Model Wilson coefficients.

  11. Connecting dark matter annihilation to the vertex functions of Standard Model fermions

    NASA Astrophysics Data System (ADS)

    Kumar, Jason; Light, Christopher

    2017-07-01

    We consider scenarios in which dark matter is a Majorana fermion which couples to Standard Model fermions through the exchange of charged mediating particles. The matrix elements for various dark matter annihilation processes are then related to one-loop corrections to the fermion-photon vertex, where dark matter and the charged mediators run in the loop. In particular, in the limit where Standard Model fermion helicity mixing is suppressed, the cross section for dark matter annihilation to various final states is related to corrections to the Standard Model fermion charge form factor. These corrections can be extracted in a gauge-invariant manner from collider cross sections. Although current measurements from colliders are not precise enough to provide useful constraints on dark matter annihilation, improved measurements at future experiments, such as the International Linear Collider, could improve these constraints by several orders of magnitude, allowing them to surpass the limits obtainable by direct observation.

  12. Parameter recovery, bias and standard errors in the linear ballistic accumulator model.

    PubMed

    Visser, Ingmar; Poessé, Rens

    2017-05-01

    The linear ballistic accumulator (LBA) model (Brown & Heathcote, , Cogn. Psychol., 57, 153) is increasingly popular in modelling response times from experimental data. An R package, glba, has been developed to fit the LBA model using maximum likelihood estimation which is validated by means of a parameter recovery study. At sufficient sample sizes parameter recovery is good, whereas at smaller sample sizes there can be large bias in parameters. In a second simulation study, two methods for computing parameter standard errors are compared. The Hessian-based method is found to be adequate and is (much) faster than the alternative bootstrap method. The use of parameter standard errors in model selection and inference is illustrated in an example using data from an implicit learning experiment (Visser et al., , Mem. Cogn., 35, 1502). It is shown that typical implicit learning effects are captured by different parameters of the LBA model. © 2017 The British Psychological Society.

  13. Distinguishing Standard Model Extensions using MonoTop Chirality at the LHC

    NASA Astrophysics Data System (ADS)

    Mueller, Ryan; Allahverdi, Rouzbeh; Dalchenko, Mykhailo; Dutta, Bhaskar; Flórez, Andrés; Gao, Yu; Kamon, Teruki; Kolev, Nikolay; Segura, Manuel

    2017-01-01

    Spectral analysis of the top quark final states is a promising method to distinguish physics beyond the standard model (BSM) from the SM. Many BSM physics with top quark final states feature top quarks with right or left handed polarized helicity. The energy spectrum of the top quark decay products can be used to distinguish the top quark helicity. A Delphes simulation of a minimal standard model extension featuring a color scalar triplet that decays into a left handed top and a dark matter (DM) candidate is compared with a right handed model to demonstrate how such an energy spectrum varies and differentiates models. Both the hadronic and leptonic decay channels of the top quark are considered in the analysis. In the hadronic channel the right and left handed models are separated at 95% CL with a production cross section of 20 fb and 100 fb-1 integrated luminosity of 13 TeV proton-proton collisions at the LHC.

  14. Adventures in model-building beyond the Standard Model and esoterica in six dimensions

    NASA Astrophysics Data System (ADS)

    Stone, David C.

    This dissertation is most easily understood as two distinct periods of research. The first three chapters are dedicated to phenomenological interests in physics. An anomalous measurement of the top quark forward-backward asymmetry in both detectors at the Tevatron collider is explained by particle content from beyond the Standard Model. The extra field content is assumed to have originated from a grand unified group SU(5), and so only specific content may be added. Methods for spontaneously breaking the R-symmetry of supersymmetric theories, of phenomenological interest for any realistic supersymmetric model, are studied in the context of two-loop Coleman-Weinberg potentials. For a superpotential with a certain structure, which must include two different couplings, a robust method of spontaneously breaking the R-symmetry is established. The phenomenological studies conclude with an isospin analysis of B decays to kaons and pions. When the parameters of the analysis are fit to data, it is seen that an enhancement of matrix elements in certain representations of isospin emerge. This is highly reminiscent of the infamous and unexplained enhancements seen in the K → pipi system. We conjecture that this enhancement may be a universal feature of the flavor group, isospin in this case, rather than of just the K → pipi system. The final two chapters approach the problem of counting degrees of freedom in quantum field theories. We examine the form of the Weyl anomaly in six dimensions with the Weyl consistency conditions. These consistency conditions impose constraints that lead to a candidate for the alpha-theorem in six dimensions. This candidate has all the properties that the equivalent theorems in two and four dimensions did, and, in fact, we show that in an even number of dimensions the form of the Euler density, the generalized Einstein tensor, and the Weyl transformations guarantee such a candidate exists. We go on to show that, unlike in two and four dimensions

  15. Performance of preproduction model cesium beam frequency standards for spacecraft applications

    NASA Technical Reports Server (NTRS)

    Levine, M. W.

    1978-01-01

    A cesium beam frequency standards for spaceflight application on Navigation Development Satellites was designed and fabricated and preliminary testing was completed. The cesium standard evolved from an earlier prototype model launched aboard NTS-2 and the engineering development model to be launched aboard NTS satellites during 1979. A number of design innovations, including a hybrid analog/digital integrator and the replacement of analog filters and phase detectors by clocked digital sampling techniques are discussed. Thermal and thermal-vacuum testing was concluded and test data are presented. Stability data for 10 to 10,000 seconds averaging interval, measured under laboratory conditions, are shown.

  16. Les Houches 2015: Physics at TeV Colliders Standard Model Working Group Report

    SciTech Connect

    Andersen, J.R.; et al.

    2016-05-16

    This Report summarizes the proceedings of the 2015 Les Houches workshop on Physics at TeV Colliders. Session 1 dealt with (I) new developments relevant for high precision Standard Model calculations, (II) the new PDF4LHC parton distributions, (III) issues in the theoretical description of the production of Standard Model Higgs bosons and how to relate experimental measurements, (IV) a host of phenomenological studies essential for comparing LHC data from Run I with theoretical predictions and projections for future measurements in Run II, and (V) new developments in Monte Carlo event generators.

  17. Testing non-standard inflationary models with the cosmic microwave background

    NASA Astrophysics Data System (ADS)

    Landau, Susana J.

    2015-03-01

    The emergence of the seeds of cosmic structure from an isotropic and homogeneuous universe has not been clearly explained by the standard version of inflationary models. We review a proposal that attempts to deal with this problem by introducing "the self induced collapse hypothesis". As a consequence of this modification of standard inflationary scenarios, the predicted primordial power spectrum and the CMB spectrum are modified. We show the results of statistical analyses comparing the predictions of these models with recent CMB observations and the matter power spectrum from galaxy surveys.

  18. Standard Model Extension and Casimir effect for fermions at finite temperature

    NASA Astrophysics Data System (ADS)

    Santos, A. F.; Khanna, Faqir C.

    2016-11-01

    Lorentz and CPT symmetries are foundations for important processes in particle physics. Recent studies in Standard Model Extension (SME) at high energy indicate that these symmetries may be violated. Modifications in the lagrangian are necessary to achieve a hermitian hamiltonian. The fermion sector of the standard model extension is used to calculate the effects of the Lorentz and CPT violation on the Casimir effect at zero and finite temperature. The Casimir effect and Stefan-Boltzmann law at finite temperature are calculated using the thermo field dynamics formalism.

  19. Beyond Standard Model Physics: At the Frontiers of Cosmology and Particle Physics

    NASA Astrophysics Data System (ADS)

    Lopez-Suarez, Alejandro O.

    I begin to write this thesis at a time of great excitement in the field of cosmology and particle physics. The aim of this thesis is to study and search for beyond the standard model (BSM) physics in the cosmological and high energy particle fields. There are two main questions, which this thesis aims to address: 1) what can we learn about the inflationary epoch utilizing the pioneer gravitational wave detector Adv. LIGO?, and 2) what are the dark matter particle properties and interactions with the standard model particles?. This thesis will focus on advances in answering both questions.

  20. Blowout Jets: Hinode X-Ray Jets that Don't Fit the Standard Model

    NASA Technical Reports Server (NTRS)

    Moore, Ronald L.; Cirtain, Jonathan W.; Sterling, Alphonse C.; Falconer, David A.

    2010-01-01

    Nearly half of all H-alpha macrospicules in polar coronal holes appear to be miniature filament eruptions. This suggests that there is a large class of X-ray jets in which the jet-base magnetic arcade undergoes a blowout eruption as in a CME, instead of remaining static as in most solar X-ray jets, the standard jets that fit the model advocated by Shibata. Along with a cartoon depicting the standard model, we present a cartoon depicting the signatures expected of blowout jets in coronal X-ray images. From Hinode/XRT movies and STEREO/EUVI snapshots in polar coronal holes, we present examples of (1) X-ray jets that fit the standard model, and (2) X-ray jets that do not fit the standard model but do have features appropriate for blowout jets. These features are (1) a flare arcade inside the jet-base arcade in addition to the small flare arcade (bright point) outside that standard jets have, (2) a filament of cool (T is approximately 80,000K) plasma that erupts from the core of the jetbase arcade, and (3) an extra jet strand that should not be made by the reconnection for standard jets but could be made by reconnection between the ambient unipolar open field and the opposite-polarity leg of the filament-carrying flux-rope core field of the erupting jet-base arcade. We therefore infer that these non-standard jets are blowout jets, jets made by miniature versions of the sheared-core-arcade eruptions that make CMEs