Shears, Tara
2012-02-28
The Standard Model is the theory used to describe the interactions between fundamental particles and fundamental forces. It is remarkably successful at predicting the outcome of particle physics experiments. However, the theory has not yet been completely verified. In particular, one of the most vital constituents, the Higgs boson, has not yet been observed. This paper describes the Standard Model, the experimental tests of the theory that have led to its acceptance and its shortcomings.
NASA Technical Reports Server (NTRS)
Guenther, D. B.; Demarque, P.; Kim, Y.-C.; Pinsonneault, M. H.
1992-01-01
A set of solar models have been constructed, each based on a single modification to the physics of a reference solar model. In addition, a model combining several of the improvements has been calculated to provide a best solar model. Improvements were made to the nuclear reaction rates, the equation of state, the opacities, and the treatment of the atmosphere. The impact on both the structure and the frequencies of the low-l p-modes of the model to these improvements are discussed. It is found that the combined solar model, which is based on the best physics available (and does not contain any ad hoc assumptions), reproduces the observed oscillation spectrum (for low-l) within the errors associated with the uncertainties in the model physics (primarily opacities).
Premise for Standardized Sepsis Models.
Remick, Daniel G; Ayala, Alfred; Chaudry, Irshad; Coopersmith, Craig M; Deutschman, Clifford; Hellman, Judith; Moldawer, Lyle; Osuchowski, Marcin
2018-06-05
Sepsis morbidity and mortality exacts a toll on patients and contributes significantly to healthcare costs. Preclinical models of sepsis have been used to study disease pathogenesis and test new therapies, but divergent outcomes have been observed with the same treatment even when using the same sepsis model. Other disorders such as diabetes, cancer, malaria, obesity and cardiovascular diseases have used standardized, preclinical models that allow laboratories to compare results. Standardized models accelerate the pace of research and such models have been used to test new therapies or changes in treatment guidelines. The National Institutes of Health (NIH) mandated that investigators increase data reproducibility and the rigor of scientific experiments and has also issued research funding announcements about the development and refinement of standardized models. Our premise is that refinement and standardization of preclinical sepsis models may accelerate the development and testing of potential therapeutics for human sepsis, as has been the case with preclinical models for other disorders. As a first step towards creating standardized models, we suggest 1) standardizing the technical standards of the widely used cecal ligation and puncture model and 2) creating a list of appropriate organ injury and immune dysfunction parameters. Standardized sepsis models could enhance reproducibility and allow comparison of results between laboratories and may accelerate our understanding of the pathogenesis of sepsis.
Reference and Standard Atmosphere Models
NASA Technical Reports Server (NTRS)
Johnson, Dale L.; Roberts, Barry C.; Vaughan, William W.; Parker, Nelson C. (Technical Monitor)
2002-01-01
This paper describes the development of standard and reference atmosphere models along with the history of their origin and use since the mid 19th century. The first "Standard Atmospheres" were established by international agreement in the 1920's. Later some countries, notably the United States, also developed and published "Standard Atmospheres". The term "Reference Atmospheres" is used to identify atmosphere models for specific geographical locations. Range Reference Atmosphere Models developed first during the 1960's are examples of these descriptions of the atmosphere. This paper discusses the various models, scopes, applications and limitations relative to use in aerospace industry activities.
Colorado Model Content Standards: Science
ERIC Educational Resources Information Center
Colorado Department of Education, 2007
2007-01-01
The Colorado Model Content Standards for Science specify what all students should know and be able to do in science as a result of their school studies. Specific expectations are given for students completing grades K-2, 3-5, 6-8, and 9-12. Five standards outline the essential level of science knowledge and skills needed by Colorado citizens to…
Standard model of knowledge representation
NASA Astrophysics Data System (ADS)
Yin, Wensheng
2016-09-01
Knowledge representation is the core of artificial intelligence research. Knowledge representation methods include predicate logic, semantic network, computer programming language, database, mathematical model, graphics language, natural language, etc. To establish the intrinsic link between various knowledge representation methods, a unified knowledge representation model is necessary. According to ontology, system theory, and control theory, a standard model of knowledge representation that reflects the change of the objective world is proposed. The model is composed of input, processing, and output. This knowledge representation method is not a contradiction to the traditional knowledge representation method. It can express knowledge in terms of multivariate and multidimensional. It can also express process knowledge, and at the same time, it has a strong ability to solve problems. In addition, the standard model of knowledge representation provides a way to solve problems of non-precision and inconsistent knowledge.
Model Standards Advance the Profession
ERIC Educational Resources Information Center
Journal of Staff Development, 2011
2011-01-01
Leadership by teachers is essential to serving the needs of students, schools, and the teaching profession. To that end, the Teacher Leadership Exploratory Consortium has developed Teacher Leader Model Standards to codify, promote, and support teacher leadership as a vehicle to transform schools for the needs of the 21st century. The Teacher…
Modular modelling with Physiome standards
Nickerson, David P.; Nielsen, Poul M. F.; Hunter, Peter J.
2016-01-01
Key points The complexity of computational models is increasing, supported by research in modelling tools and frameworks. But relatively little thought has gone into design principles for complex models.We propose a set of design principles for complex model construction with the Physiome standard modelling protocol CellML.By following the principles, models are generated that are extensible and are themselves suitable for reuse in larger models of increasing complexity.We illustrate these principles with examples including an architectural prototype linking, for the first time, electrophysiology, thermodynamically compliant metabolism, signal transduction, gene regulation and synthetic biology.The design principles complement other Physiome research projects, facilitating the application of virtual experiment protocols and model analysis techniques to assist the modelling community in creating libraries of composable, characterised and simulatable quantitative descriptions of physiology. Abstract The ability to produce and customise complex computational models has great potential to have a positive impact on human health. As the field develops towards whole‐cell models and linking such models in multi‐scale frameworks to encompass tissue, organ, or organism levels, reuse of previous modelling efforts will become increasingly necessary. Any modelling group wishing to reuse existing computational models as modules for their own work faces many challenges in the context of construction, storage, retrieval, documentation and analysis of such modules. Physiome standards, frameworks and tools seek to address several of these challenges, especially for models expressed in the modular protocol CellML. Aside from providing a general ability to produce modules, there has been relatively little research work on architectural principles of CellML models that will enable reuse at larger scales. To complement and support the existing tools and frameworks, we develop a set
Asymptotically safe standard model extensions?
NASA Astrophysics Data System (ADS)
Pelaggi, Giulio Maria; Plascencia, Alexis D.; Salvio, Alberto; Sannino, Francesco; Smirnov, Juri; Strumia, Alessandro
2018-05-01
We consider theories with a large number NF of charged fermions and compute the renormalization group equations for the gauge, Yukawa and quartic couplings resummed at leading order in 1 /NF. We construct extensions of the standard model where SU(2) and/or SU(3) are asymptotically safe. When the same procedure is applied to the Abelian U(1) factor, we find that the Higgs quartic can not be made asymptotically safe and stay perturbative at the same time.
Extensions of the standard model
Ramond, P.
1983-01-01
In these lectures we focus on several issues that arise in theoretical extensions of the standard model. First we describe the kinds of fermions that can be added to the standard model without affecting known phenomenology. We focus in particular on three types: the vector-like completion of the existing fermions as would be predicted by a Kaluza-Klein type theory, which we find cannot be realistically achieved without some chiral symmetry; fermions which are vector-like by themselves, such as do appear in supersymmetric extensions, and finally anomaly-free chiral sets of fermions. We note that a chiral symmetry, such as the Peccei-Quinnmore » symmetry can be used to produce a vector-like theory which, at scales less than M/sub W/, appears to be chiral. Next, we turn to the analysis of the second hierarchy problem which arises in Grand Unified extensions of the standard model, and plays a crucial role in proton decay of supersymmetric extensions. We review the known mechanisms for avoiding this problem and present a new one which seems to lead to the (family) triplication of the gauge group. Finally, this being a summer school, we present a list of homework problems. 44 references.« less
Consistency Across Standards or Standards in a New Business Model
NASA Technical Reports Server (NTRS)
Russo, Dane M.
2010-01-01
Presentation topics include: standards in a changing business model, the new National Space Policy is driving change, a new paradigm for human spaceflight, consistency across standards, the purpose of standards, danger of over-prescriptive standards, a balance is needed (between prescriptive and general standards), enabling versus inhibiting, characteristics of success-oriented standards, characteristics of success-oriented standards, and conclusions. Additional slides include NASA Procedural Requirements 8705.2B identifies human rating standards and requirements, draft health and medical standards for human rating, what's been done, government oversight models, examples of consistency from anthropometry, examples of inconsistency from air quality and appendices of government and non-governmental human factors standards.
NASA Astrophysics Data System (ADS)
Gunion, John F.; Han, Tao; Ohnemus, James
1995-08-01
The Table of Contents for the book is as follows: * Preface * Organizing and Advisory Committees * PLENARY SESSIONS * Looking Beyond the Standard Model from LEP1 and LEP2 * Virtual Effects of Physics Beyond the Standard Model * Extended Gauge Sectors * CLEO's Views Beyond the Standard Model * On Estimating Perturbative Coefficients in Quantum Field Theory and Statistical Physics * Perturbative Corrections to Inclusive Heavy Hadron Decay * Some Recent Developments in Sphalerons * Searching for New Matter Particles at Future Colliders * Issues in Dynamical Supersymmetry Breaking * Present Status of Fermilab Collider Accelerator Upgrades * The Extraordinary Scientific Opportunities from Upgrading Fermilab's Luminosity ≥ 1033 cm-2 sec-1 * Applications of Effective Lagrangians * Collider Phenomenology for Strongly Interacting Electroweak Sector * Physics of Self-Interacting Electroweak Bosons * Particle Physics at a TeV-Scale e+e- Linear Collider * Physics at γγ and eγ Colliders * Challenges for Non-Minimal Higgs Searchers at Future Colliders * Physics Potential and Development of μ+μ- Colliders * Beyond Standard Quantum Chromodynamics * Extracting Predictions from Supergravity/Superstrings for the Effective Theory Below the Planck Scale * Non-Universal SUSY Breaking, Hierarchy and Squark Degeneracy * Supersymmetric Phenomenology in the Light of Grand Unification * A Survey of Phenomenological Constraints on Supergravity Models * Precision Tests of the MSSM * The Search for Supersymmetry * Neutrino Physics * Neutrino Mass: Oscillations and Hot Dark Matter * Dark Matter and Large-Scale Structure * Electroweak Baryogenesis * Progress in Searches for Non-Baryonic Dark Matter * Big Bang Nucleosynthesis * Flavor Tests of Quark-Lepton * Where are We Coming from? What are We? Where are We Going? * Summary, Perspectives * PARALLEL SESSIONS * SUSY Phenomenology I * Is Rb Telling us that Superpartners will soon be Discovered? * Dark Matter in Constrained Minimal
Standard for Models and Simulations
NASA Technical Reports Server (NTRS)
Steele, Martin J.
2016-01-01
This NASA Technical Standard establishes uniform practices in modeling and simulation to ensure essential requirements are applied to the design, development, and use of models and simulations (MS), while ensuring acceptance criteria are defined by the program project and approved by the responsible Technical Authority. It also provides an approved set of requirements, recommendations, and criteria with which MS may be developed, accepted, and used in support of NASA activities. As the MS disciplines employed and application areas involved are broad, the common aspects of MS across all NASA activities are addressed. The discipline-specific details of a given MS should be obtained from relevant recommended practices. The primary purpose is to reduce the risks associated with MS-influenced decisions by ensuring the complete communication of the credibility of MS results.
Cosmology beyond the Standard Model
NASA Astrophysics Data System (ADS)
Wells, Christopher M.
The Standard Model of Cosmology, like its particle physics counterpart, is incomplete in its present form theoretically and observationally. Additional structure, in the form of an early period of accelerated expansion (inflation), is suggested by the special initial conditions required to produce the visible universe. Furthermore, a wide variety of indirect observations indicate that 80% of the mass in the universe is dark. In this thesis, we construct a class of inflation models free from the usual pathologies. In particular, we build a novel realization of hybrid inflation, in which both the inflaton and waterfall degrees of freedom are moduli of a higher dimensional compactification. Because the inflationary fields are realized as global degrees of freedom in the extra dimension, they are protected from the 4D quantum corrections that would otherwise spoil inflation. Via the Ads/CFT correspondence we can relate our construction to a dual theory of composite inflationary degrees of freedom. We then turn to studying the problem of missing matter in the Standard Cosmology. Despite an abundance of indirect observations of dark matter, direct detection experiments have produced conflicting results which seem to point to a more complicated dark sector. In this thesis, we propose that dark matter be made up primarily of non-relativistic bound states, i.e. dark atoms. We explore the atomic parameter space allowed by the demands that dark matter is predominantly atomic and that the dark atoms and ions satisfy observational bounds on dark matter self-interactions. We then study possible interactions between dark matter and normal matter such that dark atoms scatter inelastically from nuclei in direct detection experiments.
Constrained exceptional supersymmetric standard model
Athron, P.; King, S. F.; Miller, D. J.
2009-08-01
We propose and study a constrained version of the exceptional supersymmetric standard model (E{sub 6}SSM), which we call the cE{sub 6}SSM, based on a universal high energy scalar mass m{sub 0}, trilinear scalar coupling A{sub 0} and gaugino mass M{sub 1/2}. We derive the renormalization group (RG) Equations for the cE{sub 6}SSM, including the extra U(1){sub N} gauge factor and the low-energy matter content involving three 27 representations of E{sub 6}. We perform a numerical RG analysis for the cE{sub 6}SSM, imposing the usual low-energy experimental constraints and successful electroweak symmetry breaking. Our analysis reveals that the sparticle spectrum ofmore » the cE{sub 6}SSM involves a light gluino, two light neutralinos, and a light chargino. Furthermore, although the squarks, sleptons, and Z{sup '} boson are typically heavy, the exotic quarks and squarks can also be relatively light. We finally specify a set of benchmark points, which correspond to particle spectra, production modes, and decay patterns peculiar to the cE{sub 6}SSM, altogether leading to spectacular new physics signals at the Large Hadron Collider.« less
Colorado Model Content Standards: Economics.
ERIC Educational Resources Information Center
Colorado State Dept. of Education, Denver.
The goal of these Colorado state economics content standards is for students, by the time they graduate from high school, to understand economics well enough to make good judgments about personal economic questions and about economic policy in a complex and changing world. Students need an understanding of basic economic concepts to become…
A standard satellite control reference model
NASA Technical Reports Server (NTRS)
Golden, Constance
1994-01-01
This paper describes a Satellite Control Reference Model that provides the basis for an approach to identify where standards would be beneficial in supporting space operations functions. The background and context for the development of the model and the approach are described. A process for using this reference model to trace top level interoperability directives to specific sets of engineering interface standards that must be implemented to meet these directives is discussed. Issues in developing a 'universal' reference model are also identified.
Modeling in the Common Core State Standards
ERIC Educational Resources Information Center
Tam, Kai Chung
2011-01-01
The inclusion of modeling and applications into the mathematics curriculum has proven to be a challenging task over the last fifty years. The Common Core State Standards (CCSS) has made mathematical modeling both one of its Standards for Mathematical Practice and one of its Conceptual Categories. This article discusses the need for mathematical…
Physics beyond the Standard Model
NASA Astrophysics Data System (ADS)
Lach, Theodore
2011-04-01
Recent discoveries of the excited states of the Bs** meson along with the discovery of the omega-b-minus have brought into popular acceptance the concept of the orbiting quarks predicted by the Checker Board Model (CBM) 14 years ago. Back then the concept of orbiting quarks was not fashionable. Recent estimates of velocities of these quarks inside the proton and neutron are in excess of 90% the speed of light also in agreement with the CBM model. Still a 2D structure of the nucleus has not been accepted nor has it been proven wrong. The CBM predicts masses of the up and dn quarks are 237.31 MeV and 42.392 MeV respectively and suggests that a lighter generation of quarks u and d make up a different generation of quarks that make up light mesons. The CBM also predicts that the T' and B' quarks do exist and are not as massive as might be expected. (this would make it a 5G world in conflict with the SM) The details of the CB model and prediction of quark masses can be found at: http://checkerboard.dnsalias.net/ (1). T.M. Lach, Checkerboard Structure of the Nucleus, Infinite Energy, Vol. 5, issue 30, (2000). (2). T.M. Lach, Masses of the Sub-Nuclear Particles, nucl-th/0008026, @http://xxx.lanl.gov/.
Colorado Model Content Standards: Foreign Language.
ERIC Educational Resources Information Center
Colorado State Dept. of Education, Denver.
The model course content standards for foreign language instruction in Colorado's public schools, K-12, provide guidelines, not curriculum, for school districts to design language programs. An introductory section presents some basic considerations in program design. The two general standards for foreign language performance are that: (1) students…
Standardized Tests and Froebel's Original Kindergarten Model
ERIC Educational Resources Information Center
Jeynes, William H.
2006-01-01
The author argues that American educators rely on standardized tests at too early an age when administered in kindergarten, particularly given the original intent of kindergarten as envisioned by its founder, Friedrich Froebel. The author examines the current use of standardized tests in kindergarten and the Froebel model, including his emphasis…
Wisconsin's Model Academic Standards for Music.
ERIC Educational Resources Information Center
Nikolay, Pauli; Grady, Susan; Stefonek, Thomas
To assist parents and educators in preparing students for the 21st century, Wisconsin citizens have become involved in the development of challenging academic standards in 12 curricular areas. Having clear standards for students and teachers makes it possible to develop rigorous local curricula and valid, reliable assessments. This model of…
Threshold Effects Beyond the Standard Model
NASA Astrophysics Data System (ADS)
Taylor, T. R.
In this contribution to the Festschrift celebrating Gabriele Veneziano on his 65th birthday, I discuss the threshold effects of extra dimensions and their applications to physics beyond the standard model, focusing on superstring theory.
Standard Model Background of the Cosmological Collider.
Chen, Xingang; Wang, Yi; Xianyu, Zhong-Zhi
2017-06-30
The inflationary universe can be viewed as a "cosmological collider" with an energy of the Hubble scale, producing very massive particles and recording their characteristic signals in primordial non-Gaussianities. To utilize this collider to explore any new physics at very high scales, it is a prerequisite to understand the background signals from the particle physics standard model. In this Letter we describe the standard model background of the cosmological collider.
Supersymmetric preons and the standard model
NASA Astrophysics Data System (ADS)
Raitio, Risto
2018-06-01
The experimental fact that standard model superpartners have not been observed compels one to consider an alternative implementation for supersymmetry. The basic supermultiplet proposed here consists of a photon and a charged spin 1/2 preon field, and their superpartners. These fields are shown to yield the standard model fermions, Higgs fields and gauge symmetries. Supersymmetry is defined for unbound preons only. Quantum group SLq (2) representations are introduced to classify topologically scalars, preons, quarks and leptons.
Exploring the Standard Model of Particles
ERIC Educational Resources Information Center
Johansson, K. E.; Watkins, P. M.
2013-01-01
With the recent discovery of a new particle at the CERN Large Hadron Collider (LHC) the Higgs boson could be about to be discovered. This paper provides a brief summary of the standard model of particle physics and the importance of the Higgs boson and field in that model for non-specialists. The role of Feynman diagrams in making predictions for…
Estimating standard errors in feature network models.
Frank, Laurence E; Heiser, Willem J
2007-05-01
Feature network models are graphical structures that represent proximity data in a discrete space while using the same formalism that is the basis of least squares methods employed in multidimensional scaling. Existing methods to derive a network model from empirical data only give the best-fitting network and yield no standard errors for the parameter estimates. The additivity properties of networks make it possible to consider the model as a univariate (multiple) linear regression problem with positivity restrictions on the parameters. In the present study, both theoretical and empirical standard errors are obtained for the constrained regression parameters of a network model with known features. The performance of both types of standard error is evaluated using Monte Carlo techniques.
Development of NASA's Models and Simulations Standard
NASA Technical Reports Server (NTRS)
Bertch, William J.; Zang, Thomas A.; Steele, Martin J.
2008-01-01
From the Space Shuttle Columbia Accident Investigation, there were several NASA-wide actions that were initiated. One of these actions was to develop a standard for development, documentation, and operation of Models and Simulations. Over the course of two-and-a-half years, a team of NASA engineers, representing nine of the ten NASA Centers developed a Models and Simulation Standard to address this action. The standard consists of two parts. The first is the traditional requirements section addressing programmatics, development, documentation, verification, validation, and the reporting of results from both the M&S analysis and the examination of compliance with this standard. The second part is a scale for evaluating the credibility of model and simulation results using levels of merit associated with 8 key factors. This paper provides an historical account of the challenges faced by and the processes used in this committee-based development effort. This account provides insights into how other agencies might approach similar developments. Furthermore, we discuss some specific applications of models and simulations used to assess the impact of this standard on future model and simulation activities.
Neutrino in standard model and beyond
NASA Astrophysics Data System (ADS)
Bilenky, S. M.
2015-07-01
After discovery of the Higgs boson at CERN the Standard Model acquired a status of the theory of the elementary particles in the electroweak range (up to about 300 GeV). What general conclusions can be inferred from the Standard Model? It looks that the Standard Model teaches us that in the framework of such general principles as local gauge symmetry, unification of weak and electromagnetic interactions and Brout-Englert-Higgs spontaneous breaking of the electroweak symmetry nature chooses the simplest possibilities. Two-component left-handed massless neutrino fields play crucial role in the determination of the charged current structure of the Standard Model. The absence of the right-handed neutrino fields in the Standard Model is the simplest, most economical possibility. In such a scenario Majorana mass term is the only possibility for neutrinos to be massive and mixed. Such mass term is generated by the lepton-number violating Weinberg effective Lagrangian. In this approach three Majorana neutrino masses are suppressed with respect to the masses of other fundamental fermions by the ratio of the electroweak scale and a scale of a lepton-number violating physics. The discovery of the neutrinoless double β-decay and absence of transitions of flavor neutrinos into sterile states would be evidence in favor of the minimal scenario we advocate here.
The Standard Model and Higgs physics
NASA Astrophysics Data System (ADS)
Torassa, Ezio
2018-05-01
The Standard Model is a consistent and computable theory that successfully describes the elementary particle interactions. The strong, electromagnetic and weak interactions have been included in the theory exploiting the relation between group symmetries and group generators, in order to smartly introduce the force carriers. The group properties lead to constraints between boson masses and couplings. All the measurements performed at the LEP, Tevatron, LHC and other accelerators proved the consistency of the Standard Model. A key element of the theory is the Higgs field, which together with the spontaneous symmetry breaking, gives mass to the vector bosons and to the fermions. Unlike the case of vector bosons, the theory does not provide prediction for the Higgs boson mass. The LEP experiments, while providing very precise measurements of the Standard Model theory, searched for the evidence of the Higgs boson until the year 2000. The discovery of the top quark in 1994 by the Tevatron experiments and of the Higgs boson in 2012 by the LHC experiments were considered as the completion of the fundamental particles list of the Standard Model theory. Nevertheless the neutrino oscillations, the dark matter and the baryon asymmetry in the Universe evidence that we need a new extended model. In the Standard Model there are also some unattractive theoretical aspects like the divergent loop corrections to the Higgs boson mass and the very small Yukawa couplings needed to describe the neutrino masses. For all these reasons, the hunt of discrepancies between Standard Model and data is still going on with the aim to finally describe the new extended theory.
Standard Model as a Double Field Theory.
Choi, Kang-Sin; Park, Jeong-Hyuck
2015-10-23
We show that, without any extra physical degree introduced, the standard model can be readily reformulated as a double field theory. Consequently, the standard model can couple to an arbitrary stringy gravitational background in an O(4,4) T-duality covariant manner and manifest two independent local Lorentz symmetries, Spin(1,3)×Spin(3,1). While the diagonal gauge fixing of the twofold spin groups leads to the conventional formulation on the flat Minkowskian background, the enhanced symmetry makes the standard model more rigid, and also stringy, than it appeared. The CP violating θ term may no longer be allowed by the symmetry, and hence the strong CP problem can be solved. There are now stronger constraints imposed on the possible higher order corrections. We speculate that the quarks and the leptons may belong to the two different spin classes.
New Physics Beyond the Standard Model
NASA Astrophysics Data System (ADS)
Cai, Haiying
In this thesis we discuss several extensons of the standard model, with an emphasis on the hierarchy problem. The hierachy problem related to the Higgs boson mass is a strong indication of new physics beyond the Standard Model. In the literature, several mechanisms, e.g. , supersymmetry (SUSY), the little Higgs and extra dimensions, are proposed to explain why the Higgs mass can be stabilized to the electroweak scale. In the Standard Model, the largest quadratically divergent contribution to the Higgs mass-squared comes from the top quark loop. We consider a few novel possibilities on how this contribution is cancelled. In the standard SUSY scenario, the quadratic divergence from the fermion loops is cancelled by the scalar superpartners and the SUSY breaking scale determines the masses of the scalars. We propose a new SUSY model, where the superpartner of the top quark is spin-1 rather than spin-0. In little Higgs theories, the Higgs field is realized as a psudo goldstone boson in a nonlinear sigma model. The smallness of its mass is protected by the global symmetry. As a variation, we put the little Higgs into an extra dimensional model where the quadratically divergent top loop contribution to the Higgs mass is cancelled by an uncolored heavy "top quirk" charged under a different SU(3) gauge group. Finally, we consider a supersymmetric warped extra dimensional model where the superpartners have continuum mass spectra. We use the holographic boundary action to study how a mass gap can arise to separate the zero modes from continuum modes. Such extensions of the Standard Model have novel signatures at the Large Hadron Collider.
Asymptotically Safe Standard Model via Vectorlike Fermions.
Mann, R B; Meffe, J R; Sannino, F; Steele, T G; Wang, Z W; Zhang, C
2017-12-29
We construct asymptotically safe extensions of the standard model by adding gauged vectorlike fermions. Using large number-of-flavor techniques we argue that all gauge couplings, including the hypercharge and, under certain conditions, the Higgs coupling, can achieve an interacting ultraviolet fixed point.
Asymptotically Safe Standard Model via Vectorlike Fermions
NASA Astrophysics Data System (ADS)
Mann, R. B.; Meffe, J. R.; Sannino, F.; Steele, T. G.; Wang, Z. W.; Zhang, C.
2017-12-01
We construct asymptotically safe extensions of the standard model by adding gauged vectorlike fermions. Using large number-of-flavor techniques we argue that all gauge couplings, including the hypercharge and, under certain conditions, the Higgs coupling, can achieve an interacting ultraviolet fixed point.
Addressing Beyond Standard Model physics using cosmology
NASA Astrophysics Data System (ADS)
Ghalsasi, Akshay
We have consensus models for both particle physics (i.e. standard model) and cosmology (i.e. LambdaCDM). Given certain assumptions about the initial conditions of the universe, the marriage of the standard model (SM) of particle physics and LambdaCDM cosmology has been phenomenally successful in describing the universe we live in. However it is quite clear that all is not well. The three biggest problems that the SM faces today are baryogenesis, dark matter and dark energy. These problems, along with the problem of neutrino masses, indicate the existence of physics beyond SM. Evidence of baryogenesis, dark matter and dark energy all comes from astrophysical and cosmological observations. Cosmology also provides the best (model dependent) constraints on neutrino masses. In this thesis I will try address the following problems 1) Addressing the origin of dark energy (DE) using non-standard neutrino cosmology and exploring the effects of the non-standard neutrino cosmology on terrestrial and cosmological experiments. 2) Addressing the matter anti-matter asymmetry of the universe.
Inflation in the standard cosmological model
NASA Astrophysics Data System (ADS)
Uzan, Jean-Philippe
2015-12-01
The inflationary paradigm is now part of the standard cosmological model as a description of its primordial phase. While its original motivation was to solve the standard problems of the hot big bang model, it was soon understood that it offers a natural theory for the origin of the large-scale structure of the universe. Most models rely on a slow-rolling scalar field and enjoy very generic predictions. Besides, all the matter of the universe is produced by the decay of the inflaton field at the end of inflation during a phase of reheating. These predictions can be (and are) tested from their imprint of the large-scale structure and in particular the cosmic microwave background. Inflation stands as a window in physics where both general relativity and quantum field theory are at work and which can be observationally studied. It connects cosmology with high-energy physics. Today most models are constructed within extensions of the standard model, such as supersymmetry or string theory. Inflation also disrupts our vision of the universe, in particular with the ideas of chaotic inflation and eternal inflation that tend to promote the image of a very inhomogeneous universe with fractal structure on a large scale. This idea is also at the heart of further speculations, such as the multiverse. This introduction summarizes the connections between inflation and the hot big bang model and details the basics of its dynamics and predictions. xml:lang="fr"
Study on Standard Fatigue Vehicle Load Model
NASA Astrophysics Data System (ADS)
Huang, H. Y.; Zhang, J. P.; Li, Y. H.
2018-02-01
Based on the measured data of truck from three artery expressways in Guangdong Province, the statistical analysis of truck weight was conducted according to axle number. The standard fatigue vehicle model applied to industrial areas in the middle and late was obtained, which adopted equivalence damage principle, Miner linear accumulation law, water discharge method and damage ratio theory. Compared with the fatigue vehicle model Specified by the current bridge design code, the proposed model has better applicability. It is of certain reference value for the fatigue design of bridge in China.
Electroweak standard model with very special relativity
NASA Astrophysics Data System (ADS)
Alfaro, Jorge; González, Pablo; Ávila, Ricardo
2015-05-01
The very special relativity electroweak Standard Model (VSR EW SM) is a theory with SU (2 )L×U (1 )R symmetry, with the same number of leptons and gauge fields as in the usual Weinberg-Salam model. No new particles are introduced. The model is renormalizable and unitarity is preserved. However, photons obtain mass and the massive bosons obtain different masses for different polarizations. Besides, neutrino masses are generated. A VSR-invariant term will produce neutrino oscillations and new processes are allowed. In particular, we compute the rate of the decays μ →e +γ . All these processes, which are forbidden in the electroweak Standard Model, put stringent bounds on the parameters of our model and measure the violation of Lorentz invariance. We investigate the canonical quantization of this nonlocal model. Second quantization is carried out, and we obtain a well-defined particle content. Additionally, we do a counting of the degrees of freedom associated with the gauge bosons involved in this work, after spontaneous symmetry breaking has been realized. Violations of Lorentz invariance have been predicted by several theories of quantum gravity [J. Alfaro, H. Morales-Tecotl, and L. F. Urrutia, Phys. Rev. Lett. 84, 2318 (2000); Phys. Rev. D 65, 103509 (2002)]. It is a remarkable possibility that the low-energy effects of Lorentz violation induced by quantum gravity could be contained in the nonlocal terms of the VSR EW SM.
Temperature dependence of standard model CP violation.
Brauner, Tomáš; Taanila, Olli; Tranberg, Anders; Vuorinen, Aleksi
2012-01-27
We analyze the temperature dependence of CP violation effects in the standard model by determining the effective action of its bosonic fields, obtained after integrating out the fermions from the theory and performing a covariant gradient expansion. We find nonvanishing CP violating terms starting at the sixth order of the expansion, albeit only in the C-odd-P-even sector, with coefficients that depend on quark masses, Cabibbo-Kobayashi-Maskawa matrix elements, temperature and the magnitude of the Higgs field. The CP violating effects are observed to decrease rapidly with temperature, which has important implications for the generation of a matter-antimatter asymmetry in the early Universe. Our results suggest that the cold electroweak baryogenesis scenario may be viable within the standard model, provided the electroweak transition temperature is at most of order 1 GeV.
Indoorgml - a Standard for Indoor Spatial Modeling
NASA Astrophysics Data System (ADS)
Li, Ki-Joune
2016-06-01
With recent progress of mobile devices and indoor positioning technologies, it becomes possible to provide location-based services in indoor space as well as outdoor space. It is in a seamless way between indoor and outdoor spaces or in an independent way only for indoor space. However, we cannot simply apply spatial models developed for outdoor space to indoor space due to their differences. For example, coordinate reference systems are employed to indicate a specific position in outdoor space, while the location in indoor space is rather specified by cell number such as room number. Unlike outdoor space, the distance between two points in indoor space is not determined by the length of the straight line but the constraints given by indoor components such as walls, stairs, and doors. For this reason, we need to establish a new framework for indoor space from fundamental theoretical basis, indoor spatial data models, and information systems to store, manage, and analyse indoor spatial data. In order to provide this framework, an international standard, called IndoorGML has been developed and published by OGC (Open Geospatial Consortium). This standard is based on a cellular notion of space, which considers an indoor space as a set of non-overlapping cells. It consists of two types of modules; core module and extension module. While core module consists of four basic conceptual and implementation modeling components (geometric model for cell, topology between cells, semantic model of cell, and multi-layered space model), extension modules may be defined on the top of the core module to support an application area. As the first version of the standard, we provide an extension for indoor navigation.
Beyond standard model calculations with Sherpa
Höche, Stefan; Kuttimalai, Silvan; Schumann, Steffen; ...
2015-03-24
We present a fully automated framework as part of the Sherpa event generator for the computation of tree-level cross sections in beyond Standard Model scenarios, making use of model information given in the Universal FeynRules Output format. Elementary vertices are implemented into C++ code automatically and provided to the matrix-element generator Comix at runtime. Widths and branching ratios for unstable particles are computed from the same building blocks. The corresponding decays are simulated with spin correlations. Parton showers, QED radiation and hadronization are added by Sherpa, providing a full simulation of arbitrary BSM processes at the hadron level.
Beyond standard model calculations with Sherpa.
Höche, Stefan; Kuttimalai, Silvan; Schumann, Steffen; Siegert, Frank
We present a fully automated framework as part of the Sherpa event generator for the computation of tree-level cross sections in Beyond Standard Model scenarios, making use of model information given in the Universal FeynRules Output format. Elementary vertices are implemented into C++ code automatically and provided to the matrix-element generator Comix at runtime. Widths and branching ratios for unstable particles are computed from the same building blocks. The corresponding decays are simulated with spin correlations. Parton showers, QED radiation and hadronization are added by Sherpa, providing a full simulation of arbitrary BSM processes at the hadron level.
Extended spin symmetry and the standard model
NASA Astrophysics Data System (ADS)
Besprosvany, J.; Romero, R.
2010-12-01
We review unification ideas and explain the spin-extended model in this context. Its consideration is also motivated by the standard-model puzzles. With the aim of constructing a common description of discrete degrees of freedom, as spin and gauge quantum numbers, the model departs from q-bits and generalized Hilbert spaces. Physical requirements reduce the space to one that is represented by matrices. The classification of the representations is performed through Clifford algebras, with its generators associated with Lorentz and scalar symmetries. We study a reduced space with up to two spinor elements within a matrix direct product. At given dimension, the demand that Lorentz symmetry be maintained, determines the scalar symmetries, which connect to vector-and-chiral gauge-interacting fields; we review the standard-model information in each dimension. We obtain fermions and bosons, with matter fields in the fundamental representation, radiation fields in the adjoint, and scalar particles with the Higgs quantum numbers. We relate the fields' representation in such spaces to the quantum-field-theory one, and the Lagrangian. The model provides a coupling-constant definition.
Standard model EFT and extended scalar sectors
Dawson, Sally; Murphy, Christopher W.
One of the simplest extensions of the Standard Model is the inclusion of an additional scalar multiplet, and we consider scalars in the S U ( 2 ) L singlet, triplet, and quartet representations. Here, we examine models with heavy neutral scalars, m H ~1 – 2 TeV , and the matching of the UV complete theories to the low energy effective field theory. We also demonstrate the agreement of the kinematic distributions obtained in the singlet models for the gluon fusion of a Higgs pair with the predictions of the effective field theory. Finally, the restrictions on the extendedmore » scalar sectors due to unitarity and precision electroweak measurements are summarized and lead to highly restricted regions of viable parameter space for the triplet and quartet models.« less
Standard model EFT and extended scalar sectors
Dawson, Sally; Murphy, Christopher W.
2017-07-31
One of the simplest extensions of the Standard Model is the inclusion of an additional scalar multiplet, and we consider scalars in the S U ( 2 ) L singlet, triplet, and quartet representations. Here, we examine models with heavy neutral scalars, m H ~1 – 2 TeV , and the matching of the UV complete theories to the low energy effective field theory. We also demonstrate the agreement of the kinematic distributions obtained in the singlet models for the gluon fusion of a Higgs pair with the predictions of the effective field theory. Finally, the restrictions on the extendedmore » scalar sectors due to unitarity and precision electroweak measurements are summarized and lead to highly restricted regions of viable parameter space for the triplet and quartet models.« less
Conformal standard model, leptogenesis, and dark matter
NASA Astrophysics Data System (ADS)
Lewandowski, Adrian; Meissner, Krzysztof A.; Nicolai, Hermann
2018-02-01
The conformal standard model is a minimal extension of the Standard Model (SM) of particle physics based on the assumed absence of large intermediate scales between the TeV scale and the Planck scale, which incorporates only right-chiral neutrinos and a new complex scalar in addition to the usual SM degrees of freedom, but no other features such as supersymmetric partners. In this paper, we present a comprehensive quantitative analysis of this model, and show that all outstanding issues of particle physics proper can in principle be solved "in one go" within this framework. This includes in particular the stabilization of the electroweak scale, "minimal" leptogenesis and the explanation of dark matter, with a small mass and very weakly interacting Majoron as the dark matter candidate (for which we propose to use the name "minoron"). The main testable prediction of the model is a new and almost sterile scalar boson that would manifest itself as a narrow resonance in the TeV region. We give a representative range of parameter values consistent with our assumptions and with observation.
The computation of standard solar models
NASA Technical Reports Server (NTRS)
Ulrich, Roger K.; Cox, Arthur N.
1991-01-01
Procedures for calculating standard solar models with the usual simplifying approximations of spherical symmetry, no mixing except in the surface convection zone, no mass loss or gain during the solar lifetime, and no separation of elements by diffusion are described. The standard network of nuclear reactions among the light elements is discussed including rates, energy production and abundance changes. Several of the equation of state and opacity formulations required for the basic equations of mass, momentum and energy conservation are presented. The usual mixing-length convection theory is used for these results. Numerical procedures for calculating the solar evolution, and current evolution and oscillation frequency results for the present sun by some recent authors are given.
42 CFR 403.210 - NAIC model standards.
Code of Federal Regulations, 2013 CFR
2013-10-01
... 42 Public Health 2 2013-10-01 2013-10-01 false NAIC model standards. 403.210 Section 403.210... model standards. (a) NAIC model standards means the National Association of Insurance Commissioners (NAIC) “Model Regulation to Implement the Individual Accident and Insurance Minimum Standards Act” (as...
42 CFR 403.210 - NAIC model standards.
Code of Federal Regulations, 2010 CFR
2010-10-01
... 42 Public Health 2 2010-10-01 2010-10-01 false NAIC model standards. 403.210 Section 403.210... model standards. (a) NAIC model standards means the National Association of Insurance Commissioners (NAIC) “Model Regulation to Implement the Individual Accident and Insurance Minimum Standards Act” (as...
42 CFR 403.210 - NAIC model standards.
Code of Federal Regulations, 2014 CFR
2014-10-01
... 42 Public Health 2 2014-10-01 2014-10-01 false NAIC model standards. 403.210 Section 403.210... model standards. (a) NAIC model standards means the National Association of Insurance Commissioners (NAIC) “Model Regulation to Implement the Individual Accident and Insurance Minimum Standards Act” (as...
42 CFR 403.210 - NAIC model standards.
Code of Federal Regulations, 2012 CFR
2012-10-01
... 42 Public Health 2 2012-10-01 2012-10-01 false NAIC model standards. 403.210 Section 403.210... model standards. (a) NAIC model standards means the National Association of Insurance Commissioners (NAIC) “Model Regulation to Implement the Individual Accident and Insurance Minimum Standards Act” (as...
42 CFR 403.210 - NAIC model standards.
Code of Federal Regulations, 2011 CFR
2011-10-01
... 42 Public Health 2 2011-10-01 2011-10-01 false NAIC model standards. 403.210 Section 403.210... model standards. (a) NAIC model standards means the National Association of Insurance Commissioners (NAIC) “Model Regulation to Implement the Individual Accident and Insurance Minimum Standards Act” (as...
Naturalness in the Standard Model and beyond
NASA Astrophysics Data System (ADS)
Papaioannou, Anastasios Yiannaki
After an introduction to the Standard Model of particle physics and the unresolved question of naturalness posed by its treatment of electroweak symmetry breaking, we consider several different theoretical approaches that attempt to answer this question. First, we present work in which we consider the possibility that the Higgs boson, the long-sought hypothetical particle intimately associated with electroweak symmetry breaking, has a much larger mass than is usually assumed. Absent direct experimental evidence for a light Higgs boson (m ˜ O (100 GeV)), and precision electroweak data consistent with a light Higgs notwithstanding, we propose a heavier (m ˜ O (500 GeV)), thus more natural, Higgs boson. This heavy Higgs can be made consistent with the precision electroweak data if we also extend the Standard Model via the inclusion of new fermionic states near the weak energy scale. These new states, in addition to bringing the heavy Higgs boson in line with the precision data, also serve as a candidate for the elusive dark matter that pervades the universe. From there we go on to consider the problem of naturalness from the perspective of supersymmetry, one of the most popular candidates for physics beyond the Standard Model. In particular, the theory of the Next-to-Minimal Supersymmetric Standard Model (NMSSM) has found favor in its ability to solve the problem of naturalness posed by the Standard Model, in its hints at unification of the strong, weak, and electromagnetic interactions at high energies, and in its ability to provide supersymmetric particles as dark matter candidates. The NMSSM, however, requires rather large superpartner masses in order to accommodate a Higgs boson heavier than current experimental bounds while still maintaining gauge unification at high energies. We explore the possibility of new supersymmetric states at intermediate energies between the weak scale and the unification scale, which preserve gauge unification and allow a heavier
Phenomenology of the utilitarian supersymmetric standard model
Fraser, Sean; Kownacki, Corey; Ma, Ernest; ...
2016-06-11
We study the 2010 specific version of the 2002 proposed U(1)(X) extension of the supersymmetric standard model, which has no mu term and conserves baryon number and lepton number separately and automatically. We consider in detail the scalar sector as well as the extra Z(X) gauge boson, and their interactions with the necessary extra color-triplet particles of this model, which behave as leptoquarks. We show how the diphoton excess at 750 GeV, recently observed at the LHC, may be explained within this context. We identify a new fermion dark-matter candidate and discuss its properties. An important byproduct of this studymore » is the discovery of relaxed supersymmetric constraints on the Higgs boson's mass of 125 GeV.« less
RK and RK* beyond the standard model
NASA Astrophysics Data System (ADS)
Hiller, Gudrun; Nišandžić, Ivan
2017-08-01
Measurements of the ratio of B →K*μ μ to B →K*e e branching fractions, RK*, by the LHCb Collaboration strengthen the hints from previous studies with pseudoscalar kaons, RK, for the breakdown of lepton universality, and therefore the Standard Model (SM), to ˜3.5 σ . Complementarity between RK and RK* allows us to pin down the Dirac structure of the new contributions to be predominantly SM-like chiral, with possible admixture of chirality-flipped contributions of up to O (few 10 %). Scalar and vector leptoquark representations (S3,V1,V3) plus possible (S˜2,V2) admixture can explain RK ,K* via tree-level exchange. Flavor models naturally predict leptoquark masses not exceeding a few TeV, with couplings to third-generation quarks at O (0.1 ), implying that this scenario can be directly tested at the LHC.
Sphaleron rate in the minimal standard model.
D'Onofrio, Michela; Rummukainen, Kari; Tranberg, Anders
2014-10-03
We use large-scale lattice simulations to compute the rate of baryon number violating processes (the sphaleron rate), the Higgs field expectation value, and the critical temperature in the standard model across the electroweak phase transition temperature. While there is no true phase transition between the high-temperature symmetric phase and the low-temperature broken phase, the crossover is sharp and located at temperature T(c) = (159.5 ± 1.5) GeV. The sphaleron rate in the symmetric phase (T>T(c)) is Γ/T(4) = (18 ± 3)α(W)(5), and in the broken phase in the physically interesting temperature range 130 GeV < T < T(c) it can be parametrized as log(Γ/T(4)) = (0.83 ± 0.01)T/GeV-(147.7 ± 1.9). The freeze-out temperature in the early Universe, where the Hubble rate wins over the baryon number violation rate, is T* = (131.7 ± 2.3) GeV. These values, beyond being intrinsic properties of the standard model, are relevant for, e.g., low-scale leptogenesis scenarios.
Scenarios of physics beyond the standard model
NASA Astrophysics Data System (ADS)
Fok, Ricky
This dissertation discusses three topics on scenarios beyond the Standard Model. Topic one is the effects from a fourth generation of quarks and leptons on electroweak baryogenesis in the early universe. The Standard Model is incapable of electroweak baryogenesis due to an insufficiently strong enough electroweak phase transition (EWPT) as well as insufficient CP violation. We show that the presence of heavy fourth generation fermions solves the first problem but requires additional bosons to be included to stabilize the electroweak vacuum. Introducing supersymmetric partners of the heavy fermions, we find that the EWPT can be made strong enough and new sources of CP violation are present. Topic two relates to the lepton avor problem in supersymmetry. In the Minimal Supersymmetric Standard Model (MSSM), the off-diagonal elements in the slepton mass matrix must be suppressed at the 10-3 level to avoid experimental bounds from lepton avor changing processes. This dissertation shows that an enlarged R-parity can alleviate the lepton avor problem. An analysis of all sensitive parameters was performed in the mass range below 1 TeV, and we find that slepton maximal mixing is possible without violating bounds from the lepton avor changing processes: mu → egamma; mu → e conversion, and mu → 3e. Topic three is the collider phenomenology of quirky dark matter. In this model, quirks are particles that are gauged under the electroweak group, as well as a dark" color SU(2) group. The hadronization scale of this color group is well below the quirk masses. As a result, the dark color strings never break. Quirk and anti-quirk pairs can be produced at the LHC. Once produced, they immediately form a bound state of high angular momentum. The quirk pair rapidly shed angular momentum by emitting soft radiation before they annihilate into observable signals. This dissertation presents the decay branching ratios of quirkonia where quirks obtain their masses through electroweak
The Standard Model Algebra - a summary
NASA Astrophysics Data System (ADS)
Cristinel Stoica, Ovidiu
2017-08-01
A generation of leptons and quarks and the gauge symmetries of the Standard Model can be obtained from the Clifford algebra ℂℓ 6. An instance of ℂℓ 6 is implicitly generated by the Dirac algebra combined with the electroweak symmetry, while the color symmetry gives another instance of ℂℓ 6 with a Witt decomposition. The minimal mathematical model proposed here results by identifying the two instances of ℂℓ 6. The left ideal decomposition generated by the Witt decomposition represents the leptons and quarks, and their antiparticles. The SU(3)c and U(1)em symmetries of the SM are the symmetries of this ideal decomposition. The patterns of electric charges, colors, chirality, weak isospins, and hypercharges, follow from this, without predicting additional particles or forces, or proton decay. The electroweak symmetry is present in its broken form, due to the geometry. The predicted Weinberg angle is given by sin2 W = 0.25. The model shares common features with previously known models, particularly with Chisholm and Farwell, 1996, Trayling and Baylis, 2004, and Furey, 2016.
[Standardization and modeling of surgical processes].
Strauss, G; Schmitz, P
2016-12-01
Due to the technological developments around the operating room, surgery in the twenty-first century is undergoing a paradigm shift. Which technologies have already been integrated into the surgical routine? How can a favorable cost-benefit balance be achieved by the implementation of new software-based assistance systems? This article presents the state of the art technology as exemplified by a semi-automated operation system for otorhinolaryngology surgery. The main focus is on systems for implementation of digital handbooks and navigational functions in situ. On the basis of continuous development in digital imaging, decisions may by facilitated by individual patient models thus allowing procedures to be optimized. The ongoing digitization and linking of all relevant information enable a high level of standardization in terms of operating procedures. This may be used by assistance systems as a basis for complete documentation and high process reliability. Automation of processes in the operating room results in an increase in quality, precision and standardization so that the effectiveness and efficiency of treatment can be improved; however, care must be taken that detrimental consequences, such as loss of skills and placing too much faith in technology must be avoided by adapted training concepts.
Standard model group: Survival of the fittest
NASA Astrophysics Data System (ADS)
Nielsen, H. B.; Brene, N.
1983-09-01
The essential content of this paper is related to random dynamics. We speculate that the world seen through a sub-Planck-scale microscope has a lattice structure and that the dynamics on this lattice is almost completely random, except for the requirement that the random (plaquette) action is invariant under some "world (gauge) group". We see that the randomness may lead to spontaneous symmetry breakdown in the vacuum (spontaneous collapse) without explicit appeal to any scalar field associated with the usual Higgs mechanism. We further argue that the subgroup which survives as the end product of a possible chain of collapses is likely to have certain properties; the most important is that it has a topologically connected center. The standard group, i.e. the group of the gauge theory which combines the Salam-Weinberg model with QCD, has this property.
Beyond the standard model of particle physics.
Virdee, T S
2016-08-28
The Large Hadron Collider (LHC) at CERN and its experiments were conceived to tackle open questions in particle physics. The mechanism of the generation of mass of fundamental particles has been elucidated with the discovery of the Higgs boson. It is clear that the standard model is not the final theory. The open questions still awaiting clues or answers, from the LHC and other experiments, include: What is the composition of dark matter and of dark energy? Why is there more matter than anti-matter? Are there more space dimensions than the familiar three? What is the path to the unification of all the fundamental forces? This talk will discuss the status of, and prospects for, the search for new particles, symmetries and forces in order to address the open questions.This article is part of the themed issue 'Unifying physics and technology in light of Maxwell's equations'. © 2016 The Author(s).
Sequestering the standard model vacuum energy.
Kaloper, Nemanja; Padilla, Antonio
2014-03-07
We propose a very simple reformulation of general relativity, which completely sequesters from gravity all of the vacuum energy from a matter sector, including all loop corrections and renders all contributions from phase transitions automatically small. The idea is to make the dimensional parameters in the matter sector functionals of the 4-volume element of the Universe. For them to be nonzero, the Universe should be finite in spacetime. If this matter is the standard model of particle physics, our mechanism prevents any of its vacuum energy, classical or quantum, from sourcing the curvature of the Universe. The mechanism is consistent with the large hierarchy between the Planck scale, electroweak scale, and curvature scale, and early Universe cosmology, including inflation. Consequences of our proposal are that the vacuum curvature of an old and large universe is not zero, but very small, that w(DE) ≃ -1 is a transient, and that the Universe will collapse in the future.
Outstanding questions: physics beyond the Standard Model.
Ellis, John
2012-02-28
The Standard Model of particle physics agrees very well with experiment, but many important questions remain unanswered, among them are the following. What is the origin of particle masses and are they due to a Higgs boson? How does one understand the number of species of matter particles and how do they mix? What is the origin of the difference between matter and antimatter, and is it related to the origin of the matter in the Universe? What is the nature of the astrophysical dark matter? How does one unify the fundamental interactions? How does one quantize gravity? In this article, I introduce these questions and discuss how they may be addressed by experiments at the Large Hadron Collider, with particular attention to the search for the Higgs boson and supersymmetry.
Wisconsin's Model Academic Standards for Agricultural Education. Bulletin No. 9003.
ERIC Educational Resources Information Center
Fortier, John D.; Albrecht, Bryan D.; Grady, Susan M.; Gagnon, Dean P.; Wendt, Sharon, W.
These model academic standards for agricultural education in Wisconsin represent the work of a task force of educators, parents, and business people with input from the public. The introductory section of this bulletin defines the academic standards and discusses developing the standards, using the standards, relating the standards to all…
Integrated Personal Protective Equipment Standards Support Model
2008-04-01
traditional SCBA showed that the distribution of the weight is important as well. Twelve firefighters performed simulated fire -fighting and rescue exercises...respiratory equipment standards and five National Fire Protection Association (NFPA) protective suit, clothing, and respirator standards. The...respirators. The clothing standards were for protective ensembles for urban search and rescue operations, open circuit SCBA for fire and emergency
Standard surface-reflectance model and illuminant estimation
NASA Technical Reports Server (NTRS)
Tominaga, Shoji; Wandell, Brian A.
1989-01-01
A vector analysis technique was adopted to test the standard reflectance model. A computational model was developed to determine the components of the observed spectra and an estimate of the illuminant was obtained without using a reference white standard. The accuracy of the standard model is evaluated.
Standard Model thermodynamics across the electroweak crossover
Laine, M.; Meyer, M., E-mail: laine@itp.unibe.ch, E-mail: meyer@itp.unibe.ch
Even though the Standard Model with a Higgs mass m{sub H} = 125GeV possesses no bulk phase transition, its thermodynamics still experiences a 'soft point' at temperatures around T = 160GeV, with a deviation from ideal gas thermodynamics. Such a deviation may have an effect on precision computations of weakly interacting dark matter relic abundances if their mass is in the few TeV range, or on leptogenesis scenarios operating in this temperature range. By making use of results from lattice simulations based on a dimensionally reduced effective field theory, we estimate the relevant thermodynamic functions across the crossover. The resultsmore » are tabulated in a numerical form permitting for their insertion as a background equation of state into cosmological particle production/decoupling codes. We find that Higgs dynamics induces a non-trivial 'structure' visible e.g. in the heat capacity, but that in general the largest radiative corrections originate from QCD effects, reducing the energy density by a couple of percent from the free value even at T > 160GeV.« less
Standard Model thermodynamics across the electroweak crossover
Laine, M.; Meyer, M.
Even though the Standard Model with a Higgs mass m{sub \\tiny H}=125 GeV possesses no bulk phase transition, its thermodynamics still experiences a “soft point” at temperatures around T=160 GeV, with a deviation from ideal gas thermodynamics. Such a deviation may have an effect on precision computations of weakly interacting dark matter relic abundances if their mass is in the few TeV range, or on leptogenesis scenarios operating in this temperature range. By making use of results from lattice simulations based on a dimensionally reduced effective field theory, we estimate the relevant thermodynamic functions across the crossover. The results are tabulatedmore » in a numerical form permitting for their insertion as a background equation of state into cosmological particle production/decoupling codes. We find that Higgs dynamics induces a non-trivial “structure” visible e.g. in the heat capacity, but that in general the largest radiative corrections originate from QCD effects, reducing the energy density by a couple of percent from the free value even at T>160 GeV.« less
Geometrical basis for the Standard Model
NASA Astrophysics Data System (ADS)
Potter, Franklin
1994-02-01
The robust character of the Standard Model is confirmed. Examination of its geometrical basis in three equivalent internal symmetry spaces-the unitary plane C 2, the quaternion space Q, and the real space R 4—as well as the real space R 3 uncovers mathematical properties that predict the physical properties of leptons and quarks. The finite rotational subgroups of the gauge group SU(2) L × U(1) Y generate exactly three lepton families and four quark families and reveal how quarks and leptons are related. Among the physical properties explained are the mass ratios of the six leptons and eight quarks, the origin of the left-handed preference by the weak interaction, the geometrical source of color symmetry, and the zero neutrino masses. The ( u, d) and ( c, s) quark families team together to satisfy the triangle anomaly cancellation with the electron family, while the other families pair one-to-one for cancellation. The spontaneously broken symmetry is discrete and needs no Higgs mechanism. Predictions include all massless neutrinos, the top quark at 160 GeV/ c 2, the b' quark at 80 GeV/ c 2, and the t' quark at 2600 GeV/ c 2.
Augmented standard model and the simplest scenario
NASA Astrophysics Data System (ADS)
Wu, Tai Tsun; Wu, Sau Lan
2015-11-01
The experimental discovery of the Higgs particle in 2012 by the ATLAS Collaboration and the CMS Collaboration at CERN ushers in a new era of particle physics. On the basis of these data, scalar quarks and scalar leptons are added to each generation of quarks and leptons. The resulting augmented standard model has fermion-boson symmetry for each of three generations, but only one Higgs doublet giving masses to all the elementary particles. A specific special case, the simplest scenario, is studied in detail. In this case, there are twenty six quadratic divergences, and all these divergences are cancelled provided that one single relation between the masses is satisfied. This mass relation contains a great deal of information, and in particular determines the masses of all the right-handed scalar quarks and scalar leptons, while gives relations for the masses of the left-handed ones. An alternative procedure is also given with a different starting point and less reliance on the experimental data. The result is of course the same.
Ellipsoidal geometry in asteroid thermal models - The standard radiometric model
NASA Technical Reports Server (NTRS)
Brown, R. H.
1985-01-01
The major consequences of ellipsoidal geometry in an othewise standard radiometric model for asteroids are explored. It is shown that for small deviations from spherical shape a spherical model of the same projected area gives a reasonable aproximation to the thermal flux from an ellipsoidal body. It is suggested that large departures from spherical shape require that some correction be made for geometry. Systematic differences in the radii of asteroids derived radiometrically at 10 and 20 microns may result partly from nonspherical geometry. It is also suggested that extrapolations of the rotational variation of thermal flux from a nonspherical body based solely on the change in cross-sectional area are in error.
Cp Asymmetries in B0DECAYS Beyond the Standard Model
NASA Astrophysics Data System (ADS)
Dib, Claudio O.; London, David; Nir, Yosef
Of the many ingredients of the Standard Model that are relevant to the analysis of CP asymmetries in B0 decays, some are likely to hold even beyond the Standard Model while others are sensitive to new physics. Consequently, certain predictions are maintained while others may show dramatic deviations from the Standard Model. Many classes of models may show clear signatures when the asymmetries are measured: four quark generations, Z-mediated flavor-changing neutral currents, supersymmetry and “real superweak” models. On the other hand, models of left-right symmetry and multi-Higgs sectors with natural flavor conservation are unlikely to modify the Standard Model predictions.
A Five Stage Conceptual Model for Information Technology Standards.
ERIC Educational Resources Information Center
Cargill, Carl F.
The advent of anticipatory and boundary layer standards used in information technology standardization has created a need for a new base level theory that can be used to anticipate the problems that will be encountered in standards planning, creation, and implementation. To meet this need, a five-level model of standards has been developed. The…
Models of Teaching: Connecting Student Learning with Standards
ERIC Educational Resources Information Center
Dell'Olio, Jeanine M.; Donk, Tony
2007-01-01
"Models of Teaching: Connecting Student Learning with Standards" features classic and contemporary models of teaching appropriate to elementary and secondary settings. Authors Jeanine M. Dell'Olio and Tony Donk use detailed case studies to discuss 10 models of teaching and demonstrate how the models can incorporate state content standards and…
Tool for physics beyond the standard model
NASA Astrophysics Data System (ADS)
Newby, Christopher A.
The standard model (SM) of particle physics is a well studied theory, but there are hints that the SM is not the final story. What the full picture is, no one knows, but this thesis looks into three methods useful for exploring a few of the possibilities. To begin I present a paper by Spencer Chang, Nirmal Raj, Chaowaroj Wanotayaroj, and me, that studies the Higgs boson. The scalar particle first seen in 2012 may be the vanilla SM version, but there is some evidence that its couplings are different than predicted. By means of increasing the Higgs' coupling to vector bosons and fermions, we can be more consistent with the data. Next, in a paper by Spencer Chang, Gabriel Barello, and me, we elaborate on a tool created to study dark matter (DM) direct detection. The original work by Anand. et al. focused on elastic dark matter, whereas we extended this work to include the in elastic case, where different DM mass states enter and leave the collision. We also examine several direct detection experiments with our new framework to see if DAMA's modulation can be explained while avoiding the strong constraints imposed by the other experiments. We find that there are several operators that can do this. Finally, in a paper by Spencer Chang, Gabriel Barello, and me, we study an interesting phenomenon know as kinetic mixing, where two gauge bosons can share interactions with particles even though these particles aren't charged under both gauge groups. This, in and of itself, is not new, but we discuss a different method of obtaining this mixing where instead of mixing between two Abelian groups one of the groups is Nonabelian. Using this we then see that there is an inherent mass scale in the mixing strength; something that is absent in the Abelian-Abelian case. Furthermore, if the Nonabelian symmetry is the SU(2)L of the SM then the mass scale of the physics responsible for the mixing is about 1 TeV, right around the sweet spot for detection at the LHC. This dissertation
Primordial lithium and the standard model(s)
NASA Technical Reports Server (NTRS)
Deliyannis, Constantine P.; Demarque, Pierre; Kawaler, Steven D.; Romanelli, Paul; Krauss, Lawrence M.
1989-01-01
The results of new theoretical work on surface Li-7 and Li-6 evolution in the oldest halo stars are presented, along with a new and refined analysis of the predicted primordial Li abundance resulting from big-bang nucleosynthesis. This makes it possible to determine the constraints which can be imposed on cosmology using primordial Li and both standard big-bang and stellar-evolution models. This leads to limits on the baryon density today of 0.0044-0.025 (where the Hubble constant is 100h km/sec Mpc) and imposes limitations on alternative nucleosynthesis scenarios.
Mathematics Teacher TPACK Standards and Development Model
ERIC Educational Resources Information Center
Niess, Margaret L.; Ronau, Robert N.; Shafer, Kathryn G.; Driskell, Shannon O.; Harper, Suzanne R.; Johnston, Christopher; Browning, Christine; Ozgun-Koca, S. Asli; Kersaint, Gladis
2009-01-01
What knowledge is needed to teach mathematics with digital technologies? The overarching construct, called technology, pedagogy, and content knowledge (TPACK), has been proposed as the interconnection and intersection of technology, pedagogy, and content knowledge. Mathematics Teacher TPACK Standards offer guidelines for thinking about this…
Wisconsin Model Academic Standards for Social Studies.
ERIC Educational Resources Information Center
Wisconsin State Dept. of Public Instruction, Madison.
Students at all grade levels in Wisconsin are required to learn about the principles and ideals upon which the United States is founded, and understand the world in which they live. Students at all levels should develop skills and understandings in all five strands found in the Wisconsin standards for social studies. These skills and…
ERIC Educational Resources Information Center
Lee, Jaekyung; Liu, Xiaoyan; Amo, Laura Casey; Wang, Weichun Leilani
2014-01-01
Drawing on national and state assessment datasets in reading and math, this study tested "external" versus "internal" standards-based education models. The goal was to understand whether and how student performance standards work in multilayered school systems under No Child Left Behind Act of 2001 (NCLB). Under the…
Standard solar model. II - g-modes
NASA Technical Reports Server (NTRS)
Guenther, D. B.; Demarque, P.; Pinsonneault, M. H.; Kim, Y.-C.
1992-01-01
The paper presents the g-mode oscillation for a set of modern solar models. Each solar model is based on a single modification or improvement to the physics of a reference solar model. Improvements were made to the nuclear reaction rates, the equation of state, the opacities, and the treatment of the atmosphere. The error in the predicted g-mode periods associated with the uncertainties in the model physics is predicted and the specific sensitivities of the g-mode periods and their period spacings to the different model structures are described. In addition, these models are compared to a sample of published observations. A remarkably good agreement is found between the 'best' solar model and the observations of Hill and Gu (1990).
Standardized training in nurse model travel clinics.
Sofarelli, Theresa A; Ricks, Jane H; Anand, Rahul; Hale, Devon C
2011-01-01
International travel plays a significant role in the emergence and redistribution of major human diseases. The importance of travel medicine clinics for preventing morbidity and mortality has been increasingly appreciated, although few studies have thus far examined the management and staff training strategies that result in successful travel-clinic operations. Here, we describe an example of travel-clinic operation and management coordinated through the University of Utah School of Medicine, Division of Infectious Diseases. This program, which involves eight separate clinics distributed statewide, functions both to provide patient consult and care services, as well as medical provider training and continuing medical education (CME). Initial training, the use of standardized forms and protocols, routine chart reviews and monthly continuing education meetings are the distinguishing attributes of this program. An Infectious Disease team consisting of one medical doctor (MD) and a physician assistant (PA) act as consultants to travel nurses who comprise the majority of clinic staff. Eight clinics distributed throughout the state of Utah serve approximately 6,000 travelers a year. Pre-travel medical services are provided by 11 nurses, including 10 registered nurses (RNs) and 1 licensed practical nurse (LPN). This trained nursing staff receives continuing travel medical education and participate in the training of new providers. All nurses have completed a full training program and 7 of the 11 (64%) of clinic nursing staff serve more than 10 patients a week. Quality assurance measures show that approximately 0.5% of charts reviewed contain a vaccine or prescription error which require patient notification for correction. Using an initial training program, standardized patient intake forms, vaccine and prescription protocols, preprinted prescriptions, and regular CME, highly trained nurses at travel clinics are able to provide standardized pre-travel care to
The Earth's magnetosphere modeling and ISO standard
NASA Astrophysics Data System (ADS)
Alexeev, I.
The empirical model developed by Tsyganenko T96 is constructed by minimizing the rms deviation from the large magnetospheric data base Fairfield et al 1994 which contains Earth s magnetospheric magnetic field measurements accumulated during many years The applicability of the T96 model is limited mainly by quiet conditions in the solar wind along the Earth orbit But contrary to the internal planet s field the external magnetospheric magnetic field sources are much more time-dependent A reliable representation of the magnetic field is crucial in the framework of radiation belt modelling especially for disturbed conditions The last version of the Tsyganenko model has been constructed for a geomagnetic storm time interval This version based on the more accurate and physically consistent approach in which each source of the magnetic field would have its own relaxation timescale and a driving function based on an individual best fit combination of the solar wind and IMF parameters The same method has been used previously for paraboloid model construction This method is based on a priori information about the global magnetospheric current systems structure Each current system is included as a separate block module in the magnetospheric model As it was shown by the spacecraft magnetometer data there are three current systems which are the main contributors to the external magnetospheric magnetic field magnetopause currents ring current and tail current sheet Paraboloid model is based on an analytical solution of the Laplace
NASREN: Standard reference model for telerobot control
NASA Technical Reports Server (NTRS)
Albus, J. S.; Lumia, R.; Mccain, H.
1987-01-01
A hierarchical architecture is described which supports space station telerobots in a variety of modes. The system is divided into three hierarchies: task decomposition, world model, and sensory processing. Goals at each level of the task dedomposition heirarchy are divided both spatially and temporally into simpler commands for the next lower level. This decomposition is repreated until, at the lowest level, the drive signals to the robot actuators are generated. To accomplish its goals, task decomposition modules must often use information stored it the world model. The purpose of the sensory system is to update the world model as rapidly as possible to keep the model in registration with the physical world. The architecture of the entire control system hierarch is described and how it can be applied to space telerobot applications.
Particle Physics Primer: Explaining the Standard Model of Matter.
ERIC Educational Resources Information Center
Vondracek, Mark
2002-01-01
Describes the Standard Model, a basic model of the universe that describes electromagnetic force, weak nuclear force radioactivity, and the strong nuclear force responsible for holding particles within the nucleus together. (YDS)
Creating Better School-Age Care Jobs: Model Work Standards.
ERIC Educational Resources Information Center
Haack, Peggy
Built on the premise that good school-age care jobs are the cornerstone of high-quality services for school-age youth and their families, this guide presents model work standards for school-age care providers. The guide begins with a description of the strengths and challenges of the school-age care profession. The model work standards are…
The Standard Model from LHC to future colliders.
Forte, S; Nisati, A; Passarino, G; Tenchini, R; Calame, C M Carloni; Chiesa, M; Cobal, M; Corcella, G; Degrassi, G; Ferrera, G; Magnea, L; Maltoni, F; Montagna, G; Nason, P; Nicrosini, O; Oleari, C; Piccinini, F; Riva, F; Vicini, A
This review summarizes the results of the activities which have taken place in 2014 within the Standard Model Working Group of the "What Next" Workshop organized by INFN, Italy. We present a framework, general questions, and some indications of possible answers on the main issue for Standard Model physics in the LHC era and in view of possible future accelerators.
Big bang nucleosynthesis - The standard model and alternatives
NASA Technical Reports Server (NTRS)
Schramm, David N.
1991-01-01
The standard homogeneous-isotropic calculation of the big bang cosmological model is reviewed, and alternate models are discussed. The standard model is shown to agree with the light element abundances for He-4, H-2, He-3, and Li-7 that are available. Improved observational data from recent LEP collider and SLC results are discussed. The data agree with the standard model in terms of the number of neutrinos, and provide improved information regarding neutron lifetimes. Alternate models are reviewed which describe different scenarios for decaying matter or quark-hadron induced inhomogeneities. The baryonic density relative to the critical density in the alternate models is similar to that of the standard model when they are made to fit the abundances. This reinforces the conclusion that the baryonic density relative to critical density is about 0.06, and also reinforces the need for both nonbaryonic dark matter and dark baryonic matter.
ERIC Educational Resources Information Center
Levy, Roy; Xu, Yuning; Yel, Nedim; Svetina, Dubravka
2015-01-01
The standardized generalized dimensionality discrepancy measure and the standardized model-based covariance are introduced as tools to critique dimensionality assumptions in multidimensional item response models. These tools are grounded in a covariance theory perspective and associated connections between dimensionality and local independence.…
Sporulation in Bacteria: Beyond the Standard Model.
Hutchison, Elizabeth A; Miller, David A; Angert, Esther R
2014-10-01
Endospore formation follows a complex, highly regulated developmental pathway that occurs in a broad range of Firmicutes. Although Bacillus subtilis has served as a powerful model system to study the morphological, biochemical, and genetic determinants of sporulation, fundamental aspects of the program remain mysterious for other genera. For example, it is entirely unknown how most lineages within the Firmicutes regulate entry into sporulation. Additionally, little is known about how the sporulation pathway has evolved novel spore forms and reproductive schemes. Here, we describe endospore and internal offspring development in diverse Firmicutes and outline progress in characterizing these programs. Moreover, comparative genomics studies are identifying highly conserved sporulation genes, and predictions of sporulation potential in new isolates and uncultured bacteria can be made from these data. One surprising outcome of these comparative studies is that core regulatory and some structural aspects of the program appear to be universally conserved. This suggests that a robust and sophisticated developmental framework was already in place in the last common ancestor of all extant Firmicutes that produce internal offspring or endospores. The study of sporulation in model systems beyond B. subtilis will continue to provide key information on the flexibility of the program and provide insights into how changes in this developmental course may confer advantages to cells in diverse environments.
The MP (Materialization Pattern) Model for Representing Math Educational Standards
NASA Astrophysics Data System (ADS)
Choi, Namyoun; Song, Il-Yeol; An, Yuan
Representing natural languages with UML has been an important research issue for various reasons. Little work has been done for modeling imperative mood sentences which are the sentence structure of math educational standard statements. In this paper, we propose the MP (Materialization Pattern) model that captures the semantics of English sentences used in math educational standards. The MP model is based on the Reed-Kellogg sentence diagrams and creates MP schemas with the UML notation. The MP model explicitly represents the semantics of the sentences by extracting math concepts and the cognitive process of math concepts from math educational standard statements, and simplifies modeling. This MP model is also developed to be used for aligning math educational standard statements via schema matching.
Reference Architecture Model Enabling Standards Interoperability.
Blobel, Bernd
2017-01-01
Advanced health and social services paradigms are supported by a comprehensive set of domains managed by different scientific disciplines. Interoperability has to evolve beyond information and communication technology (ICT) concerns, including the real world business domains and their processes, but also the individual context of all actors involved. So, the system must properly reflect the environment in front and around the computer as essential and even defining part of the health system. This paper introduces an ICT-independent system-theoretical, ontology-driven reference architecture model allowing the representation and harmonization of all domains involved including the transformation into an appropriate ICT design and implementation. The entire process is completely formalized and can therefore be fully automated.
Standardized verification of fuel cycle modeling
Feng, B.; Dixon, B.; Sunny, E.; ...
2016-04-05
A nuclear fuel cycle systems modeling and code-to-code comparison effort was coordinated across multiple national laboratories to verify the tools needed to perform fuel cycle analyses of the transition from a once-through nuclear fuel cycle to a sustainable potential future fuel cycle. For this verification study, a simplified example transition scenario was developed to serve as a test case for the four systems codes involved (DYMOND, VISION, ORION, and MARKAL), each used by a different laboratory participant. In addition, all participants produced spreadsheet solutions for the test case to check all the mass flows and reactor/facility profiles on a year-by-yearmore » basis throughout the simulation period. The test case specifications describe a transition from the current US fleet of light water reactors to a future fleet of sodium-cooled fast reactors that continuously recycle transuranic elements as fuel. After several initial coordinated modeling and calculation attempts, it was revealed that most of the differences in code results were not due to different code algorithms or calculation approaches, but due to different interpretations of the input specifications among the analysts. Therefore, the specifications for the test case itself were iteratively updated to remove ambiguity and to help calibrate interpretations. In addition, a few corrections and modifications were made to the codes as well, which led to excellent agreement between all codes and spreadsheets for this test case. Although no fuel cycle transition analysis codes matched the spreadsheet results exactly, all remaining differences in the results were due to fundamental differences in code structure and/or were thoroughly explained. As a result, the specifications and example results are provided so that they can be used to verify additional codes in the future for such fuel cycle transition scenarios.« less
Simulation and Modeling Capability for Standard Modular Hydropower Technology
Stewart, Kevin M.; Smith, Brennan T.; Witt, Adam M.
Grounded in the stakeholder-validated framework established in Oak Ridge National Laboratory’s SMH Exemplary Design Envelope Specification, this report on Simulation and Modeling Capability for Standard Modular Hydropower (SMH) Technology provides insight into the concepts, use cases, needs, gaps, and challenges associated with modeling and simulating SMH technologies. The SMH concept envisions a network of generation, passage, and foundation modules that achieve environmentally compatible, cost-optimized hydropower using standardization and modularity. The development of standardized modeling approaches and simulation techniques for SMH (as described in this report) will pave the way for reliable, cost-effective methods for technology evaluation, optimization, and verification.
Air Quality Modeling | Air Quality Planning & Standards | US ...
2016-06-08
The basic mission of the Office of Air Quality Planning and Standards is to preserve and improve the quality of our nation's air. One facet of accomplishing this goal requires that new and existing air pollution sources be modeled for compliance with the National Ambient Air Quality Standards (NAAQS).
Distributed geospatial model sharing based on open interoperability standards
Feng, Min; Liu, Shuguang; Euliss, Ned H.; Fang, Yin
2009-01-01
Numerous geospatial computational models have been developed based on sound principles and published in journals or presented in conferences. However modelers have made few advances in the development of computable modules that facilitate sharing during model development or utilization. Constraints hampering development of model sharing technology includes limitations on computing, storage, and connectivity; traditional stand-alone and closed network systems cannot fully support sharing and integrating geospatial models. To address this need, we have identified methods for sharing geospatial computational models using Service Oriented Architecture (SOA) techniques and open geospatial standards. The service-oriented model sharing service is accessible using any tools or systems compliant with open geospatial standards, making it possible to utilize vast scientific resources available from around the world to solve highly sophisticated application problems. The methods also allow model services to be empowered by diverse computational devices and technologies, such as portable devices and GRID computing infrastructures. Based on the generic and abstract operations and data structures required for Web Processing Service (WPS) standards, we developed an interactive interface for model sharing to help reduce interoperability problems for model use. Geospatial computational models are shared on model services, where the computational processes provided by models can be accessed through tools and systems compliant with WPS. We developed a platform to help modelers publish individual models in a simplified and efficient way. Finally, we illustrate our technique using wetland hydrological models we developed for the prairie pothole region of North America.
The standard data model approach to patient record transfer.
Canfield, K.; Silva, M.; Petrucci, K.
1994-01-01
This paper develops an approach to electronic data exchange of patient records from Ambulatory Encounter Systems (AESs). This approach assumes that the AES is based upon a standard data model. The data modeling standard used here is IDEFIX for Entity/Relationship (E/R) modeling. Each site that uses a relational database implementation of this standard data model (or a subset of it) can exchange very detailed patient data with other such sites using industry standard tools and without excessive programming efforts. This design is detailed below for a demonstration project between the research-oriented geriatric clinic at the Baltimore Veterans Affairs Medical Center (BVAMC) and the Laboratory for Healthcare Informatics (LHI) at the University of Maryland. PMID:7949973
A Sandbox Environment for the Community Sensor Model Standard
NASA Astrophysics Data System (ADS)
Hare, T. M.; Laura, J. R.; Humpreys, I. R.; Wilson, T. J.; Hahn, M. A.; Shepherd, M. R.; Sides, S. C.
2017-06-01
Here we present ongoing work Astrogeology is undertaking to provide a programming sandbox environment for the Community Sensor Model standard. We define a sandbox as a testing environment that allows programmers to experiment.
Holomorphy without supersymmetry in the Standard Model Effective Field Theory
Alonso, Rodrigo; Jenkins, Elizabeth E.; Manohar, Aneesh V.
2014-12-12
The anomalous dimensions of dimension-six operators in the Standard Model Effective Field Theory (SMEFT) respect holomorphy to a large extent. Holomorphy conditions are reminiscent of supersymmetry, even though the SMEFT is not a supersymmetric theory.
NASA Standard for Models and Simulations: Philosophy and Requirements Overview
NASA Technical Reports Server (NTRS)
Blattnig, Steve R.; Luckring, James M.; Morrison, Joseph H.; Sylvester, Andre J.; Tripathi, Ram K.; Zang, Thomas A.
2013-01-01
Following the Columbia Accident Investigation Board report, the NASA Administrator chartered an executive team (known as the Diaz Team) to identify those CAIB report elements with NASA-wide applicability and to develop corrective measures to address each element. One such measure was the development of a standard for the development, documentation, and operation of models and simulations. This report describes the philosophy and requirements overview of the resulting NASA Standard for Models and Simulations.
NASA Standard for Models and Simulations: Philosophy and Requirements Overview
NASA Technical Reports Server (NTRS)
Blattnig, St3eve R.; Luckring, James M.; Morrison, Joseph H.; Sylvester, Andre J.; Tripathi, Ram K.; Zang, Thomas A.
2009-01-01
Following the Columbia Accident Investigation Board report, the NASA Administrator chartered an executive team (known as the Diaz Team) to identify those CAIB report elements with NASA-wide applicability and to develop corrective measures to address each element. One such measure was the development of a standard for the development, documentation, and operation of models and simulations. This report describes the philosophy and requirements overview of the resulting NASA Standard for Models and Simulations.
Standard Model of Particle Physics--a health physics perspective.
Bevelacqua, J J
2010-11-01
The Standard Model of Particle Physics is reviewed with an emphasis on its relationship to the physics supporting the health physics profession. Concepts important to health physics are emphasized and specific applications are presented. The capability of the Standard Model to provide health physics relevant information is illustrated with application of conservation laws to neutron and muon decay and in the calculation of the neutron mean lifetime.
A standard library for modeling satellite orbits on a microcomputer
NASA Astrophysics Data System (ADS)
Beutel, Kenneth L.
1988-03-01
Introductory students of astrodynamics and the space environment are required to have a fundamental understanding of the kinematic behavior of satellite orbits. This thesis develops a standard library that contains the basic formulas for modeling earth orbiting satellites. This library is used as a basis for implementing a satellite motion simulator that can be used to demonstrate orbital phenomena in the classroom. Surveyed are the equations of orbital elements, coordinate systems and analytic formulas, which are made into a standard method for modeling earth orbiting satellites. The standard library is written in the C programming language and is designed to be highly portable between a variety of computer environments. The simulation draws heavily on the standards established by the library to produce a graphics-based orbit simulation program written for the Apple Macintosh computer. The simulation demonstrates the utility of the standard library functions but, because of its extensive use of the Macintosh user interface, is not portable to other operating systems.
Bounce inflation cosmology with Standard Model Higgs boson
Wan, Youping; Huang, Fa Peng; Zhang, Xinmin
It is of great interest to connect cosmology in the early universe to the Standard Model of particle physics. In this paper, we try to construct a bounce inflation model with the standard model Higgs boson, where the one loop correction is taken into account in the effective potential of Higgs field. In this model, a Galileon term has been introduced to eliminate the ghost mode when bounce happens. Moreover, due to the fact that the Fermion loop correction can make part of the Higgs potential negative, one naturally obtains a large equation of state(EoS) parameter in the contracting phase,more » which can eliminate the anisotropy problem. After the bounce, the model can drive the universe into the standard higgs inflation phase, which can generate nearly scale-invariant power spectrum.« less
Conformal standard model with an extended scalar sector
NASA Astrophysics Data System (ADS)
Latosinski, Adam; Lewandowski, Adrian; Meissner, Krzysztof A.; Nicolai, Hermann
2015-10-01
We present an extended version of the Conformal Standard Model (characterized by the absence of any new intermediate scales between the electroweak scale and the Planck scale) with an enlarged scalar sector coupling to right-chiral neutrinos. The scalar potential and the Yukawa couplings involving only right-chiral neutrinos are invariant under a new global symmetry SU(3) N that complements the standard U(1) B-L symmetry, and is broken explicitly only by the Yukawa interaction, of order O (10-6), coupling right-chiral neutrinos and the electroweak lepton doublets. We point out four main advantages of this enlargement, namely: (1) the economy of the (non-supersymmetric) Standard Model, and thus its observational success, is preserved; (2) thanks to the enlarged scalar sector the RG improved one-loop effective potential is everywhere positive with a stable global minimum, thereby avoiding the notorious instability of the Standard Model vacuum; (3) the pseudo-Goldstone bosons resulting from spontaneous breaking of the SU(3) N symmetry are natural Dark Matter candidates with calculable small masses and couplings; and (4) the Majorana Yukawa coupling matrix acquires a form naturally adapted to leptogenesis. The model is made perturbatively consistent up to the Planck scale by imposing the vanishing of quadratic divergences at the Planck scale (`softly broken conformal symmetry'). Observable consequences of the model occur mainly via the mixing of the new scalars and the standard model Higgs boson.
Making Validated Educational Models Central in Preschool Standards.
ERIC Educational Resources Information Center
Schweinhart, Lawrence J.
This paper presents some ideas to preschool educators and policy makers about how to make validated educational models central in standards for preschool education and care programs that are available to all 3- and 4-year-olds. Defining an educational model as a coherent body of program practices, curriculum content, program and child, and teacher…
Informatics in radiology: an information model of the DICOM standard.
Kahn, Charles E; Langlotz, Curtis P; Channin, David S; Rubin, Daniel L
2011-01-01
The Digital Imaging and Communications in Medicine (DICOM) Standard is a key foundational technology for radiology. However, its complexity creates challenges for information system developers because the current DICOM specification requires human interpretation and is subject to nonstandard implementation. To address this problem, a formally sound and computationally accessible information model of the DICOM Standard was created. The DICOM Standard was modeled as an ontology, a machine-accessible and human-interpretable representation that may be viewed and manipulated by information-modeling tools. The DICOM Ontology includes a real-world model and a DICOM entity model. The real-world model describes patients, studies, images, and other features of medical imaging. The DICOM entity model describes connections between real-world entities and the classes that model the corresponding DICOM information entities. The DICOM Ontology was created to support the Cancer Biomedical Informatics Grid (caBIG) initiative, and it may be extended to encompass the entire DICOM Standard and serve as a foundation of medical imaging systems for research and patient care. RSNA, 2010
Dark matter, constrained minimal supersymmetric standard model, and lattice QCD.
Giedt, Joel; Thomas, Anthony W; Young, Ross D
2009-11-13
Recent lattice measurements have given accurate estimates of the quark condensates in the proton. We use these results to significantly improve the dark matter predictions in benchmark models within the constrained minimal supersymmetric standard model. The predicted spin-independent cross sections are at least an order of magnitude smaller than previously suggested and our results have significant consequences for dark matter searches.
Higher Education Quality Assessment Model: Towards Achieving Educational Quality Standard
ERIC Educational Resources Information Center
Noaman, Amin Y.; Ragab, Abdul Hamid M.; Madbouly, Ayman I.; Khedra, Ahmed M.; Fayoumi, Ayman G.
2017-01-01
This paper presents a developed higher education quality assessment model (HEQAM) that can be applied for enhancement of university services. This is because there is no universal unified quality standard model that can be used to assess the quality criteria of higher education institutes. The analytical hierarchy process is used to identify the…
Improving automation standards via semantic modelling: Application to ISA88.
Dombayci, Canan; Farreres, Javier; Rodríguez, Horacio; Espuña, Antonio; Graells, Moisès
2017-03-01
Standardization is essential for automation. Extensibility, scalability, and reusability are important features for automation software that rely in the efficient modelling of the addressed systems. The work presented here is from the ongoing development of a methodology for semi-automatic ontology construction methodology from technical documents. The main aim of this work is to systematically check the consistency of technical documents and support the improvement of technical document consistency. The formalization of conceptual models and the subsequent writing of technical standards are simultaneously analyzed, and guidelines proposed for application to future technical standards. Three paradigms are discussed for the development of domain ontologies from technical documents, starting from the current state of the art, continuing with the intermediate method presented and used in this paper, and ending with the suggested paradigm for the future. The ISA88 Standard is taken as a representative case study. Linguistic techniques from the semi-automatic ontology construction methodology is applied to the ISA88 Standard and different modelling and standardization aspects that are worth sharing with the automation community is addressed. This study discusses different paradigms for developing and sharing conceptual models for the subsequent development of automation software, along with presenting the systematic consistency checking method. Copyright © 2017 ISA. Published by Elsevier Ltd. All rights reserved.
A Standard Kinematic Model for Flight Simulation at NASA Ames
NASA Technical Reports Server (NTRS)
Mcfarland, R. E.
1975-01-01
A standard kinematic model for aircraft simulation exists at NASA-Ames on a variety of computer systems, one of which is used to control the flight simulator for advanced aircraft (FSAA). The derivation of the kinematic model is given and various mathematical relationships are presented as a guide. These include descriptions of standardized simulation subsystems such as the atmospheric turbulence model and the generalized six-degrees-of-freedom trim routine, as well as an introduction to the emulative batch-processing system which enables this facility to optimize its real-time environment.
Non-standard models and the sociology of cosmology
NASA Astrophysics Data System (ADS)
López-Corredoira, Martín
2014-05-01
I review some theoretical ideas in cosmology different from the standard "Big Bang": the quasi-steady state model, the plasma cosmology model, non-cosmological redshifts, alternatives to non-baryonic dark matter and/or dark energy, and others. Cosmologists do not usually work within the framework of alternative cosmologies because they feel that these are not at present as competitive as the standard model. Certainly, they are not so developed, and they are not so developed because cosmologists do not work on them. It is a vicious circle. The fact that most cosmologists do not pay them any attention and only dedicate their research time to the standard model is to a great extent due to a sociological phenomenon (the "snowball effect" or "groupthink"). We might well wonder whether cosmology, our knowledge of the Universe as a whole, is a science like other fields of physics or a predominant ideology.
NASA Standard for Models and Simulations: Credibility Assessment Scale
NASA Technical Reports Server (NTRS)
Babula, Maria; Bertch, William J.; Green, Lawrence L.; Hale, Joseph P.; Moser, Gary E.; Steele, Martin J.; Sylvester, Andre; Woods, Jody
2008-01-01
As one of its many responses to the 2003 Space Shuttle Columbia accident, NASA decided to develop a formal standard for models and simulations (M and S)ii. Work commenced in May 2005. An interim version was issued in late 2006. This interim version underwent considerable revision following an extensive Agency-wide review in 2007 along with some additional revisions as a result of the review by the NASA Engineering Management Board (EMB) in the first half of 2008. Issuance of the revised, permanent version,hereafter referred to as the M and S Standard or just the Standard, occurred in July 2008.
Modeling Standards of Care for an Online Environment
Jones-Schenk, Jan; Rossi, Julia
1998-01-01
At Intermountain Health Care in Salt Lake City a team was created to develop core standards for clinical practice that would enhance consistency of care across the care continuum. The newly developed Standards of Care had to meet the following criteria: electronic delivery, research-based, and support an interdisciplinary care environment along with an exception-based documentation system. The process has slowly evolved and the team has grown to include clinicians from multiple sites and disciplines who have met on a regular basis for over a year. The first challenge was to develop a model for the standards of care that would be suitable for an online environment.
The rare decay K+ → π +ν overlineν as a standard model "standard"
NASA Astrophysics Data System (ADS)
Bigi, I. I.; Gabbiani, F.
1991-12-01
A priori one would expect that extensions of the Standard Model can significantly enhance (or suppress) BR( K+ → π +ν overlineν. We have analyzed many different classes of such extensions: models with a non-minimal Higgs sector or with right-handed currents; fourth family extensions; SUSY in both the minimal and non-minimal version, and even with broken R-parity.We find that - apart from a few somewhat exotic exceptions - these extensions have little direct impact on K+ → π +ν overlineν due to constraints that are inferred from Bd- overlineBd and K0- overlineK0 mixing and upper bounds on B → K∗γ . Accordingly K+ → ν +ν overlineν probes very cleanly the top mass and the KM parameter | V(td) V(ts)|, two fundamental parameters in the Standard Model.
NASA Standard for Models and Simulations: Credibility Assessment Scale
NASA Technical Reports Server (NTRS)
Babula, Maria; Bertch, William J.; Green, Lawrence L.; Hale, Joseph P.; Mosier, Gary E.; Steele, Martin J.; Woods, Jody
2009-01-01
As one of its many responses to the 2003 Space Shuttle Columbia accident, NASA decided to develop a formal standard for models and simulations (M&S). Work commenced in May 2005. An interim version was issued in late 2006. This interim version underwent considerable revision following an extensive Agency-wide review in 2007 along with some additional revisions as a result of the review by the NASA Engineering Management Board (EMB) in the first half of 2008. Issuance of the revised, permanent version, hereafter referred to as the M&S Standard or just the Standard, occurred in July 2008. Bertch, Zang and Steeleiv provided a summary review of the development process of this standard up through the start of the review by the EMB. A thorough recount of the entire development process, major issues, key decisions, and all review processes are available in Ref. v. This is the second of a pair of papers providing a summary of the final version of the Standard. Its focus is the Credibility Assessment Scale, a key feature of the Standard, including an example of its application to a real-world M&S problem for the James Webb Space Telescope. The companion paper summarizes the overall philosophy of the Standard and an overview of the requirements. Verbatim quotes from the Standard are integrated into the text of this paper, and are indicated by quotation marks.
Lattice Gauge Theories Within and Beyond the Standard Model
Gelzer, Zechariah John
The Standard Model of particle physics has been very successful in describing fundamental interactions up to the highest energies currently probed in particle accelerator experiments. However, the Standard Model is incomplete and currently exhibits tension with experimental data for interactions involvingmore » $B$~mesons. Consequently, $B$-meson physics is of great interest to both experimentalists and theorists. Experimentalists worldwide are studying the decay and mixing processes of $B$~mesons in particle accelerators. Theorists are working to understand the data by employing lattice gauge theories within and beyond the Standard Model. This work addresses the theoretical effort and is divided into two main parts. In the first part, I present a lattice-QCD calculation of form factors for exclusive semileptonic decays of $B$~mesons that are mediated by both charged currents ($$B \\to \\pi \\ell \
The standard model on non-commutative space-time
NASA Astrophysics Data System (ADS)
Calmet, X.; Jurčo, B.; Schupp, P.; Wess, J.; Wohlgenannt, M.
2002-03-01
We consider the standard model on a non-commutative space and expand the action in the non-commutativity parameter θ^{μ ν}. No new particles are introduced; the structure group is SU(3)× SU(2)× U(1). We derive the leading order action. At zeroth order the action coincides with the ordinary standard model. At leading order in θ^{μν} we find new vertices which are absent in the standard model on commutative space-time. The most striking features are couplings between quarks, gluons and electroweak bosons and many new vertices in the charged and neutral currents. We find that parity is violated in non-commutative QCD. The Higgs mechanism can be applied. QED is not deformed in the minimal version of the NCSM to the order considered.
Cosmological signatures of a UV-conformal standard model.
Dorsch, Glauber C; Huber, Stephan J; No, Jose Miguel
2014-09-19
Quantum scale invariance in the UV has been recently advocated as an attractive way of solving the gauge hierarchy problem arising in the standard model. We explore the cosmological signatures at the electroweak scale when the breaking of scale invariance originates from a hidden sector and is mediated to the standard model by gauge interactions (gauge mediation). These scenarios, while being hard to distinguish from the standard model at LHC, can give rise to a strong electroweak phase transition leading to the generation of a large stochastic gravitational wave signal in possible reach of future space-based detectors such as eLISA and BBO. This relic would be the cosmological imprint of the breaking of scale invariance in nature.
Solar Luminosity on the Main Sequence, Standard Model and Variations
NASA Astrophysics Data System (ADS)
Ayukov, S. V.; Baturin, V. A.; Gorshkov, A. B.; Oreshina, A. V.
2017-05-01
Our Sun became Main Sequence star 4.6 Gyr ago according Standard Solar Model. At that time solar luminosity was 30% lower than current value. This conclusion is based on assumption that Sun is fueled by thermonuclear reactions. If Earth's albedo and emissivity in infrared are unchanged during Earth history, 2.3 Gyr ago oceans had to be frozen. This contradicts to geological data: there was liquid water 3.6-3.8 Gyr ago on Earth. This problem is known as Faint Young Sun Paradox. We analyze luminosity change in standard solar evolution theory. Increase of mean molecular weight in the central part of the Sun due to conversion of hydrogen to helium leads to gradual increase of luminosity with time on the Main Sequence. We also consider several exotic models: fully mixed Sun; drastic change of pp reaction rate; Sun consisting of hydrogen and helium only. Solar neutrino observations however exclude most non-standard solar models.
Test of a Power Transfer Model for Standardized Electrofishing
Miranda, L.E.; Dolan, C.R.
2003-01-01
Standardization of electrofishing in waters with differing conductivities is critical when monitoring temporal and spatial differences in fish assemblages. We tested a model that can help improve the consistency of electrofishing by allowing control over the amount of power that is transferred to the fish. The primary objective was to verify, under controlled laboratory conditions, whether the model adequately described fish immobilization responses elicited with various electrical settings over a range of water conductivities. We found that the model accurately described empirical observations over conductivities ranging from 12 to 1,030 ??S/cm for DC and various pulsed-DC settings. Because the model requires knowledge of a fish's effective conductivity, an attribute that is likely to vary according to species, size, temperature, and other variables, a second objective was to gather available estimates of the effective conductivity of fish to examine the magnitude of variation and to assess whether in practical applications a standard effective conductivity value for fish may be assumed. We found that applying a standard fish effective conductivity of 115 ??S/cm introduced relatively little error into the estimation of the peak power density required to immobilize fish with electrofishing. However, this standard was derived from few estimates of fish effective conductivity and a limited number of species; more estimates are needed to validate our working standard.
A standard telemental health evaluation model: the time is now.
Kramer, Greg M; Shore, Jay H; Mishkind, Matt C; Friedl, Karl E; Poropatich, Ronald K; Gahm, Gregory A
2012-05-01
The telehealth field has advanced historic promises to improve access, cost, and quality of care. However, the extent to which it is delivering on its promises is unclear as the scientific evidence needed to justify success is still emerging. Many have identified the need to advance the scientific knowledge base to better quantify success. One method for advancing that knowledge base is a standard telemental health evaluation model. Telemental health is defined here as the provision of mental health services using live, interactive video-teleconferencing technology. Evaluation in the telemental health field largely consists of descriptive and small pilot studies, is often defined by the individual goals of the specific programs, and is typically focused on only one outcome. The field should adopt new evaluation methods that consider the co-adaptive interaction between users (patients and providers), healthcare costs and savings, and the rapid evolution in communication technologies. Acceptance of a standard evaluation model will improve perceptions of telemental health as an established field, promote development of a sounder empirical base, promote interagency collaboration, and provide a framework for more multidisciplinary research that integrates measuring the impact of the technology and the overall healthcare aspect. We suggest that consideration of a standard model is timely given where telemental health is at in terms of its stage of scientific progress. We will broadly recommend some elements of what such a standard evaluation model might include for telemental health and suggest a way forward for adopting such a model.
Status of the AIAA Modeling and Simulation Format Standard
NASA Technical Reports Server (NTRS)
Jackson, E. Bruce; Hildreth, Bruce L.
2008-01-01
The current draft AIAA Standard for flight simulation models represents an on-going effort to improve the productivity of practitioners of the art of digital flight simulation (one of the original digital computer applications). This initial release provides the capability for the efficient representation and exchange of an aerodynamic model in full fidelity; the DAVE-ML format can be easily imported (with development of site-specific import tools) in an unambiguous way with automatic verification. An attractive feature of the standard is the ability to coexist with existing legacy software or tools. The draft Standard is currently limited in scope to static elements of dynamic flight simulations; however, these static elements represent the bulk of typical flight simulation mathematical models. It is already seeing application within U.S. and Australian government agencies in an effort to improve productivity and reduce model rehosting overhead. An existing tool allows import of DAVE-ML models into a popular simulation modeling and analysis tool, and other community-contributed tools and libraries can simplify the use of DAVE-ML compliant models at compile- or run-time of high-fidelity flight simulation.
Joe H. Scott; Robert E. Burgan
2005-01-01
This report describes a new set of standard fire behavior fuel models for use with Rothermel's surface fire spread model and the relationship of the new set to the original set of 13 fire behavior fuel models. To assist with transition to using the new fuel models, a fuel model selection guide, fuel model crosswalk, and set of fuel model photos are provided.
Search for the standard model Higgs boson in $$l\
Li, Dikai
2013-01-01
Humans have always attempted to understand the mystery of Nature, and more recently physicists have established theories to describe the observed phenomena. The most recent theory is a gauge quantum field theory framework, called Standard Model (SM), which proposes a model comprised of elementary matter particles and interaction particles which are fundamental force carriers in the most unified way. The Standard Model contains the internal symmetries of the unitary product group SU(3) c ⓍSU(2) L Ⓧ U(1) Y , describes the electromagnetic, weak and strong interactions; the model also describes how quarks interact with each other through all of thesemore » three interactions, how leptons interact with each other through electromagnetic and weak forces, and how force carriers mediate the fundamental interactions.« less
Searching for Physics Beyond the Standard Model and Beyond
NASA Astrophysics Data System (ADS)
Abdullah, Mohammad
The hierarchy problem, convolved with the various known puzzles in particle physics, grants us a great outlook of new physics soon to be discovered. We present multiple approaches to searching for physics beyond the standard model. First, two models with a minimal amount of theoretical guidance are analyzed using existing or simulated LHC data. Then, an extension of the Minimal Supersymmetric Standard Model (MSSM) is studied with an emphasis on the cosmological implications as well as the current and future sensitivity of colliders, direct detection and indirect detection experiments. Finally, a more complete model of the MSSM is presented through which we attempt to resolve tension with observations within the context of gauge mediated supersymmetry breaking.
Ontology based standardization of Petri net modeling for signaling pathways.
Takai-Igarashi, Takako
2005-01-01
Taking account of the great availability of Petri nets in modeling and analyzing large complicated signaling networks, semantics of Petri nets is in need of systematization for the purpose of consistency and reusability of the models. This paper reports on standardization of units of Petri nets on the basis of an ontology that gives an intrinsic definition to the process of signaling in signaling pathways.
Ontology based standardization of petri net modeling for signaling pathways.
Takai-Igarashi, Takako
2011-01-01
Taking account of the great availability of Petri nets in modeling and analyzing large complicated signaling networks, semantics of Petri nets is in need of systematization for the purpose of consistency and reusability of the models. This paper reports on standardization of units of Petri nets on the basis of an ontology that gives an intrinsic definition to the process of signaling in signaling pathways.
Teacher Leader Model Standards: Implications for Preparation, Policy, and Practice
ERIC Educational Resources Information Center
Berg, Jill Harrison; Carver, Cynthia L.; Mangin, Melinda M.
2014-01-01
Teacher leadership is increasingly recognized as a resource for instructional improvement. Consequently, teacher leader initiatives have expanded rapidly despite limited knowledge about how to prepare and support teacher leaders. In this context, the "Teacher Leader Model Standards" represent an important development in the field. In…
Stability and UV completion of the Standard Model
NASA Astrophysics Data System (ADS)
Branchina, Vincenzo; Messina, Emanuele
2017-03-01
The knowledge of the electroweak vacuum stability condition is of the greatest importance for our understanding of beyond Standard Model physics. It is widely believed that new physics that lives at very high-energy scales should have no impact on the stability analysis. This expectation has been recently challenged, but the results were controversial as new physics was given in terms of non-renormalizable higher-order operators. Here we consider for the first time new physics at extremely high-energy scales (say close to the Planck scale) in terms of renormalizable operators, in other words we consider a sort of toy UV completion of the Standard Model, and definitely show that its presence can be crucial in determining the vacuum stability condition. This result has important phenomenological consequences, as it provides useful guidance in studying beyond Standard Model theories. Moreover, it suggests that very popular speculations based on the so-called “criticality” of the Standard Model do not appear to be well founded.
View of a five inch standard Mark III model 1 ...
View of a five inch standard Mark III model 1 #39, manufactured in 1916 at the naval gun factory waterveliet, NY; this is the only gun remaining on olympia dating from the period when it was in commission; note ammunition lift at left side of photograph. (p36) - USS Olympia, Penn's Landing, 211 South Columbus Boulevard, Philadelphia, Philadelphia County, PA
Mathematical Modeling, Sense Making, and the Common Core State Standards
ERIC Educational Resources Information Center
Schoenfeld, Alan H.
2013-01-01
On October 14, 2013 the Mathematics Education Department at Teachers College hosted a full-day conference focused on the Common Core Standards Mathematical Modeling requirements to be implemented in September 2014 and in honor of Professor Henry Pollak's 25 years of service to the school. This article is adapted from my talk at this conference…
Home Economics Education Career Path Guide and Model Curriculum Standards.
ERIC Educational Resources Information Center
California State Univ., Northridge.
This curriculum guide developed in California and organized in 10 chapters, provides a home economics education career path guide and model curriculum standards for high school home economics programs. The first chapter contains information on the following: home economics education in California, home economics careers for the future, home…
An Exercise in Modelling Using the US Standard Atmosphere
ERIC Educational Resources Information Center
LoPresto, Michael C.; Jacobs, Diane A.
2007-01-01
In this exercise the US Standard Atmosphere is used as "data" that a student is asked to model by deriving equations to reproduce it with the help of spreadsheet and graphing software. The exercise can be used as a laboratory or an independent study for a student of introductory physics to provide an introduction to scientific research…
Constructing exact perturbations of the standard cosmological models
NASA Astrophysics Data System (ADS)
Sopuerta, Carlos F.
1999-11-01
In this paper we show a procedure to construct cosmological models which, according to a covariant criterion, can be seen as exact (nonlinear) perturbations of the standard Friedmann-Lemaı⁁tre-Robertson-Walker (FLRW) cosmological models. The special properties of this procedure will allow us to select some of the characteristics of the models and also to study in depth their main geometrical and physical features. In particular, the models are conformally stationary, which means that they are compatible with the existence of isotropic radiation, and the observers that would measure this isotropy are rotating. Moreover, these models have two arbitrary functions (one of them is a complex function) which control their main properties, and in general they do not have any isometry. We study two examples, focusing on the case when the underlying FLRW models are flat dust models. In these examples we compare our results with those of the linearized theory of perturbations about a FLRW background.
Prediction models for clustered data: comparison of a random intercept and standard regression model
2013-01-01
Background When study data are clustered, standard regression analysis is considered inappropriate and analytical techniques for clustered data need to be used. For prediction research in which the interest of predictor effects is on the patient level, random effect regression models are probably preferred over standard regression analysis. It is well known that the random effect parameter estimates and the standard logistic regression parameter estimates are different. Here, we compared random effect and standard logistic regression models for their ability to provide accurate predictions. Methods Using an empirical study on 1642 surgical patients at risk of postoperative nausea and vomiting, who were treated by one of 19 anesthesiologists (clusters), we developed prognostic models either with standard or random intercept logistic regression. External validity of these models was assessed in new patients from other anesthesiologists. We supported our results with simulation studies using intra-class correlation coefficients (ICC) of 5%, 15%, or 30%. Standard performance measures and measures adapted for the clustered data structure were estimated. Results The model developed with random effect analysis showed better discrimination than the standard approach, if the cluster effects were used for risk prediction (standard c-index of 0.69 versus 0.66). In the external validation set, both models showed similar discrimination (standard c-index 0.68 versus 0.67). The simulation study confirmed these results. For datasets with a high ICC (≥15%), model calibration was only adequate in external subjects, if the used performance measure assumed the same data structure as the model development method: standard calibration measures showed good calibration for the standard developed model, calibration measures adapting the clustered data structure showed good calibration for the prediction model with random intercept. Conclusion The models with random intercept discriminate
Bouwmeester, Walter; Twisk, Jos W R; Kappen, Teus H; van Klei, Wilton A; Moons, Karel G M; Vergouwe, Yvonne
2013-02-15
When study data are clustered, standard regression analysis is considered inappropriate and analytical techniques for clustered data need to be used. For prediction research in which the interest of predictor effects is on the patient level, random effect regression models are probably preferred over standard regression analysis. It is well known that the random effect parameter estimates and the standard logistic regression parameter estimates are different. Here, we compared random effect and standard logistic regression models for their ability to provide accurate predictions. Using an empirical study on 1642 surgical patients at risk of postoperative nausea and vomiting, who were treated by one of 19 anesthesiologists (clusters), we developed prognostic models either with standard or random intercept logistic regression. External validity of these models was assessed in new patients from other anesthesiologists. We supported our results with simulation studies using intra-class correlation coefficients (ICC) of 5%, 15%, or 30%. Standard performance measures and measures adapted for the clustered data structure were estimated. The model developed with random effect analysis showed better discrimination than the standard approach, if the cluster effects were used for risk prediction (standard c-index of 0.69 versus 0.66). In the external validation set, both models showed similar discrimination (standard c-index 0.68 versus 0.67). The simulation study confirmed these results. For datasets with a high ICC (≥15%), model calibration was only adequate in external subjects, if the used performance measure assumed the same data structure as the model development method: standard calibration measures showed good calibration for the standard developed model, calibration measures adapting the clustered data structure showed good calibration for the prediction model with random intercept. The models with random intercept discriminate better than the standard model only
Progress Toward a Format Standard for Flight Dynamics Models
NASA Technical Reports Server (NTRS)
Jackson, E. Bruce; Hildreth, Bruce L.
2006-01-01
In the beginning, there was FORTRAN, and it was... not so good. But it was universal, and all flight simulator equations of motion were coded with it. Then came ACSL, C, Ada, C++, C#, Java, FORTRAN-90, Matlab/Simulink, and a number of other programming languages. Since the halcyon punch card days of 1968, models of aircraft flight dynamics have proliferated in training devices, desktop engineering and development computers, and control design textbooks. With the rise of industry teaming and increased reliance on simulation for procurement decisions, aircraft and missile simulation models are created, updated, and exchanged with increasing frequency. However, there is no real lingua franca to facilitate the exchange of models from one simulation user to another. The current state-of-the-art is such that several staff-months if not staff-years are required to 'rehost' each release of a flight dynamics model from one simulation environment to another one. If a standard data package or exchange format were to be universally adopted, the cost and time of sharing and updating aerodynamics, control laws, mass and inertia, and other flight dynamic components of the equations of motion of an aircraft or spacecraft simulation could be drastically reduced. A 2002 paper estimated over $ 6 million in savings could be realized for one military aircraft type alone. This paper describes the efforts of the American Institute of Aeronautics and Astronautics (AIAA) to develop a standard flight dynamic model exchange standard based on XML and HDF-5 data formats.
Galactic chemical evolution and nucleocosmochronology - Standard model with terminated infall
NASA Technical Reports Server (NTRS)
Clayton, D. D.
1984-01-01
Some exactly soluble families of models for the chemical evolution of the Galaxy are presented. The parameters considered include gas mass, the age-metallicity relation, the star mass vs. metallicity, the age distribution, and the mean age of dwarfs. A short BASIC program for calculating these parameters is given. The calculation of metallicity gradients, nuclear cosmochronology, and extinct radioactivities is addressed. An especially simple, mathematically linear model is recommended as a standard model of galaxies with truncated infall due to its internal consistency and compact display of the physical effects of the parameters.
Rational models as theories - not standards - of behavior.
McKenzie, Craig R.M.
2003-09-01
When people's behavior in laboratory tasks systematically deviates from a rational model, the implication is that real-world performance could be improved by changing the behavior. However, recent studies suggest that behavioral violations of rational models are at least sometimes the result of strategies that are well adapted to the real world (and not necessarily to the laboratory task). Thus, even if one accepts that certain behavior in the laboratory is irrational, compelling evidence that real-world behavior ought to change accordingly is often lacking. It is suggested here that rational models be seen as theories, and not standards, of behavior.
Data Standardization for Carbon Cycle Modeling: Lessons Learned
NASA Astrophysics Data System (ADS)
Wei, Y.; Liu, S.; Cook, R. B.; Post, W. M.; Huntzinger, D. N.; Schwalm, C.; Schaefer, K. M.; Jacobson, A. R.; Michalak, A. M.
2012-12-01
Terrestrial biogeochemistry modeling is a crucial component of carbon cycle research and provides unique capabilities to understand terrestrial ecosystems. The Multi-scale Synthesis and Terrestrial Model Intercomparison Project (MsTMIP) aims to identify key differences in model formulation that drive observed differences in model predictions of biospheric carbon exchange. To do so, the MsTMIP framework provides standardized prescribed environmental driver data and a standard model protocol to facilitate comparisons of modeling results from nearly 30 teams. Model performance is then evaluated against a variety of carbon-cycle related observations (remote sensing, atmospheric, and flux tower-based observations) using quantitative performance measures and metrics in an integrated evaluation framework. As part of this effort, we have harmonized highly diverse and heterogeneous environmental driver data, model outputs, and observational benchmark data sets to facilitate use and analysis by the MsTMIP team. In this presentation, we will describe the lessons learned from this data-intensive carbon cycle research. The data harmonization activity itself can be made more efficient with the consideration of proper tools, version control, workflow management, and collaboration within the whole team. The adoption of on-demand and interoperable protocols (e.g. OPeNDAP and Open Geospatial Consortium) makes data visualization and distribution more flexible. Users can customize and download data in specific spatial extent, temporal period, and different resolutions. The effort to properly organize data in an open and standard format (e.g. Climate & Forecast compatible netCDF) allows the data to be analysed by a dispersed set of researchers more efficiently, and maximizes the longevity and utilization of the data. The lessons learned from this specific experience can benefit efforts by the broader community to leverage diverse data resources more efficiently in scientific research.
Simple standard model extension by heavy charged scalar
NASA Astrophysics Data System (ADS)
Boos, E.; Volobuev, I.
2018-05-01
We consider a Standard Model (SM) extension by a heavy charged scalar gauged only under the UY(1 ) weak hypercharge gauge group. Such an extension, being gauge invariant with respect to the SM gauge group, is a simple special case of the well-known Zee model. Since the interactions of the charged scalar with the Standard Model fermions turn out to be significantly suppressed compared to the Standard Model interactions, the charged scalar provides an example of a long-lived charged particle being interesting to search for at the LHC. We present the pair and single production cross sections of the charged scalar at different colliders and the possible decay widths for various boson masses. It is shown that the current ATLAS and CMS searches at 8 and 13 TeV collision energy lead to the bounds on the scalar boson mass of about 300-320 GeV. The limits are expected to be much larger for higher collision energies and, assuming 15 a b-1 integrated luminosity, reach about 2.7 TeV at future 27 TeV LHC thus covering the most interesting mass region.
SLHAplus: A library for implementing extensions of the standard model
NASA Astrophysics Data System (ADS)
Bélanger, G.; Christensen, Neil D.; Pukhov, A.; Semenov, A.
2011-03-01
We provide a library to facilitate the implementation of new models in codes such as matrix element and event generators or codes for computing dark matter observables. The library contains an SLHA reader routine as well as diagonalisation routines. This library is available in CalcHEP and micrOMEGAs. The implementation of models based on this library is supported by LanHEP and FeynRules. Program summaryProgram title: SLHAplus_1.3 Catalogue identifier: AEHX_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEHX_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 6283 No. of bytes in distributed program, including test data, etc.: 52 119 Distribution format: tar.gz Programming language: C Computer: IBM PC, MAC Operating system: UNIX (Linux, Darwin, Cygwin) RAM: 2000 MB Classification: 11.1 Nature of problem: Implementation of extensions of the standard model in matrix element and event generators and codes for dark matter observables. Solution method: For generic extensions of the standard model we provide routines for reading files that adopt the standard format of the SUSY Les Houches Accord (SLHA) file. The procedure has been generalized to take into account an arbitrary number of blocks so that the reader can be used in generic models including non-supersymmetric ones. The library also contains routines to diagonalize real and complex mass matrices with either unitary or bi-unitary transformations as well as routines for evaluating the running strong coupling constant, running quark masses and effective quark masses. Running time: 0.001 sec
Extending the Standard Model with Confining and Conformal Dynamics
NASA Astrophysics Data System (ADS)
McRaven, John Emory
This dissertation will provide a survey of models that involve extending the standard model with confining and conformal dynamics. We will study a series of models, describe them in detail, outline their phenomenology, and provide some search strategies for finding them. The Gaugephobic Higgs model provides an interpolation between three different models of electroweak symmetry breaking: Higgsless models, Randall-Sundrum models, and the Standard Model. At parameter points between the extremes, Standard Model Higgs signals are present at reduced rates, and Higgsless Kaluza-Klein excitations are present with shifted masses and couplings, as well as signals from exotic quarks necessary to protect the Zbb coupling. Using a new implementation of the model in SHERPA, we show the LHC signals which differentiate the generic Gaugephobic Higgs model from its limiting cases. These are all signals involving a Higgs coupling to a Kaluza-Klein gauge boson or quark. We identify the clean signal pp → W (i) → WH mediated by a Kaluza-Klein W, which can be present at large rates and is enhanced for even Kaluza-Klein numbers. Due to the very hard lepton coming from the W+/- decay, this signature has little background, and provides a better discovery channel for the Higgs than any of the Standard Model modes, over its entire mass range. A Higgs radiated from new heavy quarks also has large rates, but is much less promising due to very high multiplicity final states. The AdS/CFT conjectures a relation between Extra Dimensional models in AdS5 space, such as the Gaugephobic Higgs Model, and 4D Conformal Field theories. The notion of conformality has found its way into several phenomenological models for TeV-scale physics extending the standard model. We proceed to explore the phenomenology of a new heavy quark that transforms under a hidden strongly coupled conformal gauge group in addition to transforming under QCD. This object would form states similar to R-Hadrons. The heavy state
Electroweak baryogenesis and the standard model effective field theory
NASA Astrophysics Data System (ADS)
de Vries, Jordy; Postma, Marieke; van de Vis, Jorinde; White, Graham
2018-01-01
We investigate electroweak baryogenesis within the framework of the Standard Model Effective Field Theory. The Standard Model Lagrangian is supplemented by dimension-six operators that facilitate a strong first-order electroweak phase transition and provide sufficient CP violation. Two explicit scenarios are studied that are related via the classical equations of motion and are therefore identical at leading order in the effective field theory expansion. We demonstrate that formally higher-order dimension-eight corrections lead to large modifications of the matter-antimatter asymmetry. The effective field theory expansion breaks down in the modified Higgs sector due to the requirement of a first-order phase transition. We investigate the source of the breakdown in detail and show how it is transferred to the CP-violating sector. We briefly discuss possible modifications of the effective field theory framework.
Astrophysical neutrinos flavored with beyond the Standard Model physics
NASA Astrophysics Data System (ADS)
Rasmussen, Rasmus W.; Lechner, Lukas; Ackermann, Markus; Kowalski, Marek; Winter, Walter
2017-10-01
We systematically study the allowed parameter space for the flavor composition of astrophysical neutrinos measured at Earth, including beyond the Standard Model theories at production, during propagation, and at detection. One motivation is to illustrate the discrimination power of the next-generation neutrino telescopes such as IceCube-Gen2. We identify several examples that lead to potential deviations from the standard neutrino mixing expectation such as significant sterile neutrino production at the source, effective operators modifying the neutrino propagation at high energies, dark matter interactions in neutrino propagation, or nonstandard interactions in Earth matter. IceCube-Gen2 can exclude about 90% of the allowed parameter space in these cases, and hence will allow us to efficiently test and discriminate between models. More detailed information can be obtained from additional observables such as the energy dependence of the effect, fraction of electron antineutrinos at the Glashow resonance, or number of tau neutrino events.
Aspects of Particle Physics Beyond the Standard Model
NASA Astrophysics Data System (ADS)
Lu, Xiaochuan
This dissertation describes a few aspects of particles beyond the Standard Model, with a focus on the remaining questions after the discovery of a Standard Model-like Higgs boson. In specific, three topics are discussed in sequence: neutrino mass and baryon asymmetry, naturalness problem of Higgs mass, and placing constraints on theoretical models from precision measurements. First, the consequence of the neutrino mass anarchy on cosmology is studied. Attentions are paid in particular to the total mass of neutrinos and baryon asymmetry through leptogenesis. With the assumption of independence among mass matrix entries in addition to the basis independence, Gaussian measure is the only choice. On top of Gaussian measure, a simple approximate U(1) flavor symmetry makes leptogenesis highly successful. Correlations between the baryon asymmetry and the light-neutrino quantities are investigated. Also discussed are possible implications of recently suggested large total mass of neutrinos by the SDSS/BOSS data. Second, the Higgs mass implies fine-tuning for minimal theories of weak-scale supersymmetry (SUSY). Non-decoupling effects can boost the Higgs mass when new states interact with the Higgs, but new sources of SUSY breaking that accompany such extensions threaten naturalness. I will show that two singlets with a Dirac mass can increase the Higgs mass while maintaining naturalness in the presence of large SUSY breaking in the singlet sector. The modified Higgs phenomenology of this scenario, termed "Dirac NMSSM", is also studied. Finally, the sensitivities of future precision measurements in probing physics beyond the Standard Model are studied. A practical three-step procedure is presented for using the Standard Model effective field theory (SM EFT) to connect ultraviolet (UV) models of new physics with weak scale precision observables. With this procedure, one can interpret precision measurements as constraints on the UV model concerned. A detailed explanation is
Discovery of the fourth quark in the Standard Model
, using the MARK I detector, (above left) and on the East Coast, at DOEÂs Brookhaven Laboratory Burton Richter at DOEÂs SLAC Sam Ting and team at DOE's Brookhaven 1974 The discovery of charm , the fourth quark in the Standard Model, occurred simultaneously on the West Coast, at DOEÂs SLAC
Electroweak baryogenesis in the exceptional supersymmetric standard model
Chao, Wei
2015-08-28
Here, we study electroweak baryogenesis in the E 6 inspired exceptional supersymmetric standard model (E 6SSM). The relaxation coefficients driven by singlinos and the new gaugino as well as the transport equation of the Higgs supermultiplet number density in the E 6SSM are calculated. Our numerical simulation shows that both CP-violating source terms from singlinos and the new gaugino can solely give rise to a correct baryon asymmetry of the Universe via the electroweak baryogenesis mechanism.
Beyond-Standard-Model Tensor Interaction and Hadron Phenomenology.
Courtoy, Aurore; Baeßler, Stefan; González-Alonso, Martín; Liuti, Simonetta
2015-10-16
We evaluate the impact of recent developments in hadron phenomenology on extracting possible fundamental tensor interactions beyond the standard model. We show that a novel class of observables, including the chiral-odd generalized parton distributions, and the transversity parton distribution function can contribute to the constraints on this quantity. Experimental extractions of the tensor hadronic matrix elements, if sufficiently precise, will provide a, so far, absent testing ground for lattice QCD calculations.
Phenomenology of the N = 3 Lee-Wick Standard Model
NASA Astrophysics Data System (ADS)
TerBeek, Russell Henry
With the discovery of the Higgs Boson in 2012, particle physics has decidedly moved beyond the Standard Model into a new epoch. Though the Standard Model particle content is now completely accounted for, there remain many theoretical issues about the structure of the theory in need of resolution. Among these is the hierarchy problem: since the renormalized Higgs mass receives quadratic corrections from a higher cutoff scale, what keeps the Higgs boson light? Many possible solutions to this problem have been advanced, such as supersymmetry, Randall-Sundrum models, or sub-millimeter corrections to gravity. One such solution has been advanced by the Lee-Wick Standard Model. In this theory, higher-derivative operators are added to the Lagrangian for each Standard Model field, which result in propagators that possess two physical poles and fall off more rapidly in the ultraviolet regime. It can be shown by an auxiliary field transformation that the higher-derivative theory is identical to positing a second, manifestly renormalizable theory in which new fields with opposite-sign kinetic and mass terms are found. These so-called Lee-Wick fields have opposite-sign propagators, and famously cancel off the quadratic divergences that plague the renormalized Higgs mass. The states in the Hilbert space corresponding to Lee-Wick particles have negative norm, and implications for causality and unitarity are examined. This dissertation explores a variant of the theory called the N = 3 Lee-Wick Standard Model. The Lagrangian of this theory features a yet-higher derivative operator, which produces a propagator with three physical poles and possesses even better high-energy behavior than the minimal Lee-Wick theory. An analogous auxiliary field transformation takes this higher-derivative theory into a renormalizable theory with states of alternating positive, negative, and positive norm. The phenomenology of this theory is examined in detail, with particular emphasis on the collider
Impersonating the Standard Model Higgs boson: Alignment without decoupling
Carena, Marcela; Low, Ian; Shah, Nausheen R.; ...
2014-04-03
In models with an extended Higgs sector there exists an alignment limit, in which the lightest CP-even Higgs boson mimics the Standard Model Higgs. The alignment limit is commonly associated with the decoupling limit, where all non-standard scalars are significantly heavier than the Z boson. However, alignment can occur irrespective of the mass scale of the rest of the Higgs sector. In this work we discuss the general conditions that lead to “alignment without decoupling”, therefore allowing for the existence of additional non-standard Higgs bosons at the weak scale. The values of tan β for which this happens are derivedmore » in terms of the effective Higgs quartic couplings in general two-Higgs-doublet models as well as in supersymmetric theories, including the MSSM and the NMSSM. In addition, we study the information encoded in the variations of the SM Higgs-fermion couplings to explore regions in the m A – tan β parameter space.« less
Neutrino masses in the Lee-Wick standard model
Espinosa, Jose Ramon; Grinstein, Benjamin; O'Connell, Donal
2008-04-15
Recently, an extension of the standard model based on ideas of Lee and Wick has been discussed. This theory is free of quadratic divergences and hence has a Higgs mass that is stable against radiative corrections. Here, we address the question of whether or not it is possible to couple very heavy particles, with masses much greater than the weak scale, to the Lee-Wick standard model degrees of freedom and still preserve the stability of the weak scale. We show that in the LW-standard model the familiar seesaw mechanism for generating neutrino masses preserves the solution to the hierarchy puzzlemore » provided by the higher derivative terms. The very heavy right-handed neutrinos do not destabilize the Higgs mass. We give an example of new heavy degrees of freedom that would destabilize the hierarchy, and discuss a general mechanism for coupling other heavy degrees of freedom to the Higgs doublet while preserving the hierarchy.« less
Domain walls in the extensions of the Standard Model
NASA Astrophysics Data System (ADS)
Krajewski, Tomasz; Lalak, Zygmunt; Lewicki, Marek; Olszewski, Paweł
2018-05-01
Our main interest is the evolution of domain walls of the Higgs field in the early Universe. The aim of this paper is to understand how dynamics of Higgs domain walls could be influenced by yet unknown interactions from beyond the Standard Model. We assume that the Standard Model is valid up to certain, high, energy scale Λ and use the framework of the effective field theory to describe physics below that scale. Performing numerical simulations with different values of the scale Λ we are able to extend our previous analysis [1]. Our recent numerical simulations show that evolution of Higgs domain walls is rather insensitive to interactions beyond the Standard Model as long as masses of new particles are grater than 1012 GeV. For lower values of Λ the RG improved effective potential is strongly modified at field strengths crucial to the evolution of domain walls. However, we find that even for low values of Λ, Higgs domain walls decayed shortly after their formation for generic initial conditions. On the other hand, in simulations with specifically chosen initial conditions Higgs domain walls can live longer and enter the scaling regime. We also determine the energy spectrum of gravitational waves produced by decaying domain walls of the Higgs field. For generic initial field configurations the amplitude of the signal is too small to be observed in planned detectors.
No Evidence for Extensions to the Standard Cosmological Model
NASA Astrophysics Data System (ADS)
Heavens, Alan; Fantaye, Yabebal; Sellentin, Elena; Eggers, Hans; Hosenie, Zafiirah; Kroon, Steve; Mootoovaloo, Arrykrishna
2017-09-01
We compute the Bayesian evidence for models considered in the main analysis of Planck cosmic microwave background data. By utilizing carefully defined nearest-neighbor distances in parameter space, we reuse the Monte Carlo Markov chains already produced for parameter inference to compute Bayes factors B for many different model-data set combinations. The standard 6-parameter flat cold dark matter model with a cosmological constant (Λ CDM ) is favored over all other models considered, with curvature being mildly favored only when cosmic microwave background lensing is not included. Many alternative models are strongly disfavored by the data, including primordial correlated isocurvature models (ln B =-7.8 ), nonzero scalar-to-tensor ratio (ln B =-4.3 ), running of the spectral index (ln B =-4.7 ), curvature (ln B =-3.6 ), nonstandard numbers of neutrinos (ln B =-3.1 ), nonstandard neutrino masses (ln B =-3.2 ), nonstandard lensing potential (ln B =-4.6 ), evolving dark energy (ln B =-3.2 ), sterile neutrinos (ln B =-6.9 ), and extra sterile neutrinos with a nonzero scalar-to-tensor ratio (ln B =-10.8 ). Other models are less strongly disfavored with respect to flat Λ CDM . As with all analyses based on Bayesian evidence, the final numbers depend on the widths of the parameter priors. We adopt the priors used in the Planck analysis, while performing a prior sensitivity analysis. Our quantitative conclusion is that extensions beyond the standard cosmological model are disfavored by Planck data. Only when newer Hubble constant measurements are included does Λ CDM become disfavored, and only mildly, compared with a dynamical dark energy model (ln B ˜+2 ).
No Evidence for Extensions to the Standard Cosmological Model.
Heavens, Alan; Fantaye, Yabebal; Sellentin, Elena; Eggers, Hans; Hosenie, Zafiirah; Kroon, Steve; Mootoovaloo, Arrykrishna
2017-09-08
We compute the Bayesian evidence for models considered in the main analysis of Planck cosmic microwave background data. By utilizing carefully defined nearest-neighbor distances in parameter space, we reuse the Monte Carlo Markov chains already produced for parameter inference to compute Bayes factors B for many different model-data set combinations. The standard 6-parameter flat cold dark matter model with a cosmological constant (ΛCDM) is favored over all other models considered, with curvature being mildly favored only when cosmic microwave background lensing is not included. Many alternative models are strongly disfavored by the data, including primordial correlated isocurvature models (lnB=-7.8), nonzero scalar-to-tensor ratio (lnB=-4.3), running of the spectral index (lnB=-4.7), curvature (lnB=-3.6), nonstandard numbers of neutrinos (lnB=-3.1), nonstandard neutrino masses (lnB=-3.2), nonstandard lensing potential (lnB=-4.6), evolving dark energy (lnB=-3.2), sterile neutrinos (lnB=-6.9), and extra sterile neutrinos with a nonzero scalar-to-tensor ratio (lnB=-10.8). Other models are less strongly disfavored with respect to flat ΛCDM. As with all analyses based on Bayesian evidence, the final numbers depend on the widths of the parameter priors. We adopt the priors used in the Planck analysis, while performing a prior sensitivity analysis. Our quantitative conclusion is that extensions beyond the standard cosmological model are disfavored by Planck data. Only when newer Hubble constant measurements are included does ΛCDM become disfavored, and only mildly, compared with a dynamical dark energy model (lnB∼+2).
Elementary particles, dark matter candidate and new extended standard model
NASA Astrophysics Data System (ADS)
Hwang, Jaekwang
2017-01-01
Elementary particle decays and reactions are discussed in terms of the three-dimensional quantized space model beyond the standard model. Three generations of the leptons and quarks correspond to the lepton charges. Three heavy leptons and three heavy quarks are introduced. And the bastons (new particles) are proposed as the possible candidate of the dark matters. Dark matter force, weak force and strong force are explained consistently. Possible rest masses of the new particles are, tentatively, proposed for the experimental searches. For more details, see the conference paper at https://www.researchgate.net/publication/308723916.
A physiome standards-based model publication paradigm.
Nickerson, David P; Buist, Martin L
2009-05-28
In this era of widespread broadband Internet penetration and powerful Web browsers on most desktops, a shift in the publication paradigm for physiome-style models is envisaged. No longer will model authors simply submit an essentially textural description of the development and behaviour of their model. Rather, they will submit a complete working implementation of the model encoded and annotated according to the various standards adopted by the physiome project, accompanied by a traditional human-readable summary of the key scientific goals and outcomes of the work. While the final published, peer-reviewed article will look little different to the reader, in this new paradigm, both reviewers and readers will be able to interact with, use and extend the models in ways that are not currently possible. Here, we review recent developments that are laying the foundations for this new model publication paradigm. Initial developments have focused on the publication of mathematical models of cellular electrophysiology, using technology based on a CellML- or Systems Biology Markup Language (SBML)-encoded implementation of the mathematical models. Here, we review the current state of the art and what needs to be done before such a model publication becomes commonplace.
BiGG Models: A platform for integrating, standardizing and sharing genome-scale models
King, Zachary A.; Lu, Justin; Drager, Andreas; ...
2015-10-17
In this study, genome-scale metabolic models are mathematically structured knowledge bases that can be used to predict metabolic pathway usage and growth phenotypes. Furthermore, they can generate and test hypotheses when integrated with experimental data. To maximize the value of these models, centralized repositories of high-quality models must be established, models must adhere to established standards and model components must be linked to relevant databases. Tools for model visualization further enhance their utility. To meet these needs, we present BiGG Models (http://bigg.ucsd.edu), a completely redesigned Biochemical, Genetic and Genomic knowledge base. BiGG Models contains more than 75 high-quality, manually-curated genome-scalemore » metabolic models. On the website, users can browse, search and visualize models. BiGG Models connects genome-scale models to genome annotations and external databases. Reaction and metabolite identifiers have been standardized across models to conform to community standards and enable rapid comparison across models. Furthermore, BiGG Models provides a comprehensive application programming interface for accessing BiGG Models with modeling and analysis tools. As a resource for highly curated, standardized and accessible models of metabolism, BiGG Models will facilitate diverse systems biology studies and support knowledge-based analysis of diverse experimental data.« less
BiGG Models: A platform for integrating, standardizing and sharing genome-scale models
King, Zachary A.; Lu, Justin; Dräger, Andreas; Miller, Philip; Federowicz, Stephen; Lerman, Joshua A.; Ebrahim, Ali; Palsson, Bernhard O.; Lewis, Nathan E.
2016-01-01
Genome-scale metabolic models are mathematically-structured knowledge bases that can be used to predict metabolic pathway usage and growth phenotypes. Furthermore, they can generate and test hypotheses when integrated with experimental data. To maximize the value of these models, centralized repositories of high-quality models must be established, models must adhere to established standards and model components must be linked to relevant databases. Tools for model visualization further enhance their utility. To meet these needs, we present BiGG Models (http://bigg.ucsd.edu), a completely redesigned Biochemical, Genetic and Genomic knowledge base. BiGG Models contains more than 75 high-quality, manually-curated genome-scale metabolic models. On the website, users can browse, search and visualize models. BiGG Models connects genome-scale models to genome annotations and external databases. Reaction and metabolite identifiers have been standardized across models to conform to community standards and enable rapid comparison across models. Furthermore, BiGG Models provides a comprehensive application programming interface for accessing BiGG Models with modeling and analysis tools. As a resource for highly curated, standardized and accessible models of metabolism, BiGG Models will facilitate diverse systems biology studies and support knowledge-based analysis of diverse experimental data. PMID:26476456
ERIC Educational Resources Information Center
Wisconsin Department of Public Instruction, 2011
2011-01-01
Wisconsin's adoption of the Common Core State Standards provides an excellent opportunity for Wisconsin school districts and communities to define expectations from birth through preparation for college and work. By aligning the existing Wisconsin Model Early Learning Standards with the Wisconsin Common Core State Standards, expectations can be…
29 CFR 1990.151 - Model standard pursuant to section 6(b) of the Act.
Code of Federal Regulations, 2014 CFR
2014-07-01
... 29 Labor 9 2014-07-01 2014-07-01 false Model standard pursuant to section 6(b) of the Act. 1990... OCCUPATIONAL CARCINOGENS Model Standards § 1990.151 Model standard pursuant to section 6(b) of the Act. Occupational Exposure to ________ Permanent Standard (insert section number of standard) (a) Scope and...
29 CFR 1990.151 - Model standard pursuant to section 6(b) of the Act.
Code of Federal Regulations, 2013 CFR
2013-07-01
... 29 Labor 9 2013-07-01 2013-07-01 false Model standard pursuant to section 6(b) of the Act. 1990... OCCUPATIONAL CARCINOGENS Model Standards § 1990.151 Model standard pursuant to section 6(b) of the Act. Occupational Exposure to ________ Permanent Standard (insert section number of standard) (a) Scope and...
29 CFR 1990.151 - Model standard pursuant to section 6(b) of the Act.
Code of Federal Regulations, 2012 CFR
2012-07-01
... 29 Labor 9 2012-07-01 2012-07-01 false Model standard pursuant to section 6(b) of the Act. 1990... OCCUPATIONAL CARCINOGENS Model Standards § 1990.151 Model standard pursuant to section 6(b) of the Act. Occupational Exposure to ________ Permanent Standard (insert section number of standard) (a) Scope and...
A unified model of the standard genetic code.
José, Marco V; Zamudio, Gabriel S; Morgado, Eberto R
2017-03-01
The Rodin-Ohno (RO) and the Delarue models divide the table of the genetic code into two classes of aminoacyl-tRNA synthetases (aaRSs I and II) with recognition from the minor or major groove sides of the tRNA acceptor stem, respectively. These models are asymmetric but they are biologically meaningful. On the other hand, the standard genetic code (SGC) can be derived from the primeval RNY code (R stands for purines, Y for pyrimidines and N any of them). In this work, the RO-model is derived by means of group actions, namely, symmetries represented by automorphisms, assuming that the SGC originated from a primeval RNY code. It turns out that the RO-model is symmetric in a six-dimensional (6D) hypercube. Conversely, using the same automorphisms, we show that the RO-model can lead to the SGC. In addition, the asymmetric Delarue model becomes symmetric by means of quotient group operations. We formulate isometric functions that convert the class aaRS I into the class aaRS II and vice versa. We show that the four polar requirement categories display a symmetrical arrangement in our 6D hypercube. Altogether these results cannot be attained, neither in two nor in three dimensions. We discuss the present unified 6D algebraic model, which is compatible with both the SGC (based upon the primeval RNY code) and the RO-model.
Toward Standardizing a Lexicon of Infectious Disease Modeling Terms.
Milwid, Rachael; Steriu, Andreea; Arino, Julien; Heffernan, Jane; Hyder, Ayaz; Schanzer, Dena; Gardner, Emma; Haworth-Brockman, Margaret; Isfeld-Kiely, Harpa; Langley, Joanne M; Moghadas, Seyed M
2016-01-01
Disease modeling is increasingly being used to evaluate the effect of health intervention strategies, particularly for infectious diseases. However, the utility and application of such models are hampered by the inconsistent use of infectious disease modeling terms between and within disciplines. We sought to standardize the lexicon of infectious disease modeling terms and develop a glossary of terms commonly used in describing models' assumptions, parameters, variables, and outcomes. We combined a comprehensive literature review of relevant terms with an online forum discussion in a virtual community of practice, mod4PH (Modeling for Public Health). Using a convergent discussion process and consensus amongst the members of mod4PH, a glossary of terms was developed as an online resource. We anticipate that the glossary will improve inter- and intradisciplinary communication and will result in a greater uptake and understanding of disease modeling outcomes in heath policy decision-making. We highlight the role of the mod4PH community of practice and the methodologies used in this endeavor to link theory, policy, and practice in the public health domain.
On a radiative origin of the Standard Model from trinification
NASA Astrophysics Data System (ADS)
Camargo-Molina, José Eliel; Morais, António P.; Pasechnik, Roman; Wessén, Jonas
2016-09-01
In this work, we present a trinification-based grand unified theory incorporating a global SU(3) family symmetry that after a spontaneous breaking leads to a left-right symmetric model. Already at the classical level, this model can accommodate the matter content and the quark Cabbibo mixing in the Standard Model (SM) with only one Yukawa coupling at the unification scale. Considering the minimal low-energy scenario with the least amount of light states, we show that the resulting effective theory enables dynamical breaking of its gauge group down to that of the SM by means of radiative corrections accounted for by the renormalisation group evolution at one loop. This result paves the way for a consistent explanation of the SM breaking scale and fermion mass hierarchies.
Electroweak precision data and the Lee-Wick standard model
Underwood, Thomas E. J.; Zwicky, Roman
2009-02-01
We investigate the electroweak precision constraints on the recently proposed Lee-Wick standard model at tree level. We analyze low-energy, Z-pole (LEP1/SLC) and LEP2 data separately. We derive the exact tree-level low-energy and Z-pole effective Lagrangians from both the auxiliary field and higher derivative formulation of the theory. For the LEP2 data we use the fact that the Lee-Wick standard model belongs to the class of models that assumes a so-called 'universal' form which can be described by seven oblique parameters at leading order in m{sub W}{sup 2}/M{sub 1,2}{sup 2}. At tree level we find that Y=-m{sub W}{sup 2}/M{sub 1}{sup 2}more » and W=-m{sub W}{sup 2}/M{sub 2}{sup 2}, where the negative sign is due to the presence of the negative norm states. All other oblique parameters (S,X) and (T,U,V) are found to be zero. In the addendum we show how our results differ from previous investigations, where contact terms, which are found to be of leading order, have been neglected. The LEP1/SLC constraints are slightly stronger than LEP2 and much stronger than the low-energy ones. The LEP1/SLC results exclude gauge boson masses of M{sub 1}{approx_equal}M{sub 2}{approx}3 TeV at the 99% confidence level. Somewhat lower masses are possible when one of the masses assumes a large value. Loop corrections to the electroweak observables are suppressed by the standard {approx}1/(4{pi}){sup 2} factor and are therefore not expected to change the constraints on M1 and M{sub 2}. This assertion is most transparent from the higher derivative formulation of the theory.« less
Toward Standardizing a Lexicon of Infectious Disease Modeling Terms
Milwid, Rachael; Steriu, Andreea; Arino, Julien; Heffernan, Jane; Hyder, Ayaz; Schanzer, Dena; Gardner, Emma; Haworth-Brockman, Margaret; Isfeld-Kiely, Harpa; Langley, Joanne M.; Moghadas, Seyed M.
2016-01-01
Disease modeling is increasingly being used to evaluate the effect of health intervention strategies, particularly for infectious diseases. However, the utility and application of such models are hampered by the inconsistent use of infectious disease modeling terms between and within disciplines. We sought to standardize the lexicon of infectious disease modeling terms and develop a glossary of terms commonly used in describing models’ assumptions, parameters, variables, and outcomes. We combined a comprehensive literature review of relevant terms with an online forum discussion in a virtual community of practice, mod4PH (Modeling for Public Health). Using a convergent discussion process and consensus amongst the members of mod4PH, a glossary of terms was developed as an online resource. We anticipate that the glossary will improve inter- and intradisciplinary communication and will result in a greater uptake and understanding of disease modeling outcomes in heath policy decision-making. We highlight the role of the mod4PH community of practice and the methodologies used in this endeavor to link theory, policy, and practice in the public health domain. PMID:27734014
Standard Model parton distributions at very high energies
Bauer, Christian W.; Ferland, Nicolas; Webber, Bryan R.
2017-08-09
We compute the leading-order evolution of parton distribution functions for all the Standard Model fermions and bosons up to energy scales far above the electroweak scale, where electroweak symmetry is restored. Our results include the 52 PDFs of the unpolarized proton, evolving according to the SU(3), SU(2), U(1), mixed SU(2)×U(1) and Yukawa interactions. We illustrate the numerical effects on parton distributions at large energies, and show that this can lead to important corrections to parton luminosities at a future 100 TeV collider.
Dark Matter and Color Octets Beyond the Standard Model
Krnjaic, Gordan Zdenko
2012-07-01
Although the Standard Model (SM) of particles and interactions has survived forty years of experimental tests, it does not provide a complete description of nature. From cosmological and astrophysical observations, it is now clear that the majority of matter in the universe is not baryonic and interacts very weakly (if at all) via non-gravitational forces. The SM does not provide a dark matter candidate, so new particles must be introduced. Furthermore, recent Tevatron results suggest that SM predictions for benchmark collider observables are in tension with experimental observations. In this thesis, we will propose extensions to the SM that addressmore » each of these issues.« less
What is special about the group of the standard model?
NASA Astrophysics Data System (ADS)
Nielsen, H. B.; Brene, N.
1989-06-01
The standard model is based on the algebra of U 1×SU 2×SU 3. The systematics of charges of the fundamental fermions seems to suggest the importance of a particular group having this algebra, viz. S(U 2×U 3). This group is distinguished from all other connected compact non semisimple groups with dimensionality up to 12 by a characteristic property: it is very “skew”. By this we mean that the group has relatively few “generalised outer automorphisms”. One may speculate about physical reasons for this fact.
Standard Model parton distributions at very high energies
Bauer, Christian W.; Ferland, Nicolas; Webber, Bryan R.
We compute the leading-order evolution of parton distribution functions for all the Standard Model fermions and bosons up to energy scales far above the electroweak scale, where electroweak symmetry is restored. Our results include the 52 PDFs of the unpolarized proton, evolving according to the SU(3), SU(2), U(1), mixed SU(2)×U(1) and Yukawa interactions. We illustrate the numerical effects on parton distributions at large energies, and show that this can lead to important corrections to parton luminosities at a future 100 TeV collider.
Electroweak baryogenesis in the exceptional supersymmetric standard model
Chao, Wei, E-mail: chao@physics.umass.edu
2015-08-01
We study electroweak baryogenesis in the E{sub 6} inspired exceptional supersymmetric standard model (E{sub 6}SSM). The relaxation coefficients driven by singlinos and the new gaugino as well as the transport equation of the Higgs supermultiplet number density in the E{sub 6}SSM are calculated. Our numerical simulation shows that both CP-violating source terms from singlinos and the new gaugino can solely give rise to a correct baryon asymmetry of the Universe via the electroweak baryogenesis mechanism.
Electroweak baryogenesis in the exceptional supersymmetric standard model
Chao, Wei
2015-08-28
We study electroweak baryogenesis in the E{sub 6} inspired exceptional supersymmetric standard model (E{sub 6}SSM). The relaxation coefficients driven by singlinos and the new gaugino as well as the transport equation of the Higgs supermultiplet number density in the E{sub 6}SSM are calculated. Our numerical simulation shows that both CP-violating source terms from singlinos and the new gaugino can solely give rise to a correct baryon asymmetry of the Universe via the electroweak baryogenesis mechanism.
Big bang nucleosynthesis: The standard model and alternatives
NASA Technical Reports Server (NTRS)
Schramm, David N.
1991-01-01
Big bang nucleosynthesis provides (with the microwave background radiation) one of the two quantitative experimental tests of the big bang cosmological model. This paper reviews the standard homogeneous-isotropic calculation and shows how it fits the light element abundances ranging from He-4 at 24% by mass through H-2 and He-3 at parts in 10(exp 5) down to Li-7 at parts in 10(exp 10). Furthermore, the recent large electron positron (LEP) (and the stanford linear collider (SLC)) results on the number of neutrinos are discussed as a positive laboratory test of the standard scenario. Discussion is presented on the improved observational data as well as the improved neutron lifetime data. Alternate scenarios of decaying matter or of quark-hadron induced inhomogeneities are discussed. It is shown that when these scenarios are made to fit the observed abundances accurately, the resulting conlusions on the baryonic density relative to the critical density, omega(sub b) remain approximately the same as in the standard homogeneous case, thus, adding to the robustness of the conclusion that omega(sub b) approximately equals 0.06. This latter point is the driving force behind the need for non-baryonic dark matter (assuming omega(sub total) = 1) and the need for dark baryonic matter, since omega(sub visible) is less than omega(sub b).
How to use the Standard Model effective field theory
Henning, Brian; Lu, Xiaochuan; Murayama, Hitoshi
2016-01-06
Here, we present a practical three-step procedure of using the Standard Model effective field theory (SM EFT) to connect ultraviolet (UV) models of new physics with weak scale precision observables. With this procedure, one can interpret precision measurements as constraints on a given UV model. We give a detailed explanation for calculating the effective action up to one-loop order in a manifestly gauge covariant fashion. This covariant derivative expansion method dramatically simplifies the process of matching a UV model with the SM EFT, and also makes available a universal formalism that is easy to use for a variety of UVmore » models. A few general aspects of RG running effects and choosing operator bases are discussed. Finally, we provide mapping results between the bosonic sector of the SM EFT and a complete set of precision electroweak and Higgs observables to which present and near future experiments are sensitive. Many results and tools which should prove useful to those wishing to use the SM EFT are detailed in several appendices.« less
Modeling the wet bulb globe temperature using standard meteorological measurements.
Liljegren, James C; Carhart, Richard A; Lawday, Philip; Tschopp, Stephen; Sharp, Robert
2008-10-01
The U.S. Army has a need for continuous, accurate estimates of the wet bulb globe temperature to protect soldiers and civilian workers from heat-related injuries, including those involved in the storage and destruction of aging chemical munitions at depots across the United States. At these depots, workers must don protective clothing that increases their risk of heat-related injury. Because of the difficulty in making continuous, accurate measurements of wet bulb globe temperature outdoors, the authors have developed a model of the wet bulb globe temperature that relies only on standard meteorological data available at each storage depot for input. The model is composed of separate submodels of the natural wet bulb and globe temperatures that are based on fundamental principles of heat and mass transfer, has no site-dependent parameters, and achieves an accuracy of better than 1 degree C based on comparisons with wet bulb globe temperature measurements at all depots.
Long-term archiving and data access: modelling and standardization
NASA Technical Reports Server (NTRS)
Hoc, Claude; Levoir, Thierry; Nonon-Latapie, Michel
1996-01-01
This paper reports on the multiple difficulties inherent in the long-term archiving of digital data, and in particular on the different possible causes of definitive data loss. It defines the basic principles which must be respected when creating long-term archives. Such principles concern both the archival systems and the data. The archival systems should have two primary qualities: independence of architecture with respect to technological evolution, and generic-ness, i.e., the capability of ensuring identical service for heterogeneous data. These characteristics are implicit in the Reference Model for Archival Services, currently being designed within an ISO-CCSDS framework. A system prototype has been developed at the French Space Agency (CNES) in conformance with these principles, and its main characteristics will be discussed in this paper. Moreover, the data archived should be capable of abstract representation regardless of the technology used, and should, to the extent that it is possible, be organized, structured and described with the help of existing standards. The immediate advantage of standardization is illustrated by several concrete examples. Both the positive facets and the limitations of this approach are analyzed. The advantages of developing an object-oriented data model within this contxt are then examined.
Penguin-like diagrams from the standard model
Ping, Chia Swee
2015-04-24
The Standard Model is highly successful in describing the interactions of leptons and quarks. There are, however, rare processes that involve higher order effects in electroweak interactions. One specific class of processes is the penguin-like diagram. Such class of diagrams involves the neutral change of quark flavours accompanied by the emission of a gluon (gluon penguin), a photon (photon penguin), a gluon and a photon (gluon-photon penguin), a Z-boson (Z penguin), or a Higgs-boson (Higgs penguin). Such diagrams do not arise at the tree level in the Standard Model. They are, however, induced by one-loop effects. In this paper, wemore » present an exact calculation of the penguin diagram vertices in the ‘tHooft-Feynman gauge. Renormalization of the vertex is effected by a prescription by Chia and Chong which gives an expression for the counter term identical to that obtained by employing Ward-Takahashi identity. The on-shell vertex functions for the penguin diagram vertices are obtained. The various penguin diagram vertex functions are related to one another via Ward-Takahashi identity. From these, a set of relations is obtained connecting the vertex form factors of various penguin diagrams. Explicit expressions for the gluon-photon penguin vertex form factors are obtained, and their contributions to the flavor changing processes estimated.« less
40 CFR 86.410-2006 - Emission standards for 2006 and later model year motorcycles.
Code of Federal Regulations, 2010 CFR
2010-07-01
...-2006 Emission standards for 2006 and later model year motorcycles. (a)(1) Exhaust emissions from Class...-1—Class I and II Motorcycle Emission Standards Model year Emission standards(g/km) HC CO 2006 and... the following table: Table E2006-2—Class III Motorcycle Emission Standards Tier Model year Emission...
40 CFR 86.410-2006 - Emission standards for 2006 and later model year motorcycles.
Code of Federal Regulations, 2011 CFR
2011-07-01
...-2006 Emission standards for 2006 and later model year motorcycles. (a)(1) Exhaust emissions from Class...-1—Class I and II Motorcycle Emission Standards Model year Emission standards(g/km) HC CO 2006 and... the following table: Table E2006-2—Class III Motorcycle Emission Standards Tier Model year Emission...
40 CFR 86.410-2006 - Emission standards for 2006 and later model year motorcycles.
Code of Federal Regulations, 2012 CFR
2012-07-01
...-2006 Emission standards for 2006 and later model year motorcycles. (a)(1) Exhaust emissions from Class...-1—Class I and II Motorcycle Emission Standards Model year Emission standards(g/km) HC CO 2006 and... the following table: Table E2006-2—Class III Motorcycle Emission Standards Tier Model year Emission...
40 CFR 86.410-2006 - Emission standards for 2006 and later model year motorcycles.
Code of Federal Regulations, 2014 CFR
2014-07-01
...-2006 Emission standards for 2006 and later model year motorcycles. (a)(1) Exhaust emissions from Class...-1—Class I and II Motorcycle Emission Standards Model year Emission standards(g/km) HC CO 2006 and... the following table: Table E2006-2—Class III Motorcycle Emission Standards Tier Model year Emission...
40 CFR 86.410-2006 - Emission standards for 2006 and later model year motorcycles.
Code of Federal Regulations, 2013 CFR
2013-07-01
...-2006 Emission standards for 2006 and later model year motorcycles. (a)(1) Exhaust emissions from Class...-1—Class I and II Motorcycle Emission Standards Model year Emission standards(g/km) HC CO 2006 and... the following table: Table E2006-2—Class III Motorcycle Emission Standards Tier Model year Emission...
Experimental validation of Swy-2 clay standard's PHREEQC model
NASA Astrophysics Data System (ADS)
Szabó, Zsuzsanna; Hegyfalvi, Csaba; Freiler, Ágnes; Udvardi, Beatrix; Kónya, Péter; Székely, Edit; Falus, György
2017-04-01
One of the challenges of the present century is to limit the greenhouse gas emissions for the mitigation of climate change which is possible for example by a transitional technology, CCS (Carbon Capture and Storage) and, among others, by the increase of nuclear proportion in the energy mix. Clay minerals are considered to be responsible for the low permeability and sealing capacity of caprocks sealing off stored CO2 and they are also the main constituents of bentonite in high level radioactive waste disposal facilities. The understanding of clay behaviour in these deep geological environments is possible through laboratory batch experiments of well-known standards and coupled geochemical models. Such experimentally validated models are scarce even though they allow deriving more precise long-term predictions of mineral reactions and rock and bentonite degradation underground and, therefore, ensuring the safety of the above technologies and increase their public acceptance. This ongoing work aims to create a kinetic geochemical model of Na-montmorillonite standard Swy-2 in the widely used PHREEQC code, supported by solution and mineral composition results from batch experiments. Several four days experiments have been carried out in 1:35 rock:water ratio at atmospheric conditions, and with inert and CO2 supercritical phase at 100 bar and 80 ⁰C relevant for the potential Hungarian CO2 reservoir complex. Solution samples have been taken during and after experiments and their compositions were measured by ICP-OES. The treated solid phase has been analysed by XRD and ATR-FTIR and compared to in-parallel measured references (dried Swy-2). Kinetic geochemical modelling of the experimental conditions has been performed by PHREEQC version 3 using equations and kinetic rate parameters from the USGS report of Palandri and Kharaka (2004). The visualization of experimental and numerous modelling results has been automatized by R. Experiments and models show very fast
Standards and Guidelines for Numerical Models for Tsunami Hazard Mitigation
NASA Astrophysics Data System (ADS)
Titov, V.; Gonzalez, F.; Kanoglu, U.; Yalciner, A.; Synolakis, C. E.
2006-12-01
An increased number of nations around the workd need to develop tsunami mitigation plans which invariably involve inundation maps for warning guidance and evacuation planning. There is the risk that inundation maps may be produced with older or untested methodology, as there are currently no standards for modeling tools. In the aftermath of the 2004 megatsunami, some models were used to model inundation for Cascadia events with results much larger than sediment records and existing state-of-the-art studies suggest leading to confusion among emergency management. Incorrectly assessing tsunami impact is hazardous, as recent events in 2006 in Tonga, Kythira, Greece and Central Java have suggested (Synolakis and Bernard, 2006). To calculate tsunami currents, forces and runup on coastal structures, and inundation of coastlines one must calculate the evolution of the tsunami wave from the deep ocean to its target site, numerically. No matter what the numerical model, validation (the process of ensuring that the model solves the parent equations of motion accurately) and verification (the process of ensuring that the model used represents geophysical reality appropriately) both are an essential. Validation ensures that the model performs well in a wide range of circumstances and is accomplished through comparison with analytical solutions. Verification ensures that the computational code performs well over a range of geophysical problems. A few analytic solutions have been validated themselves with laboratory data. Even fewer existing numerical models have been both validated with the analytical solutions and verified with both laboratory measurements and field measurements, thus establishing a gold standard for numerical codes for inundation mapping. While there is in principle no absolute certainty that a numerical code that has performed well in all the benchmark tests will also produce correct inundation predictions with any given source motions, validated codes
Subsonic roll oscillation experiments on the Standard Dynamics Model
NASA Technical Reports Server (NTRS)
Beyers, M. E.
1983-01-01
The experimental determination of the subsonic roll derivatives of the Standard Dynamics Model, which is representative of a current fighter aircraft configuration, is described. The direct, cross and cross-coupling derivatives are presented for angles of attack up to 41 deg and sideslip angles in the range from -5 deg to 5 deg, as functions of oscillation frequency. The derivatives exhibited significant nonlinear trends at high incidences and were found to be extremely sensitive to sideslip angle at angles of attack near 36 deg. The roll damping and dynamic cross derivatives were highly frequency dependent at angles of attack above 30 deg. The highest values measured for the dynamic cross and cross-coupling derivatives were comparable in magnitude with the maximum roll damping. The effects of oscillation amplitude and Mach number were also investigated, and the direct derivatives were correlated with data from another facility.
Sakurai Prize: Beyond the Standard Model Higgs Boson
NASA Astrophysics Data System (ADS)
Haber, Howard
2017-01-01
The discovery of the Higgs boson strongly suggests that the first elementary spin 0 particle has been observed. Is the Higgs boson a solo act, or are there additional Higgs bosons to be discovered? Given that there are three generations of fundamental fermions, one might also expect the sector of fundamental scalars of nature to be non-minimal. However, there are already strong constraints on the possible structure of an extended Higgs sector. In this talk, I review the theoretical motivations that have been put forward for an extended Higgs sector and discuss its implications in light of the observation that the properties of the observed Higgs boson are close to those predicted by the Standard Model. supported in part by the U.S. Department of Energy Grant Number DE-SC0010107.
Error modelling of quantum Hall array resistance standards
NASA Astrophysics Data System (ADS)
Marzano, Martina; Oe, Takehiko; Ortolano, Massimo; Callegaro, Luca; Kaneko, Nobu-Hisa
2018-04-01
Quantum Hall array resistance standards (QHARSs) are integrated circuits composed of interconnected quantum Hall effect elements that allow the realization of virtually arbitrary resistance values. In recent years, techniques were presented to efficiently design QHARS networks. An open problem is that of the evaluation of the accuracy of a QHARS, which is affected by contact and wire resistances. In this work, we present a general and systematic procedure for the error modelling of QHARSs, which is based on modern circuit analysis techniques and Monte Carlo evaluation of the uncertainty. As a practical example, this method of analysis is applied to the characterization of a 1 MΩ QHARS developed by the National Metrology Institute of Japan. Software tools are provided to apply the procedure to other arrays.
Consistent use of the standard model effective potential.
Andreassen, Anders; Frost, William; Schwartz, Matthew D
2014-12-12
The stability of the standard model is determined by the true minimum of the effective Higgs potential. We show that the potential at its minimum when computed by the traditional method is strongly dependent on the gauge parameter. It moreover depends on the scale where the potential is calculated. We provide a consistent method for determining absolute stability independent of both gauge and calculation scale, order by order in perturbation theory. This leads to a revised stability bounds m(h)(pole)>(129.4±2.3) GeV and m(t)(pole)<(171.2±0.3) GeV. We also show how to evaluate the effect of new physics on the stability bound without resorting to unphysical field values.
Image contrast enhancement based on a local standard deviation model
Chang, Dah-Chung; Wu, Wen-Rong
1996-12-31
The adaptive contrast enhancement (ACE) algorithm is a widely used image enhancement method, which needs a contrast gain to adjust high frequency components of an image. In the literature, the gain is usually inversely proportional to the local standard deviation (LSD) or is a constant. But these cause two problems in practical applications, i.e., noise overenhancement and ringing artifact. In this paper a new gain is developed based on Hunt`s Gaussian image model to prevent the two defects. The new gain is a nonlinear function of LSD and has the desired characteristic emphasizing the LSD regions in which details aremore » concentrated. We have applied the new ACE algorithm to chest x-ray images and the simulations show the effectiveness of the proposed algorithm.« less
Standard model CP violation and cold electroweak baryogenesis
Tranberg, Anders
2011-10-15
Using large-scale real-time lattice simulations, we calculate the baryon asymmetry generated at a fast, cold electroweak symmetry breaking transition. CP-violation is provided by the leading effective bosonic term resulting from integrating out the fermions in the Minimal Standard Model at zero-temperature, and performing a covariant gradient expansion [A. Hernandez, T. Konstandin, and M. G. Schmidt, Nucl. Phys. B812, 290 (2009).]. This is an extension of the work presented in [A. Tranberg, A. Hernandez, T. Konstandin, and M. G. Schmidt, Phys. Lett. B 690, 207 (2010).]. The numerical implementation is described in detail, and we address issues specifically related to usingmore » this CP-violating term in the context of Cold Electroweak Baryogenesis.« less
Standard Model in multiscale theories and observational constraints
NASA Astrophysics Data System (ADS)
Calcagni, Gianluca; Nardelli, Giuseppe; Rodríguez-Fernández, David
2016-08-01
We construct and analyze the Standard Model of electroweak and strong interactions in multiscale spacetimes with (i) weighted derivatives and (ii) q -derivatives. Both theories can be formulated in two different frames, called fractional and integer picture. By definition, the fractional picture is where physical predictions should be made. (i) In the theory with weighted derivatives, it is shown that gauge invariance and the requirement of having constant masses in all reference frames make the Standard Model in the integer picture indistinguishable from the ordinary one. Experiments involving only weak and strong forces are insensitive to a change of spacetime dimensionality also in the fractional picture, and only the electromagnetic and gravitational sectors can break the degeneracy. For the simplest multiscale measures with only one characteristic time, length and energy scale t*, ℓ* and E*, we compute the Lamb shift in the hydrogen atom and constrain the multiscale correction to the ordinary result, getting the absolute upper bound t*<10-23 s . For the natural choice α0=1 /2 of the fractional exponent in the measure, this bound is strengthened to t*<10-29 s , corresponding to ℓ*<10-20 m and E*>28 TeV . Stronger bounds are obtained from the measurement of the fine-structure constant. (ii) In the theory with q -derivatives, considering the muon decay rate and the Lamb shift in light atoms, we obtain the independent absolute upper bounds t*<10-13 s and E*>35 MeV . For α0=1 /2 , the Lamb shift alone yields t*<10-27 s , ℓ*<10-19 m and E*>450 GeV .
Probing the exotic particle content beyond the standard model
NASA Astrophysics Data System (ADS)
Ma, E.; Raidal, M.; Sarkar, U.
1999-04-01
We explore the possible exotic particle content beyond the standard model by examining all its scalar bilinear combinations. We categorize these exotic scalar fields and show that without the suppression of (A) their Yukawa couplings with the known quarks and leptons, and (B) the trilinear couplings among themselves, most are already constrained to be very heavy from the nonobservation of proton decay and neutron-antineutron oscillations, the smallness of K^0 - overline {K^0}, D^0 - overline{D^0} and B_d^0 - overline{B_d^0} mixing, as well as the requirement of a nonzero baryon asymmetry of the universe. On the other hand, assumption (B) may be naturally violated in many models, especially in supersymmetry, hence certain exotic scalars are allowed to be below a few TeV in mass and would be easily detectable at planned future hadron colliders. In particular, large cross sections for the distinctive processes like bar p p to tt,bar t c and p p to t t, b b would be expected at the Fermilab Tevatron and CERN LHC, respectively.
Phenomenological Consequences of the Constrained Exceptional Supersymmetric Standard Model
Athron, Peter; King, S. F.; Miller, D. J.
2010-02-10
The Exceptional Supersymmetric Standard Model (E{sub 6}SSM) provides a low energy alternative to the MSSM, with an extra gauged U(1){sub N} symmetry, solving the mu-problem of the MSSM. Inspired by the possible embedding into an E{sub 6} GUT, the matter content fills three generations of E{sub 6} multiplets, thus predicting exciting exotic matter such as diquarks or leptoquarks. We present predictions from a constrained version of the model (cE{sub 6}SSM), with a universal scalar mass m{sub 0}, trilinear mass A and gaugino mass M{sub 1/2}. We reveal a large volume of the cE{sub 6}SSM parameter space where the correct breakdownmore » of the gauge symmetry is achieved and all experimental constraints satisfied. We predict a hierarchical particle spectrum with heavy scalars and light gauginos, while the new exotic matter can be light or heavy depending on parameters. We present representative cE{sub 6}SSM scenarios, demonstrating that there could be light exotic particles, like leptoquarks and a U(1){sub N} Z' boson, with spectacular signals at the LHC.« less
Models of Standards Implementation: Implications for the Classroom.
ERIC Educational Resources Information Center
Marzano, Robert J.
The various ways that standards and standards-based education are being addressed around the United States are described. The education community can trace the start of the modern standards movement to the publication of "A Nation at Risk" in 1983 (National Commission on Excellence). The first education summit in 1987 then became a…
Setting, Evaluating, and Maintaining Certification Standards with the Rasch Model.
ERIC Educational Resources Information Center
Grosse, Martin E.; Wright, Benjamin D.
1986-01-01
Based on the standard setting procedures or the American Board of Preventive Medicine for their Core Test, this article describes how Rasch measurement can facilitate using test content judgments in setting a standard. Rasch measurement can then be used to evaluate and improve the precision of the standard and to hold it constant across time.…
The Risk GP Model: the standard model of prediction in medicine.
Fuller, Jonathan; Flores, Luis J
2015-12-01
With the ascent of modern epidemiology in the Twentieth Century came a new standard model of prediction in public health and clinical medicine. In this article, we describe the structure of the model. The standard model uses epidemiological measures-most commonly, risk measures-to predict outcomes (prognosis) and effect sizes (treatment) in a patient population that can then be transformed into probabilities for individual patients. In the first step, a risk measure in a study population is generalized or extrapolated to a target population. In the second step, the risk measure is particularized or transformed to yield probabilistic information relevant to a patient from the target population. Hence, we call the approach the Risk Generalization-Particularization (Risk GP) Model. There are serious problems at both stages, especially with the extent to which the required assumptions will hold and the extent to which we have evidence for the assumptions. Given that there are other models of prediction that use different assumptions, we should not inflexibly commit ourselves to one standard model. Instead, model pluralism should be standard in medical prediction. Copyright © 2015 Elsevier Ltd. All rights reserved.
Dark matter and color octets beyond the Standard Model
NASA Astrophysics Data System (ADS)
Krnjaic, Gordan Z.
Although the Standard Model (SM) of particles and interactions has survived forty years of experimental tests, it does not provide a complete description of nature. From cosmological and astrophysical observations, it is now clear that the majority of matter in the universe is not baryonic and interacts very weakly (if at all) via non-gravitational forces. The SM does not provide a dark matter candidate, so new particles must be introduced. Furthermore, recent Tevatron results suggest that SM predictions for benchmark collider observables are in tension with experimental observations. In this thesis, we will propose extensions to the SM that address each of these issues. Although there is abundant indirect evidence for the existence of dark matter, terrestrial efforts to observe its interactions have yielded conflicting results. We address this situation with a simple model of dark matter that features hydrogen-like bound states that scatter off SM nuclei by undergoing inelastic hyperfine transitions. We explore the available parameter space that results from demanding that DM self-interactions satisfy experimental bounds and ameliorate the tension between positive and null signals at the DAMA and CDMS experiments respectively. However, this simple model does not explain the cosmological abundance of dark matter and also encounters a Landau pole at a low energy scale. We, therefore, extend the field content and gauge group of the dark sector to resolve these issues with a renormalizable UV completion. We also explore the galactic dynamics of unbound dark matter and find that "dark ions" settle into a diffuse isothermal halo that differs from that of the bound states. This suppresses the local dark-ion density and expands the model's viable parameter space. We also consider the > 3σ excess in W plus dijet events recently observed at the Tevatron collider. We show that decays of a color-octet, electroweak-triplet scalar particle ("octo-triplet") can yield the
The pion: an enigma within the Standard Model
NASA Astrophysics Data System (ADS)
Horn, Tanja; Roberts, Craig D.
2016-07-01
Quantum chromodynamics (QCDs) is the strongly interacting part of the Standard Model. It is supposed to describe all of nuclear physics; and yet, almost 50 years after the discovery of gluons and quarks, we are only just beginning to understand how QCD builds the basic bricks for nuclei: neutrons and protons, and the pions that bind them together. QCD is characterised by two emergent phenomena: confinement and dynamical chiral symmetry breaking (DCSB). They have far-reaching consequences, expressed with great force in the character of the pion; and pion properties, in turn, suggest that confinement and DCSB are intimately connected. Indeed, since the pion is both a Nambu-Goldstone boson and a quark-antiquark bound-state, it holds a unique position in nature and, consequently, developing an understanding of its properties is critical to revealing some very basic features of the Standard Model. We describe experimental progress toward meeting this challenge that has been made using electromagnetic probes, highlighting both dramatic improvements in the precision of charged-pion form factor data that have been achieved in the past decade and new results on the neutral-pion transition form factor, both of which challenge existing notions of pion structure. We also provide a theoretical context for these empirical advances, which begins with an explanation of how DCSB works to guarantee that the pion is un-naturally light; but also, nevertheless, ensures that the pion is the best object to study in order to reveal the mechanisms that generate nearly all the mass of hadrons. In canvassing advances in these areas, our discussion unifies many aspects of pion structure and interactions, connecting the charged-pion elastic form factor, the neutral-pion transition form factor and the pion's leading-twist parton distribution amplitude. It also sketches novel ways in which experimental and theoretical studies of the charged-kaon electromagnetic form factor can provide
29 CFR 1990.151 - Model standard pursuant to section 6(b) of the Act.
Code of Federal Regulations, 2011 CFR
2011-07-01
... 29 Labor 9 2011-07-01 2011-07-01 false Model standard pursuant to section 6(b) of the Act. 1990... OCCUPATIONAL CARCINOGENS Model Standards § 1990.151 Model standard pursuant to section 6(b) of the Act... an action level as a limitation on requirements for periodic monitoring (para. (e)(3)), medical...
Custodial isospin violation in the Lee-Wick standard model
Chivukula, R. Sekhar; Farzinnia, Arsham; Foadi, Roshan
2010-05-01
We analyze the tension between naturalness and isospin violation in the Lee-Wick standard model (LW SM) by computing tree-level and fermionic one-loop contributions to the post-LEP electroweak parameters (S-circumflex, T-circumflex, W, and Y) and the Zb{sub L}b-bar{sub L} coupling. The model is most natural when the LW partners of the gauge bosons and fermions are light, but small partner masses can lead to large isospin violation. The post-LEP parameters yield a simple picture in the LW SM: the gauge sector contributes to Y and W only, with leading contributions arising at tree level, while the fermion sector contributes to S-circumflexmore » and T-circumflex only, with leading corrections arising at one loop. Hence, W and Y constrain the masses of the LW gauge bosons to satisfy M{sub 1}, M{sub 2} > or approx. 2.4 TeV at 95% C.L. Likewise, experimental limits on T-circumflex reveal that the masses of the LW fermions must satisfy M{sub q}, M{sub t} > or approx. 1.6 TeV at 95% C.L. if the Higgs mass is light and tend to exclude the LW SM for any LW fermion masses if the Higgs mass is heavy. Contributions from the top-quark sector to the Zb{sub L}b{sub L} coupling can be even more stringent, placing a lower bound of 4 TeV on the LW fermion masses at 95% C.L.« less
Is the standard model saved asymptotically by conformal symmetry?
NASA Astrophysics Data System (ADS)
Gorsky, A.; Mironov, A.; Morozov, A.; Tomaras, T. N.
2015-03-01
It is pointed out that the top-quark and Higgs masses and the Higgs VEV with great accuracy satisfy the relations 4 m {/H 2} = 2 m {/T 2} = v 2, which are very special and reminiscent of analogous ones at Argyres-Douglas points with enhanced conformal symmetry. Furthermore, the RG evolution of the corresponding Higgs self-interaction and Yukawa couplings λ(0) = 1/8 and y(0) = 1 leads to the free-field stable point in the pure scalar sector at the Planck scale, also suggesting enhanced conformal symmetry. Thus, it is conceivable that the Standard Model is the low-energy limit of a distinct special theory with (super?) conformal symmetry at the Planck scale. In the context of such a "scenario," one may further speculate that the Higgs particle is the Goldstone boson of (partly) spontaneously broken conformal symmetry. This would simultaneously resolve the hierarchy and Landau pole problems in the scalar sector and would provide a nearly flat potential with two almost degenerate minima at the electroweak and Planck scales.
On push-forward representations in the standard gyrokinetic model
Miyato, N., E-mail: miyato.naoaki@jaea.go.jp; Yagi, M.; Scott, B. D.
2015-01-15
Two representations of fluid moments in terms of a gyro-center distribution function and gyro-center coordinates, which are called push-forward representations, are compared in the standard electrostatic gyrokinetic model. In the representation conventionally used to derive the gyrokinetic Poisson equation, the pull-back transformation of the gyro-center distribution function contains effects of the gyro-center transformation and therefore electrostatic potential fluctuations, which is described by the Poisson brackets between the distribution function and scalar functions generating the gyro-center transformation. Usually, only the lowest order solution of the generating function at first order is considered to explicitly derive the gyrokinetic Poisson equation. This ismore » true in explicitly deriving representations of scalar fluid moments with polarization terms. One also recovers the particle diamagnetic flux at this order because it is associated with the guiding-center transformation. However, higher-order solutions are needed to derive finite Larmor radius terms of particle flux including the polarization drift flux from the conventional representation. On the other hand, the lowest order solution is sufficient for the other representation, in which the gyro-center transformation part is combined with the guiding-center one and the pull-back transformation of the distribution function does not appear.« less
Affine group formulation of the Standard Model coupled to gravity
Chou, Ching-Yi, E-mail: l2897107@mail.ncku.edu.tw; Ita, Eyo, E-mail: ita@usna.edu; Soo, Chopin, E-mail: cpsoo@mail.ncku.edu.tw
In this work we apply the affine group formalism for four dimensional gravity of Lorentzian signature, which is based on Klauder’s affine algebraic program, to the formulation of the Hamiltonian constraint of the interaction of matter and all forces, including gravity with non-vanishing cosmological constant Λ, as an affine Lie algebra. We use the hermitian action of fermions coupled to gravitation and Yang–Mills theory to find the density weight one fermionic super-Hamiltonian constraint. This term, combined with the Yang–Mills and Higgs energy densities, are composed with York’s integrated time functional. The result, when combined with the imaginary part of themore » Chern–Simons functional Q, forms the affine commutation relation with the volume element V(x). Affine algebraic quantization of gravitation and matter on equal footing implies a fundamental uncertainty relation which is predicated upon a non-vanishing cosmological constant. -- Highlights: •Wheeler–DeWitt equation (WDW) quantized as affine algebra, realizing Klauder’s program. •WDW formulated for interaction of matter and all forces, including gravity, as affine algebra. •WDW features Hermitian generators in spite of fermionic content: Standard Model addressed. •Constructed a family of physical states for the full, coupled theory via affine coherent states. •Fundamental uncertainty relation, predicated on non-vanishing cosmological constant.« less
Passive Membrane Permeability: Beyond the Standard Solubility-Diffusion Model.
Parisio, Giulia; Stocchero, Matteo; Ferrarini, Alberta
2013-12-10
The spontaneous diffusion of solutes through lipid bilayers is still a challenge for theoretical predictions. Since permeation processes remain beyond the capabilities of unbiased molecular dynamics simulations, an alternative strategy is currently adopted to gain insight into their mechanism and time scale. This is based on a monodimensional description of the translocation process only in terms of the position of the solute along the normal to the lipid bilayer, which is formally expressed in the solubility-diffusion model. Actually, a role of orientational and conformational motions has been pointed out, and the use of advanced simulation techniques has been proposed to take into account their effect. Here, we discuss the limitations of the standard solubility-diffusion approach and propose a more general description of membrane translocation as a diffusion process on a free energy surface, which is a function of the translational and rotational degrees of freedom of the molecule. Simple expressions for the permeability coefficient are obtained under suitable conditions. For fast solute reorientation, the classical solubility-diffusion equation is recovered. Under the assumption that well-defined minima can be identified on the free energy landscape, a mechanistic interpretation of the permeability coefficient in terms of all possible permeation paths is given.
Implications of Higgs’ universality for physics beyond the Standard Model
NASA Astrophysics Data System (ADS)
Goldman, T.; Stephenson, G. J.
2017-06-01
We emulate Cabibbo by assuming a kind of universality for fermion mass terms in the Standard Model. We show that this is consistent with all current data and with the concept that deviations from what we term Higgs’ universality are due to corrections from currently unknown physics of nonetheless conventional form. The application to quarks is straightforward, while the application to leptons makes use of the recognition that Dark Matter can provide the “sterile” neutrinos needed for the seesaw mechanism. Requiring agreement with neutrino oscillation results leads to the prediction that the mass eigenstates of the sterile neutrinos are separated by quadratically larger ratios than for the charged fermions. Using consistency with the global fit to LSND-like, short-baseline oscillations to determine the scale of the lowest mass sterile neutrino strongly suggests that the recently observed astrophysical 3.55 keV γ-ray line is also consistent with the mass expected for the second most massive sterile neutrino in our analysis.
Using polarized positrons to probe physics beyond the standard model
NASA Astrophysics Data System (ADS)
Furletova, Yulia; Mantry, Sonny
2018-05-01
A high intensity polarized positron beam, as part of the JLAB 12 GeV program and the proposed electron-ion collider (EIC), can provide a unique opportunity for testing the Standard Model (SM) and probing for new physics. The combination of high luminosity with polarized electrons and positrons incident on protons and deuterons can isolate important effects and distinguish between possible new physics scenarios in a manner that will complement current experimental efforts. A comparison of cross sections between polarized electron and positron beams will allow for an extraction of the poorly known weak neutral current coupling combination 2C3u - C3d and would complement the proposed plan for a precision extraction of the combination 2C2u - Cd at the EIC. Precision measurements of these neutral weak couplings would constrain new physics scenarios including Leptoquarks, R-parity violating supersymmetry, and electron and quark compositeness. The dependence of the charged current cross section on the longitudinal polarization of the positron beam will provide an independent probe to test the chiral structure of the electroweak interactions. A polarized positron can probe charged lepton flavor violation (CLFV) through a search for e+ → τ+ transitions in a manner that is independent and complementary to the proposed e- → τ- search at the EIC. A positron beam incident on an electron in a stationary nuclear target will also allow for a dark-photon (A') search via the annihilation process e+ + e- → A' + γ.
Honrubia-Escribano, A.; Gomez Lazaro, E.; Jimenez-Buendia, F.
The International Electrotechnical Commission Standard 61400-27-1 was published in February 2015. This standard deals with the development of generic terms and parameters to specify the electrical characteristics of wind turbines. Generic models of very complex technological systems, such as wind turbines, are thus defined based on the four common configurations available in the market. Due to its recent publication, the comparison of the response of generic models with specific vendor models plays a key role in ensuring the widespread use of this standard. This paper compares the response of a specific Gamesa dynamic wind turbine model to the corresponding genericmore » IEC Type III wind turbine model response when the wind turbine is subjected to a three-phase voltage dip. This Type III model represents the doubly-fed induction generator wind turbine, which is not only one of the most commonly sold and installed technologies in the current market but also a complex variable-speed operation implementation. In fact, active and reactive power transients are observed due to the voltage reduction. Special attention is given to the reactive power injection provided by the wind turbine models because it is a requirement of current grid codes. Further, the boundaries of the generic models associated with transient events that cannot be represented exactly are included in the paper.« less
Judgmental Standard Setting Using a Cognitive Components Model.
ERIC Educational Resources Information Center
McGinty, Dixie; Neel, John H.
A new standard setting approach is introduced, called the cognitive components approach. Like the Angoff method, the cognitive components method generates minimum pass levels (MPLs) for each item. In both approaches, the item MPLs are summed for each judge, then averaged across judges to yield the standard. In the cognitive components approach,…
Wisconsin's Model Academic Standards for Marketing Education. Bulletin No. 9005.
ERIC Educational Resources Information Center
Wisconsin State Dept. of Public Instruction, Madison.
This document contains standards for the academic content of the Wisconsin K-12 curriculum in the area of marketing education. Developed by task forces of educators, parents, board of education members, and employers and employees, the standards cover content, performance, and proficiency areas. The first part of the guide is an introduction that…
A Visual Model for the Variance and Standard Deviation
ERIC Educational Resources Information Center
Orris, J. B.
2011-01-01
This paper shows how the variance and standard deviation can be represented graphically by looking at each squared deviation as a graphical object--in particular, as a square. A series of displays show how the standard deviation is the size of the average square.
The pion: an enigma within the Standard Model
Horn, Tanja; Roberts, Craig D.
2016-05-27
Almost 50 years after the discovery of gluons & quarks, we are only just beginning to understand how QCD builds the basic bricks for nuclei: neutrons, protons, and the pions that bind them. QCD is characterised by two emergent phenomena: confinement & dynamical chiral symmetry breaking (DCSB). They are expressed with great force in the character of the pion. In turn, pion properties suggest that confinement & DCSB are closely connected. As both a Nambu-Goldstone boson and a quark-antiquark bound-state, the pion is unique in Nature. Developing an understanding of its properties is thus critical to revealing basic features ofmore » the Standard Model. We describe experimental progress in this direction, made using electromagnetic probes, highlighting both improvements in the precision of charged-pion form factor data, achieved in the past decade, and new results on the neutral-pion transition form factor. Both challenge existing notions of pion structure. We also provide a theoretical context for these empirical advances, first explaining how DCSB works to guarantee that the pion is unnaturally light; but also, nevertheless, ensures the pion is key to revealing the mechanisms that generate nearly all the mass of hadrons. Our discussion unifies the charged-pion elastic and neutral-pion transition form factors, and the pion's twist-2 parton distribution amplitude. It also indicates how studies of the charged-kaon form factor can provide significant contributions. Importantly, recent predictions for the large-$Q^2$ behaviour of the pion form factor can be tested by experiments planned at JLab 12. Those experiments will extend precise charged-pion form factor data to momenta that can potentially serve in validating factorisation theorems in QCD, exposing the transition between the nonperturbative and perturbative domains, and thereby reaching a goal that has long driven hadro-particle physics.« less
Using polarized positrons to probe physics beyond the standard model
Furletova, Yulia; Mantry, Sonny
A high intensity polarized positron beam, as part of the JLAB 12 GeV program and the proposed electron-ion collider (EIC), can provide a unique opportunity for testing the Standard Model (SM) and probing for new physics. The combination of high luminosity with polarized electrons and positrons incident on protons and deuterons can isolate important effects and distinguish between possible new physics scenarios in a manner that will complement current experimental efforts. Here, a comparison of cross sections between polarized electron and positron beams will allow for an extraction of the poorly known weak neutral current coupling combination 2C 3u -more » C 3d and would complement the proposed plan for a precision extraction of the combination 2C 2u - C d at the EIC. Precision measurements of these neutral weak couplings would constrain new physics scenarios including Leptoquarks, R-parity violating supersymmetry, and electron and quark compositeness. The dependence of the charged current cross section on the longitudinal polarization of the positron beam will provide an independent probe to test the chiral structure of the electroweak interactions. A polarized positron can probe charged lepton flavor violation (CLFV) through a search for e + → τ + transitions in a manner that is independent and complementary to the proposed e - → τ - search at the EIC. A positron beam incident on an electron in a stationary nuclear target will also allow for a dark-photon (A') search via the annihilation process e + + e - → A' + γ.« less
Fourth standard model family neutrino at future linear colliders
Ciftci, A.K.; Ciftci, R.; Sultansoy, S.
2005-09-01
It is known that flavor democracy favors the existence of the fourth standard model (SM) family. In order to give nonzero masses for the first three-family fermions flavor democracy has to be slightly broken. A parametrization for democracy breaking, which gives the correct values for fundamental fermion masses and, at the same time, predicts quark and lepton Cabibbo-Kobayashi-Maskawa (CKM) matrices in a good agreement with the experimental data, is proposed. The pair productions of the fourth SM family Dirac ({nu}{sub 4}) and Majorana (N{sub 1}) neutrinos at future linear colliders with {radical}(s)=500 GeV, 1 TeV, and 3 TeV are considered.more » The cross section for the process e{sup +}e{sup -}{yields}{nu}{sub 4}{nu}{sub 4}(N{sub 1}N{sub 1}) and the branching ratios for possible decay modes of the both neutrinos are determined. The decays of the fourth family neutrinos into muon channels ({nu}{sub 4}(N{sub 1}){yields}{mu}{sup {+-}}W{sup {+-}}) provide cleanest signature at e{sup +}e{sup -} colliders. Meanwhile, in our parametrization this channel is dominant. W bosons produced in decays of the fourth family neutrinos will be seen in detector as either di-jets or isolated leptons. As an example, we consider the production of 200 GeV mass fourth family neutrinos at {radical}(s)=500 GeV linear colliders by taking into account di-muon plus four jet events as signatures.« less
Using polarized positrons to probe physics beyond the standard model
Furletova, Yulia; Mantry, Sonny
2018-05-25
A high intensity polarized positron beam, as part of the JLAB 12 GeV program and the proposed electron-ion collider (EIC), can provide a unique opportunity for testing the Standard Model (SM) and probing for new physics. The combination of high luminosity with polarized electrons and positrons incident on protons and deuterons can isolate important effects and distinguish between possible new physics scenarios in a manner that will complement current experimental efforts. Here, a comparison of cross sections between polarized electron and positron beams will allow for an extraction of the poorly known weak neutral current coupling combination 2C 3u -more » C 3d and would complement the proposed plan for a precision extraction of the combination 2C 2u - C d at the EIC. Precision measurements of these neutral weak couplings would constrain new physics scenarios including Leptoquarks, R-parity violating supersymmetry, and electron and quark compositeness. The dependence of the charged current cross section on the longitudinal polarization of the positron beam will provide an independent probe to test the chiral structure of the electroweak interactions. A polarized positron can probe charged lepton flavor violation (CLFV) through a search for e + → τ + transitions in a manner that is independent and complementary to the proposed e - → τ - search at the EIC. A positron beam incident on an electron in a stationary nuclear target will also allow for a dark-photon (A') search via the annihilation process e + + e - → A' + γ.« less
Park, Yu Rang; Yoon, Young Jo; Jang, Tae Hun; Seo, Hwa Jeong
2014-01-01
Objectives Extension of the standard model while retaining compliance with it is a challenging issue because there is currently no method for semantically or syntactically verifying an extended data model. A metadata-based extended model, named CCR+, was designed and implemented to achieve interoperability between standard and extended models. Methods Furthermore, a multilayered validation method was devised to validate the standard and extended models. The American Society for Testing and Materials (ASTM) Community Care Record (CCR) standard was selected to evaluate the CCR+ model; two CCR and one CCR+ XML files were evaluated. Results In total, 188 metadata were extracted from the ASTM CCR standard; these metadata are semantically interconnected and registered in the metadata registry. An extended-data-model-specific validation file was generated from these metadata. This file can be used in a smartphone application (Health Avatar CCR+) as a part of a multilayered validation. The new CCR+ model was successfully evaluated via a patient-centric exchange scenario involving multiple hospitals, with the results supporting both syntactic and semantic interoperability between the standard CCR and extended, CCR+, model. Conclusions A feasible method for delivering an extended model that complies with the standard model is presented herein. There is a great need to extend static standard models such as the ASTM CCR in various domains: the methods presented here represent an important reference for achieving interoperability between standard and extended models. PMID:24627817
Park, Yu Rang; Yoon, Young Jo; Jang, Tae Hun; Seo, Hwa Jeong; Kim, Ju Han
2014-01-01
Extension of the standard model while retaining compliance with it is a challenging issue because there is currently no method for semantically or syntactically verifying an extended data model. A metadata-based extended model, named CCR+, was designed and implemented to achieve interoperability between standard and extended models. Furthermore, a multilayered validation method was devised to validate the standard and extended models. The American Society for Testing and Materials (ASTM) Community Care Record (CCR) standard was selected to evaluate the CCR+ model; two CCR and one CCR+ XML files were evaluated. In total, 188 metadata were extracted from the ASTM CCR standard; these metadata are semantically interconnected and registered in the metadata registry. An extended-data-model-specific validation file was generated from these metadata. This file can be used in a smartphone application (Health Avatar CCR+) as a part of a multilayered validation. The new CCR+ model was successfully evaluated via a patient-centric exchange scenario involving multiple hospitals, with the results supporting both syntactic and semantic interoperability between the standard CCR and extended, CCR+, model. A feasible method for delivering an extended model that complies with the standard model is presented herein. There is a great need to extend static standard models such as the ASTM CCR in various domains: the methods presented here represent an important reference for achieving interoperability between standard and extended models.
Null tests of the standard model using the linear model formalism
NASA Astrophysics Data System (ADS)
Marra, Valerio; Sapone, Domenico
2018-04-01
We test both the Friedmann-Lemaître-Robertson-Walker geometry and Λ CDM cosmology in a model-independent way by reconstructing the Hubble function H (z ), the comoving distance D (z ), and the growth of structure f σ8(z ) using the most recent data available. We use the linear model formalism in order to optimally reconstruct the above cosmological functions, together with their derivatives and integrals. We then evaluate four of the null tests available in the literature that probe both background and perturbation assumptions. For all the four tests, we find agreement, within the errors, with the standard cosmological model.
NASA Astrophysics Data System (ADS)
Peckham, Scott
2016-04-01
Over the last decade, model coupling frameworks like CSDMS (Community Surface Dynamics Modeling System) and ESMF (Earth System Modeling Framework) have developed mechanisms that make it much easier for modelers to connect heterogeneous sets of process models in a plug-and-play manner to create composite "system models". These mechanisms greatly simplify code reuse, but must simultaneously satisfy many different design criteria. They must be able to mediate or compensate for differences between the process models, such as their different programming languages, computational grids, time-stepping schemes, variable names and variable units. However, they must achieve this interoperability in a way that: (1) is noninvasive, requiring only relatively small and isolated changes to the original source code, (2) does not significantly reduce performance, (3) is not time-consuming or confusing for a model developer to implement, (4) can very easily be updated to accommodate new versions of a given process model and (5) does not shift the burden of providing model interoperability to the model developers. In tackling these design challenges, model framework developers have learned that the best solution is to provide each model with a simple, standardized interface, i.e. a set of standardized functions that make the model: (1) fully-controllable by a caller (e.g. a model framework) and (2) self-describing with standardized metadata. Model control functions are separate functions that allow a caller to initialize the model, advance the model's state variables in time and finalize the model. Model description functions allow a caller to retrieve detailed information on the model's input and output variables, its computational grid and its timestepping scheme. If the caller is a modeling framework, it can use the self description functions to learn about each process model in a collection to be coupled and then automatically call framework service components (e.g. regridders
Topics in physics beyond the standard model with strong interactions
NASA Astrophysics Data System (ADS)
Gomez Sanchez, Catalina
In this thesis we study a few complementary topics related to some of the open questions in the Standard Model (SM). We first consider the scalar spectrum of gauge theories with walking dynamics. The question of whether or not a light pseudo-Nambu-Goldstone boson associated with the spontaneous breaking of approximate dilatation symmetry appears in these theories has been long withstanding. We derive an effective action for the scalars, including new terms not previously considered in the literature, and obtain solutions for the lightest scalar's momentum-dependent form factor that determines the value of its pole mass. Our results for the lowest-lying state suggest that this scalar is never expected to be light, but it can have some properties that closely resemble the SM Higgs boson. We then propose a new leptonic charge-asymmetry observable well suited for the study of some Beyond the SM (BSM) physics objects at the LHC. New resonances decaying to one or many leptons could constitute the first signs of BSM physics that we observe at the LHC; if these new objects carry QCD charge they may have an associated charge asymmetry in their daughter leptons. Our observable can be used in events with single or multiple leptons in the final state. We discuss this measurement in the context of coloured scalar diquarks, as well as that of top-antitop pairs. We argue that, although a fainter signal is expected relative to other charge asymmetry observables, the low systematic uncertainties keep this particular observable relevant, especially in cases where reconstruction of the parent particle is not a viable strategy. Finally, we propose a simple dark-sector extension to the SM that communicates with ordinary quarks and leptons only through a small kinetic mixing of the dark photon and the photon. The dark sector is assumed to undergo a series of phase transitions such that monopoles and strings arise. These objects form long-lived states that eventually decay and can
An extension of the standard model with a single coupling parameter
NASA Astrophysics Data System (ADS)
Atance, Mario; Cortés, José Luis; Irastorza, Igor G.
1997-02-01
We show that it is possible to find an extension of the matter content of the standard model with a unification of gauge and Yukawa couplings reproducing their known values. The perturbative renormalizability of the model with a single coupling and the requirement to accommodate the known properties of the standard model fix the masses and couplings of the additional particles. The implications on the parameters of the standard model are discussed.
Prototyping an online wetland ecosystem services model using open model sharing standards
Feng, M.; Liu, S.; Euliss, N.H.; Young, Caitlin; Mushet, D.M.
2011-01-01
Great interest currently exists for developing ecosystem models to forecast how ecosystem services may change under alternative land use and climate futures. Ecosystem services are diverse and include supporting services or functions (e.g., primary production, nutrient cycling), provisioning services (e.g., wildlife, groundwater), regulating services (e.g., water purification, floodwater retention), and even cultural services (e.g., ecotourism, cultural heritage). Hence, the knowledge base necessary to quantify ecosystem services is broad and derived from many diverse scientific disciplines. Building the required interdisciplinary models is especially challenging as modelers from different locations and times may develop the disciplinary models needed for ecosystem simulations, and these models must be identified and made accessible to the interdisciplinary simulation. Additional difficulties include inconsistent data structures, formats, and metadata required by geospatial models as well as limitations on computing, storage, and connectivity. Traditional standalone and closed network systems cannot fully support sharing and integrating interdisciplinary geospatial models from variant sources. To address this need, we developed an approach to openly share and access geospatial computational models using distributed Geographic Information System (GIS) techniques and open geospatial standards. We included a means to share computational models compliant with Open Geospatial Consortium (OGC) Web Processing Services (WPS) standard to ensure modelers have an efficient and simplified means to publish new models. To demonstrate our approach, we developed five disciplinary models that can be integrated and shared to simulate a few of the ecosystem services (e.g., water storage, waterfowl breeding) that are provided by wetlands in the Prairie Pothole Region (PPR) of North America.
A Journey in Standard Development: The Core Manufacturing Simulation Data (CMSD) Information Model.
Lee, Yung-Tsun Tina
2015-01-01
This report documents a journey "from research to an approved standard" of a NIST-led standard development activity. That standard, Core Manufacturing Simulation Data (CMSD) information model, provides neutral structures for the efficient exchange of manufacturing data in a simulation environment. The model was standardized under the auspices of the international Simulation Interoperability Standards Organization (SISO). NIST started the research in 2001 and initiated the standardization effort in 2004. The CMSD standard was published in two SISO Products. In the first Product, the information model was defined in the Unified Modeling Language (UML) and published in 2010 as SISO-STD-008-2010. In the second Product, the information model was defined in Extensible Markup Language (XML) and published in 2013 as SISO-STD-008-01-2012. Both SISO-STD-008-2010 and SISO-STD-008-01-2012 are intended to be used together.
MUSiC - Model-independent search for deviations from Standard Model predictions in CMS
NASA Astrophysics Data System (ADS)
Pieta, Holger
2010-02-01
We present an approach for a model independent search in CMS. Systematically scanning the data for deviations from the standard model Monte Carlo expectations, such an analysis can help to understand the detector and tune event generators. By minimizing the theoretical bias the analysis is furthermore sensitive to a wide range of models for new physics, including the uncounted number of models not-yet-thought-of. After sorting the events into classes defined by their particle content (leptons, photons, jets and missing transverse energy), a minimally prejudiced scan is performed on a number of distributions. Advanced statistical methods are used to determine the significance of the deviating regions, rigorously taking systematic uncertainties into account. A number of benchmark scenarios, including common models of new physics and possible detector effects, have been used to gauge the power of such a method. )
Physical Education Model Curriculum Standards. Grades Nine through Twelve.
ERIC Educational Resources Information Center
California State Dept. of Education, Sacramento.
These physical education standards were designed to ensure that each student achieve the following goals: (1) physical activity--students develop interest and proficiency in movement skills and understand the importance of lifelong participation in daily physical activity; (2) physical fitness and wellness--students increase understanding of basic…
Addressing Standardized Testing through a Novel Assesment Model
ERIC Educational Resources Information Center
Schifter, Catherine C.; Carey, Martha
2014-01-01
The No Child Left Behind (NCLB) legislation spawned a plethora of standardized testing services for all the high stakes testing required by the law. We argue that one-size-fits all assessments disadvantage students who are English Language Learners, in the USA, as well as students with limited economic resources, special needs, and not reading on…
ISO 9000 quality standards: a model for blood banking?
Nevalainen, D E; Lloyd, H L
1995-06-01
The recent American Association of Blood Banks publications Quality Program and Quality Systems in the Blood Bank and Laboratory Environment, the FDA's draft guidelines, and recent changes in the GMP regulations all discuss the benefits of implementing quality systems in blood center and/or manufacturing operations. While the medical device GMPs in the United States have been rewritten to accommodate a quality system approach similar to ISO 9000, the Center for Biologics Evaluation and Research of the FDA is also beginning to make moves toward adopting "quality systems audits" as an inspection process rather than using the historical approach of record reviews. The approach is one of prevention of errors rather than detection after the fact (Tourault MA, oral communication, November 1994). The ISO 9000 series of standards is a quality system that has worldwide scope and can be applied in any industry or service. The use of such international standards in blood banking should raise the level of quality within an organization, among organizations on a regional level, within a country, and among nations on a worldwide basis. Whether an organization wishes to become registered to a voluntary standard or not, the use of such standards to become ISO 9000-compliant would be a move in the right direction and would be a positive sign to the regulatory authorities and the public that blood banking is making a visible effort to implement world-class quality systems in its operations. Implementation of quality system standards such as the ISO 9000 series will provide an organized approach for blood banks and blood bank testing operations. With the continued trend toward consolidation and mergers, resulting in larger operational units with more complexity, quality systems will become even more important as the industry moves into the future.(ABSTRACT TRUNCATED AT 250 WORDS)
40 CFR 1039.101 - What exhaust emission standards must my engines meet after the 2014 model year?
Code of Federal Regulations, 2013 CFR
2013-07-01
... emission standards must my engines meet after the 2014 model year? The exhaust emission standards of this section apply after the 2014 model year. Certain of these standards also apply for model year 2014 and... emission standards that apply to 2014 and earlier model years. Section 1039.105 specifies smoke standards...
40 CFR 1039.101 - What exhaust emission standards must my engines meet after the 2014 model year?
Code of Federal Regulations, 2014 CFR
2014-07-01
... emission standards must my engines meet after the 2014 model year? The exhaust emission standards of this section apply after the 2014 model year. Certain of these standards also apply for model year 2014 and... emission standards that apply to 2014 and earlier model years. Section 1039.105 specifies smoke standards...
40 CFR 1039.101 - What exhaust emission standards must my engines meet after the 2014 model year?
Code of Federal Regulations, 2012 CFR
2012-07-01
... emission standards must my engines meet after the 2014 model year? The exhaust emission standards of this section apply after the 2014 model year. Certain of these standards also apply for model year 2014 and... emission standards that apply to 2014 and earlier model years. Section 1039.105 specifies smoke standards...
40 CFR 1039.101 - What exhaust emission standards must my engines meet after the 2014 model year?
Code of Federal Regulations, 2011 CFR
2011-07-01
... emission standards must my engines meet after the 2014 model year? The exhaust emission standards of this section apply after the 2014 model year. Certain of these standards also apply for model year 2014 and... emission standards that apply to 2014 and earlier model years. Section 1039.105 specifies smoke standards...
Plot Scale Factor Models for Standard Orthographic Views
ERIC Educational Resources Information Center
Osakue, Edward E.
2007-01-01
Geometric modeling provides graphic representations of real or abstract objects. Realistic representation requires three dimensional (3D) attributes since natural objects have three principal dimensions. CAD software gives the user the ability to construct realistic 3D models of objects, but often prints of these models must be generated on two…
Physics at a 100 TeV pp Collider: Standard Model Processes
Mangano, M. L.; Zanderighi, G.; Aguilar Saavedra, J. A.
This report summarises the properties of Standard Model processes at the 100 TeV pp collider. We document the production rates and typical distributions for a number of benchmark Standard Model processes, and discuss new dynamical phenomena arising at the highest energies available at this collider. We discuss the intrinsic physics interest in the measurement of these Standard Model processes, as well as their role as backgrounds for New Physics searches.
Battery Ownership Model - Medium Duty HEV Battery Leasing & Standardization
Kelly, Ken; Smith, Kandler; Cosgrove, Jon
2015-12-01
Prepared for the U.S. Department of Energy, this milestone report focuses on the economics of leasing versus owning batteries for medium-duty hybrid electric vehicles as well as various battery standardization scenarios. The work described in this report was performed by members of the Energy Storage Team and the Vehicle Simulation Team in NREL's Transportation and Hydrogen Systems Center along with members of the Vehicles Analysis Team at Ricardo.
Error model for the SAO 1969 standard earth.
NASA Technical Reports Server (NTRS)
Martin, C. F.; Roy, N. A.
1972-01-01
A method is developed for estimating an error model for geopotential coefficients using satellite tracking data. A single station's apparent timing error for each pass is attributed to geopotential errors. The root sum of the residuals for each station also depends on the geopotential errors, and these are used to select an error model. The model chosen is 1/4 of the difference between the SAO M1 and the APL 3.5 geopotential.
Specification for a standard radar sea clutter model
NASA Astrophysics Data System (ADS)
Paulus, Richard A.
1990-09-01
A model for the average sea clutter radar cross section is proposed for the Oceanographic and Atmospheric Master Library. This model is a function of wind speed (or sea state), wind direction relative to the antenna, refractive conditions, radar antenna height, frequency, polarization, horizontal beamwidth, and compressed pulse length. The model is fully described, a FORTRAN 77 computer listing is provided, and test cases are given to demonstrate the proper operation of the program.
On the Estimation of Standard Errors in Cognitive Diagnosis Models
ERIC Educational Resources Information Center
Philipp, Michel; Strobl, Carolin; de la Torre, Jimmy; Zeileis, Achim
2018-01-01
Cognitive diagnosis models (CDMs) are an increasingly popular method to assess mastery or nonmastery of a set of fine-grained abilities in educational or psychological assessments. Several inference techniques are available to quantify the uncertainty of model parameter estimates, to compare different versions of CDMs, or to check model…
ERIC Educational Resources Information Center
Newton, Jill A.; Kasten, Sarah E.
2013-01-01
The release of the Common Core State Standards for Mathematics and their adoption across the United States calls for careful attention to the alignment between mathematics standards and assessments. This study investigates 2 models that measure alignment between standards and assessments, the Surveys of Enacted Curriculum (SEC) and the Webb…
Realizing three generations of the Standard Model fermions in the type IIB matrix model
NASA Astrophysics Data System (ADS)
Aoki, Hajime; Nishimura, Jun; Tsuchiya, Asato
2014-05-01
We discuss how the Standard Model particles appear from the type IIB matrix model, which is considered to be a nonperturbative formulation of superstring theory. In particular, we are concerned with a constructive definition of the theory, in which we start with finite- N matrices and take the large- N limit afterwards. In that case, it was pointed out recently that realizing chiral fermions in the model is more difficult than it had been thought from formal arguments at N = ∞ and that introduction of a matrix version of the warp factor is necessary. Based on this new insight, we show that two generations of the Standard Model fermions can be realized by considering a rather generic configuration of fuzzy S2 and fuzzy S2 × S2 in the extra dimensions. We also show that three generations can be obtained by squashing one of the S2's that appear in the configuration. Chiral fermions appear at the intersections of the fuzzy manifolds with nontrivial Yukawa couplings to the Higgs field, which can be calculated from the overlap of their wave functions.
Symmetry Breaking, Unification, and Theories Beyond the Standard Model
Nomura, Yasunori
2009-07-31
A model was constructed in which the supersymmetric fine-tuning problem is solved without extending the Higgs sector at the weak scale. We have demonstrated that the model can avoid all the phenomenological constraints, while avoiding excessive fine-tuning. We have also studied implications of the model on dark matter physics and collider physics. I have proposed in an extremely simple construction for models of gauge mediation. We found that the {mu} problem can be simply and elegantly solved in a class of models where the Higgs fields couple directly to the supersymmetry breaking sector. We proposed a new way of addressingmore » the flavor problem of supersymmetric theories. We have proposed a new framework of constructing theories of grand unification. We constructed a simple and elegant model of dark matter which explains excess flux of electrons/positrons. We constructed a model of dark energy in which evolving quintessence-type dark energy is naturally obtained. We studied if we can find evidence of the multiverse.« less
Airside HVAC BESTEST: HVAC Air-Distribution System Model Test Cases for ASHRAE Standard 140
Judkoff, Ronald; Neymark, Joel; Kennedy, Mike D.
This paper summarizes recent work to develop new airside HVAC equipment model analytical verification test cases for ANSI/ASHRAE Standard 140, Standard Method of Test for the Evaluation of Building Energy Analysis Computer Programs. The analytical verification test method allows comparison of simulation results from a wide variety of building energy simulation programs with quasi-analytical solutions, further described below. Standard 140 is widely cited for evaluating software for use with performance-path energy efficiency analysis, in conjunction with well-known energy-efficiency standards including ASHRAE Standard 90.1, the International Energy Conservation Code, and other international standards. Airside HVAC Equipment is a common area ofmore » modelling not previously explicitly tested by Standard 140. Integration of the completed test suite into Standard 140 is in progress.« less
Testing the Standard Model with the Primordial Inflation Explorer
NASA Technical Reports Server (NTRS)
Kogut, Alan J.
2011-01-01
The Primordial Inflation Explorer is an Explorer-class mission to measure the gravity-wave signature of primordial inflation through its distinctive imprint on the linear polarization of the cosmic microwave background. PIXIE uses an innovative optical design to achieve background-limited sensitivity in 400 spectral channels spanning 2.5 decades in frequency from 30 GHz to 6 THz (1 cm to 50 micron wavelength). The principal science goal is the detection and characterization of linear polarization from an inflationary epoch in the early universe, with tensor-to-scalar ratio r < 10A{-3) at 5 standard deviations. The rich PIXIE data set will also constrain physical processes ranging from Big Bang cosmology to the nature of the first stars to physical conditions within the interstellar medium of the Galaxy. I describe the PIXIE instrument and mission architecture needed to detect the inflationary signature using only 4 semiconductor bolometers.
Higgs boson mass in the standard model at two-loop order and beyond
Martin, Stephen P.; Robertson, David G.
2014-10-01
We calculate the mass of the Higgs boson in the standard model in terms of the underlying Lagrangian parameters at complete 2-loop order with leading 3-loop corrections. A computer program implementing the results is provided. The program also computes and minimizes the standard model effective potential in Landau gauge at 2-loop order with leading 3-loop corrections.
Needed: A Standard Information Processing Model of Learning and Learning Processes.
ERIC Educational Resources Information Center
Carifio, James
One strategy to prevent confusion as new paradigms emerge is to have professionals in the area develop and use a standard model of the phenomenon in question. The development and use of standard models in physics, genetics, archaeology, and cosmology have been very productive. The cognitive revolution in psychology and education has produced a…
Ex-Nihilo: Obstacles Surrounding Teaching the Standard Model
ERIC Educational Resources Information Center
Pimbblet, Kevin A.
2002-01-01
The model of the Big Bang is an integral part of the national curricula in England and Wales. Previous work (e.g. Baxter 1989) has shown that pupils often come into education with many and varied prior misconceptions emanating from both internal and external sources. Whilst virtually all of these misconceptions can be remedied, there will remain…
A review of standardized metabolic phenotyping of animal models.
Rozman, Jan; Klingenspor, Martin; Hrabě de Angelis, Martin
2014-10-01
Metabolic phenotyping of genetically modified animals aims to detect new candidate genes and related metabolic pathways that result in dysfunctional energy balance regulation and predispose for diseases such as obesity or type 2 diabetes mellitus. In this review, we provide a comprehensive overview on the technologies available to monitor energy flux (food uptake, bomb calorimetry of feces and food, and indirect calorimetry) and body composition (qNMR, DXA, and MRI) in animal models for human diseases with a special focus on phenotyping methods established in genetically engineered mice. We use an energy flux model to illustrate the principles of energy allocation, describe methodological aspects how to monitor energy balance, and introduce strategies for data analysis and presentation.
Metabolomics, Standards, and Metabolic Modeling for Synthetic Biology in Plants
Hill, Camilla Beate; Czauderna, Tobias; Klapperstück, Matthias; Roessner, Ute; Schreiber, Falk
2015-01-01
Life on earth depends on dynamic chemical transformations that enable cellular functions, including electron transfer reactions, as well as synthesis and degradation of biomolecules. Biochemical reactions are coordinated in metabolic pathways that interact in a complex way to allow adequate regulation. Biotechnology, food, biofuel, agricultural, and pharmaceutical industries are highly interested in metabolic engineering as an enabling technology of synthetic biology to exploit cells for the controlled production of metabolites of interest. These approaches have only recently been extended to plants due to their greater metabolic complexity (such as primary and secondary metabolism) and highly compartmentalized cellular structures and functions (including plant-specific organelles) compared with bacteria and other microorganisms. Technological advances in analytical instrumentation in combination with advances in data analysis and modeling have opened up new approaches to engineer plant metabolic pathways and allow the impact of modifications to be predicted more accurately. In this article, we review challenges in the integration and analysis of large-scale metabolic data, present an overview of current bioinformatics methods for the modeling and visualization of metabolic networks, and discuss approaches for interfacing bioinformatics approaches with metabolic models of cellular processes and flux distributions in order to predict phenotypes derived from specific genetic modifications or subjected to different environmental conditions. PMID:26557642
Dimensional reduction of the Standard Model coupled to a new singlet scalar field
NASA Astrophysics Data System (ADS)
Brauner, Tomáš; Tenkanen, Tuomas V. I.; Tranberg, Anders; Vuorinen, Aleksi; Weir, David J.
2017-03-01
We derive an effective dimensionally reduced theory for the Standard Model augmented by a real singlet scalar. We treat the singlet as a superheavy field and integrate it out, leaving an effective theory involving only the Higgs and SU(2) L × U(1) Y gauge fields, identical to the one studied previously for the Standard Model. This opens up the possibility of efficiently computing the order and strength of the electroweak phase transition, numerically and nonperturbatively, in this extension of the Standard Model. Understanding the phase diagram is crucial for models of electroweak baryogenesis and for studying the production of gravitational waves at thermal phase transitions.
ERIC Educational Resources Information Center
Tumthong, Suwut; Piriyasurawong, Pullop; Jeerangsuwan, Namon
2016-01-01
This research proposes a functional competency development model for academic personnel based on international professional qualification standards in computing field and examines the appropriateness of the model. Specifically, the model consists of three key components which are: 1) functional competency development model, 2) blended training…
Exploring New Physics Beyond the Standard Model: Final Technical Report
Wang, Liantao
This grant in 2015 to 2016 was for support in the area of theoretical High Energy Physics. The research supported focused mainly on the energy frontier, but it also has connections to both the cosmic and intensity frontiers. Lian-Tao Wang (PI) focused mainly on signal of new physics at colliders. The year 2015 - 2016, covered by this grant, has been an exciting period of digesting the influx of LHC data, understanding its meaning, and using it to refine strategies for deeper exploration. The PI proposed new methods of searching for new physics at the LHC, such as for themore » compressed stops. He also investigated in detail the signal of composite Higgs models, focusing on spin-1 composite resonances in the di-boson channel. He has also considered di-photon as a probe for such models. He has also made contributions in formulating search strategies of dark matter at the LHC, resulting in two documents with recommendations. The PI has also been active in studying the physics potential of future colliders, including Higgs factories and 100 TeV pp colliders. He has given comprehensive overview of the physics potential of the high energy proton collider, and outline its luminosity targets. He has also studied the use of lepton colliders to probe fermionic Higgs portal and bottom quark couplings to the Z boson.« less
The 1991 October 24 flare: A challenge for standard models
NASA Technical Reports Server (NTRS)
Beaujardiere, J.-F. De LA; Canfield, R. C.; Hudson, H. S.; Wulser, J.-P.; Acton, L.; Kosugi, T.; Masuda, S.
1995-01-01
The M9.8 solar flare of 1991 October 24 22:30 UT presents several interesting characteristics: (1) energy release starts high in the corona; (2) the primary chromospheric ribbons are initially well separated and do not move apart at an observable rate; (3) no evidence is found for an erupting filament or other driver. To explain this flare, we consider several canonical flare models, including a filament eruption, a confined filament eruption, current interruption, and interacting loops. We conclude that none of these scenarios unequivocally explains this flare. Two possibilities which cannot be ruled out are (1) the eruption of a filament unobservable in H-alpha which starts high in the corona and produces no ribbon motions smaller than our detection threshold and no perceptible expansion of the coronal X-ray source, and (2) energy release due to spontaneous, propagating reconnection which allows the system to essentially brighten in place.
Noncommutative GUTs, Standard Model and C, P, T
NASA Astrophysics Data System (ADS)
Aschieri, P.; Jurčo, B.; Schupp, P.; Wess, J.
2003-02-01
Noncommutative Yang-Mills theories are sensitive to the choice of the representation that enters in the gauge kinetic term. We constrain this ambiguity by considering grand unified theories. We find that at first order in the noncommutativity parameter θ, SU(5) is not truly a unified theory, while SO(10) has a unique noncommutative generalization. In view of these results we discuss the noncommutative SM theory that is compatible with SO(10) GUT and find that there are no modifications to the SM gauge kinetic term at lowest order in θ. We study in detail the reality, Hermiticity and C, P, T properties of the Seiberg-Witten map and of the resulting effective actions expanded in ordinary fields. We find that in models of GUTs (or compatible with GUTs) right-handed fermions and left-handed ones appear with opposite Seiberg-Witten map.
40 CFR 86.096-8 - Emission standards for 1996 and later model year light-duty vehicles.
Code of Federal Regulations, 2012 CFR
2012-07-01
....096-8 Emission standards for 1996 and later model year light-duty vehicles. (a)(1) Standards. (i... tested with the procedures in subpart B indicated for 1996 model year, and shall not exceed the standards... subpart B of this part for 1995 model year light-duty vehicles and be subject to the standards described...
40 CFR 86.096-8 - Emission standards for 1996 and later model year light-duty vehicles.
Code of Federal Regulations, 2013 CFR
2013-07-01
....096-8 Emission standards for 1996 and later model year light-duty vehicles. (a)(1) Standards. (i... tested with the procedures in subpart B indicated for 1996 model year, and shall not exceed the standards... subpart B of this part for 1995 model year light-duty vehicles and be subject to the standards described...
A New Proof of the Expected Frequency Spectrum under the Standard Neutral Model.
Hudson, Richard R
2015-01-01
The sample frequency spectrum is an informative and frequently employed approach for summarizing DNA variation data. Under the standard neutral model the expectation of the sample frequency spectrum has been derived by at least two distinct approaches. One relies on using results from diffusion approximations to the Wright-Fisher Model. The other is based on Pólya urn models that correspond to the standard coalescent model. A new proof of the expected frequency spectrum is presented here. It is a proof by induction and does not require diffusion results and does not require the somewhat complex sums and combinatorics of the derivations based on urn models.
Diagnostic Profiles: A Standard Setting Method for Use with a Cognitive Diagnostic Model
ERIC Educational Resources Information Center
Skaggs, Gary; Hein, Serge F.; Wilkins, Jesse L. M.
2016-01-01
This article introduces the Diagnostic Profiles (DP) standard setting method for setting a performance standard on a test developed from a cognitive diagnostic model (CDM), the outcome of which is a profile of mastered and not-mastered skills or attributes rather than a single test score. In the DP method, the key judgment task for panelists is a…
Can Cognitive Writing Models Inform the Design of the Common Core State Standards?
ERIC Educational Resources Information Center
Hayes, John R.; Olinghouse, Natalie G.
2015-01-01
In this article, we compare the Common Core State Standards in Writing to the Hayes cognitive model of writing, adapted to describe the performance of young and developing writers. Based on the comparison, we propose the inclusion of standards for motivation, goal setting, writing strategies, and attention by writers to the text they have just…
A Sandwich-Type Standard Error Estimator of SEM Models with Multivariate Time Series
ERIC Educational Resources Information Center
Zhang, Guangjian; Chow, Sy-Miin; Ong, Anthony D.
2011-01-01
Structural equation models are increasingly used as a modeling tool for multivariate time series data in the social and behavioral sciences. Standard error estimators of SEM models, originally developed for independent data, require modifications to accommodate the fact that time series data are inherently dependent. In this article, we extend a…
Using the Modification Index and Standardized Expected Parameter Change for Model Modification
ERIC Educational Resources Information Center
Whittaker, Tiffany A.
2012-01-01
Model modification is oftentimes conducted after discovering a badly fitting structural equation model. During the modification process, the modification index (MI) and the standardized expected parameter change (SEPC) are 2 statistics that may be used to aid in the selection of parameters to add to a model to improve the fit. The purpose of this…
Classical conformality in the Standard Model from Coleman’s theory
NASA Astrophysics Data System (ADS)
Kawana, Kiyoharu
2016-09-01
The classical conformality (CC) is one of the possible candidates for explaining the gauge hierarchy of the Standard Model (SM). We show that it is naturally obtained from the Coleman’s theory on baby universe.
Tests of local Lorentz invariance violation of gravity in the standard model extension with pulsars.
Shao, Lijing
2014-03-21
The standard model extension is an effective field theory introducing all possible Lorentz-violating (LV) operators to the standard model and general relativity (GR). In the pure-gravity sector of minimal standard model extension, nine coefficients describe dominant observable deviations from GR. We systematically implemented 27 tests from 13 pulsar systems to tightly constrain eight linear combinations of these coefficients with extensive Monte Carlo simulations. It constitutes the first detailed and systematic test of the pure-gravity sector of minimal standard model extension with the state-of-the-art pulsar observations. No deviation from GR was detected. The limits of LV coefficients are expressed in the canonical Sun-centered celestial-equatorial frame for the convenience of further studies. They are all improved by significant factors of tens to hundreds with existing ones. As a consequence, Einstein's equivalence principle is verified substantially further by pulsar experiments in terms of local Lorentz invariance in gravity.
The Standard Model in noncommutative geometry: fundamental fermions as internal forms
NASA Astrophysics Data System (ADS)
Dąbrowski, Ludwik; D'Andrea, Francesco; Sitarz, Andrzej
2018-05-01
Given the algebra, Hilbert space H, grading and real structure of the finite spectral triple of the Standard Model, we classify all possible Dirac operators such that H is a self-Morita equivalence bimodule for the associated Clifford algebra.
NASA Astrophysics Data System (ADS)
Espinosa, J. R.; Racco, D.; Riotto, A.
2018-03-01
For the current central values of the Higgs boson and top quark masses, the standard model Higgs potential develops an instability at a scale of the order of 1 011 GeV . We show that a cosmological signature of such instability could be dark matter in the form of primordial black holes seeded by Higgs fluctuations during inflation. The existence of dark matter might not require physics beyond the standard model.
From many body wee partons dynamics to perfect fluid: a standard model for heavy ion collisions
Venugopalan, R.
2010-07-22
We discuss a standard model of heavy ion collisions that has emerged both from experimental results of the RHIC program and associated theoretical developments. We comment briefly on the impact of early results of the LHC program on this picture. We consider how this standard model of heavy ion collisions could be solidified or falsified in future experiments at RHIC, the LHC and a future Electro-Ion Collider.
Espinosa, J R; Racco, D; Riotto, A
2018-03-23
For the current central values of the Higgs boson and top quark masses, the standard model Higgs potential develops an instability at a scale of the order of 10^{11} GeV. We show that a cosmological signature of such instability could be dark matter in the form of primordial black holes seeded by Higgs fluctuations during inflation. The existence of dark matter might not require physics beyond the standard model.
A modeling analysis of alternative primary and secondary US ozone standards in urban and rural areas
NASA Astrophysics Data System (ADS)
Nopmongcol, Uarporn; Emery, Chris; Sakulyanontvittaya, Tanarit; Jung, Jaegun; Knipping, Eladio; Yarwood, Greg
2014-12-01
This study employed the High-Order Decoupled Direct Method (HDDM) of sensitivity analysis in a photochemical grid model to determine US anthropogenic emissions reductions required from 2006 levels to meet alternative US primary (health-based) and secondary (welfare-based) ozone (O3) standards. Applying the modeling techniques developed by Yarwood et al. (2013), we specifically evaluated sector-wide emission reductions needed to meet primary standards in the range of 60-75 ppb, and secondary standards in the range of 7-15 ppm-h, in 22 cities and at 20 rural sites across the US for NOx-only, combined NOx and VOC, and VOC-only scenarios. Site-specific model biases were taken into account by applying adjustment factors separately for the primary and secondary standard metrics, analogous to the US Environmental Protection Agency's (EPA) relative response factor technique. Both bias-adjusted and unadjusted results are presented and analyzed. We found that the secondary metric does not necessarily respond to emission reductions the same way the primary metric does, indicating sensitivity to their different forms. Combined NOx and VOC reductions are most effective for cities, whereas NOx-only reductions are sufficient at rural sites. Most cities we examined require more than 50% US anthropogenic emission reductions from 2006 levels to meet the current primary 75 ppb US standard and secondary 15 ppm-h target. Most rural sites require less than 20% reductions to meet the primary 75 ppb standard and less than 40% reductions to meet the secondary 15 ppm-h target. Whether the primary standard is protective of the secondary standard depends on the combination of alternative standard levels. Our modeling suggests that the current 75 ppb standard achieves a 15 ppm-h secondary target in most (17 of 22) cities, but only half of the rural sites; the inability for several western cities and rural areas to achieve the seasonally-summed secondary 15 ppm-h target while meeting the 75 ppb
[Comparison of Flu Outbreak Reporting Standards Based on Transmission Dynamics Model].
Yang, Guo-jing; Yi, Qing-jie; Li, Qin; Zeng, Qing
2016-05-01
To compare the current two flu outbreak reporting standards for the purpose of better prevention and control of flu outbreaks. A susceptible-exposed-infectious/asymptomatic-removed (SEIAR) model without interventions was set up first, followed by a model with interventions based on real situation. Simulated interventions were developed based on the two reporting standards, and evaluated by estimated duration of outbreaks, cumulative new cases, cumulative morbidity rates, decline in percentage of morbidity rates, and cumulative secondary cases. The basic reproductive number of the outbreak was estimated as 8. 2. The simulation produced similar results as the real situation. The effect of interventions based on reporting standard one (10 accumulated new cases in a week) was better than that of interventions based on reporting standard two (30 accumulated new cases in a week). The reporting standard one (10 accumulated new cases in a week) is more effective for prevention and control of flu outbreaks.
Mazumdar, Anupam; Nadathur, Seshadri
2012-03-16
We provide a model in which both the inflaton and the curvaton are obtained from within the minimal supersymmetric standard model, with known gauge and Yukawa interactions. Since now both the inflaton and curvaton fields are successfully embedded within the same sector, their decay products thermalize very quickly before the electroweak scale. This results in two important features of the model: first, there will be no residual isocurvature perturbations, and second, observable non-Gaussianities can be generated with the non-Gaussianity parameter f(NL)~O(5-1000) being determined solely by the combination of weak-scale physics and the standard model Yukawa interactions.
A Study on Standard Competition with Network Effect Based on Evolutionary Game Model
NASA Astrophysics Data System (ADS)
Wang, Ye; Wang, Bingdong; Li, Kangning
Owing to networks widespread in modern society, standard competition with network effect is now endowed with new connotation. This paper aims to study the impact of network effect on standard competition; it is organized in the mode of "introduction-model setup-equilibrium analysis-conclusion". Starting from a well-structured model of evolutionary game, it is then extended to a dynamic analysis. This article proves both theoretically and empirically that whether or not a standard can lead the market trends depends on the utility it would bring, and the author also discusses some advisable strategies revolving around the two factors of initial position and border break.
40 CFR 1039.101 - What exhaust emission standards must my engines meet after the 2014 model year?
Code of Federal Regulations, 2010 CFR
2010-07-01
... my engines meet after the 2014 model year? 1039.101 Section 1039.101 Protection of Environment... emission standards must my engines meet after the 2014 model year? The exhaust emission standards of this section apply after the 2014 model year. Certain of these standards also apply for model year 2014 and...
NASA Astrophysics Data System (ADS)
Quast, T.; Schirmacher, A.; Hauer, K.-O.; Koo, A.
2018-02-01
To elucidate the influence of polarization in diffuse reflectometry, we performed a series of measurements in several bidirectional geometries and determined the Stokes parameters of the diffusely reflected radiation. Different types of matte reflection standards were used, including several common white standards and ceramic colour standards. The dependence of the polarization on the sample type, wavelength and geometry have been studied systematically, the main influence factors have been identified: The effect is largest at large angles of incidence or detection and at wavelengths where the magnitude of the reflectance is small. The results for the colour standards have been modelled using a microfacet-based reflection theory which is derived from the well-known model of Torrance and Sparrow. Although the theory is very simple and only has three free parameters, the agreement with the measured data is very good, all essential features of the data can be reproduced by the model.
He, Li; Xu, Zongda; Fan, Xing; Li, Jing; Lu, Hongwei
2017-05-01
This study develops a meta-modeling based mathematical programming approach with flexibility in environmental standards. It integrates numerical simulation, meta-modeling analysis, and fuzzy programming within a general framework. A set of models between remediation strategies and remediation performance can well guarantee the mitigation in computational efforts in the simulation and optimization process. In order to prevent the occurrence of over-optimistic and pessimistic optimization strategies, a high satisfaction level resulting from the implementation of a flexible standard can indicate the degree to which the environmental standard is satisfied. The proposed approach is applied to a naphthalene-contaminated site in China. Results show that a longer remediation period corresponds to a lower total pumping rate and a stringent risk standard implies a high total pumping rate. The wells located near or in the down-gradient direction to the contaminant sources have the most significant efficiency among all of remediation schemes.
A Standard-Based Model for Adaptive E-Learning Platform for Mauritian Academic Institutions
ERIC Educational Resources Information Center
Kanaksabee, P.; Odit, M. P.; Ramdoyal, A.
2011-01-01
The key aim of this paper is to introduce a standard-based model for adaptive e-learning platform for Mauritian academic institutions and to investigate the conditions and tools required to implement this model. The main forces of the system are that it allows collaborative learning, communication among user, and reduce considerable paper work.…
System Dynamics in Distance Education and a Call to Develop a Standard Model
ERIC Educational Resources Information Center
Shaffer, Steven C.
2005-01-01
This paper describes systems dynamics, reviews the literature of uses of systems concepts in distance education (DE), presents a preliminary model, and ends in a call to researchers to contribute to the building of a standard model of DE. (Contains 4 figures.)
ERIC Educational Resources Information Center
Kartal, Ozgul; Dunya, Beyza Aksu; Diefes-Dux, Heidi A.; Zawojewski, Judith S.
2016-01-01
Critical to many science, technology, engineering, and mathematics (STEM) career paths is mathematical modeling--specifically, the creation and adaptation of mathematical models to solve problems in complex settings. Conventional standardized measures of mathematics achievement are not structured to directly assess this type of mathematical…
NASA Technical Reports Server (NTRS)
1981-01-01
The use of an International Standards Organization (ISO) Open Systems Interconnection (OSI) Reference Model and its relevance to interconnecting an Applications Data Service (ADS) pilot program for data sharing is discussed. A top level mapping between the conjectured ADS requirements and identified layers within the OSI Reference Model was performed. It was concluded that the OSI model represents an orderly architecture for the ADS networking planning and that the protocols being developed by the National Bureau of Standards offer the best available implementation approach.
Implications of Higgs searches on the four-generation standard model.
Kuflik, Eric; Nir, Yosef; Volansky, Tomer
2013-03-01
Within the four-generation standard model, the Higgs couplings to gluons and to photons deviate in a significant way from the predictions of the three-generation standard model. As a consequence, large departures in several Higgs production and decay channels are expected. Recent Higgs search results, presented by ATLAS, CMS, and CDF, hint on the existence of a Higgs boson with a mass around 125 GeV. Using these results and assuming such a Higgs boson, we derive exclusion limits on the four-generation standard model. For m(H)=125 GeV, the model is excluded above 99.95% confidence level. For 124.5 GeV≤m(H)≤127.5 GeV, an exclusion limit above 99% confidence level is found.
Standardization Process for Space Radiation Models Used for Space System Design
NASA Technical Reports Server (NTRS)
Barth, Janet; Daly, Eamonn; Brautigam, Donald
2005-01-01
The space system design community has three concerns related to models of the radiation belts and plasma: 1) AP-8 and AE-8 models are not adequate for modern applications; 2) Data that have become available since the creation of AP-8 and AE-8 are not being fully exploited for modeling purposes; 3) When new models are produced, there is no authorizing organization identified to evaluate the models or their datasets for accuracy and robustness. This viewgraph presentation provided an overview of the roadmap adopted by the Working Group Meeting on New Standard Radiation Belt and Space Plasma Models.
NASA Astrophysics Data System (ADS)
Skersys, Tomas; Butleris, Rimantas; Kapocius, Kestutis
2013-10-01
Approaches for the analysis and specification of business vocabularies and rules are very relevant topics in both Business Process Management and Information Systems Development disciplines. However, in common practice of Information Systems Development, the Business modeling activities still are of mostly empiric nature. In this paper, basic aspects of the approach for business vocabularies' semi-automated extraction from business process models are presented. The approach is based on novel business modeling-level OMG standards "Business Process Model and Notation" (BPMN) and "Semantics for Business Vocabularies and Business Rules" (SBVR), thus contributing to OMG's vision about Model-Driven Architecture (MDA) and to model-driven development in general.
Secluded and Putative Flipped Dark Matter and Stueckelberg Extensions of the Standard Model
NASA Technical Reports Server (NTRS)
Fortes, E. C. F. S.; Pleitez, V.; Stecker, F. W.
2018-01-01
We consider here three dark matter models with the gauge symmetry of the standard model plus an additional local U(1)D factor. One model is truly secluded and the other two models begin flipped, but end up secluded. All of these models include one dark fermion and one vector boson that gains mass via the Stueckelberg mechanism. We show that the would be flipped models provide an example dark matter composed of "almost least interacting particles" (ALIPs). Such particles are therefore compatible with the constraints obtained from both laboratory measurements and astrophysical observations.
Secluded and putative flipped dark matter and Stueckelberg extensions of the standard model
NASA Astrophysics Data System (ADS)
Fortes, E. C. F. S.; Pleitez, V.; Stecker, F. W.
2018-02-01
We consider here three dark matter models with the gauge symmetry of the standard model plus an additional local U(1)D factor. One model is truly secluded and the other two models begin flipped, but end up secluded. All of these models include one dark fermion and one vector boson that gains mass via the Stueckelberg mechanism. We show that the would be flipped models provide an example dark matter composed of "almost least interacting particles" (ALIPs). Such particles are therefore compatible with the constraints obtained from both laboratory measurements and astrophysical observations.
40 CFR 1036.620 - Alternate CO2 standards based on model year 2011 compression-ignition engines.
Code of Federal Regulations, 2012 CFR
2012-07-01
... 40 Protection of Environment 34 2012-07-01 2012-07-01 false Alternate CO2 standards based on model... the following criteria: (1) It must have been certified to all applicable emission standards in model... set and model year in which you certify engines to the standards of this section. You may not bank any...
40 CFR 86.097-9 - Emission standards for 1997 and later model year light-duty trucks.
Code of Federal Regulations, 2013 CFR
2013-07-01
....097-9 Emission standards for 1997 and later model year light-duty trucks. (a)(1) Standards—(i) Light... standards. (ii) Heavy light-duty trucks. (A) Exhaust emissions from 1997 and later model year heavy light... model year light-duty trucks from compliance at low altitude with the emission standards set forth in...
40 CFR 1036.620 - Alternate CO2 standards based on model year 2011 compression-ignition engines.
Code of Federal Regulations, 2013 CFR
2013-07-01
... 40 Protection of Environment 34 2013-07-01 2013-07-01 false Alternate CO2 standards based on model... the following criteria: (1) It must have been certified to all applicable emission standards in model... set and model year in which you certify engines to the standards of this section. You may not bank any...
40 CFR 1036.620 - Alternate CO2 standards based on model year 2011 compression-ignition engines.
Code of Federal Regulations, 2014 CFR
2014-07-01
... 40 Protection of Environment 33 2014-07-01 2014-07-01 false Alternate CO2 standards based on model... the following criteria: (1) It must have been certified to all applicable emission standards in model... set and model year in which you certify engines to the standards of this section. You may not bank any...
Assessment of the Draft AIAA S-119 Flight Dynamic Model Exchange Standard
NASA Technical Reports Server (NTRS)
Jackson, E. Bruce; Murri, Daniel G.; Hill, Melissa A.; Jessick, Matthew V.; Penn, John M.; Hasan, David A.; Crues, Edwin Z.; Falck, Robert D.; McCarthy, Thomas G.; Vuong, Nghia;
2011-01-01
An assessment of a draft AIAA standard for flight dynamics model exchange, ANSI/AIAA S-119-2011, was conducted on behalf of NASA by a team from the NASA Engineering and Safety Center. The assessment included adding the capability of importing standard models into real-time simulation facilities at several NASA Centers as well as into analysis simulation tools. All participants were successful at importing two example models into their respective simulation frameworks by using existing software libraries or by writing new import tools. Deficiencies in the libraries and format documentation were identified and fixed; suggestions for improvements to the standard were provided to the AIAA. An innovative tool to generate C code directly from such a model was developed. Performance of the software libraries compared favorably with compiled code. As a result of this assessment, several NASA Centers can now import standard models directly into their simulations. NASA is considering adopting the now-published S-119 standard as an internal recommended practice.
A Lifecycle Approach to Brokered Data Management for Hydrologic Modeling Data Using Open Standards.
NASA Astrophysics Data System (ADS)
Blodgett, D. L.; Booth, N.; Kunicki, T.; Walker, J.
2012-12-01
The U.S. Geological Survey Center for Integrated Data Analytics has formalized an information management-architecture to facilitate hydrologic modeling and subsequent decision support throughout a project's lifecycle. The architecture is based on open standards and open source software to decrease the adoption barrier and to build on existing, community supported software. The components of this system have been developed and evaluated to support data management activities of the interagency Great Lakes Restoration Initiative, Department of Interior's Climate Science Centers and WaterSmart National Water Census. Much of the research and development of this system has been in cooperation with international interoperability experiments conducted within the Open Geospatial Consortium. Community-developed standards and software, implemented to meet the unique requirements of specific disciplines, are used as a system of interoperable, discipline specific, data types and interfaces. This approach has allowed adoption of existing software that satisfies the majority of system requirements. Four major features of the system include: 1) assistance in model parameter and forcing creation from large enterprise data sources; 2) conversion of model results and calibrated parameters to standard formats, making them available via standard web services; 3) tracking a model's processes, inputs, and outputs as a cohesive metadata record, allowing provenance tracking via reference to web services; and 4) generalized decision support tools which rely on a suite of standard data types and interfaces, rather than particular manually curated model-derived datasets. Recent progress made in data and web service standards related to sensor and/or model derived station time series, dynamic web processing, and metadata management are central to this system's function and will be presented briefly along with a functional overview of the applications that make up the system. As the separate
V3885 Sagittarius: A Comparison With a Range of Standard Model Accretion Disks
2009-10-01
is greater than that in previous models. Blaes et al. (2006) show that magnetic support has a significant effect on synthetic spectra of black hole ...reserved. Printed in the U.S.A. V3885 SAGITTARIUS : A COMPARISON WITH A RANGE OF STANDARD MODEL ACCRETION DISKS∗ Albert P. Linnell1, Patrick Godon2, Ivan...Ultraviolet Spectroscopic Explorer and Space Telescope Imaging Spectrograph spectra of V3885 Sagittarius , on an absolute flux basis, selects a model that
Adventures in model-building beyond the Standard Model and esoterica in six dimensions
NASA Astrophysics Data System (ADS)
Stone, David C.
This dissertation is most easily understood as two distinct periods of research. The first three chapters are dedicated to phenomenological interests in physics. An anomalous measurement of the top quark forward-backward asymmetry in both detectors at the Tevatron collider is explained by particle content from beyond the Standard Model. The extra field content is assumed to have originated from a grand unified group SU(5), and so only specific content may be added. Methods for spontaneously breaking the R-symmetry of supersymmetric theories, of phenomenological interest for any realistic supersymmetric model, are studied in the context of two-loop Coleman-Weinberg potentials. For a superpotential with a certain structure, which must include two different couplings, a robust method of spontaneously breaking the R-symmetry is established. The phenomenological studies conclude with an isospin analysis of B decays to kaons and pions. When the parameters of the analysis are fit to data, it is seen that an enhancement of matrix elements in certain representations of isospin emerge. This is highly reminiscent of the infamous and unexplained enhancements seen in the K → pipi system. We conjecture that this enhancement may be a universal feature of the flavor group, isospin in this case, rather than of just the K → pipi system. The final two chapters approach the problem of counting degrees of freedom in quantum field theories. We examine the form of the Weyl anomaly in six dimensions with the Weyl consistency conditions. These consistency conditions impose constraints that lead to a candidate for the alpha-theorem in six dimensions. This candidate has all the properties that the equivalent theorems in two and four dimensions did, and, in fact, we show that in an even number of dimensions the form of the Euler density, the generalized Einstein tensor, and the Weyl transformations guarantee such a candidate exists. We go on to show that, unlike in two and four dimensions
Connecting dark matter annihilation to the vertex functions of Standard Model fermions
Kumar, Jason; Light, Christopher, E-mail: jkumar@hawaii.edu, E-mail: lightc@hawaii.edu
We consider scenarios in which dark matter is a Majorana fermion which couples to Standard Model fermions through the exchange of charged mediating particles. The matrix elements for various dark matter annihilation processes are then related to one-loop corrections to the fermion-photon vertex, where dark matter and the charged mediators run in the loop. In particular, in the limit where Standard Model fermion helicity mixing is suppressed, the cross section for dark matter annihilation to various final states is related to corrections to the Standard Model fermion charge form factor. These corrections can be extracted in a gauge-invariant manner frommore » collider cross sections. Although current measurements from colliders are not precise enough to provide useful constraints on dark matter annihilation, improved measurements at future experiments, such as the International Linear Collider, could improve these constraints by several orders of magnitude, allowing them to surpass the limits obtainable by direct observation.« less
Parameter recovery, bias and standard errors in the linear ballistic accumulator model.
Visser, Ingmar; Poessé, Rens
2017-05-01
The linear ballistic accumulator (LBA) model (Brown & Heathcote, , Cogn. Psychol., 57, 153) is increasingly popular in modelling response times from experimental data. An R package, glba, has been developed to fit the LBA model using maximum likelihood estimation which is validated by means of a parameter recovery study. At sufficient sample sizes parameter recovery is good, whereas at smaller sample sizes there can be large bias in parameters. In a second simulation study, two methods for computing parameter standard errors are compared. The Hessian-based method is found to be adequate and is (much) faster than the alternative bootstrap method. The use of parameter standard errors in model selection and inference is illustrated in an example using data from an implicit learning experiment (Visser et al., , Mem. Cogn., 35, 1502). It is shown that typical implicit learning effects are captured by different parameters of the LBA model. © 2017 The British Psychological Society.
Les Houches 2015: Physics at TeV Colliders Standard Model Working Group Report
Andersen, J.R.; et al.
This Report summarizes the proceedings of the 2015 Les Houches workshop on Physics at TeV Colliders. Session 1 dealt with (I) new developments relevant for high precision Standard Model calculations, (II) the new PDF4LHC parton distributions, (III) issues in the theoretical description of the production of Standard Model Higgs bosons and how to relate experimental measurements, (IV) a host of phenomenological studies essential for comparing LHC data from Run I with theoretical predictions and projections for future measurements in Run II, and (V) new developments in Monte Carlo event generators.
Flavour-changing neutral currents making and breaking the standard model.
Archilli, F; Bettler, M-O; Owen, P; Petridis, K A
2017-06-07
The standard model of particle physics is our best description yet of fundamental particles and their interactions, but it is known to be incomplete. As yet undiscovered particles and interactions might exist. One of the most powerful ways to search for new particles is by studying processes known as flavour-changing neutral current decays, whereby a quark changes its flavour without altering its electric charge. One example of such a transition is the decay of a beauty quark into a strange quark. Here we review some intriguing anomalies in these decays, which have revealed potential cracks in the standard model-hinting at the existence of new phenomena.
Performance of preproduction model cesium beam frequency standards for spacecraft applications
NASA Technical Reports Server (NTRS)
Levine, M. W.
1978-01-01
A cesium beam frequency standards for spaceflight application on Navigation Development Satellites was designed and fabricated and preliminary testing was completed. The cesium standard evolved from an earlier prototype model launched aboard NTS-2 and the engineering development model to be launched aboard NTS satellites during 1979. A number of design innovations, including a hybrid analog/digital integrator and the replacement of analog filters and phase detectors by clocked digital sampling techniques are discussed. Thermal and thermal-vacuum testing was concluded and test data are presented. Stability data for 10 to 10,000 seconds averaging interval, measured under laboratory conditions, are shown.
Les Houches 2017: Physics at TeV Colliders Standard Model Working Group Report
Andersen, J.R.; et al.
This Report summarizes the proceedings of the 2017 Les Houches workshop on Physics at TeV Colliders. Session 1 dealt with (I) new developments relevant for high precision Standard Model calculations, (II) theoretical uncertainties and dataset dependence of parton distribution functions, (III) new developments in jet substructure techniques, (IV) issues in the theoretical description of the production of Standard Model Higgs bosons and how to relate experimental measurements, (V) phenomenological studies essential for comparing LHC data from Run II with theoretical predictions and projections for future measurements, and (VI) new developments in Monte Carlo event generators.
Blowout Jets: Hinode X-Ray Jets that Don't Fit the Standard Model
NASA Technical Reports Server (NTRS)
Moore, Ronald L.; Cirtain, Jonathan W.; Sterling, Alphonse C.; Falconer, David A.
2010-01-01
Nearly half of all H-alpha macrospicules in polar coronal holes appear to be miniature filament eruptions. This suggests that there is a large class of X-ray jets in which the jet-base magnetic arcade undergoes a blowout eruption as in a CME, instead of remaining static as in most solar X-ray jets, the standard jets that fit the model advocated by Shibata. Along with a cartoon depicting the standard model, we present a cartoon depicting the signatures expected of blowout jets in coronal X-ray images. From Hinode/XRT movies and STEREO/EUVI snapshots in polar coronal holes, we present examples of (1) X-ray jets that fit the standard model, and (2) X-ray jets that do not fit the standard model but do have features appropriate for blowout jets. These features are (1) a flare arcade inside the jet-base arcade in addition to the small flare arcade (bright point) outside that standard jets have, (2) a filament of cool (T is approximately 80,000K) plasma that erupts from the core of the jetbase arcade, and (3) an extra jet strand that should not be made by the reconnection for standard jets but could be made by reconnection between the ambient unipolar open field and the opposite-polarity leg of the filament-carrying flux-rope core field of the erupting jet-base arcade. We therefore infer that these non-standard jets are blowout jets, jets made by miniature versions of the sheared-core-arcade eruptions that make CMEs
[Research model on commodity specification standard of radix Chinese materia medica].
Kang, Chuan-Zhi; Zhou, Tao; Jiang, Wei-Ke; Huang, Lu-Qi; Guo, Lan-Ping
2016-03-01
As an important part of the market commodity circulation, the standard grade of Chinese traditional medicine commodity is very important to restrict the market order and guarantee the quality of the medicinal material. The State Council issuing the "protection and development of Chinese herbal medicine (2015-2020)" also make clear that the important task of improving the circulation of Chinese herbal medicine industry norms and the commodity specification standard of common traditional Chinese medicinal materials. However, as a large class of Chinese herbal medicines, the standard grade of the radix is more confused in the market circulation, and lack of a more reasonable study model in the development of the standard. Thus, this paper summarizes the research background, present situation and problems, and several key points of the commodity specification and grade standard in radix herbs. Then, the research model is introduced as an example of Pseudostellariae Radix, so as to provide technical support and reference for formulating commodity specifications and grades standard in other radix traditional Chinese medicinal materials. Copyright© by the Chinese Pharmaceutical Association.
Visually guided tube thoracostomy insertion comparison to standard of care in a large animal model.
Hernandez, Matthew C; Vogelsang, David; Anderson, Jeff R; Thiels, Cornelius A; Beilman, Gregory; Zielinski, Martin D; Aho, Johnathon M
2017-04-01
Tube thoracostomy (TT) is a lifesaving procedure for a variety of thoracic pathologies. The most commonly utilized method for placement involves open dissection and blind insertion. Image guided placement is commonly utilized but is limited by an inability to see distal placement location. Unfortunately, TT is not without complications. We aim to demonstrate the feasibility of a disposable device to allow for visually directed TT placement compared to the standard of care in a large animal model. Three swine were sequentially orotracheally intubated and anesthetized. TT was conducted utilizing a novel visualization device, tube thoracostomy visual trocar (TTVT) and standard of care (open technique). Position of the TT in the chest cavity were recorded using direct thoracoscopic inspection and radiographic imaging with the operator blinded to results. Complications were evaluated using a validated complication grading system. Standard descriptive statistical analyses were performed. Thirty TT were placed, 15 using TTVT technique, 15 using standard of care open technique. All of the TT placed using TTVT were without complication and in optimal position. Conversely, 27% of TT placed using standard of care open technique resulted in complications. Necropsy revealed no injury to intrathoracic organs. Visual directed TT placement using TTVT is feasible and non-inferior to the standard of care in a large animal model. This improvement in instrumentation has the potential to greatly improve the safety of TT. Further study in humans is required. Therapeutic Level II. Copyright © 2017 Elsevier Ltd. All rights reserved.
230Th-234U Model-Ages of Some Uranium Standard Reference Materials
Williams, R W; Gaffney, A M; Kristo, M J
The 'age' of a sample of uranium is an important aspect of a nuclear forensic investigation and of the attribution of the material to its source. To the extent that the sample obeys the standard rules of radiochronometry, then the production ages of even very recent material can be determined using the {sup 230}Th-{sup 234}U chronometer. These standard rules may be summarized as (a) the daughter/parent ratio at time=zero must be known, and (b) there has been no daughter/parent fractionation since production. For most samples of uranium, the 'ages' determined using this chronometer are semantically 'model-ages' because (a) some assumptionmore » of the initial {sup 230}Th content in the sample is required and (b) closed-system behavior is assumed. The uranium standard reference materials originally prepared and distributed by the former US National Bureau of Standards and now distributed by New Brunswick Laboratory as certified reference materials (NBS SRM = NBL CRM) are good candidates for samples where both rules are met. The U isotopic standards have known purification and production dates, and closed-system behavior in the solid form (U{sub 3}O{sub 8}) may be assumed with confidence. We present here {sup 230}Th-{sup 234}U model-ages for several of these standards, determined by isotope dilution mass spectrometry using a multicollector ICP-MS, and compare these ages with their known production history.« less
Geo3DML: A standard-based exchange format for 3D geological models
NASA Astrophysics Data System (ADS)
Wang, Zhangang; Qu, Honggang; Wu, Zixing; Wang, Xianghong
2018-01-01
A geological model (geomodel) in three-dimensional (3D) space is a digital representation of the Earth's subsurface, recognized by geologists and stored in resultant geological data (geodata). The increasing demand for data management and interoperable applications of geomodelscan be addressed by developing standard-based exchange formats for the representation of not only a single geological object, but also holistic geomodels. However, current standards such as GeoSciML cannot incorporate all the geomodel-related information. This paper presents Geo3DML for the exchange of 3D geomodels based on the existing Open Geospatial Consortium (OGC) standards. Geo3DML is based on a unified and formal representation of structural models, attribute models and hierarchical structures of interpreted resultant geodata in different dimensional views, including drills, cross-sections/geomaps and 3D models, which is compatible with the conceptual model of GeoSciML. Geo3DML aims to encode all geomodel-related information integrally in one framework, including the semantic and geometric information of geoobjects and their relationships, as well as visual information. At present, Geo3DML and some supporting tools have been released as a data-exchange standard by the China Geological Survey (CGS).
NASA Technical Reports Server (NTRS)
Avila, Arturo
2011-01-01
The Standard JPL thermal engineering practice prescribes worst-case methodologies for design. In this process, environmental and key uncertain thermal parameters (e.g., thermal blanket performance, interface conductance, optical properties) are stacked in a worst case fashion to yield the most hot- or cold-biased temperature. Thus, these simulations would represent the upper and lower bounds. This, effectively, represents JPL thermal design margin philosophy. Uncertainty in the margins and the absolute temperatures is usually estimated by sensitivity analyses and/or by comparing the worst-case results with "expected" results. Applicability of the analytical model for specific design purposes along with any temperature requirement violations are documented in peer and project design review material. In 2008, NASA released NASA-STD-7009, Standard for Models and Simulations. The scope of this standard covers the development and maintenance of models, the operation of simulations, the analysis of the results, training, recommended practices, the assessment of the Modeling and Simulation (M&S) credibility, and the reporting of the M&S results. The Mars Exploration Rover (MER) project thermal control system M&S activity was chosen as a case study determining whether JPL practice is in line with the standard and to identify areas of non-compliance. This paper summarizes the results and makes recommendations regarding the application of this standard to JPL thermal M&S practices.
Synthesis of β-Peptide Standards for Use in Model Prebiotic Reactions
NASA Astrophysics Data System (ADS)
Forsythe, Jay G.; English, Sloane L.; Simoneaux, Rachel E.; Weber, Arthur L.
2018-05-01
A one-pot method was developed for the preparation of a series of β-alanine standards of moderate size (2 to ≥12 residues) for studies concerning the prebiotic origins of peptides. The one-pot synthesis involved two sequential reactions: (1) dry-down self-condensation of β-alanine methyl ester, yielding β-alanine peptide methyl ester oligomers, and (2) subsequent hydrolysis of β-alanine peptide methyl ester oligomers, producing a series of β-alanine peptide standards. These standards were then spiked into a model prebiotic product mixture to confirm by HPLC the formation of β-alanine peptides under plausible reaction conditions. The simplicity of this approach suggests it can be used to prepare a variety of β-peptide standards for investigating differences between α- and β-peptides in the context of prebiotic chemistry.
Standards in Modeling and Simulation: The Next Ten Years MODSIM World Paper 2010
NASA Technical Reports Server (NTRS)
Collins, Andrew J.; Diallo, Saikou; Sherfey, Solomon R.; Tolk, Andreas; Turnitsa, Charles D.; Petty, Mikel; Wiesel, Eric
2011-01-01
The world has moved on since the introduction of the Distributed Interactive Simulation (DIS) standard in the early 1980s. The cold-war maybe over but there is still a requirement to train for and analyze the next generation of threats that face the free world. With the emergence of new and more powerful computer technology and techniques means that modeling and simulation (M&S) has become an important and growing, part in satisfying this requirement. As an industry grows, the benefits from standardization within that industry grow with it. For example, it is difficult to imagine what the USA would be like without the 110 volts standard for domestic electricity supply. This paper contains an overview of the outcomes from a recent workshop to investigate the possible future of M&S standards within the federal government.
NASA Astrophysics Data System (ADS)
Naggary, Schabnam; Brinkmann, Ralf Peter
2015-09-01
The characteristics of radio frequency (RF) modulated plasma boundary sheaths are studied on the basis of the so-called ``standard sheath model.'' This model assumes that the applied radio frequency ωRF is larger than the plasma frequency of the ions but smaller than that of the electrons. It comprises a phase-averaged ion model - consisting of an equation of continuity (with ionization neglected) and an equation of motion (with collisional ion-neutral interaction taken into account) - a phase-resolved electron model - consisting of an equation of continuity and the assumption of Boltzmann equilibrium -, and Poisson's equation for the electrical field. Previous investigations have studied the standard sheath model under additional approximations, most notably the assumption of a step-like electron front. This contribution presents an investigation and parameter study of the standard sheath model which avoids any further assumptions. The resulting density profiles and overall charge-voltage characteristics are compared with those of the step-model based theories. The authors gratefully acknowledge Efe Kemaneci for helpful comments and fruitful discussions.
Personalized-detailed clinical model for data interoperability among clinical standards.
Khan, Wajahat Ali; Hussain, Maqbool; Afzal, Muhammad; Amin, Muhammad Bilal; Saleem, Muhammad Aamir; Lee, Sungyoung
2013-08-01
Data interoperability among health information exchange (HIE) systems is a major concern for healthcare practitioners to enable provisioning of telemedicine-related services. Heterogeneity exists in these systems not only at the data level but also among different heterogeneous healthcare standards with which these are compliant. The relationship between healthcare organization data and different heterogeneous standards is necessary to achieve the goal of data level interoperability. We propose a personalized-detailed clinical model (P-DCM) approach for the generation of customized mappings that creates the necessary linkage between organization-conformed healthcare standards concepts and clinical model concepts to ensure data interoperability among HIE systems. We consider electronic health record (EHR) standards, openEHR, and HL7 CDA instances transformation using P-DCM. P-DCM concepts associated with openEHR and HL7 CDA help in transformation of instances among these standards. We investigated two datasets: (1) data of 100 diabetic patients, including 50 each of type 1 and type 2, from a local hospital in Korea and (2) data of a single Alzheimer's disease patient. P-DCMs were created for both scenarios, which provided the basis for deriving instances for HL7 CDA and openEHR standards. For proof of concept, we present case studies of encounter information for type 2 diabetes mellitus patients and monitoring of daily routine activities of an Alzheimer's disease patient. These reflect P-DCM-based customized mappings generation with openEHR and HL7 CDA standards. Customized mappings are generated based on the relationship of P-DCM concepts with CDA and openEHR concepts. The objective of this work is to achieve semantic data interoperability among heterogeneous standards. This would lead to effective utilization of resources and allow timely information exchange among healthcare systems.
Personalized-Detailed Clinical Model for Data Interoperability Among Clinical Standards
Khan, Wajahat Ali; Hussain, Maqbool; Afzal, Muhammad; Amin, Muhammad Bilal; Saleem, Muhammad Aamir
2013-01-01
Abstract Objective: Data interoperability among health information exchange (HIE) systems is a major concern for healthcare practitioners to enable provisioning of telemedicine-related services. Heterogeneity exists in these systems not only at the data level but also among different heterogeneous healthcare standards with which these are compliant. The relationship between healthcare organization data and different heterogeneous standards is necessary to achieve the goal of data level interoperability. We propose a personalized-detailed clinical model (P-DCM) approach for the generation of customized mappings that creates the necessary linkage between organization-conformed healthcare standards concepts and clinical model concepts to ensure data interoperability among HIE systems. Materials and Methods: We consider electronic health record (EHR) standards, openEHR, and HL7 CDA instances transformation using P-DCM. P-DCM concepts associated with openEHR and HL7 CDA help in transformation of instances among these standards. We investigated two datasets: (1) data of 100 diabetic patients, including 50 each of type 1 and type 2, from a local hospital in Korea and (2) data of a single Alzheimer's disease patient. P-DCMs were created for both scenarios, which provided the basis for deriving instances for HL7 CDA and openEHR standards. Results: For proof of concept, we present case studies of encounter information for type 2 diabetes mellitus patients and monitoring of daily routine activities of an Alzheimer's disease patient. These reflect P-DCM-based customized mappings generation with openEHR and HL7 CDA standards. Customized mappings are generated based on the relationship of P-DCM concepts with CDA and openEHR concepts. Conclusions: The objective of this work is to achieve semantic data interoperability among heterogeneous standards. This would lead to effective utilization of resources and allow timely information exchange among healthcare systems. PMID:23875730
Robust geographically weighted regression of modeling the Air Polluter Standard Index (APSI)
NASA Astrophysics Data System (ADS)
Warsito, Budi; Yasin, Hasbi; Ispriyanti, Dwi; Hoyyi, Abdul
2018-05-01
The Geographically Weighted Regression (GWR) model has been widely applied to many practical fields for exploring spatial heterogenity of a regression model. However, this method is inherently not robust to outliers. Outliers commonly exist in data sets and may lead to a distorted estimate of the underlying regression model. One of solution to handle the outliers in the regression model is to use the robust models. So this model was called Robust Geographically Weighted Regression (RGWR). This research aims to aid the government in the policy making process related to air pollution mitigation by developing a standard index model for air polluter (Air Polluter Standard Index - APSI) based on the RGWR approach. In this research, we also consider seven variables that are directly related to the air pollution level, which are the traffic velocity, the population density, the business center aspect, the air humidity, the wind velocity, the air temperature, and the area size of the urban forest. The best model is determined by the smallest AIC value. There are significance differences between Regression and RGWR in this case, but Basic GWR using the Gaussian kernel is the best model to modeling APSI because it has smallest AIC.
H∞ output tracking control of discrete-time nonlinear systems via standard neural network models.
Liu, Meiqin; Zhang, Senlin; Chen, Haiyang; Sheng, Weihua
2014-10-01
This brief proposes an output tracking control for a class of discrete-time nonlinear systems with disturbances. A standard neural network model is used to represent discrete-time nonlinear systems whose nonlinearity satisfies the sector conditions. H∞ control performance for the closed-loop system including the standard neural network model, the reference model, and state feedback controller is analyzed using Lyapunov-Krasovskii stability theorem and linear matrix inequality (LMI) approach. The H∞ controller, of which the parameters are obtained by solving LMIs, guarantees that the output of the closed-loop system closely tracks the output of a given reference model well, and reduces the influence of disturbances on the tracking error. Three numerical examples are provided to show the effectiveness of the proposed H∞ output tracking design approach.
Johnsen, David C; Williams, John N; Baughman, Pauletta Gay; Roesch, Darren M; Feldman, Cecile A
2015-10-01
This opinion article applauds the recent introduction of a new dental accreditation standard addressing critical thinking and problem-solving, but expresses a need for additional means for dental schools to demonstrate they are meeting the new standard because articulated outcomes, learning models, and assessments of competence are still being developed. Validated, research-based learning models are needed to define reference points against which schools can design and assess the education they provide to their students. This article presents one possible learning model for this purpose and calls for national experts from within and outside dental education to develop models that will help schools define outcomes and assess performance in educating their students to become practitioners who are effective critical thinkers and problem-solvers.
Sairam, K; Dorababu, M; Goel, R K; Bhattacharya, S K
2002-04-01
Bacopa monniera Wettst. (syn. Herpestis monniera L.; Scrophulariaceae) is a commonly used Ayurvedic drug for mental disorders. The standardized extract was reported earlier to have significant anti-oxidant effect, anxiolytic activity and improve memory retention in Alzheimer's disease. Presently, the standardized methanolic extract of Bacopa monniera (bacoside A - 38.0+/-0.9) was investigated for potential antidepressant activity in rodent models of depression. The effect was compared with the standard antidepressant drug imipramine (15 mg/kg, ip). The extract when given in the dose of 20 and 40 mg/kg, orally once daily for 5 days was found to have significant antidepressant activity in forced swim and learned helplessness models of depression and was comparable to that of imipramine.
40 CFR 86.099-8 - Emission standards for 1999 and later model year light-duty vehicles.
Code of Federal Regulations, 2013 CFR
2013-07-01
....099-8 Emission standards for 1999 and later model year light-duty vehicles. (a)(1)(i)-(ii) [Reserved... schedule of table A99-08 of this section for model year 1999. For small volume manufacturers, the standards... 40 Protection of Environment 19 2013-07-01 2013-07-01 false Emission standards for 1999 and later...
40 CFR 85.2203 - Short test standards for 1981 and later model year light-duty vehicles.
Code of Federal Regulations, 2010 CFR
2010-07-01
... Control System Performance Warranty Short Tests § 85.2203 Short test standards for 1981 and later model... 1982 and later model year vehicles at high altitude to which high altitude certification standards of 1... 40 Protection of Environment 18 2010-07-01 2010-07-01 false Short test standards for 1981 and...
40 CFR 85.2203 - Short test standards for 1981 and later model year light-duty vehicles.
Code of Federal Regulations, 2012 CFR
2012-07-01
... Control System Performance Warranty Short Tests § 85.2203 Short test standards for 1981 and later model... 1982 and later model year vehicles at high altitude to which high altitude certification standards of 1... 40 Protection of Environment 19 2012-07-01 2012-07-01 false Short test standards for 1981 and...
40 CFR 85.2203 - Short test standards for 1981 and later model year light-duty vehicles.
Code of Federal Regulations, 2011 CFR
2011-07-01
... Control System Performance Warranty Short Tests § 85.2203 Short test standards for 1981 and later model... 1982 and later model year vehicles at high altitude to which high altitude certification standards of 1... 40 Protection of Environment 18 2011-07-01 2011-07-01 false Short test standards for 1981 and...
40 CFR 86.001-9 - Emission standards for 2001 and later model year light-duty trucks
Code of Federal Regulations, 2010 CFR
2010-07-01
....001-9 Emission standards for 2001 and later model year light-duty trucks Section 86.001-9 includes... for 2001 and later model years, and shall not exceed the standards described in paragraph (d)(1) of... 40 Protection of Environment 18 2010-07-01 2010-07-01 false Emission standards for 2001 and later...
40 CFR 86.001-9 - Emission standards for 2001 and later model year light-duty trucks
Code of Federal Regulations, 2012 CFR
2012-07-01
....001-9 Emission standards for 2001 and later model year light-duty trucks Section 86.001-9 includes... for 2001 and later model years, and shall not exceed the standards described in paragraph (d)(1) of... 40 Protection of Environment 19 2012-07-01 2012-07-01 false Emission standards for 2001 and later...
40 CFR 86.001-9 - Emission standards for 2001 and later model year light-duty trucks.
Code of Federal Regulations, 2013 CFR
2013-07-01
....001-9 Emission standards for 2001 and later model year light-duty trucks. Section 86.001-9 includes... for 2001 and later model years, and shall not exceed the standards described in paragraph (d)(1) of... 40 Protection of Environment 19 2013-07-01 2013-07-01 false Emission standards for 2001 and later...
40 CFR 85.2204 - Short test standards for 1981 and later model year light-duty trucks.
Code of Federal Regulations, 2013 CFR
2013-07-01
... Control System Performance Warranty Short Tests § 85.2204 Short test standards for 1981 and later model... later model year trucks at high altitude to which high altitude certification standards of 2.0 g/mile HC... 40 Protection of Environment 19 2013-07-01 2013-07-01 false Short test standards for 1981 and...
Code of Federal Regulations, 2010 CFR
2010-07-01
... 40 Protection of Environment 6 2010-07-01 2010-07-01 false Emission Standards for 2008 Model Year..., Subpt. IIII, Table 2 Table 2 to Subpart IIII of Part 60—Emission Standards for 2008 Model Year and Later... ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR PROGRAMS (CONTINUED) STANDARDS OF PERFORMANCE FOR NEW STATIONARY...
40 CFR 85.2204 - Short test standards for 1981 and later model year light-duty trucks.
Code of Federal Regulations, 2011 CFR
2011-07-01
... Control System Performance Warranty Short Tests § 85.2204 Short test standards for 1981 and later model... later model year trucks at high altitude to which high altitude certification standards of 2.0 g/mile HC... 40 Protection of Environment 18 2011-07-01 2011-07-01 false Short test standards for 1981 and...
Code of Federal Regulations, 2013 CFR
2013-07-01
... 40 Protection of Environment 7 2013-07-01 2013-07-01 false Emission Standards for 2008 Model Year..., Subpt. IIII, Table 2 Table 2 to Subpart IIII of Part 60—Emission Standards for 2008 Model Year and Later... ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR PROGRAMS (CONTINUED) STANDARDS OF PERFORMANCE FOR NEW STATIONARY...
Code of Federal Regulations, 2011 CFR
2011-07-01
... 40 Protection of Environment 6 2011-07-01 2011-07-01 false Emission Standards for 2008 Model Year..., Subpt. IIII, Table 2 Table 2 to Subpart IIII of Part 60—Emission Standards for 2008 Model Year and Later... ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR PROGRAMS (CONTINUED) STANDARDS OF PERFORMANCE FOR NEW STATIONARY...
40 CFR 85.2204 - Short test standards for 1981 and later model year light-duty trucks.
Code of Federal Regulations, 2012 CFR
2012-07-01
... Control System Performance Warranty Short Tests § 85.2204 Short test standards for 1981 and later model... later model year trucks at high altitude to which high altitude certification standards of 2.0 g/mile HC... 40 Protection of Environment 19 2012-07-01 2012-07-01 false Short test standards for 1981 and...
40 CFR 86.099-8 - Emission standards for 1999 and later model year light-duty vehicles.
Code of Federal Regulations, 2011 CFR
2011-07-01
....099-8 Emission standards for 1999 and later model year light-duty vehicles. (a)(1)(i)-(ii) [Reserved... schedule of table A99-08 of this section for model year 1999. For small volume manufacturers, the standards... 40 Protection of Environment 18 2011-07-01 2011-07-01 false Emission standards for 1999 and later...
40 CFR 86.001-9 - Emission standards for 2001 and later model year light-duty trucks
Code of Federal Regulations, 2011 CFR
2011-07-01
....001-9 Emission standards for 2001 and later model year light-duty trucks Section 86.001-9 includes... for 2001 and later model years, and shall not exceed the standards described in paragraph (d)(1) of... 40 Protection of Environment 18 2011-07-01 2011-07-01 false Emission standards for 2001 and later...
40 CFR 85.2203 - Short test standards for 1981 and later model year light-duty vehicles.
Code of Federal Regulations, 2013 CFR
2013-07-01
... Control System Performance Warranty Short Tests § 85.2203 Short test standards for 1981 and later model... 1982 and later model year vehicles at high altitude to which high altitude certification standards of 1... 40 Protection of Environment 19 2013-07-01 2013-07-01 false Short test standards for 1981 and...
40 CFR 86.099-8 - Emission standards for 1999 and later model year light-duty vehicles.
Code of Federal Regulations, 2012 CFR
2012-07-01
....099-8 Emission standards for 1999 and later model year light-duty vehicles. (a)(1)(i)-(ii) [Reserved... schedule of table A99-08 of this section for model year 1999. For small volume manufacturers, the standards... 40 Protection of Environment 19 2012-07-01 2012-07-01 false Emission standards for 1999 and later...
Code of Federal Regulations, 2012 CFR
2012-07-01
... 40 Protection of Environment 7 2012-07-01 2012-07-01 false Emission Standards for 2008 Model Year..., Subpt. IIII, Table 2 Table 2 to Subpart IIII of Part 60—Emission Standards for 2008 Model Year and Later... ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR PROGRAMS (CONTINUED) STANDARDS OF PERFORMANCE FOR NEW STATIONARY...
40 CFR 85.2204 - Short test standards for 1981 and later model year light-duty trucks.
Code of Federal Regulations, 2010 CFR
2010-07-01
... Control System Performance Warranty Short Tests § 85.2204 Short test standards for 1981 and later model... later model year trucks at high altitude to which high altitude certification standards of 2.0 g/mile HC... 40 Protection of Environment 18 2010-07-01 2010-07-01 false Short test standards for 1981 and...
40 CFR 86.099-8 - Emission standards for 1999 and later model year light-duty vehicles.
Code of Federal Regulations, 2010 CFR
2010-07-01
....099-8 Emission standards for 1999 and later model year light-duty vehicles. (a)(1)(i)-(ii) [Reserved... schedule of table A99-08 of this section for model year 1999. For small volume manufacturers, the standards... 40 Protection of Environment 18 2010-07-01 2010-07-01 false Emission standards for 1999 and later...
Code of Federal Regulations, 2014 CFR
2014-07-01
... 40 Protection of Environment 7 2014-07-01 2014-07-01 false Emission Standards for 2008 Model Year..., Subpt. IIII, Table 2 Table 2 to Subpart IIII of Part 60—Emission Standards for 2008 Model Year and Later... ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR PROGRAMS (CONTINUED) STANDARDS OF PERFORMANCE FOR NEW STATIONARY...
Hydrogen maser frequency standard computer model for automatic cavity tuning servo simulations
NASA Technical Reports Server (NTRS)
Potter, P. D.; Finnie, C.
1978-01-01
A computer model of the JPL hydrogen maser frequency standard was developed. This model allows frequency stability data to be generated, as a function of various maser parameters, many orders of magnitude faster than these data can be obtained by experimental test. In particular, the maser performance as a function of the various automatic tuning servo parameters may be readily determined. Areas of discussion include noise sources, first-order autotuner loop, second-order autotuner loop, and a comparison of the loops.
Teacher Leader Model Standards and the Functions Assumed by National Board Certified Teachers
ERIC Educational Resources Information Center
Swan Dagen, Allison; Morewood, Aimee; Smith, Megan L.
2017-01-01
The Teacher Leader Model Standards (TLMS) were created to stimulate discussion around the leadership responsibilities teachers assume in schools. This study used the TLMS to gauge the self-reported leadership responsibilities of National Board Certified Teachers (NBCTs). The NBCTs reported engaging in all domains of the TLMS, most frequently with…
Beyond standard model searches in the MiniBooNE experiment
Katori, Teppei; Conrad, Janet M.
2014-08-05
Tmore » he MiniBooNE experiment has contributed substantially to beyond standard model searches in the neutrino sector. he experiment was originally designed to test the Δ m 2 ~ 1 eV 2 region of the sterile neutrino hypothesis by observing ν e ( ν - e ) charged current quasielastic signals from a ν μ ( ν - μ ) beam. MiniBooNE observed excesses of ν e and ν - e candidate events in neutrino and antineutrino mode, respectively. o date, these excesses have not been explained within the neutrino standard model ( ν SM); the standard model extended for three massive neutrinos. Confirmation is required by future experiments such as MicroBooNE. MiniBooNE also provided an opportunity for precision studies of Lorentz violation. he results set strict limits for the first time on several parameters of the standard-model extension, the generic formalism for considering Lorentz violation. Most recently, an extension to MiniBooNE running, with a beam tuned in beam-dump mode, is being performed to search for dark sector particles. In addition, this review describes these studies, demonstrating that short baseline neutrino experiments are rich environments in new physics searches.« less
Existence of standard models of conic fibrations over non-algebraically-closed fields
Avilov, A A
2014-12-31
We prove an analogue of Sarkisov's theorem on the existence of a standard model of a conic fibration over an algebraically closed field of characteristic different from two for three-dimensional conic fibrations over an arbitrary field of characteristic zero with an action of a finite group. Bibliography: 16 titles.
Value Added Models and the Implementation of the National Standards of K-12 Physical Education
ERIC Educational Resources Information Center
Seymour, Clancy M.; Garrison, Mark J.
2017-01-01
The implementation of value-added models of teacher evaluation continue to expand in public education, but the effects of using student test scores to evaluate K-12 physical educators necessitates further discussion. Using the five National Standards for K-12 Physical Education from the Society of Health and Physical Educators America (SHAPE),…
Use of Standard Deviations as Predictors in Models Using Large-Scale International Data Sets
ERIC Educational Resources Information Center
Austin, Bruce; French, Brian; Adesope, Olusola; Gotch, Chad
2017-01-01
Measures of variability are successfully used in predictive modeling in research areas outside of education. This study examined how standard deviations can be used to address research questions not easily addressed using traditional measures such as group means based on index variables. Student survey data were obtained from the Organisation for…
Application of a Mixed Consequential Ethical Model to a Problem Regarding Test Standards.
ERIC Educational Resources Information Center
Busch, John Christian
The work of the ethicist Charles Curran and the problem-solving strategy of the mixed consequentialist ethical model are applied to a traditional social science measurement problem--that of how to adjust a recommended standard in order to be fair to the test-taker and society. The focus is on criterion-referenced teacher certification tests.…
ERIC Educational Resources Information Center
Chou, Yeh-Tai; Wang, Wen-Chung
2010-01-01
Dimensionality is an important assumption in item response theory (IRT). Principal component analysis on standardized residuals has been used to check dimensionality, especially under the family of Rasch models. It has been suggested that an eigenvalue greater than 1.5 for the first eigenvalue signifies a violation of unidimensionality when there…
ERIC Educational Resources Information Center
Kulgemeyer, Christoph; Schecker, Horst
2014-01-01
This paper gives an overview of research on modelling science competence in German science education. Since the first national German educational standards for physics, chemistry and biology education were released in 2004 research projects dealing with competences have become prominent strands. Most of this research is about the structure of…
Gauge coupling beta functions in the standard model to three loops.
Mihaila, Luminita N; Salomon, Jens; Steinhauser, Matthias
2012-04-13
In this Letter, we compute the three-loop corrections to the beta functions of the three gauge couplings in the standard model of particle physics using the minimal subtraction scheme and taking into account Yukawa and Higgs self-couplings.
Progress in the improved lattice calculation of direct CP-violation in the Standard Model
NASA Astrophysics Data System (ADS)
Kelly, Christopher
2018-03-01
We discuss the ongoing effort by the RBC & UKQCD collaborations to improve our lattice calculation of the measure of Standard Model direct CP violation, ɛ', with physical kinematics. We present our progress in decreasing the (dominant) statistical error and discuss other related activities aimed at reducing the systematic errors.
Let's Have a Coffee with the Standard Model of Particle Physics!
ERIC Educational Resources Information Center
Woithe, Julia; Wiener, Gerfried J.; Van der Veken, Frederik F.
2017-01-01
The Standard Model of particle physics is one of the most successful theories in physics and describes the fundamental interactions between elementary particles. It is encoded in a compact description, the so-called "Lagrangian," which even fits on t-shirts and coffee mugs. This mathematical formulation, however, is complex and only…
Models of information exchange between radio interfaces of Wi-Fi group of standards
NASA Astrophysics Data System (ADS)
Litvinskaya, O. S.
2018-05-01
This paper offers models of information exchange between radio interfaces of the Wi-Fi group of standards by the example of a real facility management system for the oil and gas industry. Interaction between the MU-MIMO and MIMO technologies is analyzed. An optimal variant of information exchange is proposed.
Search for a Standard Model Higgs Boson with a Dilepton and Missing Energy Signature
Gerbaudo, Davide
2011-09-01
The subject of this thesis is the search for a standard model Higgs boson decaying to a pair of W bosons that in turn decay leptonically, H → W +W - →more » $$\\bar{ℓ}$$vℓ$$\\bar{v}$$. This search is performed considering events produced in p$$\\bar{p}$$ collisions at √s = 1.96 TeV, where two oppositely charged lepton candidates (e +e -, e ±μ ±, or μ +μ} -), and missing transverse energy, have been reconstructed. The data were collected with the D0 detector at the Fermilab Tevatron collider, and are tested against the standard model predictions computed for a Higgs boson with mass in the range 115-200 GeV. No excess of events over background is observed, and limits on Standard Model Higgs boson production are determined. An interpretation of these limits within the hypothesis of a fourth-generation extension to the standard model is also given. The overall analysis scheme is the same for the three dilepton pairs being considered (e +e -, e ±μ ±, or μ +μ -); this thesis, however, describes in detail the study of the dimuon final state.« less
The Standard Model in the history of the Natural Sciences, Econometrics, and the social sciences
NASA Astrophysics Data System (ADS)
Fisher, W. P., Jr.
2010-07-01
In the late 18th and early 19th centuries, scientists appropriated Newton's laws of motion as a model for the conduct of any other field of investigation that would purport to be a science. This early form of a Standard Model eventually informed the basis of analogies for the mathematical expression of phenomena previously studied qualitatively, such as cohesion, affinity, heat, light, electricity, and magnetism. James Clerk Maxwell is known for his repeated use of a formalized version of this method of analogy in lectures, teaching, and the design of experiments. Economists transferring skills learned in physics made use of the Standard Model, especially after Maxwell demonstrated the value of conceiving it in abstract mathematics instead of as a concrete and literal mechanical analogy. Haavelmo's probability approach in econometrics and R. Fisher's Statistical Methods for Research Workers brought a statistical approach to bear on the Standard Model, quietly reversing the perspective of economics and the social sciences relative to that of physics. Where physicists, and Maxwell in particular, intuited scientific method as imposing stringent demands on the quality and interrelations of data, instruments, and theory in the name of inferential and comparative stability, statistical models and methods disconnected theory from data by removing the instrument as an essential component. New possibilities for reconnecting economics and the social sciences to Maxwell's sense of the method of analogy are found in Rasch's probabilistic models for measurement.
SMI Compatible Simulation Scheduler Design for Reuse of Model Complying with Smp Standard
NASA Astrophysics Data System (ADS)
Koo, Cheol-Hea; Lee, Hoon-Hee; Cheon, Yee-Jin
2010-12-01
Software reusability is one of key factors which impacts cost and schedule on a software development project. It is very crucial also in satellite simulator development since there are many commercial simulator models related to satellite and dynamics. If these models can be used in another simulator platform, great deal of confidence and cost/schedule reduction would be achieved. Simulation model portability (SMP) is maintained by European Space Agency and many models compatible with SMP/simulation model interface (SMI) are available. Korea Aerospace Research Institute (KARI) is developing hardware abstraction layer (HAL) supported satellite simulator to verify on-board software of satellite. From above reasons, KARI wants to port these SMI compatible models to the HAL supported satellite simulator. To port these SMI compatible models to the HAL supported satellite simulator, simulation scheduler is preliminary designed according to the SMI standard.
NASA Astrophysics Data System (ADS)
Mirvis, E.; Iredell, M.
2015-12-01
The operational (OPS) NOAA National Centers for Environmental Prediction (NCEP) suite, traditionally, consist of a large set of multi- scale HPC models, workflows, scripts, tools and utilities, which are very much depending on the variety of the additional components. Namely, this suite utilizes a unique collection of the in-house developed 20+ shared libraries (NCEPLIBS), certain versions of the 3-rd party libraries (like netcdf, HDF, ESMF, jasper, xml etc.), HPC workflow tool within dedicated (sometimes even vendors' customized) HPC system homogeneous environment. This domain and site specific, accompanied with NCEP's product- driven large scale real-time data operations complicates NCEP collaborative development tremendously by reducing chances to replicate this OPS environment anywhere else. The NOAA/NCEP's Environmental Modeling Center (EMC) missions to develop and improve numerical weather, climate, hydrological and ocean prediction through the partnership with the research community. Realizing said difficulties, lately, EMC has been taken an innovative approach to improve flexibility of the HPC environment by building the elements and a foundation for NCEP OPS functionally equivalent environment (FEE), which can be used to ease the external interface constructs as well. Aiming to reduce turnaround time of the community code enhancements via Research-to-Operations (R2O) cycle, EMC developed and deployed several project sub-set standards that already paved the road to NCEP OPS implementation standards. In this topic we will discuss the EMC FEE for O2R requirements and approaches in collaborative standardization, including NCEPLIBS FEE and models code version control paired with the models' derived customized HPC modules and FEE footprints. We will share NCEP/EMC experience and potential in the refactoring of EMC development processes, legacy codes and in securing model source code quality standards by using combination of the Eclipse IDE, integrated with the
Rübel, Oliver; Dougherty, Max; Prabhat; Denes, Peter; Conant, David; Chang, Edward F.; Bouchard, Kristofer
2016-01-01
Neuroscience continues to experience a tremendous growth in data; in terms of the volume and variety of data, the velocity at which data is acquired, and in turn the veracity of data. These challenges are a serious impediment to sharing of data, analyses, and tools within and across labs. Here, we introduce BRAINformat, a novel data standardization framework for the design and management of scientific data formats. The BRAINformat library defines application-independent design concepts and modules that together create a general framework for standardization of scientific data. We describe the formal specification of scientific data standards, which facilitates sharing and verification of data and formats. We introduce the concept of Managed Objects, enabling semantic components of data formats to be specified as self-contained units, supporting modular and reusable design of data format components and file storage. We also introduce the novel concept of Relationship Attributes for modeling and use of semantic relationships between data objects. Based on these concepts we demonstrate the application of our framework to design and implement a standard format for electrophysiology data and show how data standardization and relationship-modeling facilitate data analysis and sharing. The format uses HDF5, enabling portable, scalable, and self-describing data storage and integration with modern high-performance computing for data-driven discovery. The BRAINformat library is open source, easy-to-use, and provides detailed user and developer documentation and is freely available at: https://bitbucket.org/oruebel/brainformat. PMID:27867355
Song, Mi; Chen, Zeng-Ping; Chen, Yao; Jin, Jing-Wen
2014-07-01
Liquid chromatography-mass spectrometry assays suffer from signal instability caused by the gradual fouling of the ion source, vacuum instability, aging of the ion multiplier, etc. To address this issue, in this contribution, an internal standard was added into the mobile phase. The internal standard was therefore ionized and detected together with the analytes of interest by the mass spectrometer to ensure that variations in measurement conditions and/or instrument have similar effects on the signal contributions of both the analytes of interest and the internal standard. Subsequently, based on the unique strategy of adding internal standard in mobile phase, a multiplicative effects model was developed for quantitative LC-MS assays and tested on a proof of concept model system: the determination of amino acids in water by LC-MS. The experimental results demonstrated that the proposed method could efficiently mitigate the detrimental effects of continuous signal variation, and achieved quantitative results with average relative predictive error values in the range of 8.0-15.0%, which were much more accurate than the corresponding results of conventional internal standard method based on the peak height ratio and partial least squares method (their average relative predictive error values were as high as 66.3% and 64.8%, respectively). Therefore, it is expected that the proposed method can be developed and extended in quantitative LC-MS analysis of more complex systems. Copyright © 2014 Elsevier B.V. All rights reserved.
DOT National Transportation Integrated Search
1982-01-01
This report presents the user instructions and data requirements for SIMCO, a combined simulation and probability computer model developed to quantify and evaluate carbon monoxide in roadside environments. The model permits direct determinations of t...
Gratacós, Jordi; Luelmo, Jesús; Rodríguez, Jesús; Notario, Jaume; Marco, Teresa Navío; de la Cueva, Pablo; Busquets, Manel Pujol; Font, Mercè García; Joven, Beatriz; Rivera, Raquel; Vega, Jose Luis Alvarez; Álvarez, Antonio Javier Chaves; Parera, Ricardo Sánchez; Carrascosa, Jose Carlos Ruiz; Martínez, Fernando José Rodríguez; Sánchez, José Pardo; Olmos, Carlos Feced; Pujol, Conrad; Galindez, Eva; Barrio, Silvia Pérez; Arana, Ana Urruticoechea; Hergueta, Mercedes; Coto, Pablo; Queiro, Rubén
2018-06-01
To define and give priority to standards of care and quality indicators of multidisciplinary care for patients with psoriatic arthritis (PsA). A systematic literature review on PsA standards of care and quality indicators was performed. An expert panel of rheumatologists and dermatologists who provide multidisciplinary care was established. In a consensus meeting group, the experts discussed and developed the standards of care and quality indicators and graded their priority, agreement and also the feasibility (only for quality indicators) following qualitative methodology and a Delphi process. Afterwards, these results were discussed with 2 focus groups, 1 with patients, another with health managers. A descriptive analysis is presented. We obtained 25 standards of care (9 of structure, 9 of process, 7 of results) and 24 quality indicators (2 of structure, 5 of process, 17 of results). Standards of care include relevant aspects in the multidisciplinary care of PsA patients like an appropriate physical infrastructure and technical equipment, the access to nursing care, labs and imaging techniques, other health professionals and treatments, or the development of care plans. Regarding quality indicators, the definition of multidisciplinary care model objectives and referral criteria, the establishment of responsibilities and coordination among professionals and the active evaluation of patients and data collection were given a high priority. Patients considered all of them as important. This set of standards of care and quality indicators for the multidisciplinary care of patients with PsA should help improve quality of care in these patients.
NASA Astrophysics Data System (ADS)
Sinha, Manodeep; Berlind, Andreas A.; McBride, Cameron K.; Scoccimarro, Roman; Piscionere, Jennifer A.; Wibking, Benjamin D.
2018-04-01
Interpreting the small-scale clustering of galaxies with halo models can elucidate the connection between galaxies and dark matter halos. Unfortunately, the modelling is typically not sufficiently accurate for ruling out models statistically. It is thus difficult to use the information encoded in small scales to test cosmological models or probe subtle features of the galaxy-halo connection. In this paper, we attempt to push halo modelling into the "accurate" regime with a fully numerical mock-based methodology and careful treatment of statistical and systematic errors. With our forward-modelling approach, we can incorporate clustering statistics beyond the traditional two-point statistics. We use this modelling methodology to test the standard ΛCDM + halo model against the clustering of SDSS DR7 galaxies. Specifically, we use the projected correlation function, group multiplicity function and galaxy number density as constraints. We find that while the model fits each statistic separately, it struggles to fit them simultaneously. Adding group statistics leads to a more stringent test of the model and significantly tighter constraints on model parameters. We explore the impact of varying the adopted halo definition and cosmological model and find that changing the cosmology makes a significant difference. The most successful model we tried (Planck cosmology with Mvir halos) matches the clustering of low luminosity galaxies, but exhibits a 2.3σ tension with the clustering of luminous galaxies, thus providing evidence that the "standard" halo model needs to be extended. This work opens the door to adding interesting freedom to the halo model and including additional clustering statistics as constraints.
NASA Astrophysics Data System (ADS)
Sinha, Manodeep; Berlind, Andreas A.; McBride, Cameron K.; Scoccimarro, Roman; Piscionere, Jennifer A.; Wibking, Benjamin D.
2018-07-01
Interpreting the small-scale clustering of galaxies with halo models can elucidate the connection between galaxies and dark matter haloes. Unfortunately, the modelling is typically not sufficiently accurate for ruling out models statistically. It is thus difficult to use the information encoded in small scales to test cosmological models or probe subtle features of the galaxy-halo connection. In this paper, we attempt to push halo modelling into the `accurate' regime with a fully numerical mock-based methodology and careful treatment of statistical and systematic errors. With our forward-modelling approach, we can incorporate clustering statistics beyond the traditional two-point statistics. We use this modelling methodology to test the standard Λ cold dark matter (ΛCDM) + halo model against the clustering of Sloan Digital Sky Survey (SDSS) seventh data release (DR7) galaxies. Specifically, we use the projected correlation function, group multiplicity function, and galaxy number density as constraints. We find that while the model fits each statistic separately, it struggles to fit them simultaneously. Adding group statistics leads to a more stringent test of the model and significantly tighter constraints on model parameters. We explore the impact of varying the adopted halo definition and cosmological model and find that changing the cosmology makes a significant difference. The most successful model we tried (Planck cosmology with Mvir haloes) matches the clustering of low-luminosity galaxies, but exhibits a 2.3σ tension with the clustering of luminous galaxies, thus providing evidence that the `standard' halo model needs to be extended. This work opens the door to adding interesting freedom to the halo model and including additional clustering statistics as constraints.
Ecotoxicological models generally have large data requirements and are frequently based on existing information from diverse sources. Standardizing data for toxicological models may be necessary to reduce extraneous variation and to ensure models reflect intrinsic relationships. ...
Laomettachit, Teeraphan; Chen, Katherine C; Baumann, William T; Tyson, John J
2016-01-01
To understand the molecular mechanisms that regulate cell cycle progression in eukaryotes, a variety of mathematical modeling approaches have been employed, ranging from Boolean networks and differential equations to stochastic simulations. Each approach has its own characteristic strengths and weaknesses. In this paper, we propose a "standard component" modeling strategy that combines advantageous features of Boolean networks, differential equations and stochastic simulations in a framework that acknowledges the typical sorts of reactions found in protein regulatory networks. Applying this strategy to a comprehensive mechanism of the budding yeast cell cycle, we illustrate the potential value of standard component modeling. The deterministic version of our model reproduces the phenotypic properties of wild-type cells and of 125 mutant strains. The stochastic version of our model reproduces the cell-to-cell variability of wild-type cells and the partial viability of the CLB2-dbΔ clb5Δ mutant strain. Our simulations show that mathematical modeling with "standard components" can capture in quantitative detail many essential properties of cell cycle control in budding yeast.
Laomettachit, Teeraphan; Chen, Katherine C.; Baumann, William T.
2016-01-01
To understand the molecular mechanisms that regulate cell cycle progression in eukaryotes, a variety of mathematical modeling approaches have been employed, ranging from Boolean networks and differential equations to stochastic simulations. Each approach has its own characteristic strengths and weaknesses. In this paper, we propose a “standard component” modeling strategy that combines advantageous features of Boolean networks, differential equations and stochastic simulations in a framework that acknowledges the typical sorts of reactions found in protein regulatory networks. Applying this strategy to a comprehensive mechanism of the budding yeast cell cycle, we illustrate the potential value of standard component modeling. The deterministic version of our model reproduces the phenotypic properties of wild-type cells and of 125 mutant strains. The stochastic version of our model reproduces the cell-to-cell variability of wild-type cells and the partial viability of the CLB2-dbΔ clb5Δ mutant strain. Our simulations show that mathematical modeling with “standard components” can capture in quantitative detail many essential properties of cell cycle control in budding yeast. PMID:27187804
Liu, Yan; Cai, Wensheng; Shao, Xueguang
2016-12-05
Calibration transfer is essential for practical applications of near infrared (NIR) spectroscopy because the measurements of the spectra may be performed on different instruments and the difference between the instruments must be corrected. For most of calibration transfer methods, standard samples are necessary to construct the transfer model using the spectra of the samples measured on two instruments, named as master and slave instrument, respectively. In this work, a method named as linear model correction (LMC) is proposed for calibration transfer without standard samples. The method is based on the fact that, for the samples with similar physical and chemical properties, the spectra measured on different instruments are linearly correlated. The fact makes the coefficients of the linear models constructed by the spectra measured on different instruments are similar in profile. Therefore, by using the constrained optimization method, the coefficients of the master model can be transferred into that of the slave model with a few spectra measured on slave instrument. Two NIR datasets of corn and plant leaf samples measured with different instruments are used to test the performance of the method. The results show that, for both the datasets, the spectra can be correctly predicted using the transferred partial least squares (PLS) models. Because standard samples are not necessary in the method, it may be more useful in practical uses. Copyright © 2016 Elsevier B.V. All rights reserved.
NASA Technical Reports Server (NTRS)
Zang, Thomas A.; Luckring, James M.; Morrison, Joseph H.; Blattnig, Steve R.; Green, Lawrence L.; Tripathi, Ram K.
2007-01-01
The National Aeronautics and Space Administration (NASA) recently issued an interim version of the Standard for Models and Simulations (M&S Standard) [1]. The action to develop the M&S Standard was identified in an internal assessment [2] of agency-wide changes needed in the wake of the Columbia Accident [3]. The primary goal of this standard is to ensure that the credibility of M&S results is properly conveyed to those making decisions affecting human safety or mission success criteria. The secondary goal is to assure that the credibility of the results from models and simulations meets the project requirements (for credibility). This presentation explains the motivation and key aspects of the M&S Standard, with a special focus on the requirements for verification, validation and uncertainty quantification. Some pilot applications of this standard to computational fluid dynamics applications will be provided as illustrations. The authors of this paper are the members of the team that developed the initial three drafts of the standard, the last of which benefited from extensive comments from most of the NASA Centers. The current version (number 4) incorporates modifications made by a team representing 9 of the 10 NASA Centers. A permanent version of the M&S Standard is expected by December 2007. The scope of the M&S Standard is confined to those uses of M&S that support program and project decisions that may affect human safety or mission success criteria. Such decisions occur, in decreasing order of importance, in the operations, the test & evaluation, and the design & analysis phases. Requirements are placed on (1) program and project management, (2) models, (3) simulations and analyses, (4) verification, validation and uncertainty quantification (VV&UQ), (5) recommended practices, (6) training, (7) credibility assessment, and (8) reporting results to decision makers. A key component of (7) and (8) is the use of a Credibility Assessment Scale, some of the details
Germovsek, Eva; Barker, Charlotte I S; Sharland, Mike; Standing, Joseph F
2018-04-19
Pharmacokinetic/pharmacodynamic (PKPD) modeling is important in the design and conduct of clinical pharmacology research in children. During drug development, PKPD modeling and simulation should underpin rational trial design and facilitate extrapolation to investigate efficacy and safety. The application of PKPD modeling to optimize dosing recommendations and therapeutic drug monitoring is also increasing, and PKPD model-based dose individualization will become a core feature of personalized medicine. Following extensive progress on pediatric PK modeling, a greater emphasis now needs to be placed on PD modeling to understand age-related changes in drug effects. This paper discusses the principles of PKPD modeling in the context of pediatric drug development, summarizing how important PK parameters, such as clearance (CL), are scaled with size and age, and highlights a standardized method for CL scaling in children. One standard scaling method would facilitate comparison of PK parameters across multiple studies, thus increasing the utility of existing PK models and facilitating optimal design of new studies.
Abbas, Ismail; Rovira, Joan; Casanovas, Josep
2006-12-01
To develop and validate a model of a clinical trial that evaluates the changes in cholesterol level as a surrogate marker for lipodystrophy in HIV subjects under alternative antiretroviral regimes, i.e., treatment with Protease Inhibitors vs. a combination of nevirapine and other antiretroviral drugs. Five simulation models were developed based on different assumptions, on treatment variability and pattern of cholesterol reduction over time. The last recorded cholesterol level, the difference from the baseline, the average difference from the baseline and level evolution, are the considered endpoints. Specific validation criteria based on a 10% minus or plus standardized distance in means and variances were used to compare the real and the simulated data. The validity criterion was met by all models for considered endpoints. However, only two models met the validity criterion when all endpoints were considered. The model based on the assumption that within-subjects variability of cholesterol levels changes over time is the one that minimizes the validity criterion, standardized distance equal to or less than 1% minus or plus. Simulation is a useful technique for calibration, estimation, and evaluation of models, which allows us to relax the often overly restrictive assumptions regarding parameters required by analytical approaches. The validity criterion can also be used to select the preferred model for design optimization, until additional data are obtained allowing an external validation of the model.
Standard Information Models for Representing Adverse Sensitivity Information in Clinical Documents.
Topaz, M; Seger, D L; Goss, F; Lai, K; Slight, S P; Lau, J J; Nandigam, H; Zhou, L
2016-01-01
Adverse sensitivity (e.g., allergy and intolerance) information is a critical component of any electronic health record system. While several standards exist for structured entry of adverse sensitivity information, many clinicians record this data as free text. This study aimed to 1) identify and compare the existing common adverse sensitivity information models, and 2) to evaluate the coverage of the adverse sensitivity information models for representing allergy information on a subset of inpatient and outpatient adverse sensitivity clinical notes. We compared four common adverse sensitivity information models: Health Level 7 Allergy and Intolerance Domain Analysis Model, HL7-DAM; the Fast Healthcare Interoperability Resources, FHIR; the Consolidated Continuity of Care Document, C-CDA; and OpenEHR, and evaluated their coverage on a corpus of inpatient and outpatient notes (n = 120). We found that allergy specialists' notes had the highest frequency of adverse sensitivity attributes per note, whereas emergency department notes had the fewest attributes. Overall, the models had many similarities in the central attributes which covered between 75% and 95% of adverse sensitivity information contained within the notes. However, representations of some attributes (especially the value-sets) were not well aligned between the models, which is likely to present an obstacle for achieving data interoperability. Also, adverse sensitivity exceptions were not well represented among the information models. Although we found that common adverse sensitivity models cover a significant portion of relevant information in the clinical notes, our results highlight areas needed to be reconciled between the standards for data interoperability.
Much ado about mice: Standard-setting in model organism research.
Hardesty, Rebecca A
2018-04-11
Recently there has been a practice turn in the philosophy of science that has called for analyses to be grounded in the actual doings of everyday science. This paper is in furtherance of this call and it does so by employing participant-observation ethnographic methods as a tool for discovering epistemological features of scientific practice in a neuroscience lab. The case I present focuses on a group of neurobiologists researching the genetic underpinnings of cognition in Down syndrome (DS) and how they have developed a new mouse model which they argue should be regarded as the "gold standard" for all DS mouse research. Through use of ethnographic methods, interviews, and analyses of publications, I uncover how the lab constructed their new mouse model. Additionally, I describe how model organisms can serve as abstract standards for scientific work that impact the epistemic value of scientific claims, regulate practice, and constrain future work. Copyright © 2018 Elsevier Ltd. All rights reserved.
A mid-latitude ozone model for the 1976 U.S. standard atmosphere
NASA Technical Reports Server (NTRS)
Krueger, A. J.; Minzner, R. A.
1976-01-01
A mid-latitude northern hemisphere model of the daytime ozone distribution in the troposphere, stratosphere, and lower mesosphere has been constructed. Data from rocket soundings in the latitude range of 45 deg N + or - 15 deg N, results of balloon soundings at latitudes from 41 to 47 deg N, and latitude gradients from satellite ozone observations have been combined to produce estimates of the annual mean ozone concentration and its variability at heights up to 74 km for an effective latitude of 45 deg N. This model is a revision for heights above 26 km of the tentative mid-latitude ozone model, included in the U.S. Standard Atmosphere Supplements, 1966, and has been adopted for use in the U.S. Standard Atmosphere, 1976.
A search for non-standard model W helicity in top quark decays
NASA Astrophysics Data System (ADS)
Kilminster, Benjamin John
The structure of the tbW vertex is probed by measuring the polarization of the W in t → W + b → l + v + b. The invariant mass of the lepton and b quark measures the W decay angle which in turn allows a comparison with polarizations expected from different possible models for the spin properties of the tbW interaction. We measure the fraction by rate of Ws produced with a V + A coupling in lieu of the Standard Model V-A to be fV + A = -0.21+0.42-0.24 (stat) +/- 0.21 (sys). We assign a limit of fV + A < 0.80 95% Confidence Level (CL). By combining this result with a complementary observable in the same data, we assign a limit of fV + A < 0.61 95% CL. We find no evidence for a non-Standard Model tbW vertex.
Single Top Production at Next-to-Leading Order in the Standard Model Effective Field Theory.
Zhang, Cen
2016-04-22
Single top production processes at hadron colliders provide information on the relation between the top quark and the electroweak sector of the standard model. We compute the next-to-leading order QCD corrections to the three main production channels: t-channel, s-channel, and tW associated production, in the standard model including operators up to dimension six. The calculation can be matched to parton shower programs and can therefore be directly used in experimental analyses. The QCD corrections are found to significantly impact the extraction of the current limits on the operators, because both of an improved accuracy and a better precision of the theoretical predictions. In addition, the distributions of some of the key discriminating observables are modified in a nontrivial way, which could change the interpretation of measurements in terms of UV complete models.
NASA Standard for Models and Simulations (M and S): Development Process and Rationale
NASA Technical Reports Server (NTRS)
Zang, Thomas A.; Blattnig, Steve R.; Green, Lawrence L.; Hemsch, Michael J.; Luckring, James M.; Morison, Joseph H.; Tripathi, Ram K.
2009-01-01
After the Columbia Accident Investigation Board (CAIB) report. the NASA Administrator at that time chartered an executive team (known as the Diaz Team) to identify the CAIB report elements with Agency-wide applicability, and to develop corrective measures to address each element. This report documents the chronological development and release of an Agency-wide Standard for Models and Simulations (M&S) (NASA Standard 7009) in response to Action #4 from the report, "A Renewed Commitment to Excellence: An Assessment of the NASA Agency-wide Applicability of the Columbia Accident Investigation Board Report, January 30, 2004".
ERIC Educational Resources Information Center
Kaliski, Pamela; Wind, Stefanie A.; Engelhard, George, Jr.; Morgan, Deanna; Plake, Barbara; Reshetar, Rosemary
2012-01-01
The Many-Facet Rasch (MFR) Model is traditionally used to evaluate the quality of ratings on constructed response assessments; however, it can also be used to evaluate the quality of judgments from panel-based standard setting procedures. The current study illustrates the use of the MFR Model by examining the quality of ratings obtained from a…
NASA Astrophysics Data System (ADS)
Cao, Zhenggang; Ding, Zengqian; Hu, Zhixiong; Wen, Tao; Qiao, Wen; Liu, Wenli
2016-10-01
Optical coherence tomography (OCT) has been widely applied in diagnosis of eye diseases during the last 20 years. Differing from traditional two-dimension imaging technologies, OCT could also provide cross-sectional information of target tissues simultaneously and precisely. As well known, axial resolution is one of the most critical parameters impacting the OCT image quality, which determines whether an accurate diagnosis could be obtained. Therefore, it is important to evaluate the axial resolution of an OCT equipment. Phantoms always play an important role in the standardization and validation process. Here, a standard model eye with micro-scale multilayer structure was custom designed and manufactured. Mimicking a real human eye, analyzing the physical characteristic of layer structures of retina and cornea in-depth, appropriate materials were selected by testing the scattering coefficient of PDMS phantoms with difference concentration of TiO2 or BaSO4 particles. An artificial retina and cornea with multilayer-films which have a thickness of 10 to 60 micrometers for each layer were fabricated using spin coating technology. Considering key parameters of the standard model eye need to be traceable as well as accurate, the optical refractive index and layer structure thicknesses of phantoms were verified by utilizing Thickness Monitoring System. Consequently, a standard OCT model eye was obtained after the retinal or corneal phantom was embedded into a water-filled model eye which has been fabricated by 3D printing technology to simulate ocular dispersion and emmetropic refraction. The eye model was manufactured with a transparent resin to simulate realistic ophthalmic testing environment, and most key optical elements including cornea, lens and vitreous body were realized. By investigating with a research and a clinical OCT system respectively, the OCT model eye was demonstrated with similar physical properties as natural eye, and the multilayer film measurement
2003-07-01
standard release with the publicly available "mod" interface allows us to avoid purchasing a game engine license (approximate cost $350,000) from Epic...depletion is accurately simulated for ammunition * Both contain target detection, target identification, target selection, and collision avoidance and...into other game genres such as Real-Time Strategy (RTS) games and Massively Multiplayer Online Role- Playing Games ( MMORPG ). Unfortunately these game
Hayashi, Kazuo; Chung, Onejune; Park, Seojung; Lee, Seung-Pyo; Sachdeva, Rohit C L; Mizoguchi, Itaru
2015-03-01
Virtual 3-dimensional (3D) models obtained by scanning of physical casts have become an alternative to conventional dental cast analysis in orthodontic treatment. If the precision (reproducibility) of virtual 3D model analysis can be further improved, digital orthodontics could be even more widely accepted. The purpose of this study was to clarify the influence of "standardization" of the target points for dental cast analysis using virtual 3D models. Physical plaster models were also measured to obtain additional information. Five sets of dental casts were used. The dental casts were scanned with R700 (3Shape, Copenhagen, Denmark) and REXCAN DS2 3D (Solutionix, Seoul, Korea) scanners. In this study, 3 system and software packages were used: SureSmile (OraMetrix, Richardson, Tex), Rapidform (Inus, Seoul, Korea), and I-DEAS (SDRC, Milford, Conn). Without standardization, the maximum differences were observed between the SureSmile software and the Rapidform software (0.39 mm ± 0.07). With standardization, the maximum differences were observed between the SureSmile software and measurements with a digital caliper (0.099 mm ± 0.01), and this difference was significantly greater (P <0.05) than the 2 other mean difference values. Furthermore, the results of this study showed that the mean differences "WITH" standardization were significantly lower than those "WITHOUT" standardization for all systems, software packages, or methods. The results showed that elimination of the influence of usability or habituation is important for improving the reproducibility of dental cast analysis. Copyright © 2015 American Association of Orthodontists. Published by Elsevier Inc. All rights reserved.
Beyond the standard of care: a new model to judge medical negligence.
Brenner, Lawrence H; Brenner, Alison Tytell; Awerbuch, Eric J; Horwitz, Daniel
2012-05-01
The term "standard of care" has been used in law and medicine to determine whether medical care is negligent. However, the precise meaning of this concept is often unclear for both medical and legal professionals. Our purposes are to (1) examine the limitations of using standard of care as a measure of negligence, (2) propose the use of the legal concepts of justification and excuse in developing a new model of examining medical conduct, and (3) outline the framework of this model. We applied the principles of tort liability set forth in the clinical and legal literature to describe the difficulty in applying standard of care in medical negligence cases. Using the concepts of justification and excuse, we propose a judicial model that may promote fair and just jury verdicts in medical negligence cases. Contrary to conventional understanding, medical negligence is not simply nonconformity to norms. Two additional concepts of legal liability, ie, justification and excuse, must also be considered to properly judge medical conduct. Medical conduct is justified when the benefits outweigh the risks; the law sanctions the conduct and encourages future conduct under similar circumstances. Excuse, on the other hand, relieves a doctor of legal liability under specific circumstances even though his/her conduct was not justified. Standard of care is an inaccurate measure of medical negligence because it is premised on the faulty notion of conformity to norms. An alternative judicial model to determine medical negligence would (1) eliminate standard of care in medical malpractice law, (2) reframe the court instruction to jurors, and (3) establish an ongoing consensus committee on orthopaedic principles of negligence.
Efforts to integrate CMIP metadata and standards into NOAA-GFDL's climate model workflow
NASA Astrophysics Data System (ADS)
Blanton, C.; Lee, M.; Mason, E. E.; Radhakrishnan, A.
2017-12-01
Modeling centers participating in CMIP6 run model simulations, publish requested model output (conforming to community data standards), and document models and simulations using ES-DOC. GFDL developed workflow software implementing some best practices to meet these metadata and documentation requirements. The CMIP6 Data Request defines the variables that should be archived for each experiment and specifies their spatial and temporal structure. We used the Data Request's dreqPy python library to write GFDL model configuration files as an alternative to hand-crafted tables. There was also a largely successful effort to standardize variable names within the model to reduce the additional overhead of translating "GFDL to CMOR" variables at a later stage in the pipeline. The ES-DOC ecosystem provides tools and standards to create, publish, and view various types of community-defined CIM documents, most notably model and simulation documents. Although ES-DOC will automatically create simulation documents during publishing by harvesting NetCDF global attributes, the information must be collected, stored, and placed in the NetCDF files by the workflow. We propose to develop a GUI to collect the simulation document precursors. In addition, a new MIP for CMIP6-CPMIP, a comparison of computational performance of climate models-is documented using machine and performance CIM documents. We used ES-DOC's pyesdoc python library to automatically create these machine and performance documents. We hope that these and similar efforts will become permanent features of the GFDL workflow to facilitate future participation in CMIP-like activities.
A standardization model based on image recognition for performance evaluation of an oral scanner.
Seo, Sang-Wan; Lee, Wan-Sun; Byun, Jae-Young; Lee, Kyu-Bok
2017-12-01
Accurate information is essential in dentistry. The image information of missing teeth is used in optically based medical equipment in prosthodontic treatment. To evaluate oral scanners, the standardized model was examined from cases of image recognition errors of linear discriminant analysis (LDA), and a model that combines the variables with reference to ISO 12836:2015 was designed. The basic model was fabricated by applying 4 factors to the tooth profile (chamfer, groove, curve, and square) and the bottom surface. Photo-type and video-type scanners were used to analyze 3D images after image capture. The scans were performed several times according to the prescribed sequence to distinguish the model from the one that did not form, and the results confirmed it to be the best. In the case of the initial basic model, a 3D shape could not be obtained by scanning even if several shots were taken. Subsequently, the recognition rate of the image was improved with every variable factor, and the difference depends on the tooth profile and the pattern of the floor surface. Based on the recognition error of the LDA, the recognition rate decreases when the model has a similar pattern. Therefore, to obtain the accurate 3D data, the difference of each class needs to be provided when developing a standardized model.
A standard protocol for describing individual-based and agent-based models
Grimm, Volker; Berger, Uta; Bastiansen, Finn; Eliassen, Sigrunn; Ginot, Vincent; Giske, Jarl; Goss-Custard, John; Grand, Tamara; Heinz, Simone K.; Huse, Geir; Huth, Andreas; Jepsen, Jane U.; Jorgensen, Christian; Mooij, Wolf M.; Muller, Birgit; Pe'er, Guy; Piou, Cyril; Railsback, Steven F.; Robbins, Andrew M.; Robbins, Martha M.; Rossmanith, Eva; Ruger, Nadja; Strand, Espen; Souissi, Sami; Stillman, Richard A.; Vabo, Rune; Visser, Ute; DeAngelis, Donald L.
2006-01-01
Simulation models that describe autonomous individual organisms (individual based models, IBM) or agents (agent-based models, ABM) have become a widely used tool, not only in ecology, but also in many other disciplines dealing with complex systems made up of autonomous entities. However, there is no standard protocol for describing such simulation models, which can make them difficult to understand and to duplicate. This paper presents a proposed standard protocol, ODD, for describing IBMs and ABMs, developed and tested by 28 modellers who cover a wide range of fields within ecology. This protocol consists of three blocks (Overview, Design concepts, and Details), which are subdivided into seven elements: Purpose, State variables and scales, Process overview and scheduling, Design concepts, Initialization, Input, and Submodels. We explain which aspects of a model should be described in each element, and we present an example to illustrate the protocol in use. In addition, 19 examples are available in an Online Appendix. We consider ODD as a first step for establishing a more detailed common format of the description of IBMs and ABMs. Once initiated, the protocol will hopefully evolve as it becomes used by a sufficiently large proportion of modellers.
Secular changes in standards of bodily attractiveness in women: tests of a reproductive model.
Barber, N
1998-05-01
Since success at work is favored by a more slender body build while reproduction is favored by curvaceousness, standards of women's bodily attractiveness should be predictable from economic and reproductive variables. This hypothesis was tested in a replication and extension of a study by Silverstein, Perdue, Peterson, Vogel, and Fantini (1986) which looked at correlates of curvaceousness of Vogue models over time. As economic prosperity increased, and as women's participation in the economy, and higher education, increased, curvaceousness of the standards declined. As the proportion of single women to men, both aged 20-24 years, increased, and as the birth rate declined, curvaceousness was reduced. Results suggest that cultural standards of attractiveness are influenced by an evolved psychology of mate selection.
Boullata, Joseph I; Holcombe, Beverly; Sacks, Gordon; Gervasio, Jane; Adams, Stephen C; Christensen, Michael; Durfee, Sharon; Ayers, Phil; Marshall, Neil; Guenter, Peggi
2016-08-01
Parenteral nutrition (PN) is a high-alert medication with a complex drug use process. Key steps in the process include the review of each PN prescription followed by the preparation of the formulation. The preparation step includes compounding the PN or activating a standardized commercially available PN product. The verification and review, as well as preparation of this complex therapy, require competency that may be determined by using a standardized process for pharmacists and for pharmacy technicians involved with PN. An American Society for Parenteral and Enteral Nutrition (ASPEN) standardized model for PN order review and PN preparation competencies is proposed based on a competency framework, the ASPEN-published interdisciplinary core competencies, safe practice recommendations, and clinical guidelines, and is intended for institutions and agencies to use with their staff. © 2016 American Society for Parenteral and Enteral Nutrition.
Low temperature electroweak phase transition in the Standard Model with hidden scale invariance
NASA Astrophysics Data System (ADS)
Arunasalam, Suntharan; Kobakhidze, Archil; Lagger, Cyril; Liang, Shelley; Zhou, Albert
2018-01-01
We discuss a cosmological phase transition within the Standard Model which incorporates spontaneously broken scale invariance as a low-energy theory. In addition to the Standard Model fields, the minimal model involves a light dilaton, which acquires a large vacuum expectation value (VEV) through the mechanism of dimensional transmutation. Under the assumption of the cancellation of the vacuum energy, the dilaton develops a very small mass at 2-loop order. As a result, a flat direction is present in the classical dilaton-Higgs potential at zero temperature while the quantum potential admits two (almost) degenerate local minima with unbroken and broken electroweak symmetry. We found that the cosmological electroweak phase transition in this model can only be triggered by a QCD chiral symmetry breaking phase transition at low temperatures, T ≲ 132 MeV. Furthermore, unlike the standard case, the universe settles into the chiral symmetry breaking vacuum via a first-order phase transition which gives rise to a stochastic gravitational background with a peak frequency ∼10-8 Hz as well as triggers the production of approximately solar mass primordial black holes. The observation of these signatures of cosmological phase transitions together with the detection of a light dilaton would provide a strong hint of the fundamental role of scale invariance in particle physics.
Standardized 3D Bioprinting of Soft Tissue Models with Human Primary Cells.
Rimann, Markus; Bono, Epifania; Annaheim, Helene; Bleisch, Matthias; Graf-Hausner, Ursula
2016-08-01
Cells grown in 3D are more physiologically relevant than cells cultured in 2D. To use 3D models in substance testing and regenerative medicine, reproducibility and standardization are important. Bioprinting offers not only automated standardizable processes but also the production of complex tissue-like structures in an additive manner. We developed an all-in-one bioprinting solution to produce soft tissue models. The holistic approach included (1) a bioprinter in a sterile environment, (2) a light-induced bioink polymerization unit, (3) a user-friendly software, (4) the capability to print in standard labware for high-throughput screening, (5) cell-compatible inkjet-based printheads, (6) a cell-compatible ready-to-use BioInk, and (7) standard operating procedures. In a proof-of-concept study, skin as a reference soft tissue model was printed. To produce dermal equivalents, primary human dermal fibroblasts were printed in alternating layers with BioInk and cultured for up to 7 weeks. During long-term cultures, the models were remodeled and fully populated with viable and spreaded fibroblasts. Primary human dermal keratinocytes were seeded on top of dermal equivalents, and epidermis-like structures were formed as verified with hematoxylin and eosin staining and immunostaining. However, a fully stratified epidermis was not achieved. Nevertheless, this is one of the first reports of an integrative bioprinting strategy for industrial routine application. © 2015 Society for Laboratory Automation and Screening.
Blöchliger, Nicolas; Keller, Peter M; Böttger, Erik C; Hombach, Michael
2017-09-01
The procedure for setting clinical breakpoints (CBPs) for antimicrobial susceptibility has been poorly standardized with respect to population data, pharmacokinetic parameters and clinical outcome. Tools to standardize CBP setting could result in improved antibiogram forecast probabilities. We propose a model to estimate probabilities for methodological categorization errors and defined zones of methodological uncertainty (ZMUs), i.e. ranges of zone diameters that cannot reliably be classified. The impact of ZMUs on methodological error rates was used for CBP optimization. The model distinguishes theoretical true inhibition zone diameters from observed diameters, which suffer from methodological variation. True diameter distributions are described with a normal mixture model. The model was fitted to observed inhibition zone diameters of clinical Escherichia coli strains. Repeated measurements for a quality control strain were used to quantify methodological variation. For 9 of 13 antibiotics analysed, our model predicted error rates of < 0.1% applying current EUCAST CBPs. Error rates were > 0.1% for ampicillin, cefoxitin, cefuroxime and amoxicillin/clavulanic acid. Increasing the susceptible CBP (cefoxitin) and introducing ZMUs (ampicillin, cefuroxime, amoxicillin/clavulanic acid) decreased error rates to < 0.1%. ZMUs contained low numbers of isolates for ampicillin and cefuroxime (3% and 6%), whereas the ZMU for amoxicillin/clavulanic acid contained 41% of all isolates and was considered not practical. We demonstrate that CBPs can be improved and standardized by minimizing methodological categorization error rates. ZMUs may be introduced if an intermediate zone is not appropriate for pharmacokinetic/pharmacodynamic or drug dosing reasons. Optimized CBPs will provide a standardized antibiotic susceptibility testing interpretation at a defined level of probability. © The Author 2017. Published by Oxford University Press on behalf of the British Society
40 CFR 86.000-9 - Emission standards for 2000 and later model year light-duty trucks.
Code of Federal Regulations, 2013 CFR
2013-07-01
....000-9 Emission standards for 2000 and later model year light-duty trucks. Section 86.000-9 includes...) and CO Model year Percentage 2002 40 2003 80 2004 100 Table A00-6—Useful Life Standards (G/MI) for... applicable model year's heavy light-duty trucks shall not exceed the applicable SFTP standards in table A00-6...
40 CFR 86.000-9 - Emission standards for 2000 and later model year light-duty trucks.
Code of Federal Regulations, 2012 CFR
2012-07-01
....000-9 Emission standards for 2000 and later model year light-duty trucks. Section 86.000-9 includes...) and CO Model year Percentage 2002 40 2003 80 2004 100 Table A00-6—Useful Life Standards (G/MI) for... applicable model year's heavy light-duty trucks shall not exceed the applicable SFTP standards in table A00-6...
40 CFR 86.000-8 - Emission standards for 2000 and later model year light-duty vehicles.
Code of Federal Regulations, 2012 CFR
2012-07-01
....000-8 Emission standards for 2000 and later model year light-duty vehicles. Section 86.000-8 includes... later model year light-duty vehicles shall meet the additional SFTP standards of table A00-2 (defined by...=NOX) and CO Model year Percentage 2000 40 2001 80 2002 100 Table A00-2—Useful Life Standards (G/MI...
40 CFR 86.000-8 - Emission standards for 2000 and later model year light-duty vehicles.
Code of Federal Regulations, 2013 CFR
2013-07-01
....000-8 Emission standards for 2000 and later model year light-duty vehicles. Section 86.000-8 includes... later model year light-duty vehicles shall meet the additional SFTP standards of table A00-2 (defined by...=NOX) and CO Model year Percentage 2000 40 2001 80 2002 100 Table A00-2—Useful Life Standards (G/MI...
Cai, Longyan; He, Hong S.; Wu, Zhiwei; Lewis, Benard L.; Liang, Yu
2014-01-01
Understanding the fire prediction capabilities of fuel models is vital to forest fire management. Various fuel models have been developed in the Great Xing'an Mountains in Northeast China. However, the performances of these fuel models have not been tested for historical occurrences of wildfires. Consequently, the applicability of these models requires further investigation. Thus, this paper aims to develop standard fuel models. Seven vegetation types were combined into three fuel models according to potential fire behaviors which were clustered using Euclidean distance algorithms. Fuel model parameter sensitivity was analyzed by the Morris screening method. Results showed that the fuel model parameters 1-hour time-lag loading, dead heat content, live heat content, 1-hour time-lag SAV(Surface Area-to-Volume), live shrub SAV, and fuel bed depth have high sensitivity. Two main sensitive fuel parameters: 1-hour time-lag loading and fuel bed depth, were determined as adjustment parameters because of their high spatio-temporal variability. The FARSITE model was then used to test the fire prediction capabilities of the combined fuel models (uncalibrated fuel models). FARSITE was shown to yield an unrealistic prediction of the historical fire. However, the calibrated fuel models significantly improved the capabilities of the fuel models to predict the actual fire with an accuracy of 89%. Validation results also showed that the model can estimate the actual fires with an accuracy exceeding 56% by using the calibrated fuel models. Therefore, these fuel models can be efficiently used to calculate fire behaviors, which can be helpful in forest fire management. PMID:24714164
The stage-value model: Implications for the changing standards of care.
Görtz, Daniel Patrik; Commons, Michael Lamport
2015-01-01
The standard of care is a legal and professional notion against which doctors and other medical personnel are held liable. The standard of care changes as new scientific findings and technological innovations within medicine, pharmacology, nursing and public health are developed and adopted. This study consists of four parts. Part 1 describes the problem and gives concrete examples of its occurrence. The second part discusses the application of the Model of Hierarchical Complexity on the field, giving examples of how standards of care are understood at different behavioral developmental stage. It presents the solution to the problem of standards of care at a Paradigmatic Stage 14. The solution at this stage is a deliberative, communicative process based around why certain norms should or should not apply in each specific case, by the use of "meta-norms". Part 3 proposes a Cross-Paradigmatic Stage 15 view of how the problem of changing standards of care can be solved. The proposed solution is to found the legal procedure in each case on well-established behavioral laws. We maintain that such a behavioristic, scientifically based justice would be much more proficient at effecting restorative legal interventions that create desired behaviors. Copyright © 2015 Elsevier Ltd. All rights reserved.
Multiple-point principle with a scalar singlet extension of the standard model
Haba, Naoyuki; Ishida, Hiroyuki; Okada, Nobuchika; ...
2017-01-21
Here, we suggest a scalar singlet extension of the standard model, in which the multiple-point principle (MPP) condition of a vanishing Higgs potential at the Planck scale is realized. Although there have been lots of attempts to realize the MPP at the Planck scale, the realization with keeping naturalness is quite difficult. This model can easily achieve the MPP at the Planck scale without large Higgs mass corrections. It is worth noting that the electroweak symmetry can be radiatively broken in our model. In the naturalness point of view, the singlet scalar mass should be of O(1 TeV) or less.more » Also, we consider right-handed neutrino extension of the model for neutrino mass generation. The model does not affect the MPP scenario, and might keep the naturalness with the new particle mass scale beyond TeV, thanks to accidental cancellation of Higgs mass corrections.« less
Accurate Modeling of Galaxy Clustering on Small Scales: Testing the Standard ΛCDM + Halo Model
NASA Astrophysics Data System (ADS)
Sinha, Manodeep; Berlind, Andreas A.; McBride, Cameron; Scoccimarro, Roman
2015-01-01
The large-scale distribution of galaxies can be explained fairly simply by assuming (i) a cosmological model, which determines the dark matter halo distribution, and (ii) a simple connection between galaxies and the halos they inhabit. This conceptually simple framework, called the halo model, has been remarkably successful at reproducing the clustering of galaxies on all scales, as observed in various galaxy redshift surveys. However, none of these previous studies have carefully modeled the systematics and thus truly tested the halo model in a statistically rigorous sense. We present a new accurate and fully numerical halo model framework and test it against clustering measurements from two luminosity samples of galaxies drawn from the SDSS DR7. We show that the simple ΛCDM cosmology + halo model is not able to simultaneously reproduce the galaxy projected correlation function and the group multiplicity function. In particular, the more luminous sample shows significant tension with theory. We discuss the implications of our findings and how this work paves the way for constraining galaxy formation by accurate simultaneous modeling of multiple galaxy clustering statistics.
Paraboloid magnetospheric magnetic field model and the status of the model as an ISO standard
NASA Astrophysics Data System (ADS)
Alexeev, I.
A reliable representation of the magnetic field is crucial in the framework of radiation belt modelling especially for disturbed conditions The empirical model developed by Tsyganenko T96 is constructed by minimizing the rms deviation from the large magnetospheric data base The applicability of the T96 model is limited mainly by quiet conditions in the solar wind along the Earth orbit But contrary to the internal planet s field the external magnetospheric magnetic field sources are much more time-dependent A reliable representation of the magnetic field is crucial in the framework of radiation belt modelling especially for disturbed conditions It is a reason why the method of the paraboloid magnetospheric model construction based on the more accurate and physically consistent approach in which each source of the magnetic field would have its own relaxation timescale and a driving function based on an individual best fit combination of the solar wind and IMF parameters Such approach is based on a priori information about the global magnetospheric current systems structure Each current system is included as a separate block module in the magnetospheric model As it was shown by the spacecraft magnetometer data there are three current systems which are the main contributors to the external magnetospheric magnetic field magnetopause currents ring current and tail current sheet Paraboloid model is based on an analytical solution of the Laplace equation for each of these large-scale current systems in the magnetosphere with a
NASA Astrophysics Data System (ADS)
Bermudez, L. E.; Percivall, G.; Idol, T. A.
2015-12-01
Experts in climate modeling, remote sensing of the Earth, and cyber infrastructure must work together in order to make climate predictions available to decision makers. Such experts and decision makers worked together in the Open Geospatial Consortium's (OGC) Testbed 11 to address a scenario of population displacement by coastal inundation due to the predicted sea level rise. In a Policy Fact Sheet "Harnessing Climate Data to Boost Ecosystem & Water Resilience", issued by White House Office of Science and Technology (OSTP) in December 2014, OGC committed to increase access to climate change information using open standards. In July 2015, the OGC Testbed 11 Urban Climate Resilience activity delivered on that commitment with open standards based support for climate-change preparedness. Using open standards such as the OGC Web Coverage Service and Web Processing Service and the NetCDF and GMLJP2 encoding standards, Testbed 11 deployed an interoperable high-resolution flood model to bring climate model outputs together with global change assessment models and other remote sensing data for decision support. Methods to confirm model predictions and to allow "what-if-scenarios" included in-situ sensor webs and crowdsourcing. A scenario was in two locations: San Francisco Bay Area and Mozambique. The scenarios demonstrated interoperation and capabilities of open geospatial specifications in supporting data services and processing services. The resultant High Resolution Flood Information System addressed access and control of simulation models and high-resolution data in an open, worldwide, collaborative Web environment. The scenarios examined the feasibility and capability of existing OGC geospatial Web service specifications in supporting the on-demand, dynamic serving of flood information from models with forecasting capacity. Results of this testbed included identification of standards and best practices that help researchers and cities deal with climate-related issues
Search for the standard model Higgs boson in tau final states.
Abazov, V M; Abbott, B; Abolins, M; Acharya, B S; Adams, M; Adams, T; Aguilo, E; Ahsan, M; Alexeev, G D; Alkhazov, G; Alton, A; Alverson, G; Alves, G A; Ancu, L S; Andeen, T; Anzelc, M S; Aoki, M; Arnoud, Y; Arov, M; Arthaud, M; Askew, A; Asman, B; Atramentov, O; Avila, C; Backusmayes, J; Badaud, F; Bagby, L; Baldin, B; Bandurin, D V; Banerjee, S; Barberis, E; Barfuss, A-F; Bargassa, P; Baringer, P; Barreto, J; Bartlett, J F; Bassler, U; Bauer, D; Beale, S; Bean, A; Begalli, M; Begel, M; Belanger-Champagne, C; Bellantoni, L; Bellavance, A; Benitez, J A; Beri, S B; Bernardi, G; Bernhard, R; Bertram, I; Besançon, M; Beuselinck, R; Bezzubov, V A; Bhat, P C; Bhatnagar, V; Blazey, G; Blessing, S; Bloom, K; Boehnlein, A; Boline, D; Bolton, T A; Boos, E E; Borissov, G; Bose, T; Brandt, A; Brock, R; Brooijmans, G; Bross, A; Brown, D; Bu, X B; Buchholz, D; Buehler, M; Buescher, V; Bunichev, V; Burdin, S; Burnett, T H; Buszello, C P; Calfayan, P; Calpas, B; Calvet, S; Cammin, J; Carrasco-Lizarraga, M A; Carrera, E; Carvalho, W; Casey, B C K; Castilla-Valdez, H; Chakrabarti, S; Chakraborty, D; Chan, K M; Chandra, A; Cheu, E; Cho, D K; Choi, S; Choudhary, B; Christoudias, T; Cihangir, S; Claes, D; Clutter, J; Cooke, M; Cooper, W E; Corcoran, M; Couderc, F; Cousinou, M-C; Crépé-Renaudin, S; Cuplov, V; Cutts, D; Cwiok, M; Das, A; Davies, G; De, K; de Jong, S J; De La Cruz-Burelo, E; DeVaughan, K; Déliot, F; Demarteau, M; Demina, R; Denisov, D; Denisov, S P; Desai, S; Diehl, H T; Diesburg, M; Dominguez, A; Dorland, T; Dubey, A; Dudko, L V; Duflot, L; Duggan, D; Duperrin, A; Dutt, S; Dyshkant, A; Eads, M; Edmunds, D; Ellison, J; Elvira, V D; Enari, Y; Eno, S; Ermolov, P; Escalier, M; Evans, H; Evdokimov, A; Evdokimov, V N; Facini, G; Ferapontov, A V; Ferbel, T; Fiedler, F; Filthaut, F; Fisher, W; Fisk, H E; Fortner, M; Fox, H; Fu, S; Fuess, S; Gadfort, T; Galea, C F; Garcia-Bellido, A; Gavrilov, V; Gay, P; Geist, W; Geng, W; Gerber, C E; Gershtein, Y; Gillberg, D; Ginther, G; Gómez, B; Goussiou, A; Grannis, P D; Greder, S; Greenlee, H; Greenwood, Z D; Gregores, E M; Grenier, G; Gris, Ph; Grivaz, J-F; Grohsjean, A; Grünendahl, S; Grünewald, M W; Guo, F; Guo, J; Gutierrez, G; Gutierrez, P; Haas, A; Hadley, N J; Haefner, P; Hagopian, S; Haley, J; Hall, I; Hall, R E; Han, L; Harder, K; Harel, A; Hauptman, J M; Hays, J; Hebbeker, T; Hedin, D; Hegeman, J G; Heinson, A P; Heintz, U; Hensel, C; Heredia-De La Cruz, I; Herner, K; Hesketh, G; Hildreth, M D; Hirosky, R; Hoang, T; Hobbs, J D; Hoeneisen, B; Hohlfeld, M; Hossain, S; Houben, P; Hu, Y; Hubacek, Z; Huske, N; Hynek, V; Iashvili, I; Illingworth, R; Ito, A S; Jabeen, S; Jaffré, M; Jain, S; Jakobs, K; Jamin, D; Jarvis, C; Jesik, R; Johns, K; Johnson, C; Johnson, M; Johnston, D; Jonckheere, A; Jonsson, P; Juste, A; Kajfasz, E; Karmanov, D; Kasper, P A; Katsanos, I; Kaushik, V; Kehoe, R; Kermiche, S; Khalatyan, N; Khanov, A; Kharchilava, A; Kharzheev, Y N; Khatidze, D; Kim, T J; Kirby, M H; Kirsch, M; Klima, B; Kohli, J M; Konrath, J-P; Kozelov, A V; Kraus, J; Kuhl, T; Kumar, A; Kupco, A; Kurca, T; Kuzmin, V A; Kvita, J; Lacroix, F; Lam, D; Lammers, S; Landsberg, G; Lebrun, P; Lee, W M; Leflat, A; Lellouch, J; Li, J; Li, L; Li, Q Z; Lietti, S M; Lim, J K; Lincoln, D; Linnemann, J; Lipaev, V V; Lipton, R; Liu, Y; Liu, Z; Lobodenko, A; Lokajicek, M; Love, P; Lubatti, H J; Luna-Garcia, R; Lyon, A L; Maciel, A K A; Mackin, D; Mättig, P; Magerkurth, A; Mal, P K; Malbouisson, H B; Malik, S; Malyshev, V L; Maravin, Y; Martin, B; McCarthy, R; McGivern, C L; Meijer, M M; Melnitchouk, A; Mendoza, L; Menezes, D; Mercadante, P G; Merkin, M; Merritt, K W; Meyer, A; Meyer, J; Mitrevski, J; Mommsen, R K; Mondal, N K; Moore, R W; Moulik, T; Muanza, G S; Mulhearn, M; Mundal, O; Mundim, L; Nagy, E; Naimuddin, M; Narain, M; Neal, H A; Negret, J P; Neustroev, P; Nilsen, H; Nogima, H; Novaes, S F; Nunnemann, T; Obrant, G; Ochando, C; Onoprienko, D; Orduna, J; Oshima, N; Osman, N; Osta, J; Otec, R; Otero Y Garzón, G J; Owen, M; Padilla, M; Padley, P; Pangilinan, M; Parashar, N; Park, S-J; Park, S K; Parsons, J; Partridge, R; Parua, N; Patwa, A; Pawloski, G; Penning, B; Perfilov, M; Peters, K; Peters, Y; Pétroff, P; Piegaia, R; Piper, J; Pleier, M-A; Podesta-Lerma, P L M; Podstavkov, V M; Pogorelov, Y; Pol, M-E; Polozov, P; Popov, A V; Potter, C; Prado da Silva, W L; Protopopescu, S; Qian, J; Quadt, A; Quinn, B; Rakitine, A; Rangel, M S; Ranjan, K; Ratoff, P N; Renkel, P; Rich, P; Rijssenbeek, M; Ripp-Baudot, I; Rizatdinova, F; Robinson, S; Rodrigues, R F; Rominsky, M; Royon, C; Rubinov, P; Ruchti, R; Safronov, G; Sajot, G; Sánchez-Hernández, A; Sanders, M P; Sanghi, B; Savage, G; Sawyer, L; Scanlon, T; Schaile, D; Schamberger, R D; Scheglov, Y; Schellman, H; Schliephake, T; Schlobohm, S; Schwanenberger, C; Schwienhorst, R; Sekaric, J; Severini, H; Shabalina, E; Shamim, M; Shary, V; Shchukin, A A; Shivpuri, R K; Siccardi, V; Simak, V; Sirotenko, V; Skubic, P; Slattery, P; Smirnov, D; Snow, G R; Snow, J; Snyder, S; Söldner-Rembold, S; Sonnenschein, L; Sopczak, A; Sosebee, M; Soustruznik, K; Spurlock, B; Stark, J; Stolin, V; Stoyanova, D A; Strandberg, J; Strandberg, S; Strang, M A; Strauss, E; Strauss, M; Ströhmer, R; Strom, D; Stutte, L; Sumowidagdo, S; Svoisky, P; Takahashi, M; Tanasijczuk, A; Taylor, W; Tiller, B; Tissandier, F; Titov, M; Tokmenin, V V; Torchiani, I; Tsybychev, D; Tuchming, B; Tully, C; Tuts, P M; Unalan, R; Uvarov, L; Uvarov, S; Uzunyan, S; Vachon, B; van den Berg, P J; Van Kooten, R; van Leeuwen, W M; Varelas, N; Varnes, E W; Vasilyev, I A; Verdier, P; Vertogradov, L S; Verzocchi, M; Vilanova, D; Vint, P; Vokac, P; Voutilainen, M; Wagner, R; Wahl, H D; Wang, M H L S; Warchol, J; Watts, G; Wayne, M; Weber, G; Weber, M; Welty-Rieger, L; Wenger, A; Wetstein, M; White, A; Wicke, D; Williams, M R J; Wilson, G W; Wimpenny, S J; Wobisch, M; Wood, D R; Wyatt, T R; Xie, Y; Xu, C; Yacoob, S; Yamada, R; Yang, W-C; Yasuda, T; Yatsunenko, Y A; Ye, Z; Yin, H; Yip, K; Yoo, H D; Youn, S W; Yu, J; Zeitnitz, C; Zelitch, S; Zhao, T; Zhou, B; Zhu, J; Zielinski, M; Zieminska, D; Zivkovic, L; Zutshi, V; Zverev, E G
2009-06-26
We present a search for the standard model Higgs boson using hadronically decaying tau leptons, in 1 fb(-1) of data collected with the D0 detector at the Fermilab Tevatron pp collider. We select two final states: tau+/- plus missing transverse energy and b jets, and tau+ tau- plus jets. These final states are sensitive to a combination of associated W/Z boson plus Higgs boson, vector boson fusion, and gluon-gluon fusion production processes. The observed ratio of the combined limit on the Higgs production cross section at the 95% C.L. to the standard model expectation is 29 for a Higgs boson mass of 115 GeV.
Search for the Standard Model Higgs boson decay to μ +μ - with the ATLAS detector
Aad, G.
2014-09-08
In this study, a search is reported for Higgs boson decay to μ +μ - using data with an integrated luminosity of 24.8 fb -1 collected with the ATLAS detector in pp collisions at √s = 7 and 8 TeV at the CERN Large Hadron Collider. The observed dimuon invariant mass distribution is consistent with the Standard Model background-only hypothesis in the 120–150 GeV search range. For a Higgs boson with a mass of 125.5 GeV, the observed (expected) upper limit at the 95% confidence level is 7.0 (7.2) times the Standard Model expectation. This corresponds to an upper limitmore » on the branching ratio BR(H→μ +μ -) of 1.5×10 -3.« less
Search for the minimal standard model Higgs boson in e +e - collisions at LEP
NASA Astrophysics Data System (ADS)
Akrawy, M. Z.; Alexander, G.; Allison, J.; Allport, P. P.; Anderson, K. J.; Armitage, J. C.; Arnison, G. T. J.; Ashton, P.; Azuelos, G.; Baines, J. T. M.; Ball, A. H.; Banks, J.; Barker, G. J.; Barlow, R. J.; Batley, J. R.; Beck, A.; Becker, J.; Behnke, T.; Bell, K. W.; Bella, G.; Bethke, S.; Biebel, O.; Binder, U.; Bloodworth, I. J.; Bock, P.; Breuker, H.; Brown, R. M.; Brun, R.; Buijs, A.; Burckhart, H. J.; Capiluppi, P.; Carnegie, R. K.; Carter, A. A.; Carter, J. R.; Chang, C. Y.; Charlton, D. G.; Chrin, J. T. M.; Clarke, P. E. L.; Cohen, I.; Collins, W. J.; Conboy, J. E.; Couch, M.; Coupland, M.; Cuffiani, M.; Dado, S.; Dallavalle, G. M.; Debu, P.; Deninno, M. M.; Dieckman, A.; Dittmar, M.; Dixit, M. S.; Duchovni, E.; Duerdoth, I. P.; Dumas, D. J. P.; Elcombe, P. A.; Estabrooks, P. G.; Etzion, E.; Fabbri, F.; Farthouat, P.; Fischer, H. M.; Fong, D. G.; French, M. T.; Fukunaga, C.; Gaidot, A.; Ganel, O.; Gary, J. W.; Gascon, J.; Geddes, N. I.; Gee, C. N. P.; Geich-Gimbel, C.; Gensler, S. W.; Gentit, F. X.; Giacomelli, G.; Gibson, V.; Gibson, W. R.; Gillies, J. D.; Goldberg, J.; Goodrick, M. J.; Gorn, W.; Granite, D.; Gross, E.; Grunhaus, J.; Hagedorn, H.; Hagemann, J.; Hansroul, M.; Hargrove, C. K.; Harrus, I.; Hart, J.; Hattersley, P. M.; Hauschild, M.; Hawkes, C. M.; Heflin, E.; Hemingway, R. J.; Heuer, R. D.; Hill, J. C.; Hillier, S. J.; Ho, C.; Hobbs, J. D.; Hobson, P. R.; Hochman, D.; Holl, B.; Homer, R. J.; Hou, S. R.; Howarth, C. P.; Hughes-Jones, R. E.; Humbert, R.; Igo-Kemenes, P.; Ihssen, H.; Imrie, D. C.; Janissen, L.; Jawahery, A.; Jeffreys, P. W.; Jeremie, H.; Jimack, M.; Jobes, M.; Jones, R. W. L.; Jovanovic, P.; Karlen, D.; Kawagoe, K.; Kawamoto, T.; Kellogg, R. G.; Kennedy, B. W.; Kleinwort, C.; Klem, D. E.; Knop, G.; Kobayashi, T.; Kokott, T. P.; Köpke, L.; Kowalewski, R.; Kreutzmann, H.; Kroll, J.; Kuwano, M.; Kyberd, P.; Lafferty, G. D.; Lamarche, F.; Larson, W. J.; Layter, J. G.; Le Du, P.; Leblanc, P.; Lee, A. M.; Lehto, M. H.; Lellouch, D.; Lennert, P.; Lessard, L.; Levinson, L.; Lloyd, S. L.; Loebinger, F. K.; Lorah, J. M.; Lorazo, B.; Losty, M. J.; Ludwig, J.; Ma, J.; Macbeth, A. A.; Mannelli, M.; Marcellini, S.; Maringer, G.; Martin, A. J.; Martin, J. P.; Mashimo, T.; Mättig, P.; Maur, U.; McMahon, T. J.; McNutt, J. R.; Meijers, F.; Menszner, D.; Merritt, F. S.; Mes, H.; Michelini, A.; Middleton, R. P.; Mikenberg, G.; Mildenberger, J.; Miller, D. J.; Milstene, C.; Minowa, M.; Mohr, W.; Montanari, A.; Mori, T.; Moss, M. W.; Murphy, P. G.; Murray, W. J.; Nellen, B.; Nguyen, H. H.; Nozaki, M.; O'Dowd, A. J. P.; O'Neale, S. W.; O'Neill, B. P.; Oakham, F. G.; Odorici, F.; Ogg, M.; Oh, H.; Oreglia, M. J.; Orito, S.; Pansart, J. P.; Patrick, G. N.; Pawley, S. J.; Pfister, P.; Pilcher, J. E.; Pinfold, J. L.; Plane, D. E.; Poli, B.; Pouladdej, A.; Prebys, E.; Pritchard, T. W.; Quast, G.; Raab, J.; Redmond, M. W.; Rees, D. L.; Regimbald, M.; Riles, K.; Roach, C. M.; Robins, S. A.; Rollnik, A.; Roney, J. M.; Rossberg, S.; Rossi, A. M.; Routenburg, P.; Runge, K.; Runolfsson, O.; Sanghera, S.; Sansum, R. A.; Sasaki, M.; Saunders, B. J.; Schaile, A. D.; Schaile, O.; Schappert, W.; Scharff-Hansen, P.; Schreiber, S.; Schwarz, J.; Shapira, A.; Shen, B. C.; Sherwood, P.; Simon, A.; Singh, P.; Siroli, G. P.; Skuja, A.; Smith, A. M.; Smith, T. J.; Snow, G. A.; Springer, R. W.; Sproston, M.; Stephens, K.; Stier, H. E.; Stroehmer, R.; Strom, D.; Takeda, H.; Takeshita, T.; Taras, P.; Thackray, N. J.; Tsukamoto, T.; Turner, M. F.; Tysarczyk-Niemeyer, G.; Van den plas, D.; VanDalen, G. J.; Van Kooten, R.; Vasseur, G.; Virtue, C. J.; von der Schmitt, H.; von Krogh, J.; Wagner, A.; Wahl, C.; Walker, J. P.; Ward, C. P.; Ward, D. R.; Watkins, P. M.; Watson, A. T.; Watson, N. K.; Weber, M.; Weisz, S.; Wells, P. S.; Wermes, N.; Weymann, M.; Wilson, G. W.; Wilson, J. A.; Wingerter, I.; Winterer, V.-H.; Wood, N. C.; Wotton, S.; Wuensch, B.; Wyatt, T. R.; Yaari, R.; Yang, Y.; Yekutieli, G.; Yoshida, T.; Zeuner, W.; Zorn, G. T.; OPAL Collaboration
1991-01-01
A search for the minimal standard model Higgs boson (H 0) has been performed with data from e +e - collisions in the OPAL detector at LEP. The analysis is based on approximately 8 pb -1 of data taken at centre-of-mass energies between 88.2 and 95.0 GeV. The search concentrated on the reaction e+e-→( e+e-, μ +μ -, voverlinevor τ +τ -) H0, H0→( qoverlineqor τ +τ -) for Higgs boson masses above 25 GeV/ c2. No Higgs boson candidates have been observed. The present study, combined with previous OPAL publications, excludes the existence of a standard model Higgs boson with mass in the range 3< mH 0<44GeV/ c2 at the 95% confidence level.
Mass limits for a standard model Higgs Boson in e+e- collisions at LEP
NASA Astrophysics Data System (ADS)
Akrawy, M. Z.; Alexander, G.; Allison, J.; Allport, P. P.; Anderson, K. J.; Armitage, J. C.; Arnison, G. T. J.; Ashton, P.; Azuelos, G.; Baines, J. T. M.; Ball, A. H.; Banks, J.; Barker, G. J.; Barlow, R. J.; Batley, J. R.; Bavaria, G.; Beck, F.; Bell, K. W.; Bella, G.; Bethke, S.; Biebel, O.; Bloodworth, I. J.; Bock, P.; Breuker, H.; Brown, R. M.; Brun, R.; Buijs, A.; Burckhart, H. J.; Capiluppi, P.; Carnegie, R. K.; Carter, A. A.; Carter, J. R.; Chang, C. Y.; Charlton, D. G.; Chrin, J. T. M.; Cohen, I.; Conboy, J. E.; Couch, M.; Coupland, M.; Cuffiani, M.; Dado, S.; Dallavalle, G. M.; Davies, O. W.; Deninno, M. M.; Dieckmann, A.; Dittmar, M.; Dixit, M. S.; Duchesneau, D.; Duchovni, E.; Duerdoth, I. P.; Dumas, D.; El Mamouni, H.; Elcombe, P. A.; Estabrooks, P. G.; Etzion, E.; Fabbri, F.; Farthouat, P.; Fischer, H. M.; Fong, D. G.; French, M. T.; Fukunaga, C.; Gandois, B.; Ganel, O.; Gary, J. W.; Geddes, N. I.; Gee, C. N. P.; Geich-Gimbel, C.; Gensler, S. W.; Gentit, F. X.; Giacomelli, G.; Gibson, W. R.; Gillies, J. D.; Goldberg, J.; Goodrick, M. J.; Gorn, W.; Granite, D.; Gross, E.; Grosse-Wiesmann, P.; Grunhaus, J.; Hagedorn, H.; Hagemann, J.; Hansroul, M.; Hargrove, C. K.; Hart, J.; Hattersley, P. M.; Hatzifotiadou, D.; Hauschild, M.; Hawkes, C. M.; Heflin, E.; Heintze, J.; Hemingway, R. J.; Heuer, R. D.; Hill, J. C.; Hillier, S. J.; Hinde, P. S.; Ho, C.; Hobbs, J. D.; Hobson, P. R.; Hochman, D.; Holl, B.; Homer, R. J.; Hou, S. R.; Howarth, C. P.; Hughes-Jones, R. E.; Igo-Kemenes, P.; Imori, M.; Imrie, D. C.; Jawahery, A.; Jeffreys, P. W.; Jeremie, H.; Jimack, M.; Jin, E.; Jobes, M.; Jones, R. W. L.; Jovanovic, P.; Karlen, D.; Kawagoe, K.; Kawamoto, T.; Kellogg, R. G.; Kennedy, B. W.; Kleinwort, C.; Klem, D. E.; Knop, G.; Kobayashi, T.; Köpke, L.; Kokott, T. P.; Koshiba, M.; Kowalewski, R.; Kreutzmann, H.; von Krogh, J.; Kroll, J.; Kyberd, P.; Lafferty, G. D.; Lamarche, F.; Larson, W. J.; Lasota, M. M. B.; Layter, J. G.; Le Du, P.; Leblanc, P.; Lellouch, D.; Lennert, P.; Lessard, L.; Levinson, L.; Lloyd, S. L.; Loebinger, F. K.; Lorah, J. M.; Lorazo, B.; Losty, M. J.; Ludwig, J.; Ma, J.; MacBeth, A. A.; Mannelli, M.; Marcellini, S.; Maringer, G.; Martin, J. P.; Mashimo, T.; Mättig, P.; Maur, U.; McMahon, T. J.; McPherson, A. C.; Meijers, F.; Menszner, D.; Merritt, F. S.; Mes, H.; Michelini, A.; Middleton, R. P.; Mikenberg, G.; Miller, D. J.; Milstene, C.; Minowa, M.; Mohr, W.; Montanari, A.; Mori, T.; Moss, M. W.; Muller, A.; Murphy, P. G.; Murray, W. J.; Nellen, B.; Nguyen, H. H.; Nozaki, M.; O'Dowd, A. J. P.; O'Neale, S. W.; O'Neill, B.; Oakham, F. G.; Odorici, F.; Ogg, M.; Oh, H.; Oreglia, M. J.; Orito, S.; Patrick, G. N.; Pawley, S. J.; Pilcher, J. E.; Pinfold, J. L.; Plane, D. E.; Poli, B.; Possoz, A.; Pouladdej, A.; Pritchard, T. W.; Quast, G.; Raab, J.; Redmond, M. W.; Rees, D. L.; Regimbald, M.; Riles, K.; Roach, C. M.; Roehner, F.; Rollnik, A.; Roney, J. M.; Rossi, A. M.; Routenburg, P.; Runge, K.; Runolfsson, O.; Sanghera, S.; Sansum, R. A.; Sasaki, M.; Saunders, B. J.; Schaile, A. D.; Schaile, O.; Schappert, W.; Scharff-Hansen, P.; von der Schmitt, H.; Schreiber, S.; Schwarz, J.; Shapira, A.; Shen, B. C.; Sherwood, P.; Simon, A.; Siroli, G. P.; Skuja, A.; Smith, A. M.; Smith, T. J.; Snow, G. A.; Spreadbury, E. J.; Springer, R. W.; Sproston, M.; Stephens, K.; Steuerer, J.; Stier, H. E.; Ströhmer, R.; Strom, D.; Takeda, H.; Takeshita, T.; Tsukamoto, T.; Turner, M. F.; Tysarczyk, G.; van den Plas, D.; Vandalen, G. J.; Virtue, C. J.; Wagner, A.; Wahl, C.; Wang, H.; Ward, C. P.; Ward, D. R.; Waterhouse, J.; Watkins, P. M.; Watson, A. T.; Watson, N. K.; Weber, M.; Weisz, S.; Wermes, N.; Weymann, M.; Wilson, G. W.; Wilson, J. A.; Wingerter, I.; Winterer, V.-H.; Wood, N. C.; Wotton, S.; Wuensch, B.; Wyatt, T. R.; Yaari, R.; Yamashita, H.; Yang, Y.; Yekutieli, G.; Zeuner, W.; Zorn, G. T.; Zylberajch, S.
1990-02-01
A search for the minimal standard model Higgs boson has been performed with data from e+e- collisions in the OPAL detector at LEP. The analysis is based on 825 nb-1 of data taken at centre-of-mass energies between 88.3 and 95.0 GeV. The search concentrated on the reactions e+e--->(e+e- or μ+μ- or vv)H0,H0-->(qq or τ+τ-), for Higgs masses above 3 GeV/c2. No Higgs boson candidates have been observed. The present study excludes the existence of a standard model H0 with mass in the range 3.0<=mH<=19.3 GeV/c2 at the 95% confidence level.
Form Factor Measurements at BESIII for an Improved Standard Model Prediction of the Muon g-2
NASA Astrophysics Data System (ADS)
Destefanis, Marco
The anomalous part of the magnetic moment of the muon, (g-2)μ, allows for one of the most precise tests of the Standard Model of particle physics. We report on recent results by the BESIII Collaboration of exclusive hadronic cross section channels, such as the 2π, 3π, and 4π final states. These measurements are of utmost importance for an improved calculation of the hadronic vacuum polarization contribution of (g-2)μ, which currenty is limiting the overall Standard Model prediction of this quantity. BESIII has furthermore also intiatated a programme of spacelike transition form factor measurements, which can be used for a determination of the hadronic light-by-light contribution of (g-2)μ in a data-driven approach. These results are of relevance in view of the new and direct measurements of (g-2)μ as foreseen at Fermilab/USA and J-PARC/Japan.
NASA Astrophysics Data System (ADS)
Chiang, Cheng-Wei; Ramsey-Musolf, Michael J.; Senaha, Eibun
2018-01-01
We analyze the theoretical and phenomenological considerations for the electroweak phase transition and dark matter in an extension of the standard model with a complex scalar singlet (cxSM). In contrast with earlier studies, we use a renormalization group improved scalar potential and treat its thermal history in a gauge-invariant manner. We find that the parameter space consistent with a strong first-order electroweak phase transition (SFOEWPT) and present dark matter phenomenological constraints is significantly restricted compared to results of a conventional, gauge-noninvariant analysis. In the simplest variant of the cxSM, recent LUX data and a SFOEWPT require a dark matter mass close to half the mass of the standard model-like Higgs boson. We also comment on various caveats regarding the perturbative treatment of the phase transition dynamics.
Lepton number violation in theories with a large number of standard model copies
Kovalenko, Sergey; Schmidt, Ivan; Paes, Heinrich
2011-03-01
We examine lepton number violation (LNV) in theories with a saturated black hole bound on a large number of species. Such theories have been advocated recently as a possible solution to the hierarchy problem and an explanation of the smallness of neutrino masses. On the other hand, the violation of the lepton number can be a potential phenomenological problem of this N-copy extension of the standard model as due to the low quantum gravity scale black holes may induce TeV scale LNV operators generating unacceptably large rates of LNV processes. We show, however, that this issue can be avoided bymore » introducing a spontaneously broken U{sub 1(B-L)}. Then, due to the existence of a specific compensation mechanism between contributions of different Majorana neutrino states, LNV processes in the standard model copy become extremely suppressed with rates far beyond experimental reach.« less
Guenter, Peggi; Boullata, Joseph I; Ayers, Phil; Gervasio, Jane; Malone, Ainsley; Raymond, Erica; Holcombe, Beverly; Kraft, Michael; Sacks, Gordon; Seres, David
2015-08-01
Parenteral nutrition (PN) provision is complex, as it is a high-alert medication and prone to a variety of potential errors. With changes in clinical practice models and recent federal rulings, the number of PN prescribers may be increasing. Safe prescribing of this therapy requires that competency for prescribers from all disciplines be demonstrated using a standardized process. A standardized model for PN prescribing competency is proposed based on a competency framework, the American Society for Parenteral and Enteral Nutrition (A.S.P.E.N.)-published interdisciplinary core competencies, safe practice recommendations, and clinical guidelines. This framework will guide institutions and agencies in developing and maintaining competency for safe PN prescription by their staff. © 2015 American Society for Parenteral and Enteral Nutrition.
Aerodynamic characteristics of the standard dynamics model in coning motion at Mach 0.6
NASA Technical Reports Server (NTRS)
Jermey, C.; Schiff, L. B.
1985-01-01
A wind tunnel test was conducted on the Standard Dynamics Model (a simplified generic fighter aircraft shape) undergoing coning motion at Mach 0.6. Six component force and moment data are presented for a range of angle of attack, sideslip, and coning rates. At the relatively low non-dimensional coning rate employed (omega b/2V less than or equal to 0.04), the lateral aerodynamic characteristics generally show a linear variation with coning rate.
Precision Higgs Boson Physics and Implications for Beyond the Standard Model Physics Theories
Wells, James
The discovery of the Higgs boson is one of science's most impressive recent achievements. We have taken a leap forward in understanding what is at the heart of elementary particle mass generation. We now have a significant opportunity to develop even deeper understanding of how the fundamental laws of nature are constructed. As such, we need intense focus from the scientific community to put this discovery in its proper context, to realign and narrow our understanding of viable theory based on this positive discovery, and to detail the implications the discovery has for theories that attempt to answer questions beyondmore » what the Standard Model can explain. This project's first main object is to develop a state-of-the-art analysis of precision Higgs boson physics. This is to be done in the tradition of the electroweak precision measurements of the LEP/SLC era. Indeed, the electroweak precision studies of the past are necessary inputs to the full precision Higgs program. Calculations will be presented to the community of Higgs boson observables that detail just how well various couplings of the Higgs boson can be measured, and more. These will be carried out using state-of-the-art theory computations coupled with the new experimental results coming in from the LHC. The project's second main objective is to utilize the results obtained from LHC Higgs boson experiments and the precision analysis, along with the direct search studies at LHC, and discern viable theories of physics beyond the Standard Model that unify physics to a deeper level. Studies will be performed on supersymmetric theories, theories of extra spatial dimensions (and related theories, such as compositeness), and theories that contain hidden sector states uniquely accessible to the Higgs boson. In addition, if data becomes incompatible with the Standard Model's low-energy effective lagrangian, new physics theories will be developed that explain the anomaly and put it into a more unified
Bouwman, R W; van Engen, R E; Young, K C; Veldkamp, W J H; Dance, D R
2015-01-07
Slabs of polymethyl methacrylate (PMMA) or a combination of PMMA and polyethylene (PE) slabs are used to simulate standard model breasts for the evaluation of the average glandular dose (AGD) in digital mammography (DM) and digital breast tomosynthesis (DBT). These phantoms are optimized for the energy spectra used in DM and DBT, which normally have a lower average energy than used in contrast enhanced digital mammography (CEDM). In this study we have investigated whether these phantoms can be used for the evaluation of AGD with the high energy x-ray spectra used in CEDM. For this purpose the calculated values of the incident air kerma for dosimetry phantoms and standard model breasts were compared in a zero degree projection with the use of an anti scatter grid. It was found that the difference in incident air kerma compared to standard model breasts ranges between -10% to +4% for PMMA slabs and between 6% and 15% for PMMA-PE slabs. The estimated systematic error in the measured AGD for both sets of phantoms were considered to be sufficiently small for the evaluation of AGD in quality control procedures for CEDM. However, the systematic error can be substantial if AGD values from different phantoms are compared.
Standard model effective field theory: Integrating out neutralinos and charginos in the MSSM
NASA Astrophysics Data System (ADS)
Han, Huayong; Huo, Ran; Jiang, Minyuan; Shu, Jing
2018-05-01
We apply the covariant derivative expansion method to integrate out the neutralinos and charginos in the minimal supersymmetric Standard Model. The results are presented as set of pure bosonic dimension-six operators in the Standard Model effective field theory. Nontrivial chirality dependence in fermionic covariant derivative expansion is discussed carefully. The results are checked by computing the h γ γ effective coupling and the electroweak oblique parameters using the Standard Model effective field theory with our effective operators and direct loop calculation. In global fitting, the proposed lepton collider constraint projections, special phenomenological emphasis is paid to the gaugino mass unification scenario (M2≃2 M1) and anomaly mediation scenario (M1≃3.3 M2). These results show that the precision measurement experiments in future lepton colliders will provide a very useful complementary job in probing the electroweakino sector, in particular, filling the gap of the soft lepton plus the missing ET channel search left by the traditional collider, where the neutralino as the lightest supersymmetric particle is very degenerated with the next-to-lightest chargino/neutralino.
Fermionic extensions of the Standard Model in light of the Higgs couplings
NASA Astrophysics Data System (ADS)
Bizot, Nicolas; Frigerio, Michele
2016-01-01
As the Higgs boson properties settle, the constraints on the Standard Model extensions tighten. We consider all possible new fermions that can couple to the Higgs, inspecting sets of up to four chiral multiplets. We confront them with direct collider searches, electroweak precision tests, and current knowledge of the Higgs couplings. The focus is on scenarios that may depart from the decoupling limit of very large masses and vanishing mixing, as they offer the best prospects for detection. We identify exotic chiral families that may receive a mass from the Higgs only, still in agreement with the hγγ signal strength. A mixing θ between the Standard Model and non-chiral fermions induces order θ 2 deviations in the Higgs couplings. The mixing can be as large as θ ˜ 0 .5 in case of custodial protection of the Z couplings or accidental cancellation in the oblique parameters. We also notice some intriguing effects for much smaller values of θ, especially in the lepton sector. Our survey includes a number of unconventional pairs of vector-like and Majorana fermions coupled through the Higgs, that may induce order one corrections to the Higgs radiative couplings. We single out the regions of parameters where hγγ and hgg are unaffected, while the hγZ signal strength is significantly modified, turning a few times larger than in the Standard Model in two cases. The second run of the LHC will effectively test most of these scenarios.
Creating Better Child Care Jobs: Model Work Standards for Teaching Staff in Center-Based Child Care.
ERIC Educational Resources Information Center
Center for the Child Care Workforce, Washington, DC.
This document presents model work standards articulating components of the child care center-based work environment that enable teachers to do their jobs well. These standards establish criteria to assess child care work environments and identify areas to improve in order to assure good jobs for adults and good care for children. The standards are…
A refined 'standard' thermal model for asteroids based on observations of 1 Ceres and 2 Pallas
NASA Technical Reports Server (NTRS)
Lebofsky, Larry A.; Sykes, Mark V.; Tedesco, Edward F.; Veeder, Glenn J.; Matson, Dennis L.
1986-01-01
An analysis of ground-based thermal IR observations of 1 Ceres and 2 Pallas in light of their recently determined occultation diameters and small amplitude light curves has yielded a new value for the IR beaming parameter employed in the standard asteroid thermal emission model which is significantly lower than the previous one. When applied to the reduction of thermal IR observations of other asteroids, this new value is expected to yield model diameters closer to actual values. The present formulation incorporates the IAU magnitude convention for asteroids that employs zero-phase magnitudes, including the opposition effect.
Testing the Standard Model by precision measurement of the weak charges of quarks
Ross Young; Roger Carlini; Anthony Thomas
In a global analysis of the latest parity-violating electron scattering measurements on nuclear targets, we demonstrate a significant improvement in the experimental knowledge of the weak neutral-current lepton-quark interactions at low-energy. The precision of this new result, combined with earlier atomic parity-violation measurements, limits the magnitude of possible contributions from physics beyond the Standard Model - setting a model-independent, lower-bound on the scale of new physics at ~1 TeV.
Imagining the Future, or How the Standard Model May Survive the Attacks
NASA Astrophysics Data System (ADS)
Hooft, Gerard'T.
After the last missing piece, the Higgs particle, has probably been identified, the Standard Model of the subatomic particles appears to be a quite robust structure, that can survive on its own for a long time to come. Most researchers expect considerable modifications and improvements to come in the near future, but it could also be that the Model will stay essentially as it is. This, however, would also require a change in our thinking, and the question remains whether and how it can be reconciled with our desire for our theories to be "natural".
Imagining the future, or how the Standard Model may survive the attacks
NASA Astrophysics Data System (ADS)
'T Hooft, Gerard
2016-06-01
After the last missing piece, the Higgs particle, has probably been identified, the Standard Model of the subatomic particles appears to be a quite robust structure, that can survive on its own for a long time to come. Most researchers expect considerable modifications and improvements to come in the near future, but it could also be that the Model will stay essentially as it is. This, however, would also require a change in our thinking, and the question remains whether and how it can be reconciled with our desire for our theories to be “natural”.
runDM: Running couplings of Dark Matter to the Standard Model
NASA Astrophysics Data System (ADS)
D'Eramo, Francesco; Kavanagh, Bradley J.; Panci, Paolo
2018-02-01
runDM calculates the running of the couplings of Dark Matter (DM) to the Standard Model (SM) in simplified models with vector mediators. By specifying the mass of the mediator and the couplings of the mediator to SM fields at high energy, the code can calculate the couplings at low energy, taking into account the mixing of all dimension-6 operators. runDM can also extract the operator coefficients relevant for direct detection, namely low energy couplings to up, down and strange quarks and to protons and neutrons.
NASA Astrophysics Data System (ADS)
de Blas, J.; Criado, J. C.; Pérez-Victoria, M.; Santiago, J.
2018-03-01
We compute all the tree-level contributions to the Wilson coefficients of the dimension-six Standard-Model effective theory in ultraviolet completions with general scalar, spinor and vector field content and arbitrary interactions. No assumption about the renormalizability of the high-energy theory is made. This provides a complete ultraviolet/infrared dictionary at the classical level, which can be used to study the low-energy implications of any model of interest, and also to look for explicit completions consistent with low-energy data.
Application of TDCR-Geant4 modeling to standardization of 63Ni.
Thiam, C; Bobin, C; Chauvenet, B; Bouchard, J
2012-09-01
As an alternative to the classical TDCR model applied to liquid scintillation (LS) counting, a stochastic approach based on the Geant4 toolkit is presented for the simulation of light emission inside the dedicated three-photomultiplier detection system. To this end, the Geant4 modeling includes a comprehensive description of optical properties associated with each material constituting the optical chamber. The objective is to simulate the propagation of optical photons from their creation in the LS cocktail to the production of photoelectrons in the photomultipliers. First validated for the case of radionuclide standardization based on Cerenkov emission, the scintillation process has been added to a TDCR-Geant4 modeling using the Birks expression in order to account for the light-emission nonlinearity owing to ionization quenching. The scintillation yield of the commercial Ultima Gold LS cocktail has been determined from double-coincidence detection efficiencies obtained for (60)Co and (54)Mn with the 4π(LS)β-γ coincidence method. In this paper, the stochastic TDCR modeling is applied for the case of the standardization of (63)Ni (pure β(-)-emitter; E(max)=66.98 keV) and the activity concentration is compared with the result given by the classical model. Copyright © 2012 Elsevier Ltd. All rights reserved.
Comparison of distributed acceleration and standard models of cosmic-ray transport
NASA Technical Reports Server (NTRS)
Letaw, J. R.; Silberberg, R.; Tsao, C. H.
1995-01-01
Recent cosmic-ray abundance measurements for elements in the range 3 less than or equal to Z less than or equal to 28 and energies 10 MeV/n less than or equal to E less than or equal to 1 TeV/n have been analyzed with computer transport modeling. About 500 elemental and isotopic measurements have been explored in this analysis. The transport code includes the effects of ionization losses, nuclear spallation reactions (including those of secondaries), all nuclear decay modes, stripping and attachment of electrons, escape from the Galaxy, weak reacceleration and solar modulation. Four models of reacceleration (with several submodels of various reacceleration strengths) were explored. A chi (exp 2) analysis show that the reacceleration models yield at least equally good fits to the data as the standard propagation model. However, with reacceleration, the ad hoc assumptions of the standard model regarding discontinuities in the energy dependence of the mean path length traversed by cosmic rays, and in the momentum spectrum of the cosmic-ray source spectrum are eliminated. Futhermore, the difficulty between rigidity dependent leakage and energy independent anisotropy below energies of 10(exp 14) eV is alleviated.
Standard plane localization in ultrasound by radial component model and selective search.
Ni, Dong; Yang, Xin; Chen, Xin; Chin, Chien-Ting; Chen, Siping; Heng, Pheng Ann; Li, Shengli; Qin, Jing; Wang, Tianfu
2014-11-01
Acquisition of the standard plane is crucial for medical ultrasound diagnosis. However, this process requires substantial experience and a thorough knowledge of human anatomy. Therefore it is very challenging for novices and even time consuming for experienced examiners. We proposed a hierarchical, supervised learning framework for automatically detecting the standard plane from consecutive 2-D ultrasound images. We tested this technique by developing a system that localizes the fetal abdominal standard plane from ultrasound video by detecting three key anatomical structures: the stomach bubble, umbilical vein and spine. We first proposed a novel radial component-based model to describe the geometric constraints of these key anatomical structures. We then introduced a novel selective search method which exploits the vessel probability algorithm to produce probable locations for the spine and umbilical vein. Next, using component classifiers trained by random forests, we detected the key anatomical structures at their probable locations within the regions constrained by the radial component-based model. Finally, a second-level classifier combined the results from the component detection to identify an ultrasound image as either a "fetal abdominal standard plane" or a "non- fetal abdominal standard plane." Experimental results on 223 fetal abdomen videos showed that the detection accuracy of our method was as high as 85.6% and significantly outperformed both the full abdomen and the separate anatomy detection methods without geometric constraints. The experimental results demonstrated that our system shows great promise for application to clinical practice. Copyright © 2014 World Federation for Ultrasound in Medicine & Biology. Published by Elsevier Inc. All rights reserved.
Baker, Jay B; Maskell, Kevin F; Matlock, Aaron G; Walsh, Ryan M; Skinner, Carl G
2015-07-01
We compared intubating with a preloaded bougie (PB) against standard bougie technique in terms of success rates, time to successful intubation and provider preference on a cadaveric airway model. In this prospective, crossover study, healthcare providers intubated a cadaver using the PB technique and the standard bougie technique. Participants were randomly assigned to start with either technique. Following standardized training and practice, procedural success and time for each technique was recorded for each participant. Subsequently, participants were asked to rate their perceived ease of intubation on a visual analogue scale of 1 to 10 (1=difficult and 10=easy) and to select which technique they preferred. 47 participants with variable experience intubating were enrolled at an emergency medicine intern airway course. The success rate of all groups for both techniques was equal (95.7%). The range of times to completion for the standard bougie technique was 16.0-70.2 seconds, with a mean time of 29.7 seconds. The range of times to completion for the PB technique was 15.7-110.9 seconds, with a mean time of 29.4 seconds. There was a non-significant difference of 0.3 seconds (95% confidence interval -2.8 to 3.4 seconds) between the two techniques. Participants rated the relative ease of intubation as 7.3/10 for the standard technique and 7.6/10 for the preloaded technique (p=0.53, 95% confidence interval of the difference -0.97 to 0.50). Thirty of 47 participants subjectively preferred the PB technique (p=0.039). There was no significant difference in success or time to intubation between standard bougie and PB techniques. The majority of participants in this study preferred the PB technique. Until a clear and clinically significant difference is found between these techniques, emergency airway operators should feel confident in using the technique with which they are most comfortable.
GAMBIT: the global and modular beyond-the-standard-model inference tool
NASA Astrophysics Data System (ADS)
Athron, Peter; Balazs, Csaba; Bringmann, Torsten; Buckley, Andy; Chrząszcz, Marcin; Conrad, Jan; Cornell, Jonathan M.; Dal, Lars A.; Dickinson, Hugh; Edsjö, Joakim; Farmer, Ben; Gonzalo, Tomás E.; Jackson, Paul; Krislock, Abram; Kvellestad, Anders; Lundberg, Johan; McKay, James; Mahmoudi, Farvah; Martinez, Gregory D.; Putze, Antje; Raklev, Are; Ripken, Joachim; Rogan, Christopher; Saavedra, Aldo; Savage, Christopher; Scott, Pat; Seo, Seon-Hee; Serra, Nicola; Weniger, Christoph; White, Martin; Wild, Sebastian
2017-11-01
We describe the open-source global fitting package GAMBIT: the Global And Modular Beyond-the-Standard-Model Inference Tool. GAMBIT combines extensive calculations of observables and likelihoods in particle and astroparticle physics with a hierarchical model database, advanced tools for automatically building analyses of essentially any model, a flexible and powerful system for interfacing to external codes, a suite of different statistical methods and parameter scanning algorithms, and a host of other utilities designed to make scans faster, safer and more easily-extendible than in the past. Here we give a detailed description of the framework, its design and motivation, and the current models and other specific components presently implemented in GAMBIT. Accompanying papers deal with individual modules and present first GAMBIT results. GAMBIT can be downloaded from gambit.hepforge.org.
A Simple Mathematical Model for Standard Model of Elementary Particles and Extension Thereof
NASA Astrophysics Data System (ADS)
Sinha, Ashok
2016-03-01
An algebraically (and geometrically) simple model representing the masses of the elementary particles in terms of the interaction (strong, weak, electromagnetic) constants is developed, including the Higgs bosons. The predicted Higgs boson mass is identical to that discovered by LHC experimental programs; while possibility of additional Higgs bosons (and their masses) is indicated. The model can be analyzed to explain and resolve many puzzles of particle physics and cosmology including the neutrino masses and mixing; origin of the proton mass and the mass-difference between the proton and the neutron; the big bang and cosmological Inflation; the Hubble expansion; etc. A novel interpretation of the model in terms of quaternion and rotation in the six-dimensional space of the elementary particle interaction-space - or, equivalently, in six-dimensional spacetime - is presented. Interrelations among particle masses are derived theoretically. A new approach for defining the interaction parameters leading to an elegant and symmetrical diagram is delineated. Generalization of the model to include supersymmetry is illustrated without recourse to complex mathematical formulation and free from any ambiguity. This Abstract represents some results of the Author's Independent Theoretical Research in Particle Physics, with possible connection to the Superstring Theory. However, only very elementary mathematics and physics is used in my presentation.
ERIC Educational Resources Information Center
Gunzenhauser, Georg W.; And Others
The Department of Educational Foundations at Western Illinois University developed a model preservice curriculum to respond to the problem of preparing teachers who can influence their environment rather than be victims of it and who can effectively transmit that message to students. Entitled Empowerment through Cognitive and Social Management…
Physics Beyond the Standard Model: Exotic Leptons and Black Holes at Future Colliders
NASA Astrophysics Data System (ADS)
Harris, Christopher M.
2005-02-01
The Standard Model of particle physics has been remarkably successful in describing present experimental results. However, it is assumed to be only a low-energy effective theory which will break down at higher energy scales, theoretically motivated to be around 1 TeV. There are a variety of proposed models of new physics beyond the Standard Model, most notably supersymmetric and extra dimension models. New charged and neutral heavy leptons are a feature of a number of theories of new physics, including the `intermediate scale' class of supersymmetric models. Using a time-of-flight technique to detect the charged leptons at the Large Hadron Collider, the discovery range (in the particular scenario studied in the first part of this thesis) is found to extend up to masses of 950 GeV. Extra dimension models, particularly those with large extra dimensions, allow the possible experimental production of black holes. The remainder of the thesis describes some theoretical results and computational tools necessary to model the production and decay of these miniature black holes at future particle colliders. The grey-body factors which describe the Hawking radiation emitted by higher-dimensional black holes are calculated numerically for the first time and then incorporated in a Monte Carlo black hole event generator; this can be used to model black hole production and decay at next-generation colliders. It is hoped that this generator will allow more detailed examination of black hole signatures and help to devise a method for extracting the number of extra dimensions present in nature.
Dixon, John A.
2010-02-10
The SUSY breaking in Cybersusy is proportional to the VEV that breaks the gauge symmetry SU(2)xU(1) down to U(1), and it is rather specific to models like the SSM. Assuming full breaking, as explained below, for the leptons, Cybersusy predicts a spectrum of SUSY breaking that is in accord with experimental results so far. In particular, for the choice of parameters below, Cybersusy predicts that the lowest mass superpartner for the charged leptons is a charged vector boson lepton (the Velectron), which has a mass of 316 Gev. The Selectron has a mass of 771 Gev for that choice ofmore » parameters. The theory also leads to a zero cosmological constant after SUSY breaking. The mechanism generates equations that restrict models like the SSM. This version of this paper incorporates recent results and changes discovered subsequent to the talk.« less
Statistical approach to Higgs boson couplings in the standard model effective field theory
NASA Astrophysics Data System (ADS)
Murphy, Christopher W.
2018-01-01
We perform a parameter fit in the standard model effective field theory (SMEFT) with an emphasis on using regularized linear regression to tackle the issue of the large number of parameters in the SMEFT. In regularized linear regression, a positive definite function of the parameters of interest is added to the usual cost function. A cross-validation is performed to try to determine the optimal value of the regularization parameter to use, but it selects the standard model (SM) as the best model to explain the measurements. Nevertheless as proof of principle of this technique we apply it to fitting Higgs boson signal strengths in SMEFT, including the latest Run-2 results. Results are presented in terms of the eigensystem of the covariance matrix of the least squares estimators as it has a degree model-independent to it. We find several results in this initial work: the SMEFT predicts the total width of the Higgs boson to be consistent with the SM prediction; the ATLAS and CMS experiments at the LHC are currently sensitive to non-resonant double Higgs boson production. Constraints are derived on the viable parameter space for electroweak baryogenesis in the SMEFT, reinforcing the notion that a first order phase transition requires fairly low-scale beyond the SM physics. Finally, we study which future experimental measurements would give the most improvement on the global constraints on the Higgs sector of the SMEFT.
NASA Astrophysics Data System (ADS)
Signell, R. P.; Camossi, E.
2015-11-01
Work over the last decade has resulted in standardized web-services and tools that can significantly improve the efficiency and effectiveness of working with meteorological and ocean model data. While many operational modelling centres have enabled query and access to data via common web services, most small research groups have not. The penetration of this approach into the research community, where IT resources are limited, can be dramatically improved by: (1) making it simple for providers to enable web service access to existing output files; (2) using technology that is free, and that is easy to deploy and configure; and (3) providing tools to communicate with web services that work in existing research environments. We present a simple, local brokering approach that lets modelers continue producing custom data, but virtually aggregates and standardizes the data using NetCDF Markup Language. The THREDDS Data Server is used for data delivery, pycsw for data search, NCTOOLBOX (Matlab®1) and Iris (Python) for data access, and Ocean Geospatial Consortium Web Map Service for data preview. We illustrate the effectiveness of this approach with two use cases involving small research modelling groups at NATO and USGS.1 Mention of trade names or commercial products does not constitute endorsement or recommendation for use by the US Government.
Surgical stent planning: simulation parameter study for models based on DICOM standards.
Scherer, S; Treichel, T; Ritter, N; Triebel, G; Drossel, W G; Burgert, O
2011-05-01
Endovascular Aneurysm Repair (EVAR) can be facilitated by a realistic simulation model of stent-vessel-interaction. Therefore, numerical feasibility and integrability in the clinical environment was evaluated. The finite element method was used to determine necessary simulation parameters for stent-vessel-interaction in EVAR. Input variables and result data of the simulation model were examined for their standardization using DICOM supplements. The study identified four essential parameters for the stent-vessel simulation: blood pressure, intima constitution, plaque occurrence and the material properties of vessel and plaque. Output quantities such as radial force of the stent and contact pressure between stent/vessel can help the surgeon to evaluate implant fixation and sealing. The model geometry can be saved with DICOM "Surface Segmentation" objects and the upcoming "Implant Templates" supplement. Simulation results can be stored using the "Structured Report". A standards-based general simulation model for optimizing stent-graft selection may be feasible. At present, there are limitations due to specification of individual vessel material parameters and for simulating the proximal fixation of stent-grafts with hooks. Simulation data with clinical relevance for documentation and presentation can be stored using existing or new DICOM extensions.
V3885 Sagittarius: A Comparison With a Range of Standard Model Accretion Disks
NASA Technical Reports Server (NTRS)
Linnell, Albert P.; Godon, Patrick; Hubeny, Ivan; Sion, Edward M; Szkody, Paula; Barrett, Paul E.
2009-01-01
A chi-squared analysis of standard model accretion disk synthetic spectrum fits to combined Far Ultraviolet Spectroscopic Explorer and Space Telescope Imaging Spectrograph spectra of V3885 Sagittarius, on an absolute flux basis, selects a model that accurately represents the observed spectral energy distribution. Calculation of the synthetic spectrum requires the following system parameters. The cataclysmic variable secondary star period-mass relation calibrated by Knigge in 2006 and 2007 sets the secondary component mass. A mean white dwarf (WD) mass from the same study, which is consistent with an observationally determined mass ratio, sets the adopted WD mass of 0.7M(solar mass), and the WD radius follows from standard theoretical models. The adopted inclination, i = 65 deg, is a literature consensus, and is subsequently supported by chi-squared analysis. The mass transfer rate is the remaining parameter to set the accretion disk T(sub eff) profile, and the Hipparcos parallax constrains that parameter to mas transfer = (5.0 +/- 2.0) x 10(exp -9) M(solar mass)/yr by a comparison with observed spectra. The fit to the observed spectra adopts the contribution of a 57,000 +/- 5000 K WD. The model thus provides realistic constraints on mass transfer and T(sub eff) for a large mass transfer system above the period gap.
A Search for the Standard Model Higgs Boson Produced in Association with a $W$ Boson
Frank, Martin Johannes
2011-05-01
We present a search for a standard model Higgs boson produced in association with a W boson using data collected with the CDF II detector from pmore » $$\\bar{p}$$ collisions at √s = 1.96 TeV. The search is performed in the WH → ℓvb$$\\bar{b}$$ channel. The two quarks usually fragment into two jets, but sometimes a third jet can be produced via gluon radiation, so we have increased the standard two-jet sample by including events that contain three jets. We reconstruct the Higgs boson using two or three jets depending on the kinematics of the event. We find an improvement in our search sensitivity using the larger sample together with this multijet reconstruction technique. Our data show no evidence of a Higgs boson, so we set 95% confidence level upper limits on the WH production rate. We set limits between 3.36 and 28.7 times the standard model prediction for Higgs boson masses ranging from 100 to 150 GeV/c 2.« less
Managing public health in the Army through a standard community health promotion council model.
Courie, Anna F; Rivera, Moira Shaw; Pompey, Allison
2014-01-01
Public health processes in the US Army remain uncoordinated due to competing lines of command, funding streams and multiple subject matter experts in overlapping public health concerns. The US Army Public Health Command (USAPHC) has identified a standard model for community health promotion councils (CHPCs) as an effective framework for synchronizing and integrating these overlapping systems to ensure a coordinated approach to managing the public health process. The purpose of this study is to test a foundational assumption of the CHPC effectiveness theory: the 3 features of a standard CHPC model - a CHPC chaired by a strong leader, ie, the senior commander; a full time health promotion team dedicated to the process; and centralized management through the USAPHC - will lead to high quality health promotion councils capable of providing a coordinated approach to addressing public health on Army installations. The study employed 2 evaluation questions: (1) Do CHPCs with centralized management through the USAPHC, alignment with the senior commander, and a health promotion operations team adhere more closely to the evidence-based CHPC program framework than CHPCs without these 3 features? (2) Do members of standard CHPCs report that participation in the CHPC leads to a well-coordinated approach to public health at the installation? The results revealed that both time (F(5,76)=25.02, P<.0001) and the 3 critical features of the standard CHPC model (F(1,76)=28.40, P<.0001) independently predicted program adherence. Evaluation evidence supports the USAPHC's approach to CHPC implementation as part of public health management on Army installations. Preliminary evidence suggests that the standard CHPC model may lead to a more coordinated approach to public health and may assure that CHPCs follow an evidence-informed design. This is consistent with past research demonstrating that community coalitions and public health systems that have strong leadership; dedicated staff time
40 CFR 86.099-9 - Emission standards for 1999 and later model year light-duty trucks.
Code of Federal Regulations, 2011 CFR
2011-07-01
....099-9 Emission standards for 1999 and later model year light-duty trucks. (a)(1)(i)-(iii) [Reserved... 40 Protection of Environment 18 2011-07-01 2011-07-01 false Emission standards for 1999 and later model year light-duty trucks. 86.099-9 Section 86.099-9 Protection of Environment ENVIRONMENTAL...
40 CFR 86.004-9 - Emission standards for 2004 and later model year light-duty trucks.
Code of Federal Regulations, 2013 CFR
2013-07-01
....004-9 Emission standards for 2004 and later model year light-duty trucks. Section 86.004-9 includes... 40 Protection of Environment 19 2013-07-01 2013-07-01 false Emission standards for 2004 and later model year light-duty trucks. 86.004-9 Section 86.004-9 Protection of Environment ENVIRONMENTAL...
40 CFR 86.004-9 - Emission standards for 2004 and later model year light-duty trucks.
Code of Federal Regulations, 2012 CFR
2012-07-01
....004-9 Emission standards for 2004 and later model year light-duty trucks. Section 86.004-9 includes... 40 Protection of Environment 19 2012-07-01 2012-07-01 false Emission standards for 2004 and later model year light-duty trucks. 86.004-9 Section 86.004-9 Protection of Environment ENVIRONMENTAL...
40 CFR 86.004-9 - Emission standards for 2004 and later model year light-duty trucks.
Code of Federal Regulations, 2010 CFR
2010-07-01
....004-9 Emission standards for 2004 and later model year light-duty trucks. Section 86.004-9 includes... 40 Protection of Environment 18 2010-07-01 2010-07-01 false Emission standards for 2004 and later model year light-duty trucks. 86.004-9 Section 86.004-9 Protection of Environment ENVIRONMENTAL...
40 CFR 86.004-9 - Emission standards for 2004 and later model year light-duty trucks.
Code of Federal Regulations, 2011 CFR
2011-07-01
....004-9 Emission standards for 2004 and later model year light-duty trucks. Section 86.004-9 includes... 40 Protection of Environment 18 2011-07-01 2011-07-01 false Emission standards for 2004 and later model year light-duty trucks. 86.004-9 Section 86.004-9 Protection of Environment ENVIRONMENTAL...
40 CFR 86.099-9 - Emission standards for 1999 and later model year light-duty trucks.
Code of Federal Regulations, 2010 CFR
2010-07-01
....099-9 Emission standards for 1999 and later model year light-duty trucks. (a)(1)(i)-(iii) [Reserved... 40 Protection of Environment 18 2010-07-01 2010-07-01 false Emission standards for 1999 and later model year light-duty trucks. 86.099-9 Section 86.099-9 Protection of Environment ENVIRONMENTAL...
Mid-infrared interferometry of Seyfert galaxies: Challenging the Standard Model
NASA Astrophysics Data System (ADS)
López-Gonzaga, N.; Jaffe, W.
2016-06-01
Aims: We aim to find torus models that explain the observed high-resolution mid-infrared (MIR) measurements of active galactic nuclei (AGN). Our goal is to determine the general properties of the circumnuclear dusty environments. Methods: We used the MIR interferometric data of a sample of AGNs provided by the instrument MIDI/VLTI and followed a statistical approach to compare the observed distribution of the interferometric measurements with the distributions computed from clumpy torus models. We mainly tested whether the diversity of Seyfert galaxies can be described using the Standard Model idea, where differences are solely due to a line-of-sight (LOS) effect. In addition to the LOS effects, we performed different realizations of the same model to include possible variations that are caused by the stochastic nature of the dusty models. Results: We find that our entire sample of AGNs, which contains both Seyfert types, cannot be explained merely by an inclination effect and by including random variations of the clouds. Instead, we find that each subset of Seyfert type can be explained by different models, where the filling factor at the inner radius seems to be the largest difference. For the type 1 objects we find that about two thirds of our objects could also be described using a dusty torus similar to the type 2 objects. For the remaining third, it was not possible to find a good description using models with high filling factors, while we found good fits with models with low filling factors. Conclusions: Within our model assumptions, we did not find one single set of model parameters that could simultaneously explain the MIR data of all 21 AGN with LOS effects and random variations alone. We conclude that at least two distinct cloud configurations are required to model the differences in Seyfert galaxies, with volume-filling factors differing by a factor of about 5-10. A continuous transition between the two types cannot be excluded.
Bladder cancer mapping in Libya based on standardized morbidity ratio and log-normal model
NASA Astrophysics Data System (ADS)
Alhdiri, Maryam Ahmed; Samat, Nor Azah; Mohamed, Zulkifley
2017-05-01
Disease mapping contains a set of statistical techniques that detail maps of rates based on estimated mortality, morbidity, and prevalence. A traditional approach to measure the relative risk of the disease is called Standardized Morbidity Ratio (SMR). It is the ratio of an observed and expected number of accounts in an area, which has the greatest uncertainty if the disease is rare or if geographical area is small. Therefore, Bayesian models or statistical smoothing based on Log-normal model are introduced which might solve SMR problem. This study estimates the relative risk for bladder cancer incidence in Libya from 2006 to 2007 based on the SMR and log-normal model, which were fitted to data using WinBUGS software. This study starts with a brief review of these models, starting with the SMR method and followed by the log-normal model, which is then applied to bladder cancer incidence in Libya. All results are compared using maps and tables. The study concludes that the log-normal model gives better relative risk estimates compared to the classical method. The log-normal model has can overcome the SMR problem when there is no observed bladder cancer in an area.
Model-Invariant Hybrid Computations of Separated Flows for RCA Standard Test Cases
NASA Technical Reports Server (NTRS)
Woodruff, Stephen
2016-01-01
NASA's Revolutionary Computational Aerosciences (RCA) subproject has identified several smooth-body separated flows as standard test cases to emphasize the challenge these flows present for computational methods and their importance to the aerospace community. Results of computations of two of these test cases, the NASA hump and the FAITH experiment, are presented. The computations were performed with the model-invariant hybrid LES-RANS formulation, implemented in the NASA code VULCAN-CFD. The model- invariant formulation employs gradual LES-RANS transitions and compensation for model variation to provide more accurate and efficient hybrid computations. Comparisons revealed that the LES-RANS transitions employed in these computations were sufficiently gradual that the compensating terms were unnecessary. Agreement with experiment was achieved only after reducing the turbulent viscosity to mitigate the effect of numerical dissipation. The stream-wise evolution of peak Reynolds shear stress was employed as a measure of turbulence dynamics in separated flows useful for evaluating computations.
Standard Model and New physics for ɛ'k/ɛk
NASA Astrophysics Data System (ADS)
Kitahara, Teppei
2018-05-01
The first result of the lattice simulation and improved perturbative calculations have pointed to a discrepancy between data on ɛ'k/ɛk and the standard-model (SM) prediction. Several new physics (NP) models can explain this discrepancy, and such NP models are likely to predict deviations of ℬ(K → πv
Ignition-and-Growth Modeling of NASA Standard Detonator and a Linear Shaped Charge
NASA Technical Reports Server (NTRS)
Oguz, Sirri
2010-01-01
The main objective of this study is to quantitatively investigate the ignition and shock sensitivity of NASA Standard Detonator (NSD) and the shock wave propagation of a linear shaped charge (LSC) after being shocked by NSD flyer plate. This combined explosive train was modeled as a coupled Arbitrary Lagrangian-Eulerian (ALE) model with LS-DYNA hydro code. An ignition-and-growth (I&G) reactive model based on unreacted and reacted Jones-Wilkins-Lee (JWL) equations of state was used to simulate the shock initiation. Various NSD-to-LSC stand-off distances were analyzed to calculate the shock initiation (or failure to initiate) and detonation wave propagation along the shaped charge. Simulation results were verified by experimental data which included VISAR tests for NSD flyer plate velocity measurement and an aluminum target severance test for LSC performance verification. Parameters used for the analysis were obtained from various published data or by using CHEETAH thermo-chemical code.
NASA Astrophysics Data System (ADS)
Hawkins, Keith; Leistedt, Boris; Bovy, Jo; Hogg, David W.
2017-10-01
Distances to individual stars in our own Galaxy are critical in order to piece together the nature of its velocity and spatial structure. Core helium burning red clump (RC) stars have similar luminosities, are abundant throughout the Galaxy and thus constitute good standard candles. We build a hierarchical probabilistic model to quantify the quality of RC stars as standard candles using parallax measurements from the first Gaia data release. A unique aspect of our methodology is to fully account for (and marginalize over) parallax, photometry and dust correction uncertainties, which lead to more robust results than standard approaches. We determine the absolute magnitude and intrinsic dispersion of the RC in 2MASS bands J, H, Ks, Gaia G band and WISE bands W1, W2, W3 and W4. We find that the absolute magnitude of the RC is -1.61 ± 0.01 (in Ks), +0.44 ± 0.01 (in G), -0.93 ± 0.01 (in J), -1.46 ± 0.01 (in H), -1.68 ± 0.02 (in W1), -1.69 ± 0.02 (in W2), -1.67 ± 0.02 (in W3) and -1.76 ± 0.01 mag (in W4). The mean intrinsic dispersion is ˜0.17 ± 0.03 mag across all bands (yielding a typical distance precision of ˜8 per cent). Thus RC stars are reliable and precise standard candles. In addition, we have also re-calibrated the zero-point of the absolute magnitude of the RC in each band, which provides a benchmark for future studies to estimate distances to RC stars. Finally, the parallax error shrinkage in the hierarchical model outlined in this work can be used to obtain more precise parallaxes than Gaia for the most distant RC stars across the Galaxy.
Wang, Haiyin; Jin, Chunlin; Jiang, Qingwu
2017-11-20
Traditional Chinese medicine (TCM) is an important part of China's medical system. Due to the prolonged low price of TCM procedures and the lack of an effective mechanism for dynamic price adjustment, the development of TCM has markedly lagged behind Western medicine. The World Health Organization (WHO) has emphasized the need to enhance the development of alternative and traditional medicine when creating national health care systems. The establishment of scientific and appropriate mechanisms to adjust the price of medical procedures in TCM is crucial to promoting the development of TCM. This study has examined incorporating value indicators and data on basic manpower expended, time spent, technical difficulty, and the degree of risk in the latest standards for the price of medical procedures in China, and this study also offers a price adjustment model with the relative price ratio as a key index. This study examined 144 TCM procedures and found that prices of TCM procedures were mainly based on the value of medical care provided; on average, medical care provided accounted for 89% of the price. Current price levels were generally low and the current price accounted for 56% of the standardized value of a procedure, on average. Current price levels accounted for a markedly lower standardized value of acupuncture, moxibustion, special treatment with TCM, and comprehensive TCM procedures. This study selected a total of 79 procedures and adjusted them by priority. The relationship between the price of TCM procedures and the suggested price was significantly optimized (p < 0.01). This study suggests that adjustment of the price of medical procedures based on a standardized value parity model is a scientific and suitable method of price adjustment that can serve as a reference for other provinces and municipalities in China and other countries and regions that mainly have fee-for-service (FFS) medical care.
NASA Technical Reports Server (NTRS)
Hildreth, Bruce L.; Jackson, E. Bruce
2009-01-01
The American Institute of Aeronautics Astronautics (AIAA) Modeling and Simulation Technical Committee is in final preparation of a new standard for the exchange of flight dynamics models. The standard will become an ANSI standard and is under consideration for submission to ISO for acceptance by the international community. The standard has some a spects that should provide benefits to the simulation training community. Use of the new standard by the training simulation community will reduce development, maintenance and technical refresh investment on each device. Furthermore, it will significantly lower the cost of performing model updates to improve fidelity or expand the envelope of the training device. Higher flight fidelity should result in better transfer of training, a direct benefit to the pilots under instruction. Costs of adopting the standard are minimal and should be paid back within the cost of the first use for that training device. The standard achie ves these advantages by making it easier to update the aerodynamic model. It provides a standard format for the model in a custom eXtensible Markup Language (XML) grammar, the Dynamic Aerospace Vehicle Exchange Markup Language (DAVE-ML). It employs an existing XML grammar, MathML, to describe the aerodynamic model in an input data file, eliminating the requirement for actual software compilation. The major components of the aero model become simply an input data file, and updates are simply new XML input files. It includes naming and axis system conventions to further simplify the exchange of information.
NASA Technical Reports Server (NTRS)
Sakuraba, K.; Tsuruda, Y.; Hanada, T.; Liou, J.-C.; Akahoshi, Y.
2007-01-01
This paper summarizes two new satellite impact tests conducted in order to investigate on the outcome of low- and hyper-velocity impacts on two identical target satellites. The first experiment was performed at a low velocity of 1.5 km/s using a 40-gram aluminum alloy sphere, whereas the second experiment was performed at a hyper-velocity of 4.4 km/s using a 4-gram aluminum alloy sphere by two-stage light gas gun in Kyushu Institute of Technology. To date, approximately 1,500 fragments from each impact test have been collected for detailed analysis. Each piece was analyzed based on the method used in the NASA Standard Breakup Model 2000 revision. The detailed analysis will conclude: 1) the similarity in mass distribution of fragments between low and hyper-velocity impacts encourages the development of a general-purpose distribution model applicable for a wide impact velocity range, and 2) the difference in area-to-mass ratio distribution between the impact experiments and the NASA standard breakup model suggests to describe the area-to-mass ratio by a bi-normal distribution.
The GeoDataPortal: A Standards-based Environmental Modeling Data Access and Manipulation Toolkit
NASA Astrophysics Data System (ADS)
Blodgett, D. L.; Kunicki, T.; Booth, N.; Suftin, I.; Zoerb, R.; Walker, J.
2010-12-01
Environmental modelers from fields of study such as climatology, hydrology, geology, and ecology rely on many data sources and processing methods that are common across these disciplines. Interest in inter-disciplinary, loosely coupled modeling and data sharing is increasing among scientists from the USGS, other agencies, and academia. For example, hydrologic modelers need downscaled climate change scenarios and land cover data summarized for the watersheds they are modeling. Subsequently, ecological modelers are interested in soil moisture information for a particular habitat type as predicted by the hydrologic modeler. The USGS Center for Integrated Data Analytics Geo Data Portal (GDP) project seeks to facilitate this loose model coupling data sharing through broadly applicable open-source web processing services. These services simplify and streamline the time consuming and resource intensive tasks that are barriers to inter-disciplinary collaboration. The GDP framework includes a catalog describing projects, models, data, processes, and how they relate. Using newly introduced data, or sources already known to the catalog, the GDP facilitates access to sub-sets and common derivatives of data in numerous formats on disparate web servers. The GDP performs many of the critical functions needed to summarize data sources into modeling units regardless of scale or volume. A user can specify their analysis zones or modeling units as an Open Geospatial Consortium (OGC) standard Web Feature Service (WFS). Utilities to cache Shapefiles and other common GIS input formats have been developed to aid in making the geometry available for processing via WFS. Dataset access in the GDP relies primarily on the Unidata NetCDF-Java library’s common data model. Data transfer relies on methods provided by Unidata’s Thematic Real-time Environmental Data Distribution System Data Server (TDS). TDS services of interest include the Open-source Project for a Network Data Access Protocol
Primordial gravitational waves, precisely: the role of thermodynamics in the Standard Model
NASA Astrophysics Data System (ADS)
Saikawa, Ken'ichi; Shirai, Satoshi
2018-05-01
In this paper, we revisit the estimation of the spectrum of primordial gravitational waves originated from inflation, particularly focusing on the effect of thermodynamics in the Standard Model of particle physics. By collecting recent results of perturbative and non-perturbative analysis of thermodynamic quantities in the Standard Model, we obtain the effective degrees of freedom including the corrections due to non-trivial interaction properties of particles in the Standard Model for a wide temperature interval. The impact of such corrections on the spectrum of primordial gravitational waves as well as the damping effect due to free-streaming particles is investigated by numerically solving the evolution equation of tensor perturbations in the expanding universe. It is shown that the reevaluation of the effects of free-streaming photons and neutrinos gives rise to some additional damping features overlooked in previous studies. We also observe that the continuous nature of the QCD crossover results in a smooth spectrum for modes that reenter the horizon at around the epoch of the QCD phase transition. Furthermore, we explicitly show that the values of the effective degrees of freedom remain smaller than the commonly used value 106.75 even at temperature much higher than the critical temperature of the electroweak crossover, and that the amplitude of primordial gravitational waves at a frequency range relevant to direct detection experiments becomes Script O(1) % larger than previous estimates that do not include such corrections. This effect can be relevant to future high-sensitivity gravitational wave experiments such as ultimate DECIGO. Our results on the temperature evolution of the effective degrees of freedom are made available as tabulated data and fitting functions, which can also be used in the analysis of other cosmological relics.
Desoubeaux, Guillaume; Cray, Carolyn
2017-01-01
Invasive aspergillosis has been studied in laboratory by the means of plethora of distinct animal models. They were developed to address pathophysiology, therapy, diagnosis, or miscellaneous other concerns associated. However, there are great discrepancies regarding all the experimental variables of animal models, and a thorough focus on them is needed. This systematic review completed a comprehensive bibliographic analysis specifically-based on the technical features of rodent models infected with Aspergillus fumigatus. Out the 800 articles reviewed, it was shown that mice remained the preferred model (85.8% of the referenced reports), above rats (10.8%), and guinea pigs (3.8%). Three quarters of the models involved immunocompromised status, mainly by steroids (44.4%) and/or alkylating drugs (42.9%), but only 27.7% were reported to receive antibiotic prophylaxis to prevent from bacterial infection. Injection of spores (30.0%) and inhalation/deposition into respiratory airways (66.9%) were the most used routes for experimental inoculation. Overall, more than 230 distinct A. fumigatus strains were used in models. Of all the published studies, 18.4% did not mention usage of any diagnostic tool, like histopathology or mycological culture, to control correct implementation of the disease and to measure outcome. In light of these findings, a consensus discussion should be engaged to establish a minimum standardization, although this may not be consistently suitable for addressing all the specific aspects of invasive aspergillosis. PMID:28559881
Martijn, Carolien; Sheeran, Paschal; Wesseldijk, Laura W; Merrick, Hannah; Webb, Thomas L; Roefs, Anne; Jansen, Anita
2013-04-01
The present research tested whether an evaluative conditioning intervention makes thin-ideal models less enviable as standards for appearance-based social comparisons (Study 1), and increases body satisfaction (Study 2). Female participants were randomly assigned to intervention versus control conditions in both studies (ns = 66 and 39). Intervention participants learned to associate thin-ideal models with synonyms of fake whereas control participants completed an equivalent task that did not involve learning this association. The dependent variable in Study 1 was an implicit measure of idealization of slim models assessed via a modified Implicit Association Test (IAT). Study 2 used a validated, self-report measure of body satisfaction as the outcome variable. Intervention participants showed significantly less implicit idealization of slim models on the IAT compared to controls (Study 1). In Study 2, participants who undertook the intervention exhibited an increase in body satisfaction scores whereas no such increase was observed for control participants. The present research indicates that it is possible to overcome the characteristic impact of thin-ideal models on women's judgments of their bodies. An evaluative conditioning intervention made it less likely that slim models were perceived as targets to be emulated, and enhanced body satisfaction. 2013 APA, all rights reserved
A perturbative approach to the redshift space correlation function: beyond the Standard Model
NASA Astrophysics Data System (ADS)
Bose, Benjamin; Koyama, Kazuya
2017-08-01
We extend our previous redshift space power spectrum code to the redshift space correlation function. Here we focus on the Gaussian Streaming Model (GSM). Again, the code accommodates a wide range of modified gravity and dark energy models. For the non-linear real space correlation function used in the GSM we use the Fourier transform of the RegPT 1-loop matter power spectrum. We compare predictions of the GSM for a Vainshtein screened and Chameleon screened model as well as GR. These predictions are compared to the Fourier transform of the Taruya, Nishimichi and Saito (TNS) redshift space power spectrum model which is fit to N-body data. We find very good agreement between the Fourier transform of the TNS model and the GSM predictions, with <= 6% deviations in the first two correlation function multipoles for all models for redshift space separations in 50Mpch <= s <= 180Mpc/h. Excellent agreement is found in the differences between the modified gravity and GR multipole predictions for both approaches to the redshift space correlation function, highlighting their matched ability in picking up deviations from GR. We elucidate the timeliness of such non-standard templates at the dawn of stage-IV surveys and discuss necessary preparations and extensions needed for upcoming high quality data.
Perceptual video quality assessment in H.264 video coding standard using objective modeling.
Karthikeyan, Ramasamy; Sainarayanan, Gopalakrishnan; Deepa, Subramaniam Nachimuthu
2014-01-01
Since usage of digital video is wide spread nowadays, quality considerations have become essential, and industry demand for video quality measurement is rising. This proposal provides a method of perceptual quality assessment in H.264 standard encoder using objective modeling. For this purpose, quality impairments are calculated and a model is developed to compute the perceptual video quality metric based on no reference method. Because of the shuttle difference between the original video and the encoded video the quality of the encoded picture gets degraded, this quality difference is introduced by the encoding process like Intra and Inter prediction. The proposed model takes into account of the artifacts introduced by these spatial and temporal activities in the hybrid block based coding methods and an objective modeling of these artifacts into subjective quality estimation is proposed. The proposed model calculates the objective quality metric using subjective impairments; blockiness, blur and jerkiness compared to the existing bitrate only calculation defined in the ITU G 1070 model. The accuracy of the proposed perceptual video quality metrics is compared against popular full reference objective methods as defined by VQEG.
A perturbative approach to the redshift space correlation function: beyond the Standard Model
Bose, Benjamin; Koyama, Kazuya, E-mail: benjamin.bose@port.ac.uk, E-mail: kazuya.koyama@port.ac.uk
We extend our previous redshift space power spectrum code to the redshift space correlation function. Here we focus on the Gaussian Streaming Model (GSM). Again, the code accommodates a wide range of modified gravity and dark energy models. For the non-linear real space correlation function used in the GSM we use the Fourier transform of the RegPT 1-loop matter power spectrum. We compare predictions of the GSM for a Vainshtein screened and Chameleon screened model as well as GR. These predictions are compared to the Fourier transform of the Taruya, Nishimichi and Saito (TNS) redshift space power spectrum model whichmore » is fit to N-body data. We find very good agreement between the Fourier transform of the TNS model and the GSM predictions, with ≤ 6% deviations in the first two correlation function multipoles for all models for redshift space separations in 50Mpc h ≤ s ≤ 180Mpc/ h . Excellent agreement is found in the differences between the modified gravity and GR multipole predictions for both approaches to the redshift space correlation function, highlighting their matched ability in picking up deviations from GR. We elucidate the timeliness of such non-standard templates at the dawn of stage-IV surveys and discuss necessary preparations and extensions needed for upcoming high quality data.« less
Hoel, D.D.
1984-01-01
Two computer codes have been developed for operational use in performing real time evaluations of atmospheric releases from the Savannah River Plant (SRP) in South Carolina. These codes, based on mathematical models, are part of the SRP WIND (Weather Information and Display) automated emergency response system. Accuracy of ground level concentrations from a Gaussian puff-plume model and a two-dimensional sequential puff model are being evaluated with data from a series of short range diffusion experiments using sulfur hexafluoride as a tracer. The models use meteorological data collected from 7 towers on SRP and at the 300 m WJBF-TV tower aboutmore » 15 km northwest of SRP. The winds and the stability, which is based on turbulence measurements, are measured at the 60 m stack heights. These results are compared to downwind concentrations using only standard meteorological data, i.e., adjusted 10 m winds and stability determined by the Pasquill-Turner stability classification method. Scattergrams and simple statistics were used for model evaluations. Results indicate predictions within accepted limits for the puff-plume code and a bias in the sequential puff model predictions using the meteorologist-adjusted nonstandard data. 5 references, 4 figures, 2 tables.« less
Choi, Jeeyae; Jansen, Kay; Coenen, Amy
In recent years, Decision Support Systems (DSSs) have been developed and used to achieve "meaningful use". One approach to developing DSSs is to translate clinical guidelines into a computer-interpretable format. However, there is no specific guideline modeling approach to translate nursing guidelines to computer-interpretable guidelines. This results in limited use of DSSs in nursing. Unified modeling language (UML) is a software writing language known to accurately represent the end-users' perspective, due to its expressive characteristics. Furthermore, standard terminology enabled DSSs have been shown to smoothly integrate into existing health information systems. In order to facilitate development of nursing DSSs, the UML was used to represent a guideline for medication management for older adults encode with the International Classification for Nursing Practice (ICNP®). The UML was found to be a useful and sufficient tool to model a nursing guideline for a DSS.
Choi, Jeeyae; Jansen, Kay; Coenen, Amy
2015-01-01
In recent years, Decision Support Systems (DSSs) have been developed and used to achieve “meaningful use”. One approach to developing DSSs is to translate clinical guidelines into a computer-interpretable format. However, there is no specific guideline modeling approach to translate nursing guidelines to computer-interpretable guidelines. This results in limited use of DSSs in nursing. Unified modeling language (UML) is a software writing language known to accurately represent the end-users’ perspective, due to its expressive characteristics. Furthermore, standard terminology enabled DSSs have been shown to smoothly integrate into existing health information systems. In order to facilitate development of nursing DSSs, the UML was used to represent a guideline for medication management for older adults encode with the International Classification for Nursing Practice (ICNP®). The UML was found to be a useful and sufficient tool to model a nursing guideline for a DSS. PMID:26958174
Evaluation of standard radiation atmosphere aerosol models for a coastal environment
NASA Technical Reports Server (NTRS)
Whitlock, C. H.; Suttles, J. T.; Sebacher, D. I.; Fuller, W. H.; Lecroy, S. R.
1986-01-01
Calculations are compared with data from an experiment to evaluate the utility of standard radiation atmosphere (SRA) models for defining aerosol properties in atmospheric radiation computations. Initial calculations with only SRA aerosols in a four-layer atmospheric column simulation allowed a sensitivity study and the detection of spectral trends in optical depth, which differed from measurements. Subsequently, a more detailed analysis provided a revision in the stratospheric layer, which brought calculations in line with both optical depth and skylight radiance data. The simulation procedure allows determination of which atmospheric layers influence both downwelling and upwelling radiation spectra.
Leading-order classical Lagrangians for the nonminimal standard-model extension
NASA Astrophysics Data System (ADS)
Reis, J. A. A. S.; Schreck, M.
2018-03-01
In this paper, we derive the general leading-order classical Lagrangian covering all fermion operators of the nonminimal standard-model extension (SME). Such a Lagrangian is considered to be the point-particle analog of the effective field theory description of Lorentz violation that is provided by the SME. At leading order in Lorentz violation, the Lagrangian obtained satisfies the set of five nonlinear equations that govern the map from the field theory to the classical description. This result can be of use for phenomenological studies of classical bodies in gravitational fields.
Enqvist, Kari; Kasuya, Shinta; Mazumdar, Anupam
2003-03-07
We propose that the inflaton is coupled to ordinary matter only gravitationally and that it decays into a completely hidden sector. In this scenario both baryonic and dark matter originate from the decay of a flat direction of the minimal supersymmetric standard model, which is shown to generate the desired adiabatic perturbation spectrum via the curvaton mechanism. The requirement that the energy density along the flat direction dominates over the inflaton decay products fixes the flat direction almost uniquely. The present residual energy density in the hidden sector is typically shown to be small.
Klise, Geoffrey T.; Hill, Roger; Walker, Andy
The use of the term 'availability' to describe a photovoltaic (PV) system and power plant has been fraught with confusion for many years. A term that is meant to describe equipment operational status is often omitted, misapplied or inaccurately combined with PV performance metrics due to attempts to measure performance and reliability through the lens of traditional power plant language. This paper discusses three areas where current research in standards, contract language and performance modeling is improving the way availability is used with regards to photovoltaic systems and power plants.
Phase of the Wilson line at high temperature in the standard model
Korthals Altes, C.P.; Lee, K.; Pisarski, R.D.
1994-09-26
We compute the effective potential for the phase of the Wilson line at high temperature in the standard model to one-loop order. Besides the trivial vacua, there are metastable states in the direction of U(1) hypercharge. Assuming that the Universe starts out in such a metastable state at the Planck scale, it easily persists to the time of the electroweak phase transition, which then proceeds by an unusual mechanism. All remnants of the metastable state evaporate about the time of the QCD phase transition.
Let’s have a coffee with the Standard Model of particle physics!
NASA Astrophysics Data System (ADS)
Woithe, Julia; Wiener, Gerfried J.; Van der Veken, Frederik F.
2017-05-01
The Standard Model of particle physics is one of the most successful theories in physics and describes the fundamental interactions between elementary particles. It is encoded in a compact description, the so-called ‘Lagrangian’, which even fits on t-shirts and coffee mugs. This mathematical formulation, however, is complex and only rarely makes it into the physics classroom. Therefore, to support high school teachers in their challenging endeavour of introducing particle physics in the classroom, we provide a qualitative explanation of the terms of the Lagrangian and discuss their interpretation based on associated Feynman diagrams.
NASA Astrophysics Data System (ADS)
Schreck, M.
2016-05-01
This article is devoted to finding classical point-particle equivalents for the fermion sector of the nonminimal standard model extension (SME). For a series of nonminimal operators, such Lagrangians are derived at first order in Lorentz violation using the algebraic concept of Gröbner bases. Subsequently, the Lagrangians serve as a basis for reanalyzing the results of certain kinematic tests of special relativity that were carried out in the past century. Thereby, a number of new constraints on coefficients of the nonminimal SME is obtained. In the last part of the paper we point out connections to Finsler geometry.
Galkin, A A
2012-01-01
On the basis of graphic models of the human response to environmental factors, two main types of complex quantitative influence as well as interrelation between determined effects at the level of an individual, and stochastic effects on population were revealed. Two main kinds of factors have been suggested to be distinguished. They are essential factors and accidental factors. The essential factors are common for environment. The accidental factors are foreign for environment. The above two kinds are different in approaches of hygienic standardization Accidental factors need a dot-like approach, whereas a two-level range approach is suitable for the essential factors.
The early universe history from contraction-deformation of the Standard Model
NASA Astrophysics Data System (ADS)
Gromov, N. A.
2017-03-01
The elementary particles evolution in the early Universe from Plank time up to several milliseconds is presented. The developed theory is based on the high-temperature (high-energy) limit of the Standard Model which is generated by the contractions of its gauge groups. At the infinite temperature all particles lose masses. Only massless neutral -bosons, massless Z-quarks, neutrinos and photons are survived in this limit. The weak interactions become long-range and are mediated by neutral currents, quarks have only one color degree of freedom.
Standard model light-by-light scattering in SANC: Analytic and numeric evaluation
Bardin, D. Yu., E-mail: bardin@nu.jinr.ru; Kalinovskaya, L. V., E-mail: kalinov@nu.jinr.ru; Uglov, E. D., E-mail: corner@nu.jinr.r
2010-11-15
The implementation of the Standard Model process {gamma}{gamma} {yields} {gamma}{gamma} through a fermion and boson loop into the framework of SANC system and additional precomputation modules used for calculation of massive box diagrams are described. The computation of this process takes into account nonzero mass of loop particles. The covariant and helicity amplitudes for this process, some particular cases of D{sub 0} and C{sub 0} Passarino-Veltman functions, and also numerical results of corresponding SANC module evaluation are presented. Whenever possible, the results are compared with those existing in the literature.
Experimental constraints from flavour changing processes and physics beyond the Standard Model.
Gersabeck, M; Gligorov, V V; Serra, N
Flavour physics has a long tradition of paving the way for direct discoveries of new particles and interactions. Results over the last decade have placed stringent bounds on the parameter space of physics beyond the Standard Model. Early results from the LHC, and its dedicated flavour factory LHCb, have further tightened these constraints and reiterate the ongoing relevance of flavour studies. The experimental status of flavour observables in the charm and beauty sectors is reviewed in measurements of CP violation, neutral meson mixing, and measurements of rare decays.
Standard model anatomy of WIMP dark matter direct detection. I. Weak-scale matching
NASA Astrophysics Data System (ADS)
Hill, Richard J.; Solon, Mikhail P.
2015-02-01
We present formalism necessary to determine weak-scale matching coefficients in the computation of scattering cross sections for putative dark matter candidates interacting with the Standard Model. We pay particular attention to the heavy-particle limit. A consistent renormalization scheme in the presence of nontrivial residual masses is implemented. Two-loop diagrams appearing in the matching to gluon operators are evaluated. Details are given for the computation of matching coefficients in the universal limit of WIMP-nucleon scattering for pure states of arbitrary quantum numbers, and for singlet-doublet and doublet-triplet mixed states.
Testing the standard model by precision measurement of the weak charges of quarks.
Young, R D; Carlini, R D; Thomas, A W; Roche, J
2007-09-21
In a global analysis of the latest parity-violating electron scattering measurements on nuclear targets, we demonstrate a significant improvement in the experimental knowledge of the weak neutral-current lepton-quark interactions at low energy. The precision of this new result, combined with earlier atomic parity-violation measurements, places tight constraints on the size of possible contributions from physics beyond the standard model. Consequently, this result improves the lower-bound on the scale of relevant new physics to approximately 1 TeV.
NASA Astrophysics Data System (ADS)
Ali, Mumtaz; Deo, Ravinesh C.; Downs, Nathan J.; Maraseni, Tek
2018-07-01
Forecasting drought by means of the World Meteorological Organization-approved Standardized Precipitation Index (SPI) is considered to be a fundamental task to support socio-economic initiatives and effectively mitigating the climate-risk. This study aims to develop a robust drought modelling strategy to forecast multi-scalar SPI in drought-rich regions of Pakistan where statistically significant lagged combinations of antecedent SPI are used to forecast future SPI. With ensemble-Adaptive Neuro Fuzzy Inference System ('ensemble-ANFIS') executed via a 10-fold cross-validation procedure, a model is constructed by randomly partitioned input-target data. Resulting in 10-member ensemble-ANFIS outputs, judged by mean square error and correlation coefficient in the training period, the optimal forecasts are attained by the averaged simulations, and the model is benchmarked with M5 Model Tree and Minimax Probability Machine Regression (MPMR). The results show the proposed ensemble-ANFIS model's preciseness was notably better (in terms of the root mean square and mean absolute error including the Willmott's, Nash-Sutcliffe and Legates McCabe's index) for the 6- and 12- month compared to the 3-month forecasts as verified by the largest error proportions that registered in smallest error band. Applying 10-member simulations, ensemble-ANFIS model was validated for its ability to forecast severity (S), duration (D) and intensity (I) of drought (including the error bound). This enabled uncertainty between multi-models to be rationalized more efficiently, leading to a reduction in forecast error caused by stochasticity in drought behaviours. Through cross-validations at diverse sites, a geographic signature in modelled uncertainties was also calculated. Considering the superiority of ensemble-ANFIS approach and its ability to generate uncertainty-based information, the study advocates the versatility of a multi-model approach for drought-risk forecasting and its prime importance
Use of the Ames Check Standard Model for the Validation of Wall Interference Corrections
NASA Technical Reports Server (NTRS)
Ulbrich, N.; Amaya, M.; Flach, R.
2018-01-01
The new check standard model of the NASA Ames 11-ft Transonic Wind Tunnel was chosen for a future validation of the facility's wall interference correction system. The chosen validation approach takes advantage of the fact that test conditions experienced by a large model in the slotted part of the tunnel's test section will change significantly if a subset of the slots is temporarily sealed. Therefore, the model's aerodynamic coefficients have to be recorded, corrected, and compared for two different test section configurations in order to perform the validation. Test section configurations with highly accurate Mach number and dynamic pressure calibrations were selected for the validation. First, the model is tested with all test section slots in open configuration while keeping the model's center of rotation on the tunnel centerline. In the next step, slots on the test section floor are sealed and the model is moved to a new center of rotation that is 33 inches below the tunnel centerline. Then, the original angle of attack sweeps are repeated. Afterwards, wall interference corrections are applied to both test data sets and response surface models of the resulting aerodynamic coefficients in interference-free flow are generated. Finally, the response surface models are used to predict the aerodynamic coefficients for a family of angles of attack while keeping dynamic pressure, Mach number, and Reynolds number constant. The validation is considered successful if the corrected aerodynamic coefficients obtained from the related response surface model pair show good agreement. Residual differences between the corrected coefficient sets will be analyzed as well because they are an indicator of the overall accuracy of the facility's wall interference correction process.
Leveraging Open Standard Interfaces in Accessing and Processing NASA Data Model Outputs
NASA Astrophysics Data System (ADS)
Falke, S. R.; Alameh, N. S.; Hoijarvi, K.; de La Beaujardiere, J.; Bambacus, M. J.
2006-12-01
An objective of NASA's Earth Science Division is to develop advanced information technologies for processing, archiving, accessing, visualizing, and communicating Earth Science data. To this end, NASA and other federal agencies have collaborated with the Open Geospatial Consortium (OGC) to research, develop, and test interoperability specifications within projects and testbeds benefiting the government, industry, and the public. This paper summarizes the results of a recent effort under the auspices of the OGC Web Services testbed phase 4 (OWS-4) to explore standardization approaches for accessing and processing the outputs of NASA models of physical phenomena. Within the OWS-4 context, experiments were designed to leverage the emerging OGC Web Processing Service (WPS) and Web Coverage Service (WCS) specifications to access, filter and manipulate the outputs of the NASA Goddard Earth Observing System (GEOS) and Goddard Chemistry Aerosol Radiation and Transport (GOCART) forecast models. In OWS-4, the intent is to provide the users with more control over the subsets of data that they can extract from the model results as well as over the final portrayal of that data. To meet that goal, experiments have been designed to test the suitability of use of OGC's Web Processing Service (WPS) and Web Coverage Service (WCS) for filtering, processing and portraying the model results (including slices by height or by time), and to identify any enhancements to the specs to meet the desired objectives. This paper summarizes the findings of the experiments highlighting the value of the Web Processing Service in providing standard interfaces for accessing and manipulating model data within spatial and temporal frameworks. The paper also points out the key shortcomings of the WPS especially in terms in comparison with a SOAP/WSDL approach towards solving the same problem.
Wu, Xiaowu; Corona, Benjamin T.; Chen, Xiaoyu
2012-01-01
Abstract Soft tissue injuries involving volumetric muscle loss (VML) are defined as the traumatic or surgical loss of skeletal muscle with resultant functional impairment and represent a challenging clinical problem for both military and civilian medicine. In response, a variety of tissue engineering and regenerative medicine treatments are under preclinical development. A wide variety of animal models are being used, all with critical limitations. The objective of this study was to develop a model of VML that was reproducible and technically uncomplicated to provide a standardized platform for the development of tissue engineering and regenerative medicine solutions to VML repair. A rat model of VML involving excision of ∼20% of the muscle's mass from the superficial portion of the middle third of the tibialis anterior (TA) muscle was developed and was functionally characterized. The contralateral TA muscle served as the uninjured control. Additionally, uninjured age-matched control rats were also tested to determine the effect of VML on the contralateral limb. TA muscles were assessed at 2 and 4 months postinjury. VML muscles weighed 22.7% and 19.5% less than contralateral muscles at 2 and 4 months postinjury, respectively. These differences were accompanied by a reduction in peak isometric tetanic force (Po) of 28.4% and 32.5% at 2 and 4 months. Importantly, Po corrected for differences in body weight and muscle wet weights were similar between contralateral and age-matched control muscles, indicating that VML did not have a significant impact on the contralateral limb. Lastly, repair of the injury with a biological scaffold resulted in rapid vascularization and integration with the wound. The technical simplicity, reliability, and clinical relevance of the VML model developed in this study make it ideal as a standard model for the development of tissue engineering solutions for VML. PMID:23515319
Ouellet, D; Norback, J P
1993-11-01
Continuous quality improvement is the new requirement of the Joint Commission on Accreditation of Healthcare Organizations. This means that meeting quality standards will not be enough. Dietitians will need to improve those standards and the way they are selected. Because quality is defined in terms of the customers, all quality improvement projects must start by defining what customers want. Using a salad bar as an example, this article presents and illustrates a technique developed in Japan to identify which elements in a product or service will satisfy or dissatisfy consumers. Using a model and a questionnaire format developed by Kano and coworkers, 273 students were surveyed to classify six quality elements of a salad bar. Four elements showed a dominant "must-be" characteristic: food freshness, labeling of the dressings, no spills in the food, and no spills on the salad bar. The two other elements (food easy to reach and food variety) showed a dominant one-dimensional characteristic. By better understanding consumer perceptions of quality elements, foodservice managers can select quality standards that focus on what really matters to their consumers.
Safakish, Ramin
2017-01-01
Lower back pain (LBP) is a global public health issue and is associated with substantial financial costs and loss of quality of life. Over the years, different literature has provided different statistics regarding the causes of the back pain. The following statistic is the closest estimation regarding our patient population. The sacroiliac (SI) joint pain is responsible for LBP in 18%-30% of individuals with LBP. Quadrapolar™ radiofrequency ablation, which involves ablation of the nerves of the SI joint using heat, is a commonly used treatment for SI joint pain. However, the standard Quadrapolar radiofrequency procedure is not always effective at ablating all the sensory nerves that cause the pain in the SI joint. One of the major limitations of the standard Quadrapolar radiofrequency procedure is that it produces small lesions of ~4 mm in diameter. Smaller lesions increase the likelihood of failure to ablate all nociceptive input. In this study, we compare the standard Quadrapolar radiofrequency ablation technique to a modified Quadrapolar ablation technique that has produced improved patient outcomes in our clinic. The methodology of the two techniques are compared. In addition, we compare results from an experimental model comparing the lesion sizes produced by the two techniques. Taken together, the findings from this study suggest that the modified Quadrapolar technique provides longer lasting relief for the back pain that is caused by SI joint dysfunction. A randomized controlled clinical trial is the next step required to quantify the difference in symptom relief and quality of life produced by the two techniques.
Use of Open Standards and Technologies at the Lunar Mapping and Modeling Project
NASA Astrophysics Data System (ADS)
Law, E.; Malhotra, S.; Bui, B.; Chang, G.; Goodale, C. E.; Ramirez, P.; Kim, R. M.; Sadaqathulla, S.; Rodriguez, L.
2011-12-01
The Lunar Mapping and Modeling Project (LMMP), led by the Marshall Space Flight center (MSFC), is tasked by NASA. The project is responsible for the development of an information system to support lunar exploration activities. It provides lunar explorers a set of tools and lunar map and model products that are predominantly derived from present lunar missions (e.g., the Lunar Reconnaissance Orbiter (LRO)) and from historical missions (e.g., Apollo). At Jet Propulsion Laboratory (JPL), we have built the LMMP interoperable geospatial information system's underlying infrastructure and a single point of entry - the LMMP Portal by employing a number of open standards and technologies. The Portal exposes a set of services to users to allow search, visualization, subset, and download of lunar data managed by the system. Users also have access to a set of tools that visualize, analyze and annotate the data. The infrastructure and Portal are based on web service oriented architecture. We designed the system to support solar system bodies in general including asteroids, earth and planets. We employed a combination of custom software, commercial and open-source components, off-the-shelf hardware and pay-by-use cloud computing services. The use of open standards and web service interfaces facilitate platform and application independent access to the services and data, offering for instances, iPad and Android mobile applications and large screen multi-touch with 3-D terrain viewing functions, for a rich browsing and analysis experience from a variety of platforms. The web services made use of open standards including: Representational State Transfer (REST); and Open Geospatial Consortium (OGC)'s Web Map Service (WMS), Web Coverage Service (WCS), Web Feature Service (WFS). Its data management services have been built on top of a set of open technologies including: Object Oriented Data Technology (OODT) - open source data catalog, archive, file management, data grid framework
NASA Technical Reports Server (NTRS)
Albus, James S.; Mccain, Harry G.; Lumia, Ronald
1989-01-01
The document describes the NASA Standard Reference Model (NASREM) Architecture for the Space Station Telerobot Control System. It defines the functional requirements and high level specifications of the control system for the NASA space Station document for the functional specification, and a guideline for the development of the control system architecture, of the 10C Flight Telerobot Servicer. The NASREM telerobot control system architecture defines a set of standard modules and interfaces which facilitates software design, development, validation, and test, and make possible the integration of telerobotics software from a wide variety of sources. Standard interfaces also provide the software hooks necessary to incrementally upgrade future Flight Telerobot Systems as new capabilities develop in computer science, robotics, and autonomous system control.
Brodin, N. Patrik, E-mail: nils.patrik.brodin@rh.dk; Niels Bohr Institute, University of Copenhagen, Copenhagen; Vogelius, Ivan R.
2013-10-01
Purpose: As pediatric medulloblastoma (MB) is a relatively rare disease, it is important to extract the maximum information from trials and cohort studies. Here, a framework was developed for modeling tumor control with multiple modes of failure and time-to-progression for standard-risk MB, using published pattern of failure data. Methods and Materials: Outcome data for standard-risk MB published after 1990 with pattern of relapse information were used to fit a tumor control dose-response model addressing failures in both the high-dose boost volume and the elective craniospinal volume. Estimates of 5-year event-free survival from 2 large randomized MB trials were used tomore » model the time-to-progression distribution. Uncertainty in freedom from progression (FFP) was estimated by Monte Carlo sampling over the statistical uncertainty in input data. Results: The estimated 5-year FFP (95% confidence intervals [CI]) for craniospinal doses of 15, 18, 24, and 36 Gy while maintaining 54 Gy to the posterior fossa was 77% (95% CI, 70%-81%), 78% (95% CI, 73%-81%), 79% (95% CI, 76%-82%), and 80% (95% CI, 77%-84%) respectively. The uncertainty in FFP was considerably larger for craniospinal doses below 18 Gy, reflecting the lack of data in the lower dose range. Conclusions: Estimates of tumor control and time-to-progression for standard-risk MB provides a data-driven setting for hypothesis generation or power calculations for prospective trials, taking the uncertainties into account. The presented methods can also be applied to incorporate further risk-stratification for example based on molecular biomarkers, when the necessary data become available.« less
Meystre, Stéphane M; Lee, Sanghoon; Jung, Chai Young; Chevrier, Raphaël D
2012-08-01
An increasing need for collaboration and resources sharing in the Natural Language Processing (NLP) research and development community motivates efforts to create and share a common data model and a common terminology for all information annotated and extracted from clinical text. We have combined two existing standards: the HL7 Clinical Document Architecture (CDA), and the ISO Graph Annotation Format (GrAF; in development), to develop such a data model entitled "CDA+GrAF". We experimented with several methods to combine these existing standards, and eventually selected a method wrapping separate CDA and GrAF parts in a common standoff annotation (i.e., separate from the annotated text) XML document. Two use cases, clinical document sections, and the 2010 i2b2/VA NLP Challenge (i.e., problems, tests, and treatments, with their assertions and relations), were used to create examples of such standoff annotation documents, and were successfully validated with the XML schemata provided with both standards. We developed a tool to automatically translate annotation documents from the 2010 i2b2/VA NLP Challenge format to GrAF, and automatically generated 50 annotation documents using this tool, all successfully validated. Finally, we adapted the XSL stylesheet provided with HL7 CDA to allow viewing annotation XML documents in a web browser, and plan to adapt existing tools for translating annotation documents between CDA+GrAF and the UIMA and GATE frameworks. This common data model may ease directly comparing NLP tools and applications, combining their output, transforming and "translating" annotations between different NLP applications, and eventually "plug-and-play" of different modules in NLP applications. Copyright © 2011 Elsevier Inc. All rights reserved.
Targeting the minimal supersymmetric standard model with the compact muon solenoid experiment
NASA Astrophysics Data System (ADS)
Bein, Samuel Louis
An interpretation of CMS searches for evidence of supersymmetry in the context of the minimal supersymmetric Standard Model (MSSM) is given. It is found that supersymmetric particles with color charge are excluded in the mass range below about 400 GeV, but neutral and weakly-charged sparticles remain non-excluded in all mass ranges. Discussion of the non-excluded regions of the model parameter space is given, including details on the strengths and weaknesses of existing searches, and recommendations for future analysis strategies. Advancements in the modeling of events arising from quantum chromodynamics and electroweak boson production, which are major backgrounds in searches for new physics at the LHC, are also presented. These methods have been implemented as components of CMS searches for supersymmetry in proton-proton collisions resulting in purely hadronic events (i.e., events with no identified leptons) at a center of momentum energy of 13 TeV. These searches, interpreted in the context of simplified models, exclude supersymmetric gluons (gluinos) up to masses of 1400 to 1600 GeV, depending on the model considered, and exclude scalar top quarks with masses up to about 800 GeV, assuming a massless lightest supersymmetric particle. A search for non-excluded supersymmetry models is also presented, which uses multivariate discriminants to isolate potential signal candidate events. The search achieves sensitivity to new physics models in background-dominated kinematic regions not typically considered by analyses, and rules out supersymmetry models that survived 7 and 8 TeV searches performed by CMS.
NASA Technical Reports Server (NTRS)
Guenther, D. B.
1994-01-01
The nonadiabatic frequencies of a standard solar model and a solar model that includes helium diffusion are discussed. The nonadiabatic pulsation calculation includes physics that describes the losses and gains due to radiation. Radiative gains and losses are modeled in both the diffusion approximation, which is only valid in optically thick regions, and the Eddington approximation, which is valid in both optically thin and thick regions. The calculated pulsation frequencies for modes with l less than or equal to 1320 are compared to the observed spectrum of the Sun. Compared to a strictly adiabatic calculation, the nonadiabatic calculation of p-mode frequencies improves the agreement between model and observation. When helium diffusion is included in the model the frequencies of the modes that are sensitive to regions near the base of the convection zone are improved (i.e., brought into closer agreement with observation), but the agreement is made worse for other modes. Cyclic variations in the frequency spacings of the Sun as a function of frequency of n are presented as evidence for a discontinuity in the structure of the Sun, possibly located near the base of the convection zone.
Low energy analysis of νN→νNγ in the standard model
NASA Astrophysics Data System (ADS)
Hill, Richard J.
2010-01-01
The production of single photons in low energy (˜1GeV) neutrino scattering off nucleons is analyzed in the standard model. At very low energies, Eν≪GeV, a simple description of the chiral Lagrangian involving baryons and arbitrary SU(2)L×U(1)Y gauge fields is developed. Extrapolation of the process into the ˜1-2GeV region is treated in a simple phenomenological model. Coherent enhancements in compound nuclei are studied. The relevance of single-photon events as a background to experimental searches for νμ→νe is discussed. In particular, single photons are a plausible explanation for excess events observed by the MiniBooNE experiment.
Comparisons of a standard galaxy model with stellar observations in five fields
NASA Technical Reports Server (NTRS)
Bahcall, J. N.; Soneira, R. M.
1984-01-01
Modern data on the distribution of stellar colors and on the number of stars as a function of apparent magnitude in five directions in the Galaxy are analyzed. It is found that the standard model is consistent with all the available data. Detailed comparisons with the data for five separate fields are presented. The bright end of the spheroid luminosity function and the blue tip of the spheroid horizontal branch are analyzed. The allowed range of the disk scale heights and of fluctuations in the volume density is determined, and a lower limit is set on the disk scale length. Calculations based on the thick disk model of Gilmore and Reid (1983) are presented.
Softened gravity and the extension of the standard model up to infinite energy
NASA Astrophysics Data System (ADS)
Giudice, Gian F.; Isidori, Gino; Salvio, Alberto; Strumia, Alessandro
2015-02-01
Attempts to solve naturalness by having the weak scale as the only breaking of classical scale invariance have to deal with two severe difficulties: gravity and the absence of Landau poles. We show that solutions to the first problem require premature modifications of gravity at scales no larger than 1011 GeV, while the second problem calls for many new particles at the weak scale. To build models that fulfill these properties, we classify 4- dimensional Quantum Field Theories that satisfy Total Asymptotic Freedom (TAF): the theory holds up to infinite energy, where all coupling constants flow to zero. We develop a technique to identify such theories and determine their low-energy predictions. Since the Standard Model turns out to be asymptotically free only under the unphysical conditions g 1 = 0, M t = 186 GeV, M τ = 0, M h = 163 GeV, we explore some of its weak-scale extensions that satisfy the requirements for TAF.
What matters in the classroom: A structural model of standards-based scientific literacy
NASA Astrophysics Data System (ADS)
Shive, Louise E.
For over two decades educators and policy makers have been particularly concerned with student achievement in the wake of A Nation at Risk. A majority of studies indicates that students' family background has the strongest influence on achievement, although characteristics of their teachers and schools have significant impact as well. This study considered achievement in science in particular, investigating the influence of alterable factors within the classroom on students' gains in scientific literacy. Scientific literacy included three elements: content knowledge, scientific process skills, and attitude towards science. Based on a review of the literature on student achievement, a structural equation model was constructed with five latent variables: teacher's education, instructional practices, teacher's attitudes, school's context, and students' scientific literacy. The model was tested using data from the five-month implementation of a standards-based integrated text/technology/laboratory program, Biology: Exploring Life. The sixteen biology teachers completed two pre-implementation surveys, and 664 of their students completed the three pretests and the corresponding posttests. The initial model did not fit well (chi2(80) = 2784.16; chi 2/df = 34.80; GFI = .70; IFI = .49; CFI = .49) and was inadmissible due to the presence of negative variances. After revision of the model, fit improved somewhat (chi2(53) = 1623.97; chi 2/df = 30.64; GFI = .77; IFI = .65; CFI = .65), although a negative variance migrated and persisted. The total effects were greatest for the teacher's attitudes (largely indirect, mediated through instructional practices), followed by school's context, and instructional practices. Teacher's education had the lowest total effects due to almost equal but opposite direct effects (positive) and indirect effects (mediated through instructional practices and teacher's attitudes). The investigator concluded that alterable factors such as teachers
Creating NDA working standards through high-fidelity spent fuel modeling
Skutnik, Steven E; Gauld, Ian C; Romano, Catherine E
2012-01-01
The Next Generation Safeguards Initiative (NGSI) is developing advanced non-destructive assay (NDA) techniques for spent nuclear fuel assemblies to advance the state-of-the-art in safeguards measurements. These measurements aim beyond the capabilities of existing methods to include the evaluation of plutonium and fissile material inventory, independent of operator declarations. Testing and evaluation of advanced NDA performance will require reference assemblies with well-characterized compositions to serve as working standards against which the NDA methods can be benchmarked and for uncertainty quantification. To support the development of standards for the NGSI spent fuel NDA project, high-fidelity modeling of irradiated fuel assemblies is beingmore » performed to characterize fuel compositions and radiation emission data. The assembly depletion simulations apply detailed operating history information and core simulation data as it is available to perform high fidelity axial and pin-by-pin fuel characterization for more than 1600 nuclides. The resulting pin-by-pin isotopic inventories are used to optimize the NDA measurements and provide information necessary to unfold and interpret the measurement data, e.g., passive gamma emitters, neutron emitters, neutron absorbers, and fissile content. A key requirement of this study is the analysis of uncertainties associated with the calculated compositions and signatures for the standard assemblies; uncertainties introduced by the calculation methods, nuclear data, and operating information. An integral part of this assessment involves the application of experimental data from destructive radiochemical assay to assess the uncertainty and bias in computed inventories, the impact of parameters such as assembly burnup gradients and burnable poisons, and the influence of neighboring assemblies on periphery rods. This paper will present the results of high fidelity assembly depletion modeling and uncertainty analysis from
Measurement of the fine-structure constant as a test of the Standard Model
NASA Astrophysics Data System (ADS)
Parker, Richard H.; Yu, Chenghui; Zhong, Weicheng; Estey, Brian; Müller, Holger
2018-04-01
Measurements of the fine-structure constant α require methods from across subfields and are thus powerful tests of the consistency of theory and experiment in physics. Using the recoil frequency of cesium-133 atoms in a matter-wave interferometer, we recorded the most accurate measurement of the fine-structure constant to date: α = 1/137.035999046(27) at 2.0 × 10‑10 accuracy. Using multiphoton interactions (Bragg diffraction and Bloch oscillations), we demonstrate the largest phase (12 million radians) of any Ramsey-Bordé interferometer and control systematic effects at a level of 0.12 part per billion. Comparison with Penning trap measurements of the electron gyromagnetic anomaly ge ‑ 2 via the Standard Model of particle physics is now limited by the uncertainty in ge ‑ 2; a 2.5σ tension rejects dark photons as the reason for the unexplained part of the muon’s magnetic moment at a 99% confidence level. Implications for dark-sector candidates and electron substructure may be a sign of physics beyond the Standard Model that warrants further investigation.
NASA Astrophysics Data System (ADS)
Liao, Yi; Ma, Xiao-Dong
2017-07-01
We revisit the effective field theory of the standard model that is extended with sterile neutrinos, N . We examine the basis of complete and independent effective operators involving N up to mass dimension seven (dim-7). By employing equations of motion, integration by parts, and Fierz and group identities, we construct relations among operators that were considered independent in the previous literature, and we find 7 redundant operators at dim-6, as well as 16 redundant operators and two new operators at dim-7. The correct numbers of operators involving N are, without counting Hermitian conjugates, 16 (L ∩B )+1 (L ∩B )+2 (L ∩ B) at dim-6 and 47 (L ∩B )+5 (L ∩ B) at dim-7. Here L /B (L/B) stands for lepton/baryon number conservation (violation). We verify our counting by the Hilbert series approach for nf generations of the standard model fermions and sterile neutrinos. When operators involving different flavors of fermions are counted separately and their Hermitian conjugates are included, we find there are 29 (1614) and 80 (4206) operators involving sterile neutrinos at dim-6 and dim-7, respectively, for nf=1 (3).
Constraining the top-Higgs sector of the standard model effective field theory
NASA Astrophysics Data System (ADS)
Cirigliano, V.; Dekens, W.; de Vries, J.; Mereghetti, E.
2016-08-01
Working in the framework of the Standard Model effective field theory, we study chirality-flipping couplings of the top quark to Higgs and gauge bosons. We discuss in detail the renormalization-group evolution to lower energies and investigate direct and indirect contributions to high- and low-energy C P -conserving and C P -violating observables. Our analysis includes constraints from collider observables, precision electroweak tests, flavor physics, and electric dipole moments. We find that indirect probes are competitive or dominant for both C P -even and C P -odd observables, even after accounting for uncertainties associated with hadronic and nuclear matrix elements, illustrating the importance of including operator mixing in constraining the Standard Model effective field theory. We also study scenarios where multiple anomalous top couplings are generated at the high scale, showing that while the bounds on individual couplings relax, strong correlations among couplings survive. Finally, we find that enforcing minimal flavor violation does not significantly affect the bounds on the top couplings.
Direct CP violation in K^{0}→ππ: Standard Model Status.
Gisbert, Hector; Pich, Antonio
2018-05-01
In 1988 the NA31 experiment presented the first evidence of direct CP violation in the K<sup>0</sup>→ππ decay amplitudes. A clear signal with a 7.2σ statistical significance was later established with the full data samples from the NA31, E731, NA48 and KTeV experiments, confirming that CP violation is associated with a ΔS=1 quark transition, as predicted by the Standard Model. However, the theoretical prediction for the measured ratio ε'/ε has been a subject of strong controversy along the years. Although the underlying physics was already clarified in 2001, the recent release of improved lattice data has revived again the theoretical debate. We review the current status, discussing in detail the different ingredients that enter into the calculation of this observable and the reasons why seemingly contradictory predictions were obtained in the past by several groups. An update of the Standard Model prediction is presented and the prospects for future improvements are analysed. Taking into account all known short-distance and long-distance contributions, one obtains Re(ε'/ε) = (15 ± 7) ·10<sup>-4</sup>, in good agreement with the experimental measurement. . © 2018 IOP Publishing Ltd.
The Standard Model: how far can it go and how can we tell?
Butterworth, J M
2016-08-28
The Standard Model of particle physics encapsulates our current best understanding of physics at the smallest distances and highest energies. It incorporates quantum electrodynamics (the quantized version of Maxwell's electromagnetism) and the weak and strong interactions, and has survived unmodified for decades, save for the inclusion of non-zero neutrino masses after the observation of neutrino oscillations in the late 1990s. It describes a vast array of data over a wide range of energy scales. I review a selection of these successes, including the remarkably successful prediction of a new scalar boson, a qualitatively new kind of object observed in 2012 at the Large Hadron Collider. New calculational techniques and experimental advances challenge the Standard Model across an ever-wider range of phenomena, now extending significantly above the electroweak symmetry breaking scale. I will outline some of the consequences of these new challenges, and briefly discuss what is still to be found.This article is part of the themed issue 'Unifying physics and technology in light of Maxwell's equations'. © 2016 The Author(s).
Measurement of the fine-structure constant as a test of the Standard Model.
Parker, Richard H; Yu, Chenghui; Zhong, Weicheng; Estey, Brian; Müller, Holger
2018-04-13
Measurements of the fine-structure constant α require methods from across subfields and are thus powerful tests of the consistency of theory and experiment in physics. Using the recoil frequency of cesium-133 atoms in a matter-wave interferometer, we recorded the most accurate measurement of the fine-structure constant to date: α = 1/137.035999046(27) at 2.0 × 10 -10 accuracy. Using multiphoton interactions (Bragg diffraction and Bloch oscillations), we demonstrate the largest phase (12 million radians) of any Ramsey-Bordé interferometer and control systematic effects at a level of 0.12 part per billion. Comparison with Penning trap measurements of the electron gyromagnetic anomaly g e - 2 via the Standard Model of particle physics is now limited by the uncertainty in g e - 2; a 2.5σ tension rejects dark photons as the reason for the unexplained part of the muon's magnetic moment at a 99% confidence level. Implications for dark-sector candidates and electron substructure may be a sign of physics beyond the Standard Model that warrants further investigation. Copyright © 2018 The Authors, some rights reserved; exclusive licensee American Association for the Advancement of Science. No claim to original U.S. Government Works.
Strongly interacting dynamics beyond the standard model on a space-time lattice.
Lucini, Biagio
2010-08-13
Strong theoretical arguments suggest that the Higgs sector of the standard model of electroweak interactions is an effective low-energy theory, with a more fundamental theory expected to emerge at an energy scale of the order of a teraelectronvolt. One possibility is that the more fundamental theory is strongly interacting and the Higgs sector is given by the low-energy dynamics of the underlying theory. I review recent works aimed at determining observable quantities by numerical simulations of strongly interacting theories proposed in the literature to explain the electroweak symmetry-breaking mechanism. These investigations are based on Monte Carlo simulations of the theory formulated on a space-time lattice. I focus on the so-called minimal walking technicolour scenario, an SU(2) gauge theory with two flavours of fermions in the adjoint representation. The emerging picture is that this theory has an infrared fixed point that dominates the large-distance physics. I shall discuss the first numerical determinations of quantities of phenomenological interest for this theory and analyse future directions of quantitative studies of strongly interacting theories beyond the standard model with lattice techniques. In particular, I report on a finite size scaling determination of the chiral condensate anomalous dimension gamma, for which 0.05 < or = gamma < or = 0.25.
NASA Astrophysics Data System (ADS)
Liao, Yi; Ma, Xiao-Dong
2018-03-01
We study two aspects of higher dimensional operators in standard model effective field theory. We first introduce a perturbative power counting rule for the entries in the anomalous dimension matrix of operators with equal mass dimension. The power counting is determined by the number of loops and the difference of the indices of the two operators involved, which in turn is defined by assuming that all terms in the standard model Lagrangian have an equal perturbative power. Then we show that the operators with the lowest index are unique at each mass dimension d, i.e., (H † H) d/2 for even d ≥ 4, and (LT∈ H)C(LT∈ H) T (H † H)(d-5)/2 for odd d ≥ 5. Here H, L are the Higgs and lepton doublet, and ∈, C the antisymmetric matrix of rank two and the charge conjugation matrix, respectively. The renormalization group running of these operators can be studied separately from other operators of equal mass dimension at the leading order in power counting. We compute their anomalous dimensions at one loop for general d and find that they are enhanced quadratically in d due to combinatorics. We also make connections with classification of operators in terms of their holomorphic and anti-holomorphic weights. Supported by the National Natural Science Foundation of China under Grant Nos. 11025525, 11575089, and by the CAS Center for Excellence in Particle Physics (CCEPP)
Could a Weak Coupling Massless SU(5) Theory Underly the Standard Model S-Matrix
NASA Astrophysics Data System (ADS)
White, Alan R.
2011-04-01
The unitary Critical Pomeron connects to a unique massless left-handed SU(5) theory that, remarkably, might provide an unconventional underlying unification for the Standard Model. Multi-regge theory suggests the existence of a bound-state high-energy S-Matrix that replicates Standard Model states and interactions via massless fermion anomaly dynamics. Configurations of anomalous wee gauge boson reggeons play a vacuum-like role. All particles, including neutrinos, are bound-states with dynamical masses (there is no Higgs field) that are formed (in part) by anomaly poles. The contributing zero-momentum chirality transitions break the SU(5) symmetry to vector SU(3)⊗U(1) in the S-Matrix. The high-energy interactions are vector reggeon exchanges accompanied by wee boson sums (odd-signature for the strong interaction and even-signature for the electroweak interaction) that strongly enhance couplings. The very small SU(5) coupling, αQUD ≲ 1/120, should be reflected in small (Majorana) neutrino masses. A color sextet quark sector, still to be discovered, produces both Dark Matter and Electroweak Symmetry Breaking. Anomaly color factors imply this sector could be produced at the LHC with large cross-sections, and would be definitively identified in double pomeron processes.
The Weak Charge of the Proton. A Search For Physics Beyond the Standard Model
MacEwan, Scott J.
2015-05-01
The Q weak experiment, which completed running in May of 2012 at Jefferson Laboratory, has measured the parity-violating asymmetry in elastic electron-proton scattering at four-momentum transfer Q 2 =0.025 (GeV/c) 2 in order to provide the first direct measurement of the proton's weak charge, Q W p. The Standard Model makes firm predictions for the weak charge; deviations from the predicted value would provide strong evidence of new physics beyond the Standard Model. Using an 89% polarized electron beam at 145 microA scattering from a 34.4 cm long liquid hydrogen target, scattered electrons were detected using an array of eightmore » fused-silica detectors placed symmetric about the beam axis. The parity-violating asymmetry was then measured by reversing the helicity of the incoming electrons and measuring the normalized difference in rate seen in the detectors. The low Q 2 enables a theoretically clean measurement; the higher-order hadronic corrections are constrained using previous parity-violating electron scattering world data. The experimental method will be discussed, with recent results constituting 4% of our total data and projections of our proposed uncertainties on the full data set.« less
Paleodynamics of large closed lakes as a standard for climate modeling data verification
NASA Astrophysics Data System (ADS)
Kislov, Alexander
2015-04-01
Observed and reconstructed variations of large lakes can serve as a standard for assessing the quality of the model run off simulated by climate models. It provides the opportunity to assess whether models designed for future scenarios are skillful in 'out-of sample' climate change experiments. Based on general ideas about the laws of temporal dynamics relating to massive inertial objects, slow changes of the lake level under the semi-steady climate state can be represented as resulting from the accumulation of small anomalies in the water regime; it appears like a kind of "self-developing" system. To test this hypothesis, the water balance model of the Caspian Sea (CS) was used. Time scale for the CS is estimated as ~20 years. Model is interpreted as stochastic, and from this perspective, it is a Langevin equation that incorporates the action of precipitation and evaporation like random white noise, so that the whole can be thought of as an analogue of Brownian motion. Under these conditions, the CS palaeostages during the Holocene is represented by a system undergoing random walk. It should be emphasized that modeling results are interpreted from the probabilistic point of view, despite the fact that the model is deterministically based on the physical law of conservation of water mass. Despite the CS, another candidate to be as a potential evaluation tool for climate model simulations is the Black Sea (BS) until its merger with the Mediterranean. Therefore, although the image of the CS, BS and other lakes within the climate models is very simplified (or absent), changes in the levels could be used to assess the ability of climate models to reproduce the water budget over the catchment areas. For the CS or the BS, they are the large parts of the East European Plane and can be as indicators of climate model quality. However, the use of reconstructed data of other closed lakes is problematic. It is due to its water budget components cannot be simulated with needed
NASA Astrophysics Data System (ADS)
Postpischl, L.; Morelli, A.; Danecek, P.
2009-04-01
Formats used to represent (and distribute) tomographic earth models differ considerably and are rarely self-consistent. In fact, each earth scientist, or research group, uses specific conventions to encode the various parameterizations used to describe, e.g., seismic wave speed or density in three dimensions, and complete information is often found in related documents or publications (if available at all) only. As a consequence, use of various tomographic models from different authors requires considerable effort, is more cumbersome than it should be and prevents widespread exchange and circulation within the community. We propose a format, based on modern web standards, able to represent different (grid-based) model parameterizations within the same simple text-based environment, easy to write, to parse, and to visualise. The aim is the creation of self-describing data-structures, both human and machine readable, that are automatically recognised by general-purpose software agents, and easily imported in the scientific programming environment. We think that the adoption of such a representation as a standard for the exchange and distribution of earth models can greatly ease their usage and enhance their circulation, both among fellow seismologists and among a broader non-specialist community. The proposed solution uses semantic web technologies, fully fitting the current trends in data accessibility. It is based on Json (JavaScript Object Notation), a plain-text, human-readable lightweight computer data interchange format, which adopts a hierarchical name-value model for representing simple data structures and associative arrays (called objects). Our implementation allows integration of large datasets with metadata (authors, affiliations, bibliographic references, units of measure etc.) into a single resource. It is equally suited to represent other geo-referenced volumetric quantities — beyond tomographic models — as well as (structured and unstructured
The Top Quark as a Window to Beyond the Standard Model Physics
NASA Astrophysics Data System (ADS)
Yu, Chiu-Tien
The top quark was the last of the Standard Model quarks to be discovered, and is of considerable interest. The closeness of the top quark mass to the electroweak scale is suggestive that the top quark could be closely related to the mechanisms for electroweak symmetry breaking. Any new physics in electroweak symmetry breaking models could then preferentially couple to the top quark, making the top quark a promising probe for new physics. In this thesis, we will explore two aspects of the top quark as a harbinger to new physics: the top forward-backward asymmetry as seen at the Tevatron and the search for stops. In this thesis, we will discuss the Asymmetric Left-Right Model (ALRM), a model that is based on the gauge group U'(1) x SU(2) x SU'(2) with couplings g' 1,g'2; and g' associated with the fields B',W,W', respectively, and show how this model can explain the top forwardbackward asymmetry. We will then explore the scalar sector of the ALRM, and provide a specific Higgs mechanism that provides the masses for the W' and Z' bosons. The top forward-backward asymmetry is a test of invariance of chargeconjugation. Thus, we look at the X-gluon model, a model that was motivated by the top forward-backward asymmetry, and show that one can look at the longitudinal polarization of the top-quark to test parity conservation. Finally, we investigate searches for stop squarks, the supersymmetric partner of the top quark, at the Large Hadron Collider (LHC) using shape-based analyses.
Hume, Sam; Aerts, Jozef; Sarnikar, Surendra; Huser, Vojtech
2016-04-01
In order to further advance research and development on the Clinical Data Interchange Standards Consortium (CDISC) Operational Data Model (ODM) standard, the existing research must be well understood. This paper presents a methodological review of the ODM literature. Specifically, it develops a classification schema to categorize the ODM literature according to how the standard has been applied within the clinical research data lifecycle. This paper suggests areas for future research and development that address ODM's limitations and capitalize on its strengths to support new trends in clinical research informatics. A systematic scan of the following databases was performed: (1) ABI/Inform, (2) ACM Digital, (3) AIS eLibrary, (4) Europe Central PubMed, (5) Google Scholar, (5) IEEE Xplore, (7) PubMed, and (8) ScienceDirect. A Web of Science citation analysis was also performed. The search term used on all databases was "CDISC ODM." The two primary inclusion criteria were: (1) the research must examine the use of ODM as an information system solution component, or (2) the research must critically evaluate ODM against a stated solution usage scenario. Out of 2686 articles identified, 266 were included in a title level review, resulting in 183 articles. An abstract review followed, resulting in 121 remaining articles; and after a full text scan 69 articles met the inclusion criteria. As the demand for interoperability has increased, ODM has shown remarkable flexibility and has been extended to cover a broad range of data and metadata requirements that reach well beyond ODM's original use cases. This flexibility has yielded research literature that covers a diverse array of topic areas. A classification schema reflecting the use of ODM within the clinical research data lifecycle was created to provide a categorized and consolidated view of the ODM literature. The elements of the framework include: (1) EDC (Electronic Data Capture) and EHR (Electronic Health Record
NASA Technical Reports Server (NTRS)
Sibonga, J. D.; Feiveson, A. H.
2014-01-01
This work was accomplished in support of the Finite Element [FE] Strength Task Group, NASA Johnson Space Center [JSC], Houston, TX. This group was charged with the task of developing rules for using finite-element [FE] bone-strength measures to construct operating bands for bone health that are relevant to astronauts following exposure to spaceflight. FE modeling is a computational tool used by engineers to estimate the failure loads of complex structures. Recently, some engineers have used this tool to characterize the failure loads of the hip in population studies that also monitored fracture outcomes. A Directed Research Task was authorized in July, 2012 to investigate FE data from these population studies to derive these proposed standards of bone health as a function of age and gender. The proposed standards make use of an FE-based index that integrates multiple contributors to bone strength, an expanded evaluation that is critical after an astronaut is exposed to spaceflight. The current index of bone health used by NASA is the measurement of areal BMD. There was a concern voiced by a research and clinical advisory panel that the sole use of areal BMD would be insufficient to fully evaluate the effects of spaceflight on the hip. Hence, NASA may not have a full understanding of fracture risk, both during and after a mission, and may be poorly estimating in-flight countermeasure efficacy. The FE Strength Task Group - composed of principal investigators of the aforementioned population studies and of FE modelers -donated some of its population QCT data to estimate of hip bone strength by FE modeling for this specific purpose. Consequently, Human Health Countermeasures [HHC] has compiled a dataset of FE hip strengths, generated by a single FE modeling approach, from human subjects (approx.1060) with ages covering the age range of the astronauts. The dataset has been analyzed to generate a set of FE strength cutoffs for the following scenarios: a) Qualify an
The standardized live patient and mechanical patient models--their roles in trauma teaching.
Ali, Jameel; Al Ahmadi, Khalid; Williams, Jack Ivan; Cherry, Robert Allen
2009-01-01
We have previously demonstrated improved medical student performance using standardized live patient models in the Trauma Evaluation and Management (TEAM) program. The trauma manikin has also been offered as an option for teaching trauma skills in this program. In this study, we compare performance using both models. Final year medical students were randomly assigned to three groups: group I (n = 22) with neither model, group II (n = 24) with patient model, and group III (n = 24) with mechanical model using the same clinical scenario. All students completed pre-TEAM and post-TEAM multiple choice question (MCQ) exams and an evaluation questionnaire scoring five items on a scale of 1 to 5 with 5 being the highest. The items were objectives were met, knowledge improved, skills improved, overall satisfaction, and course should be mandatory. Students (groups II and III) then switched models, rating preferences in six categories: more challenging, more interesting, more dynamic, more enjoyable learning, more realistic, and overall better model. Scores were analyzed by ANOVA with p < 0.05 being considered statistically significant. All groups had similar scores (means % +/- SD)in the pretest (group I - 50.8 +/- 7.4, group II - 51.3 +/- 6.4, group III - 51.1 +/- 6.6). All groups improved their post-test scores but groups II and III scored higher than group I with no difference in scores between groups II and III (group I - 77.5 +/- 3.8, group II - 84.8 +/- 3.6, group III - 86.3 +/- 3.2). The percent of students scoring 5 in the questionnaire are as follows: objectives met - 100% for all groups; knowledge improved: group I - 91%, group II - 96%, group III - 92%; skills improved: group I - 9%, group II - 83%, group III - 96%; overall satisfaction: group I - 91%, group II - 92%, group III - 92%; should be mandatory: group I - 32%, group II - 96%, group III - 100%. Student preferences (48 students) are as follows: the mechanical model was more challenging (44 of 48); more
Bosonic seesaw mechanism in a classically conformal extension of the Standard Model
Haba, Naoyuki; Ishida, Hiroyuki; Okada, Nobuchika; ...
2016-01-29
We suggest the so-called bosonic seesaw mechanism in the context of a classically conformal U(1) B-L extension of the Standard Model with two Higgs doublet fields. The U(1) B-L symmetry is radiatively broken via the Coleman–Weinberg mechanism, which also generates the mass terms for the two Higgs doublets through quartic Higgs couplings. Their masses are all positive but, nevertheless, the electroweak symmetry breaking is realized by the bosonic seesaw mechanism. Analyzing the renormalization group evolutions for all model couplings, we find that a large hierarchy among the quartic Higgs couplings, which is crucial for the bosonic seesaw mechanism to work,more » is dramatically reduced toward high energies. Therefore, the bosonic seesaw is naturally realized with only a mild hierarchy, if some fundamental theory, which provides the origin of the classically conformal invariance, completes our model at some high energy, for example, the Planck scale. In conclusion, we identify the regions of model parameters which satisfy the perturbativity of the running couplings and the electroweak vacuum stability as well as the naturalness of the electroweak scale.« less
Mincarone, Pierpaolo; Leo, Carlo Giacomo; Trujillo-Martín, Maria Del Mar; Manson, Jan; Guarino, Roberto; Ponzini, Giuseppe; Sabina, Saverio
2018-04-01
The importance of working toward quality improvement in healthcare implies an increasing interest in analysing, understanding and optimizing process logic and sequences of activities embedded in healthcare processes. Their graphical representation promotes faster learning, higher retention and better compliance. The study identifies standardized graphical languages and notations applied to patient care processes and investigates their usefulness in the healthcare setting. Peer-reviewed literature up to 19 May 2016. Information complemented by a questionnaire sent to the authors of selected studies. Systematic review conducted in accordance with the Preferred Reporting Items for Systematic Reviews and Meta-Analyses statement. Five authors extracted results of selected studies. Ten articles met the inclusion criteria. One notation and language for healthcare process modelling were identified with an application to patient care processes: Business Process Model and Notation and Unified Modeling Language™. One of the authors of every selected study completed the questionnaire. Users' comprehensibility and facilitation of inter-professional analysis of processes have been recognized, in the filled in questionnaires, as major strengths for process modelling in healthcare. Both the notation and the language could increase the clarity of presentation thanks to their visual properties, the capacity of easily managing macro and micro scenarios, the possibility of clearly and precisely representing the process logic. Both could increase guidelines/pathways applicability by representing complex scenarios through charts and algorithms hence contributing to reduce unjustified practice variations which negatively impact on quality of care and patient safety.
Towards Application of NASA Standard for Models and Simulations in Aeronautical Design Process
NASA Astrophysics Data System (ADS)
Vincent, Luc; Dunyach, Jean-Claude; Huet, Sandrine; Pelissier, Guillaume; Merlet, Joseph
2012-08-01
Even powerful computational techniques like simulation endure limitations in their validity domain. Consequently using simulation models requires cautions to avoid making biased design decisions for new aeronautical products on the basis of inadequate simulation results. Thus the fidelity, accuracy and validity of simulation models shall be monitored in context all along the design phases to build confidence in achievement of the goals of modelling and simulation.In the CRESCENDO project, we adapt the Credibility Assessment Scale method from NASA standard for models and simulations from space programme to the aircraft design in order to assess the quality of simulations. The proposed eight quality assurance metrics aggregate information to indicate the levels of confidence in results. They are displayed in management dashboard and can secure design trade-off decisions at programme milestones.The application of this technique is illustrated in aircraft design context with specific thermal Finite Elements Analysis. This use case shows how to judge the fitness- for-purpose of simulation as Virtual testing means and then green-light the continuation of Simulation Lifecycle Management (SLM) process.